Now that OpenAI’s chatgpt is in the public domain, the larger discussion point being considered revolves around what impact it will have on society and on the enterprise itself.
When it comes to the latter, the London, Ontario-based info-tech research group describes the release as a “watershed moment in the history of generative AI,” as it provides human-like conversations on a wide variety of topics, including writing poetry, debugging, and more. Can do. Helping to code and even troubleshoot software and hardware issues.
Earlier this month, the research firm organized a webinar It explored the potential uses of the chatbot, which it says, unlike other chatbots or intelligent software assistants, is far more efficient at engaging in conversations with its users, and can even respond to feedback, requesting clarifications. and iterate over your answers on a one-to-one basis. User Feedback.
The information session, which was moderated by Jeremy Roberts, Info-Tech’s director of research, and Jack Hakimian, the organization’s senior vice president of workshop and advisory research, examined three specific areas in which generative AI can be used in the enterprise :
- Enterprise Support: ChatGPT or other conversational AI tools can serve as the back end of an information concierge that automates enterprise support. According to the firm’s research, “Chatbots already exist, but ChatGPT could be a game changer.”
- Customer Contact: The automated workflow of current chatbots or website search functions can frustrate users when they return lists of semi-related results. With this in mind, the research showed that “Generative AI can answer questions more cheaply, intelligently direct users to appropriate products and services, and improve the customer journey so is that it can be a differentiator.”
- product development: Info-Tech says that key business applications for generative AI include creating marketing copy, summarizing long documents, and authoring communications. “Anyone creating content could look to supplement their workflow with an intelligent solution like ChatGPT.”
Roberts suggests that “there are a few steps IT departments should take to refine their use case for generative AI. First,[they]should review their capability map for high-value processes, then One must perform a basic cost-benefit analysis for the technology, and finally, explore the vendor landscape to find the solution that best meets their needs.
The online symposium emerged with three key recommendations that IT managers should consider:
- First, ChatGPT and other generative AI solutions are tools – nothing else – and as such, “there are things this technology is particularly good at and others that are not particularly useful. The key is understanding your business processes and the friction.” The aim is to uncover opportunities to reduce costs, enhance the quality of service experience and increase efficiency.
- Second, although it may be appealing to implement an aggressive AI strategy, IT teams should start with augmentation. Generational AI, Info-Tech says, is “an incredible technology, but it’s still not self-sufficient. It still needs guidance and feedback from human curators.”
- Third, talk to a lawyer and get legal advice before implementing the technique. The firm says chatbots that manage workflow aren’t complex, “but a bot that will interact with users and customers and produce content could put you at legal risk.”
in an interview with IT World Canada, Roberts said that once ChatGPT is commercialized and procured, standard practices that currently exist within an IT organization will need to be followed in terms of security precautions.
As for the next steps, he said that OpenAI is in the research phase with the language tools, and while it is currently free to download and has over a million downloads in circulation, he questioned whether “they can take away all of our entertainment”. can continue to subsidize” indefinitely.
according to recent report good From Reuters, the San Francisco-based company founded by Elon Musk, who is no longer incorporated, and backed by US$1 billion in funding from investor Sam Altman, and Microsoft Corp., is hoping to grow its business.
The story goes on to say that three sources at OpenAI’s recent pitch to investors said the organization expects US$200 million in revenue next year and US$1 billion by 2024.
Microsoft, for one, said Roberts, “is probably going to expect some sort of return on their investment at some point. They’re nice guys, but they’re not philanthropic. I doubt we’ll see (the ChatGPT functionality).” Will gradually be built into Microsoft products, but it will be fascinating what that might look like. Is it going to be another feature that they don’t charge extra for that OpenAI license, or is it going to be a new product? Clippy is coming back, but actually useful this time?
It is also likely that due to the compute costs of the ChatGPT launch, which Altman described as “eye-watering”, OpenAI will more than likely “slap some sort of paywall on it”. .
The tool’s launch represents a huge leap forward in the world of bots, he said, one that’s far more advanced than a bot “you’re in the style of John Milton writing poetry about the burrito you made in the microwave.” left too long.
“Now they’re getting a Napoleon Dynamite fan fiction of Twilight. The thing about technology and societal change as a whole is that it happens very slowly, and then all at once. OpenAI was like that for technology in a way that a lot of things haven’t been recently.
Also interesting is how the advent of ChatGPT will impact Google, and whether or not it will impact the organization’s overall revenue model.
“The challenge isn’t technical for them,” says Roberts, adding that they could probably introduce a similar AI tool on Google.com and create a sensation. I don’t think they have figured out what the end game is.