Tech giants that are banking on AI for their future growth often overlook the cost of the technology. OpenAI, Microsoft, and Google declined to comment on the matter when approached by the paper. Let’s discover more about AI chatbots costs.
However, experts believe that the cost of AI is the most significant obstacle to the big tech vision of generative AI revolutionizing various industries, reducing staff, and enhancing efficiency.
The computationally intensive AI required is why OpenAI rejected its powerful new language model, GPT-4, from the free version of ChatGPT, which still uses a weaker GPT-3.5 model.
Unfortunately, ChatGPT’s underlying dataset was last updated in September 2021, rendering it useless for research or discussion of recent events. Even those who pay $20 (about R$100 in direct conversion, at current prices) a month for GPT-4 can only send 25 messages every three hours due to the high cost of running it, and it is much slower to respond.
AI chatbots costs may also be one of the reasons why Google has yet to integrate an AI chatbot into its flagship search engine that answers billions of queries every day. When Google launched its Bard chatbot in March 2023, it opted not to use its larger language model.
Dylan Patel, lead analyst at semiconductor research firm Semi Analysis, estimated that a single ChatGPT chat can cost up to a thousand times more than a simple Google search. In a recent AI report, the Biden administration identified the computational costs of generative AI as a national concern. The White House wrote that the technology is expected to “drastically increase computational demands and associated environmental impacts” and that there is an “urgent need” to design more sustainable systems.
Hidden costs of AI
Generative AI requires a massive quantity of computational energy and specialized pc chips referred to as GPUs, which might be simplest low priced to the wealthiest companies.
The competition for get entry to those chips has transformed essential suppliers into tech giants, conserving the keys to the tech industry’s maximum treasured resource.
Silicon Valley has ruled the internet financial system by using providing loose offerings consisting of on-line studies, email, and social media, to start with incurring losses however in the end making vast profits through personalized marketing.
As an end result, AI chatbots are in all likelihood to characteristic advertisements. However, analysts recommend that ads alone won’t be enough to make AI tools worthwhile in the close to future.
Companies presenting AI fashions for customer use should balance their choice to advantage market proportion with the financial losses they are incurring.
The pursuit of more dependable AI is expected to generate profits primarily for chip makers and cloud computing giants who already dominate the digital space, as well as for the chip makers who require the hardware to operate their models.
It is no coincidence that the companies leading the development of AI language models are among the largest cloud computing vendors, such as Google and Microsoft, or have close partnerships with them, as OpenAI does with Microsoft.
However, companies that purchase AI tools from these vendors may not realize that they are locked into a heavily subsidized service that costs significantly more than what they currently pay, according to Clem Delangue, CEO of Hugging Face, an open AI company.
OpenAI CEO Sam Altman indirectly acknowledged this issue during a US Senate hearing in May, when Senator Ossoff cautioned that if OpenAI becomes addictive to ChatGPT in a way that harms children, Congress “will be very strict.” Ossoff reassured that there was no need to worry.
Learn here more about AI and Machine Learning.