The cost dilemma: Will it be the insurmountable obstacle to the success of ChatGPT?

Reliance on AI like ChatGPT will be expensive and companies and investors are considering alternatives to at least lower the bill.


The explosion of generative artificial intelligence (AI) has taken the world by storm, but there is a question that is rarely asked: Who can afford it? OpenAI spent around $540 million last year developing ChatGPT and says it needs $100 billion to fulfill its ambitions, according to The Information.

We’re going to be the most capital-intensive startup in Silicon Valley history ,” Sam Altman, founder of OpenAI, recently declared at a roundtable discussion. And when he asks Microsoft, which has invested billions of dollars in OpenAI, how much its AI venture will cost, the company says it’s keeping an eye on its bottom line.

Building anything even close to what OpenAI, Microsoft, or Google offer would require an exorbitant investment in next-generation chips and the hiring of award-winning researchers.

“People don’t realize that to do a significant amount of AI stuff, like ChatGPT, you need huge amounts of processing power. And training those models can cost tens of millions of dollars,” said Jack Gold, an analyst independent. “How many companies can afford to buy 10,000 Nvidia H100 systems that cost tens of thousands of dollars each?” Gold wondered.

The answer is virtually no one, and in technology, if the infrastructure can’t be built, it’s rented, and that’s what companies are already doing en masse by outsourcing to Microsoft, Google, and Amazon’s AWS. And with the advent of generative AI, this sector of cloud computing and tech giants is growing, while leaving the same players in a dominant position, experts warn.


The unpredictable costs of cloud computing are “a very underestimated problem for many companies,” said Stefan Sigg, head of product at Software AG, which develops enterprise software. Sigg compares cloud costs to electricity bills, saying companies that don’t know will be in for a “big surprise” if they let their engineers rack up expenses in the fast-paced race to create technology, including AI.

Azure is Microsoft’s flagship cloud offering, and some observers believe the giant’s bet on AI is really about protecting Azure’s success and ensuring the future of this cash cow. For Microsoft, “the goose that lays the golden eggs is to monetize the cloud with Azure because we are talking about what could be an opportunity of 20,000, 30,000 or 40,000 million dollars a year in the future if the commitment to AI is successful”, said Dan Ives of Wedbush Securities.

Microsoft CEO Satya Nadella insists that generative AI is “moving fast in the right direction.” Deeply respected on Wall Street, Nadella will have a grace period of six to nine months to prove that his bet is a winner, Ives predicted.

Challenge for regulators

Accumulating benefits at the company founded by Bill Gates can only mean shifting the cost of AI to customers. Reliance on AI will be expensive, and companies and investors are considering alternatives to at least lower the bill.

Regulators hope they can keep up and not leave the giants in charge and impose their conditions on smaller companies. “Lawmakers (must) ensure that (…) opportunities and openings for competition (…) are not crushed by incumbents,” Federal Trade Commission (FTC) president Lina told CNBC. Khan. But it might be too late, at least when it comes to which companies have the wherewithal to lay the groundwork for generative AI.