HomeOpinionsThe AI monetization conundrum rages on as OpenAI’s costs rocket

The AI monetization conundrum rages on as OpenAI’s costs rocket

Despite its immense popularity, OpenAI is allegedly burning through money at an unsustainable rate and will face a staggering $5 billion loss by the top of 2024.

That’s in keeping with a shock report from The Information, which cites unreleased internal financial statements and industry figures exposing how OpenAI has already spent roughly $7 billion on training models and as much as $1.5 billion on staffing. 

Dylan Patel from SemiAnalysis had earlier told The Information that OpenAI allegedly forked out some $700,000 a day to run its models in 2022, posting losses of virtually $500 million that 12 months alone. 

Despite generating substantial revenue, estimated at $3.5 billion to $4.5 billion annually, OpenAI’s expenses far outpace its income.

The company has already raised over $11 billion through seven rounds of funding and is currently valued at $80 billion. 

However, despite ChatGPT being a household name with thousands and thousands of world users, OpenAI might prove an actual money pit for investors if nothing changes. 

Microsoft, OpenAI’s biggest backer by far, has already poured billions into the corporate lately. 

Its most up-to-date injection of money, $10 billion in early 2023, was rumored to incorporate a 75% slice of OpenAI’s profits and a 49% stake in the corporate, in addition to integrating ChatGPT into Bing and other Microsoft systems. 

In return, OpenAI receives access to Azure cloud servers at a substantially reduced rate.

But on the planet of generative AI, there are never enough chips, cloud hardware, or groundbreaking, world-changing ideas that require billions to get off the bottom. 

OpenAI is heavily invested in being the primary to realize artificial general intelligence (AGI), an ambitious and incredibly expensive endeavor.

CEO Sam Altman has already hinted that he simply won’t stop until that is achieved. 

He’s involved in developing nuclear fusion and discussed creating an international chip project with UAE and US government backing value trillions. 

Competition is red-hot

Competition within the generative AI space can be intensifying, with big players like Google, Amazon, Meta, etc, all vying for a slice of the pie. 

While ChatGPT stays probably the most widely known AI chatbot, it’s capturing an increasingly smaller portion of the whole revenues up for grabs.

Plus, the open-source division, headed largely by Mistral and Meta, is constructing increasingly powerful models which are cheaper and more controllable than closed lab projects from OpenAI, Google, and others. 

As Barbara H. Wixom, a principal research scientist on the MIT Center for Information Systems Research, aptly puts it, “Like any tool, AI creates no value unless it’s used properly. AI is advanced data science, and it’s worthwhile to have the fitting capabilities with the intention to work with it and manage it properly.” 

And therein lies a critical point. If a company has the money and technical know-how to harness generative AI, it doesn’t necessarily must partner with closed-source corporations like OpenAI. Instead, it could create its own more bespoke, sovereign solutions.

Salesforce recently proved that by releasing a cutting-edge compact model for API calls that smashed frontier models from OpenAI, Anthropic, etc. 

OpenAI and others try to push the envelope with enterprise solutions like ChatGPT Enterprise, however it’s tough going, as generative AI is each costly and dubiously definitely worth the investment immediately. 

Adam Selipsky, CEO of Amazon Web Services (AWS), said himself in 2023, “Plenty of the purchasers I’ve talked to are unhappy about the price that they’re seeing for running a few of these models.”

AI corporations are responding by cutting the prices of their models and releasing lighter-weight versions like GPT-4o mini, but that, too, presents a conundrum. When do corporations make the leap into AI when the choices are rotating on a regular basis?

2023 provided few answers for AI monetization

The 12 months 2023 has acted as a testing ground for various AI monetization approaches, but none are a silver bullet for the industry’s mounting costs. 

One of the best challenges of AI monetization is that it doesn’t offer the identical economy as conventional software. 

Each user interaction with a model like ChatGPT requires specific computations, which devour energy and construct higher ongoing costs that scale as more users join the system. 

This poses a large challenge for corporations offering AI services at flat rates, as expenses can quickly outpace revenues.

If subscription costs are raised an excessive amount of, people will simply bail out. Economic surveys suggest that subscriptions are considered one of the primary things to be culled when people wish to reduce their spending.

Microsoft’s recent collaboration with OpenAI on GitHub Copilot, an AI coding assistant, served as a first-rate example of how subscriptions can backfire. 

Microsoft charged a $10 monthly subscription for the tool but reported a mean monthly lack of greater than $20 per user. Some power users inflicted losses of as much as $80 per 30 days.

It’s likely an analogous situation with other generative AI tools. Many casual users subscribe to simply considered one of the numerous available tools on a monthly basis and should readily cancel and switch to a special tool. On the opposite hand, there are non-profitable power users who devour resources without contributing to profits.

Some imagine OpenAI has attempted dirty tricks to maintain the money flowing. For example, the GPT-4o demo, timed perfectly with Google IO, revealed real-time speech synthesis features that looked as if it would break recent ground and outshine Google’s announcements.

We’re still waiting for these much-hyped voice features to roll out. OpenAI has yet to release them to anyone, citing issues of safety.

“We’re improving the model’s ability to detect and refuse certain content,” OpenAI declared in regards to the delay. 

“We’re also working on improving the user experience and preparing our infrastructure to scale to thousands and thousands while maintaining real-time responses. As a part of our iterative deployment strategy, we’ll start the alpha with a small group of users to collect feedback and expand based on what we learn.”

Premium sign-ups spiked because people were looking forward to using those recent features. Was OpenAI looking for a short-term revenue boost driven by features that were never ready?

Energy costs are one other roadblock

There’s one more snag at hand in generative AI’s monetization mission – power and water consumption. 

By 2027, the energy consumed by the AI industry could possibly be akin to that of a small nation. Recent spikes in water usage from Microsoft and Google are largely attributed to intensive AI workloads.

Google recently disclosed that AI was throwing its sustainability strategies off track. The company’s CO2 emissions have surged by 48% since 2019, and executives have all but admitted that AI workloads are in charge. 

AI-induced water shortages recently gripped Taiwan, which began redirecting water from agricultural uses to AI amidst a drought in a bid to maintain manufacturing online. Water shortages hit parts of the US in 2023, too, so there are real environmental impacts to contend with.

Speaking on the World Economic Forum, Altman said, “We do need far more energy on the planet than we thought we wanted before. We still don’t appreciate the energy needs of this technology.”

This all comes at a value, each at the corporate level for Microsoft, Google, etc., and in addition for local and national economies.

The coming years might be pivotal in shaping generative AI’s trajectory, each when it comes to its returns or investment, sustainability, and the stress between the 2. 

As Barbara H. Wixom from MIT warns, “You have to search out a option to pay for this. Otherwise, you’ll be able to’t sustain the investments, after which you’ve gotten to tug the plug.”

Will generative AI ever grind to a halt? You’ve got to think that it’s too big to fail. But it does seem stuck in monetization purgatory immediately, and something from somewhere needs to manage one other jolt of progress. 

It may not take much to push generative AI towards a mandatory flashpoint where progress comes cheaply and naturally. 

Fusion power, analog low-power AI hardware, lightweight architectures – it’s all within the pipeline – we can only wait and watch to see when all of it clicks into place. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read