HomeArtificial IntelligenceAI training costs are rising exponentially – IBM says quantum computers could...

AI training costs are rising exponentially – IBM says quantum computers could possibly be an answer

Earlier this month, the Wall Street Journal reported that a 3rd of all nuclear power plants are negotiating with tech corporations to power their latest data centers. Meanwhile, Goldman Sachs predicted that AI will cause data center electricity consumption to extend by 160% by 2030, pushing carbon dioxide emissions to greater than double current levels. It's estimated that every ChatGPT query consumes not less than ten times as much energy as a Google search. The query is: will the exponentially increasing cost of coaching AI models ultimately limit AI's potential?

VB Transform 2024 addressed the subject in a panel led by Hyunjun Park, co-founder and CEO of CATALOG. To talk concerning the scale of the issue and potential solutions, Park welcomed Dr. Jamie Garcia, Director of Quantum Algorithms and Partnerships at IBM; Paul Roberts, Director of Strategic Accounts at AWS; and Kirk Bresniker, Chief Architect at Hewlett Packard Labs and HPE Fellow and VP, to the stage.

Unsustainable resources and unfair technology

“The 2030 landing is just far enough away to find a way to make some course corrections, but it is usually real enough to contemplate the impacts of what we’re doing now,” Bresniker said.

Somewhere Between 2029 and 2031, the associated fee of coaching a single model will exceed US GDPhe added, and by 2030 they may exceed global IT spending, he added. So we’re heading towards a tough cap and the time to make decisions is now, and never simply because costs have gotten prohibitive.

“Because the query of sustainability can be about justice,” he explained. “If something is demonstrably unsustainable, then it’s inherently unjust. So if we need to have widespread and hopefully universal access to this incredible technology, we’d like to have a look at what we will do. What do we’d like to alter? Is there something about this technology that should be drastically modified in order that we will make it universally accessible?”

The role of corporate responsibility

Some corporations are taking responsibility for this looming environmental disaster while working to mitigate the looming financial catastrophe. When it involves carbon footprint, AWS has charted a course toward more responsible stewardship and sustainability, which today consists of implementing the newest liquid cooling solutions from Nvidia and more.

“We're taking a look at improvements in steel and concrete to scale back our carbon use,” Roberts explained. “We're also taking a look at alternative fuels. Instead of just using traditional diesel fuel in our generators, we're taking a look at using hydro vegetable oil and other alternative sources there.”

They are also pushing alternative chips. For example, they’ve launched their very own silicon chip, Trainium, which might be several times more efficient in comparison with alternative options. And to scale back the associated fee of inference, they’ve announced Inferentia, which they are saying offers a performance increase of over 50% per watt over existing options.

The company's second-generation Ultra Cluster network, which helps with training and pre-training, supports as much as 20,000 GPUs and delivers network throughput of about 10 petabits per second on the identical basis at sub-10 microsecond latency, a 25% reduction in overall latency. The final result: training more models in much less time and at a lower cost.

Can quantum computing change the long run?

Garcia's work focuses on the best way quantum and AI interact, and the findings are very promising. Quantum computing offers potential resource savings and speed benefits. Quantum machine learning might be used for AI in 3 ways, Garcia said: quantum models on classical data, quantum models on quantum data, and classical models on quantum data.

“In each of those categories, there are numerous theoretical evidences that it is helpful to make use of quantum computers for these areas,” Garcia said. “For example, if you may have limited training data or very sparse data or very connected data. One of the areas that we predict could be very promising is applications in healthcare and life sciences. Anything where there’s something quantum mechanical that you want to address.”

IBM is actively exploring the large potential of quantum machine learning. There are already quite a few applications in life sciences, industrial applications, materials science, and more. IBM researchers are also developing Watson Code Assist, which helps users unfamiliar with quantum computing leverage the advantages of a quantum computer for his or her applications.

“We use AI to assist with that and help people optimize circuits and define their problem in a way that is smart for the quantum computer to resolve,” she explained.

The solution, she added, might be a mixture of bits, neurons and cubits.

“There might be CPUs, plus GPUs, plus SWC “We have to collaborate and differentiate between the various parts of the workflow,” she said. “We have to push quantum technology to get to a degree where we will run the circuits we're talking about, where we predict we're going to get that type of exponential, polynomial speed. But the potential of the algorithms is basically promising for us.”

But before quantum technology becomes the hero of the day, infrastructure requirements are a sticking point. This includes further reducing power consumption and improving component technology.

“There's still numerous physics research that should be done to fulfill the infrastructure requirements for quantum,” she explained. “For me, that's the actual challenge, to appreciate this vision where all three work together to resolve problems in essentially the most resource-efficient way possible.”

Selection and the hard upper limit

“More essential than anything is radical transparency to offer decision makers a deep understanding of the sustainability, energy consumption, privacy and safety features of all these technologies we're deploying, all the best way up the availability chain, so we understand the true costs,” Bresniker said. “That gives us the flexibility to calculate the true return on those investments. Right now, all of our subject material experts are talking to corporations about adoption, but they're not necessarily listing what it takes to truly integrate these technologies successfully, sustainably and equitably.”

And a part of it comes all the way down to alternative, Roberts said. The baby is out of the stable and an increasing number of corporations will use latest generation LLMs and AI. There is a chance to decide on the performance characteristics that best fit the applying, quite than consuming resources indiscriminately.

“From a sustainability and energy perspective, you must take into consideration what use case you ought to achieve with this particular application and model after which what silicon you will use to perform that inference,” he said.

You also can select the host and select specific applications and tools that abstract the underlying use case.

“This is very important since it gives you alternative and numerous control and lets you select essentially the most cost-effective and optimal deployment in your application,” he said.

“If you include more data, more energy, more water, more people, this becomes an even bigger model, but is it actually higher for the business? That's the actual query of business fitness,” Bresniker added. “If we keep going like this, we're going to hit a tough ceiling. If we start that conversation, have that understanding and begin to protest and say: I need more transparency. I want to know where this data is coming from. How much energy is on this model? Is there one other alternative? Maybe a number of small models are higher than a monolithic monoculture. Even before we hit the ceiling, we're going to deal with the monoculture.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read