HomeArtificial IntelligenceWill the fee of scaling infrastructure limit the potential of AI?

Will the fee of scaling infrastructure limit the potential of AI?

AI is driving innovation at a pace the world has never seen before. However, there’s a caveat: the resources needed to store and compute data within the age of AI may exceed availability.

The challenge of applying AI at scale is one which the industry has been grappling with in alternative ways for a while. As large language models (LLMs) have grown, so have large-scale training and inference requirements. Added to this are concerns concerning the availability of GPU AI accelerators as demand has exceeded expectations.

Now the race is on to scale AI workloads while controlling infrastructure costs. Both traditional infrastructure providers and a brand new wave of different infrastructure providers are actively working to extend performance in processing AI workloads while reducing costs, energy consumption and environmental impact to fulfill the rapidly growing needs of enterprises scaling AI workloads.

“We see quite a lot of complexities that may include scaling AI,” Daniel Newman, CEO of Futurum Group, told VentureBeat. “Some of them with more immediate impact and others that may likely have significant impact down the road.”

Newman is anxious about power availability and the true long-term impact on business growth and productivity.

Is quantum computing an answer to scaling AI?

While one solution to the energy problem is to construct more electricity generation capability, there are numerous other options, including integrating other kinds of non-traditional computing platforms, corresponding to quantum computers.

“Current AI systems are still being explored at a rapid pace and their progress might be limited by aspects corresponding to energy consumption, long processing times and high compute power requirements,” Jamie Garcia, director of quantum algorithms and partnerships at IBM, told VentureBeat. “As quantum computing advances in scale, quality and speed, opening up latest and classically inaccessible computational spaces, it could have the potential to assist AI process certain kinds of data.”

Garcia noted that IBM has a transparent path to scaling quantum systems that may bring each scientific and business advantages to users. As they scale, quantum computers will increasingly give you the chance to process incredibly complicated data sets, he said.

“This gives them the natural potential to speed up AI applications that require generating complex correlations in data, corresponding to uncovering patterns, which could reduce the training time of LLMs,” Garcia said. “This may benefit applications in quite a lot of industries, including healthcare and life sciences, finance, logistics and materials science.”

Scaling AI within the cloud is (for now) under control

Scaling AI, like all other sort of technology, will depend on infrastructure.

“You can’t do the rest unless you get out of the infrastructure stack,” Paul Roberts, director of strategic accounts at AWS, told VentureBeat.

Roberts noted that there was an enormous explosion in AI in late 2022 when ChatGPT first went public. While in 2022 it could not have been clear where the technology was going, he said that in 2024, AWS has a superb handle on the issue. AWS particularly has invested significantly in infrastructure, partnerships and development to enable and support AI at scale.

Roberts points out that the scaling of AI is, in some ways, a continuation of the technological advances that enabled the rise of cloud computing.

“I believe we have now the tools and the infrastructure today, and I don't see this as a hype cycle,” Roberts said. “I believe that is just an evolution on this path that perhaps began with mobile devices becoming really intelligent, but today we're constructing these models on the trail to AGI, where we're going to reinforce human capabilities in the long run.”

Scaling AI shouldn’t be nearly training, but additionally about inference

Kirk Bresniker, chief architect and HPE Fellow/VP at Hewlett Packard Labs, has quite a few concerns concerning the current evolution of AI scaling.

Bresniker sees the potential risk of a “hard cap” on AI progress if concerns should not addressed. He noted that given today's requirements to coach a number one LLM and the proven fact that current processes remain the identical, he expects that by the top of the last decade, more resources will likely be needed to coach a single model than the IT industry is more likely to give you the chance to support.

“If we proceed on our current course and speed, we're heading toward a really, very hard cap,” Bresniker told VentureBeat. “That's scary because we have now other computational goals as a species that we want to realize than simply training a model once.”

The resources required to coach ever larger LLMs should not the one problem. Bresniker noted that after an LLM is built, inference is run repeatedly, and if this runs 24 hours a day, 7 days per week, the energy consumption is gigantic.

“The conclusion will bring death to polar bears,” said Bresniker.

How deductive reasoning can assist scale AI

According to Bresniker, one possible strategy to improve AI scaling is to incorporate deductive reasoning along with the present concentrate on inductive reasoning.

Bresniker argues that deductive reasoning could potentially be more energy efficient than the present inductive reasoning approaches, which require gathering huge amounts of knowledge after which analyzing it to inductively analyze the information and find the pattern. In contrast, deductive reasoning uses a logic-based approach to attract conclusions. Bresniker noted that deductive reasoning is one other human skill that shouldn’t be yet really present in AI. He doesn’t consider that deductive reasoning should completely replace inductive reasoning, but moderately that it must be used as a complementary approach.

“Adding this second capability means we're approaching an issue the suitable way,” Bresniker said. “It's so simple as having the suitable tool for the suitable job.”

Learn more concerning the challenges and opportunities of scaling AI at VentureBeat Transform next week. Speakers addressing this topic at VB Transform include Kirk Bresniker, Hewlett Packard Labs Chief Architect, HPE Fellow/VP; Jamie Garcia, Director of Quantum Algorithms and Partnerships, IBM; and Paul Roberts, Director of Strategic Accounts, AWS.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read