Google Cloud Try aggressively to consolidate its position within the increasingly competitive landscape for artificial intelligence. It has announced a comprehensive collection of latest technologies which are targeting “Thinking models“Agenten ecosystems and special infrastructure that were specially developed for large-scale AI deployments.
With his annual Cloud next conference In Las Vegas, Google has revealed its seventh generation today Tensor processing unit (TPU)Present Iron wood. The company claimsAn astonishing 24 -time more powerful than the world's leading supercomputer, The captain.
“The opportunity with AI is as great because it is,” said Amin Vahdat, Vice President and General Manager of ML Systems and Cloud AI from Google, during a press conference before the event. “Together with our customers, we feature a brand new golden age of innovation.”
The conference comes for Google at an important time that has led to dynamics significantly in its cloud business. In January, the corporate reported that the fourth quarter of 2024 Cloud revenue achieved $ 12 billionA 30% increase from yr in comparison with the yr. Google managers say lively users in To study and the Gemini Api increased by 80% last month.
How Google's latest Ironwood TPUS transform the AI ​​computer with electricity efficiency
Google is the one big cloud provider positioning itself with a “fully AI-optimized platform”, which was created from scratch to the so-called “age of the inference”, whereby the main target from model training to AI systems is shifted to resolve real problems.
The star of Google's infrastructure announcements is Ironwood, which represents a fundamental shift within the chip design philosophy. In contrast to previous generations that were compensated for, Ironwood was built especially for complex AI models after they’ve been trained.
“It isn’t any longer in regards to the data arrange within the model, but what the model can do with data after training,” said Vahdat.
Each Ironwood Pod comprises greater than 9,000 chips and delivers higher performance efficiency twice than the previous generation. This concentrate on efficiency deals with some of the urgent concerns in regards to the generative AI: enormous energy consumption.
In addition to the brand new chips, Google opens its massive global network infrastructure for corporate customers Cloud Wan (Wide Area Network). This service provides the two million miles long fiber net das, the patron services akin to YouTube and Gmail Mail can be found for corporations.
According to Google, Cloud WAN improves the network performance by as much as 40% and at the identical time reduces the whole ownership costs for a similar percentage in comparison with customer -leading networks. This represents an unusual step for a hyperscaler that essentially transforms its internal infrastructure right into a product.
Inside Gemini 2.5: How Google's “considering models” improve the AI ​​applications of corporations
On the software side, Google is expanding its Gemini model family Gemini 2.5 FlashAn economical version of its flagship AI system, which comprises what the corporate describes as “considering skills”.
In contrast to standard large -speaking models that generate reactions directly, these “considering models” arrange complex problems through multi -stage argument and even self -reflection. Gemini 2.5 Pro was began two weeks ago and is positioned to be used cases with high complexity akin to drug discovery and financial modeling. At the identical time, the newly announced flash variant adapts its depth of argument based on immediate complexity to compensate for performance and costs.
Google also significantly expands its generative media functions with updates Picture (for image generation), I understand (Video), Chirp (Audio) and the introduction of Lyria, a text-to-music model. During an indication through the press conference, Nenshad Bardoliwalla, Director of Product Management at Vertex AI, showed how these tools could work together to create an promoting video for a concert with customer -specific music and demanding processing functions, akin to removing unwanted elements from video clips.
“Only the Vertex AI combines all of those models along with models of third -party providers on a single platform,” said Bardoliwalla.
Beyond individual AI systems: How Google's Multi-agent ecosystem goals to enhance corporate workflows
The perhaps most future -oriented announcements focused on creating what Google calls as “Multi-agent ecosystem” – An environment by which several AI systems can work together on various platforms and providers.
Google leads one Agent development kit (ADK) This enables developers to create multi-agent systems with fewer than 100 code lines. The company also proposes a brand new open protocol called Agent2Agent (A2A) that permits AI agents from various providers to speak with one another.
“2025 shall be a transition yr by which generative AI is shifting from the answering of individual questions on the answer to complex problems by agented systems,” said Vahdat.
More than 50 partnersincluding necessary providers of Enterprise software SalesforcePresent Service And SAP, have registered to support this protocolWhich suggests a possible shift within the industry within the direction of interoperable AI systems.
For non-technical users, Google improves ITS Agent room Platform with functions akin to Agent Gallery (Provision of a single view of the available agents) and Agent designer (A no-code interface for creating custom agents). During an indication, Google showed how a banking account analyzes the client portfolios with these tools, predict money flow problems and routinely initiate communication to customers -without writing code.
From document summits to orders from Drive-Thru: How the specialized AI agents of Google affect the industry
Google also integrates the AI ​​across its entire workplace Productivity suite with latest functions akin to “Help me analyzeIn leaves that routinely discover insights from data from data without explicit formulas or pivot tables in addition to audio overviews in docs that create human audio versions of documents.
The company emphasized five categories of specialised agents who determine a big introduction: customer support, creative work, data evaluation, coding and security.
In Customer service identified Google Wendy's Ai Drive-throughthat now 60,000 orders day by day and the house deposits of the Home Depot takes over “Magic apron”Agent, which offers instructions for improving the do -it -yourselfer. For creative teams, corporations like WPP Use the Google AI to design and produce marketing campaigns on a scale.
The Cloud Ki competition is increasing: How the extensive approach of Google Microsoft and Amazon challenges
Google's announcements are made in the midst of the reinforced competition within the cloud -Ai -Ai area. Microsoft has it deep Integrated Openai technology on the Azure platformWhile Amazon built his own Anthropically driven offers and special chips.
Thomas Kurian, CEO of Google Cloud, emphasized the corporate's obligation to supply first-class infrastructures, models, platforms and agents, an open platform with multi-cloud that provides flexibility and selection;
This multi -stage approach seems to tell apart Google from competitors who can have strengths in certain areas, but not in the whole stack of chips to applications.
The way forward for Enterprise KI: Why Google's “considering models” and interoperability are necessary for business technology
What makes Google's announcements particularly necessary is the excellent character of its AI strategy, which incorporates custom silicon, global networks, model development, agent framework and application integration.
The concentration on inference optimization and never only the training functions reflects a ripening AI market. While the training has dominated ever larger models that dominate the headlines, inserting these models becomes efficient on a scale of the more urgent challenge for corporations.
Google's emphasis on interoperability – enable the systems of assorted providers may signal a departure from the walled garden approaches which have characterised earlier phases of cloud computing. By proposing open protocols akin to Agent2agentGoogle positions itself as a connective tissue in a heterogeneous AI ecosystem as a substitute of demanding the adoption of all-about-niche.
These announcements offer opportunities and challenges for the technical decision -makers of corporations. The efficiency gains, that are promised by specialized infrastructures akin to Ironwood TPUS and Cloud WAN, could significantly reduce the prices for the usage of AI on a scale. However, in the event you navigate the fast -developing landscape of models, agents and tools, careful strategic planning is required.
Since these more demanding AI systems develop, the power to prepare several special agents that work together, develop into crucial distinguishing feature for company -KI implementations. When creating each the components and the connections between them, Google focuses on the undeniable fact that the long run of AI isn’t only about more intelligent machines, but about machines that may effectively speak to one another.