The chatter for artificial general intelligence (AGI) can dominate headlines that come from firms in Silicon Valley as possible OpenaiPresent Meta And XaiBut for local management managers, the main target is on practical applications and measurable results. At Venturebeeats youngest Transformation 2025 The event in San Francisco, a transparent picture: The era of real, used agents -KI is accelerated here and it’s already being redesigned how firms work.
Like firms IntuitPresent Capital onePresent LinkedInPresent Stanford university And Highmark Health Put the AI agents quietly in production, create concrete problems and see tangible returns. Here are the 4 largest snacks from the event for technical decision -makers.
1. AI agents move faster into production than everyone else has recognized
Companies at the moment are using AI agents in customer applications and the trend accelerates at an exciting pace. A recently carried out venteat to 2,000 industry experts, which was carried out shortly before the VB transformation 68% of the corporate company (with over 1,000 employees) had already accepted the Agent -Ki – a number that seemed high at the moment. (In fact, I used to be afraid that it was too high to be credible. When I announced the survey ends in the event phase, I warned that the high adoption could possibly be a mirrored image of the particular readership of venturebeat.)
However, latest data validate this fast shift. A KPMG survey Published on June twenty sixth, sooner or later after our event, shows that 33% of the organizations at the moment are using AI agentsA surprising triple increase of only 11% within the last two quarters. This market shift confirms the trend enterprise beat, which was only identified weeks ago in his survey before the transformation.
This acceleration is powered by material results. Ashan WillyCEO of New relicnoticed an incredible 30% quarter about quarterwaxism To adopt agents when monitoring AI applications by its customers, mainly as a consequence of the move of its customers. Companies use AI agents to assist customers automate workflows where they need assistance. IntuitFor example, in his QuickBooks software, the production and reminder agents made it available in invoices. The result? Companies that use the function are paid five days faster and are paid 10% higher.
Even non-developers feel the shift. Scott WhiteThe product manager of Anthropics The Claude Ai product described the way it, although he was not an expert programmer, now builds software functions ready for production. “It was impossible six months ago,” he said, emphasizing the performance of tools resembling Claude Code. Similar, Openai's Product manager for his API platform, Olivier Godementdetailed how customers like Strip And Crate Use its agents SDK to construct multi-agent systems.
2. The hyperscaler race has no clear winner as a multi-cloud, multi-model Reigns
The days through which the provider of a single large -scale model (LLM) is weighted are over. A consistent topic during the complete transformation 2025 was the step towards a multi-model and multi-cloud strategy. Companies want flexibility to decide on the most effective tool for the job, be it a strong proprietary model or a finely coordinated open source alternative.
As Armand RuizVP of the AI platform at IBM The development of a model gateway by the corporate, which leads applications to make use of LLM for the particular case, was a direct answer to customer demand. IBM began to offer company customers their very own open source models, then added open source support and eventually found that it was obligatory to support all models. This desire for flexibility was reproduced by XD Huang, the CTO of Zoom, who described the three -stage model approach of his company: support for proprietary models, the offer of their very own finely coordinated model and the enabling of consumers to create their very own finely tuned versions.
This trend creates a strong but limited ecosystem, through which GPUs and the performance that’s required to generate token are only supplied with a limited extent. As Dylan Patel from Semianalysis and other discussion participants Jonathan Ross from Glow And Sean lie from Brain This is identified that the profitability of many firms that simply buy more tokens after they can be found are put under pressure as a substitute of locking themselves into profits since the costs of those tokens proceed to drop. Companies turn into more intelligent about the right way to use different models for various tasks to optimize each costs and performance – and this will often mean Solidigma Around the event of tailor-made storage and storage solutions for AI.
3. The firms deal with solving real problems and never on the persecution of AGI
While Tech executives resembling Elon Musk, Mark Zuckerberg and Sam Altman talk concerning the starting of the superintelligence, corporate practitioners roll up their sleeves and solve immediate business challenges. The conversations in transformation were refreshingly ground in point of fact.
Take Highmark Health, The third largest integrated medical insurance and provider company within the country. His chief data officer Richard Clarke LLMS is used for practical applications resembling multilingual communication to raised use your diverse customer base and optimize the medical claims. In other words, the usage of technology to offer higher services today. Similar, Capital one Creates teams of agents who reflect the functions of the corporate, with certain agents for tasks resembling risk assessment and examination, including supporting their customers with automobile dealers that associated customers with the correct loans.
The travel industry also has a realistic change. CTOS of Expedia And kayak Discussed how they adapt to latest search paradigms which might be activated by LLMS. Users can now search for a chatta for a “Infinity pool” for a “Infinity Pool”, and travel sheet forms must include this discovery of the natural language with the intention to remain competitive. The focus is on the client, not on the technology of its own.
4. The way forward for the KI teams is small, nimble and strengthened
The age of the AI agents also changes the structure of the teams. The consensus is that small, agile “teams” of three to 4 engineers are best. Varun MohanCEO of WindsurfingA rapidly growing agent idea began the event by arguing that this small team structure enables a fast testing of product hypotheses, and avoids the slowdown that plagues larger groups.
This shift implies that “everyone seems to be a master builder” and increasingly “everyone seems to be a manager” of AI agents. As Girub And Atlassian Known, the engineers at the moment are learning to administer fleets from agents. The obligatory skills develop, with the deal with clear communication and strategic considering with the intention to guide these autonomous systems.
This maneuverability is supported by growing acceptance of sand -box development. Andrew ngA number one voice within the AI advised the participants to go away security, governance and observability at the top of the event cycle. This appears to be counter -actively for giant firms, but the thought is to advertise quick innovations in a controlled environment with the intention to prove the worth quickly. This feeling was reflected in our survey, through which it was found 10% of the organizations that introduce AI haven’t any committed AI security teamWhat suggests the willingness to prioritize the speed in these early stages.
Together, these snack bars draw a transparent picture of an AI landscape of firms that ripens quickly and moves from broad experiments to focused, value-oriented execution. The conversations with Transform 2025 showed that firms use AI agents today, even in the event that they needed to learn difficult lessons on the best way. Many have already undergone one or two large pivots since he tried generative AI for the primary time a 12 months ago – so it is nice to start out early.
For a greater dive in these topics and other analyzes of the event, you may hearken to the entire discussion with the independent AI developer Sam Witteeen in our last podcast below. We have just uploaded the lectures of the foremost phase to VB Transformation Here. And our complete reporting on articles from the event is here.
Listen to the Podcast VB Transform Takeaway Podcast with Matt Marshall and Sam Wittenen:

