Nvidia is acquire Run:ai, a Tel Aviv-based company that makes it easier for developers and operations teams to administer and optimize their AI hardware infrastructure for an undisclosed amount.
Honors reported This morning, the businesses were in “advanced negotiations” that would lead to Nvidia paying greater than $1 billion for Run:ai. Apparently these negotiations went easily.
Nvidia says it would proceed to supply Run:ai's products “under the identical business model” within the near future and can put money into Run:ai's product roadmap as a part of Nvidia's DGX Cloud AI platform.
“Run:ai has been working closely with Nvidia since 2020 and we share a passion for helping our customers get probably the most out of their infrastructure,” Run:ai CEO Omri Geller said in a press release. “We are thrilled to affix Nvidia and sit up for continuing our journey together.”
Geller co-founded Run:ai with Ronen Dar several years ago after the 2 studied together at Tel Aviv University under Professor Meir Feder, Run:ai's third co-founder. Geller, Dar and Feder wanted to construct a platform that would “decompose” AI models into fragments that will run in parallel on hardware, be it on-premises, within the cloud or at the sting.
While Run:AI has relatively few direct competitors, other startups are applying the concept of dynamic hardware allocation to AI workloads. For example, Grid.ai offers software that enables data scientists to coach AI models in parallel across GPUs, processors, and more.
But relatively early on, Run:AI managed to construct a big customer base of Fortune 500 corporations – which in turn attracted VC investment. Prior to the acquisition, Run:ai had raised $118 million in capital from backers including Insight Partners, Tiger Global, S Capital and TLV Partners.