HomeArtificial IntelligenceThe CEOs of Dropbox and Figma are backing Lamini, a startup constructing...

The CEOs of Dropbox and Figma are backing Lamini, a startup constructing a generative AI platform for businesses

Lamini, a Palo Alto-based startup constructing a platform to assist corporations use generative AI technology, has raised $25 million from investors including Stanford computer science professor Andrew Ng.

LaminiCo-founded a number of years ago by Sharon Zhou and Greg Diamos, has an interesting selling point.

Many generative AI platforms are far too general-purpose, Zhou and Diamos argue, and wouldn’t have solutions and infrastructure tailored to the needs of corporations. In contrast, Lamini is built from the bottom up for enterprises and is concentrated on delivering high accuracy and scalability of generative AI.

“The top priority of just about every CEO, CIO and CTO is to leverage generative AI of their business with maximum ROI,” Zhou, CEO of Lamini, told TechCrunch. “But while it’s easy for a single developer to get a working demo on a laptop, the trail to production is riddled with bugs all over the place.”

Zhou summed up what number of corporations expressed frustration with the hurdles to meaningfully adopting generative AI into their business functions.

According to a March Opinion poll According to MIT Insights, only 9% of corporations have adopted generative AI at scale, although 75% have experimented with it. The biggest hurdles range from an absence of IT infrastructure and skills to poor governance structures, inadequate skills and high implementation costs. Safety can be a crucial factor – currently Opinion poll According to Insight Enterprises, 38% of corporations said security affects their ability to make use of generative AI technology.

So what’s Lamini's answer?

Zhou says that “every part” of Lamini’s technology stack is optimized for enterprise-scale generative AI workloads, from hardware to software, including the engines used to support model orchestration, fine-tuning, execution and training. “Optimized” is admittedly a vague word, but Lamini is pioneering a step Zhou calls “memory optimization.” It is a method that trains a model on data in order that it accurately retrieves parts of that data.

Memory optimization can potentially reduce hallucinations, Zhou claims, or cases during which a model makes up facts in response to a question.

“Memory optimization is a training paradigm – just as efficient as fine-tuning but goes beyond that – to coach a model on proprietary data that incorporates essential facts, figures and figures in order that the model has high precision,” says Nina Wei, an AI designer at Lamini, told me via email, “and may remember and recall the precise match of essential information slightly than generalizing or hallucinating.”

I'm undecided if I purchase this. “Memory tuning” appears to be more of a marketing term than an educational one; There's no research on it – at the least none that I've been in a position to find. I'll leave it to Lamini to offer evidence that his “memory tuning” is best than the opposite hallucination-reducing techniques which are/have been tried.

Luckily for Lamini, storage optimization isn't the one differentiator.

According to Zhou, the platform can operate in highly secure environments, including air-gapped environments. Lamini allows organizations to run, optimize and train models in a variety of configurations, from on-premises data centers to private and non-private clouds. And it scales workloads “elastically,” reaching over 1,000 GPUs when the applying or use case requires it, Zhou says.

“Incentives are currently not aligned with closed-source models available in the market,” Zhou said. “We’re aiming Put control back into the hands of more people, not only a number of, starting with corporations that care most about control and have essentially the most to lose from their proprietary data that belongs to another person.”

For what it's value, Lamini's co-founders are pretty well-versed within the AI ​​space. They also met with Ng individually, which little doubt explains his investment.

Previously, Zhou was a lecturer at Stanford University, where she led a bunch focused on generative AI. Before earning her PhD in computer science from Ng, she was a product manager for machine learning at Google Cloud.

For his part, Diamos co-founded MLCommons, the engineering consortium dedicated to creating standard benchmarks for AI models and hardware, in addition to the MLCommons benchmarking suite MLPerf. He also led AI research at Baidu, where he worked with Ng while he was chief scientist there. Diamos was also a software architect at Nvidia CUDA Team.

The co-founders' industry contacts appear to have given Lamini a head start in fundraising. In addition to Ng, Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy and, oddly enough, Bernard Arnault, the CEO of luxury goods giant LVMH, have all invested in Lamini.

AMD Ventures can be an investor (somewhat ironic given Diamos' Nvidia roots), as are First Round Capital and Amplify Partners. AMD got involved early on, supplying Lamini with data center hardware, and today Lamini is running lots of his models on AMD Instinct GPUs, bucking the industry trend.

Lamini claims that the training and running performance of its model can sustain with that of comparable Nvidia GPUs, depending on the workload. Since we’re unable to confirm this claim, we are going to leave it to 3rd parties.

To date, Lamini has raised $25 million in seed and Series A rounds (Amplify led the Series A). Zhou says the cash will go toward tripling the corporate's 10-person team, expanding its computing infrastructure and beginning to develop “deeper technical optimizations.”

There are quite a lot of enterprise-focused generative AI providers that would compete with facets of the Lamini platform, including tech giants resembling Google, AWS and Microsoft (through the OpenAI partnership). Google, AWS, and OpenAI particularly have aggressively courted the corporate in recent months, introducing features like optimized fine-tuning, private data fine-tuning, and more.

I asked Zhou about Lamini's customers, revenue, and overall go-to-market dynamics. She didn't wish to reveal much at this somewhat early stage, but did say that AMD (through the AMD Ventures collaboration), AngelList and NordicTrack are among the many first (paying) users of Lamini, together with several unnamed government agencies.

“We are growing quickly,” she added. “The biggest challenge is serving customers. We only managed the incoming demand because we were overloaded. Given the interest in generative AI, we should not representative of the general technology downturn – unlike our competitors within the hyped AI world, we’ve got gross margins and revenue that look more like a traditional technology company.”

Mike Dauber, general partner at Amplify, said: “We imagine there may be an incredible opportunity for generative AI within the enterprise.” Although there are quite a lot of AI infrastructure corporations, Lamini is the primary company I even have seen that do that takes business issues seriously and develops an answer that helps corporations unlock the big value of their private data while meeting even essentially the most stringent compliance and security requirements.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read