The funding climate for AI chip startups, once as sunny as a day in mid-July, is beginning to darken Nvidia asserts its dominance.
According to a recent report, US chip corporations behaved from January 2023 to September 2023, just $881 million – down from $1.79 billion in the primary three quarters of 2022. AI chip company Mythic ran out of cash in 2022 and was almost forced to to stop operations, while Graphcore, a once well-capitalized rival, now faces mounting losses.
But one startup appears to have found success within the highly competitive — and increasingly crowded — AI chip space.
HailoCo-founded in 2017 by Orr Danon and Avi Baum, previously CTO of wireless connectivity on the microprocessor company Texas Instruments, develops specialized chips to run AI workloads on edge devices. Hailo's chips perform AI tasks with lower memory and power consumption than a typical processor, making them a powerful candidate for compact, offline and battery-powered devices resembling cars, smart cameras and robotics.
“I co-founded Hailo with the mission of creating powerful AI available at scale outside the realm of information centers,” Danon told TechCrunch. “Our processors are used for tasks resembling object detection, semantic segmentation, etc., in addition to AI-powered image and video enhancement. More recently, they’re getting used to run large language models (LLMs) on edge devices resembling personal computers, infotainment electronic control units, and more.”
Many AI chip startups have yet to receive a single major order, let alone dozens or tons of. But Hailo now has over 300 customers in industries resembling automotive, security, retail, industrial automation, medical devices and defense, in response to Danon.
In a bet on Hailo's future prospects, a cohort of backers including Israeli businessman Alfred Akirov, automobile importer Delek Motors and VC platform OurCrowd invested $120 million this week in Hailo, an expansion of the corporate's Series C. Danon said the brand new capital will “enable Hailo to capitalize on all opportunities within the pipeline” while “creating the conditions for long-term growth.”
“We are strategically positioned to bring AI to edge devices in a way that may significantly expand the reach and impact of this remarkable latest technology,” Danon said.
Now you is likely to be wondering if a startup like Hailo has a probability against chip giants like Nvidia, even to a lesser extent poor, Intel And AMD? One expert, Christos Kozyrakis, a Stanford professor of electrical engineering and computer science, thinks so – he believes accelerator chips like Hailo's will change into “absolutely vital” as AI spreads.
“The energy efficiency gap between CPUs and accelerators is simply too big to disregard,” Kozyrakis told TechCrunch. “They use the accelerators for efficiency in key tasks (e.g. AI) and have one or two processors on the side for programmability.”
Kozyrakis sees longevity as a challenge for Hailo's leadership – for instance, if the AI model architectures the chips are designed to run efficiently exit of fashion. Software support is also an issue, says Kozyrakis, if a critical mass of developers aren’t willing to learn methods to use the tools based on Hailo's chips.
“Most of the challenges in terms of custom chips are within the software ecosystem,” Kozyrakis said. “For example, Nvidia has an unlimited advantage over other corporations in the realm of AI because they’ve been investing in software for his or her architectures for over 15 years.”
But with $340 million within the bank and a workforce of around 250 employees, Danon is confident about Hailo's path forward – no less than within the short term. He believes the startup's technology addresses most of the challenges corporations face with cloud-based AI inference, particularly latency, cost and scalability.
“Traditional AI models depend on cloud-based infrastructure and infrequently suffer from latency issues and other challenges,” Danon said. “They lack the power to receive real-time insights and alerts, and their reliance on networks compromises reliability and integration with the cloud, raising privacy concerns. Hailo addresses these challenges by offering solutions that operate independently of the cloud and are subsequently capable of handle significantly higher amounts of AI processing.”
Curious about Danon's perspective, I asked about generative AI and its heavy reliance on the cloud and distant data centers. Surely Hailo sees the present top-down, cloud-centric model (e.g. OpenAI's approach) as an existential threat?
Danon said that, quite the opposite, generative AI is driving latest demand for Hailo's hardware.
“In recent years, we have now seen a surge in demand for edge AI applications across most industries, from airport security to food packaging,” he said. “The latest surge in generative AI is further driving this demand as we see requests to process LLMs locally not only from customers within the computing and automotive industries, but in addition from industrial automation, security and other areas.”
How about that?