Elon Musk's startup xAI has unveiled its latest creation: a supercomputer called Colossus.
This massive AI training system currently features 100,000 Nvidia Hopper H100 processors.
Colossus, based in Memphis, Tennessee, is ready to expand with 50,000 of Nvidia's latest, more advanced H200 series chips, that are about twice as powerful because the H100 chips.
This would almost actually make Colossus probably the most powerful computer on the planet, if it wasn't already.
For context, Meta had announced plans to build up 350,000 H100 GPUs by the tip of 2024, while OpenAI would probably only use about 25,000 GPUs for GPT-4, but it surely may very well be many more.
Musk himself announced the launch of Colossus on X over Labor Day weekend, stating, “From start to complete, it took 122 days. Colossus is probably the most powerful AI training system on the earth. In addition, it’ll double its size to 200,000 (50,000 H200) in a couple of months.”
This weekend, the @xAI Team brought our Colossus 100k H100 training cluster online. From start to complete, it took 122 days.
Colossus is probably the most powerful AI training system on the earth. Moreover, it’ll double its size to 200,000 (50,000 H200) in a couple of months.
Excellent…
– Elon Musk (@elonmusk) 2 September 2024
Driving the long run of Grok
The immediate purpose of Colossus is to coach xAI's large language model (LLM), often known as Grok.
Currently, Grok is barely available to paying subscribers of Musk's social media platform X. The company has already released an early beta version of Grok-2, which was trained on around 15,000 Nvidia H100s.
Despite this relatively small training dataset, Grok-2 is already considered one of the crucial powerful AI models for big languages in response to the competition's chatbot leaderboards.
In addition, it’s open source, which implies that the trend towards closed source model releases from competitors OpenAI, Anthropic and Google is fighting for another.
Musk's ambitions for Grok are extravagant. He is already looking forward to Grok-3, scheduled for release in December 2024.
In an interview with Jordan Peterson in July, Musk boldly claimed, “Grok-3 needs to be probably the most powerful AI on the earth at this point.”
The massive increase in GPU count for training Grok-3 suggests that Musk is serious about this claim.
What about xAI’s other goals?
Founded in July 2023, xAI ultimately wants to seek out out “What the hell is actually happening here?” In Musk’s words, this implies asking fundamental questions on reality, dark matter, Fermi Paradoxand other cosmic mysteries.
The founding team of xAI was chosen to pursue this vision. Igor Babushkin, who previously worked on the Large Hadron Collider at CERN, said the corporate desires to “really advance our understanding of the universe.”
Jimmy Ba, an AI researcher on the University of Toronto, explained the goal of constructing a “general problem-solving machine” to tackle humanity’s biggest problems.
To advance its research and development, secured 6 billion US dollars in a Series B funding round in May 2024, backed by enterprise capital firms reminiscent of Andreessen Horowitz and Sequoia Capital, in addition to deep-pocketed investors reminiscent of Fidelity and Saudi Prince Alwaleed bin Talal's Kingdom Holding.
Many Tesla experts consider that Grok could eventually power the AI behind Tesla's humanoid robot Optimus. Musk claimed that Optimus could bring Tesla a trillion dollars in annual profits. Musk has also hinted that Tesla may invest $5 billion in xAI, which some shareholders have welcomed.
Now that Colossus is online, might xAI be able to transcend creating more LLMs and take a look at something daring and latest?
Sure, Grok-3 will probably be very powerful. It will probably be only barely more powerful than its competitors, and just for a short while, until something overtakes it.
This is where you’ve got to query the elemental long-term goals. AI corporations compete fiercely with similar models trained on nearly equivalent hardware stacks. This is pricey, and innovation risks stagnating with small, incremental gains.
This clears the sphere for one among the important thing players to look beyond the era of LLMs. Will xAI be the primary to enter the scene with Colossus, as Musk suggested when it was founded?