Elon Musk's xAI released Grok-1's AI model code and weights, mocking OpenAI.
This release via GitHub and BitTorrent enables researchers and developers worldwide to construct and iterate with its 314 billion parameters – around 150 billion greater than GPT-3.
xAI goals to democratize access to advanced LLM technology by providing a raw, unrefined version of Grok-1 that is prepared for experimentation in every way, including commercially.
░W░E░I░G░H░T░S░I░N░B░I░O░
– Grok (@grok) March 17, 2024
Of course, Musk couldn't resist just a little (un)friendly banter about Grok's open source offering. ChatGPT’s account
Tell us more in regards to the “Open” a part of OpenAI…
— Elon Musk (@elonmusk) March 17, 2024
Musk and OpenAI founders Sam Altman and Greg Brockman are embroiled in a legal battle and debate over OpenAI's dramatic evolution from a nonprofit open-source research company to a profit-making arm of Microsoft.
Grok is one other thorn within the side of OpenAI, which is under pressure from several quarters with the recent release of Anthropic's impressive Claude 3 Opus and Google's Gemini. Even Apple has joined the LLM fray with its newly released MM1.
However, Grok-1 isn’t immediately ready and accessible for conversational AI applications.
First, the model has not been refined with specific instructions or data sets to operate optimally in dialog systems. This means additional effort and resources shall be required to leverage Grok-1's capabilities for such tasks, presenting a challenge for those interested by developing conversational AI.
Additionally, the sheer size of the model's weights – a whopping 296GB – signifies that running the model requires significant computing resources, including high-end data center-class hardware.
However, the AI community is anticipating possible efforts to optimize Grok-1 through quantization, which could reduce the scale and computational burden of the model, making it more accessible to those with generative AI-friendly rigs.
Grok-1 is actually open source
One of an important features of the Grok-1 release is xAI's decision to make use of the Apache 2.0 license, joining Mistral's 7B license.
Unlike some licenses that impose more restrictive terms on the use and distribution of the software, the Apache 2.0 License provides extensive freedom to make use of, modify, and distribute the software.
Grok weights can be found on Apache 2.0: https://t.co/9K4IfarqXK
It is more open-source than other open-weight models, which often include usage restrictions.
It is less open source than Pythia, Bloom and OLMo, which include training code and reproducible datasets. https://t.co/kxu2anrNiP pic.twitter.com/UeNew30Lzn
— Sebastian Raschka (@rasbt) March 17, 2024
This includes industrial uses and makes Grok-1 a sexy foundation for firms and individuals seeking to construct on top of it or integrate the model into their very own services and products.
By making Grok-1's weights and architecture freely available, xAI advances Musk's vision of open AI and confronts the AI community at large.
Any viable open source model threatens to erode the revenues of closed source developers like OpenAI and Anthropic.
OpenAI is prone to be rocked by recent developments from Anthropic, Google and now xAI.
TThe community is preparing for some form of GPT-5 or Sora release that can put them back on top.