We've known it for a while, but now it's certain: the race for generative AI is competitive for each developers and end users.
Case in point: today, Elon Musk's xAI, the spin-off startup of the social network X, which uses its data to coach latest large-scale language models (LLMs) just like the Grok family announced its application programming interface (API) is now open to the general public and includes $25 free API credit per 30 days through the top of the yr.
Since it's already November, that's just two months' price of free credits, or a complete of $50.
Musk had already announced three weeks ago that the xAI API was available in beta, but apparently adoption wasn't enough for his liking, hence the added incentive of free developer credits.
Is $25 a month with two months left really that much of a carrot?
That doesn't sound like much coming from the world's richest man and multi-billionaire, and it doesn't really apply to individual users or as a complete, however it is perhaps enough to entice some developers to not less than get the tools and View xAI's platform. Create apps based on the Grok models.
Special, The price for xAI's API is $5 per million input tokens and $15 per million outputin comparison with $2.50/$10 for OpenAI's GPT-4o model and at $3/$15 for the Claude 3.5 Sonnet model from Anthropic. Ultimately, which means xAI's $25 balance won't get the developer very far – only about two million tokens per 30 days. For reference: a million tokens corresponds to the word length of 7-8 novels.
The context limit, i.e. what number of tokens will be entered or issued in an interaction via the API, is around 128,000, much like GPT-4o and OpenAI below Anthropic's 200,000 token windowand significantly below The length of the Google Gemini 1.5 Flash context window is 1 million.
Additionally, in my temporary testing of the xAPI, I used to be only capable of access Grok beta and text only, with no image generation capabilities like Grok 2 (powered by Black Forest Labs' Flux.1 model).
New Grok models will follow soon
According to xAI's blog post, this is definitely “a preview of a brand new Grok model currently in final stages of development,” and a brand new “vision model” of Grok might be available next week.
Additionally, xAI points out that the grok beta supports “function calls,” i.e. allows such access).
Compatible with the competition
Furthermore, the xAI account on the social network X published that the xAI API is “compatible with OpenAI and Anthropic SDKs,” or the software development kits of varied web tools utilized by developers on these platforms, meaning it must be relatively easy to deploy these models against grok -beta or others on the platform to exchange xAI platform.
Musk's xAI recently turned on its “Colossus” supercluster of 100,000 Nvidia H100 GPUs in Memphis, Tennessee, where its latest models are trained – the most important or considered one of the most important on the earth – so it appears that tough work is already underway at this facility.
What do you’re thinking that? Is it enough to lure developers into VentureBeat's audience to try constructing on xAI? Let me know: carl.franzen@venturebeat.com.