HomeArtificial IntelligenceStardog's Karaoke offers a neighborhood, hallucination-free LLM solution for businesses

Stardog's Karaoke offers a neighborhood, hallucination-free LLM solution for businesses

Enterprise data management and knowledge graph company Stardogheadquartered in Arlington, Virginia, has remained one step ahead since its inception Started in 2006: Even then, founder and CEO Kendall Clark knew that twenty first century corporations could be defined by their data – and that digitization and quick accessibility on demand could be an enormous market.

The company has raised $40 million in funding up to now and counts U.S. government agencies reminiscent of NASA and the Department of Defense, in addition to major corporations Raytheon and Bosch. to his customers. And with the age of generative AI in full swing, the demand for Stardog's services has only grown.

A brand new device for a brand new era

The company is now starting “Karaoke“, an modern local server developed along with partners Nvidia And Super micro which hosts Stardog's Voicebox Large Language Model (LLM) platform, a custom, fine-tuned enterprise-grade Llama 2 variant publicly unveiled in October 2023 that permits users without technical training to run natural language queries on their computers to enter the Stardog Cloud and have their questions answered with the structured data of their very own company.

“We're talking a couple of large bank, a manufacturer, a pharmaceutical company that’s regulated and due to this fact can't easily, or perhaps never, move all of its data to the cloud,” Clark said in a voice interview with VentureBeat. “All of those corporations need Gen AI, but nearly all of Gen AI is within the cloud. Karaoke is designed to bridge that gap and effectively bring the cloud to you, put it alongside your data after which provide you with the advantages of this democratized self-service data access.”

Karaoke is on the market in multiple sizes and configurations, from Micro – a single “pizza box” sized server – to the “Enterprise” version with 2304 CPU cores and a wide range of sizes in between. The smallest model supports 500 concurrent users, while the biggest model supports 20,000.

LLM as a knowledge science translator

The Voicebox LLM layer serves to effectively act as a translator, taking a user's query – for instance, within the case of a government agency: “Which trade within the euro violated sanctions within the last quarter” – and putting it into the language of information science and programming translated, a relevant JavaScript, Python or SQL query that retrieves the data from the corporate's Stardog Knowledge Graph.

How do I avoid hallucinations? Do not show LLM expenses

Additionally, by leveraging Stardog's advanced knowledge graph and newly developed Safety RAG (Rapid Accuracy Guarantee) design pattern, Voicebox guarantees a 100% hallucination-free experience. Clark told VentureBeat that due to this implementation, enterprise customers now not need to compromise on accuracy in favor of advanced technology.

“It’s pretty easy to construct a hallucination-free AI system,” he said. “You only need to do one thing: never show the user something that comes exclusively from the massive language model. We never show an end user a incontrovertible fact that comes from the massive language model. We only show them facts that emerge from their data.”

Instead of counting on the LLM to gather and summarize and even retrieve an organization's data, the Voicebox LLM layer simply converts the user's natural language queries back into more programmatic queries that a trained data scientist would create.

The “response” the user receives is an LLM (token prediction) response. Instead, it is solely what appears when the translated, programmatic version of the query accesses the corporate's database. If the LLM layer doesn't understand the user's query or what data would help answer it, it simply tells them: “I don't know the best way to answer your query.”

Additionally, Voicebox provides the user with a digital trail of where they retrieved the data, including quotes and links.

As Clark explained it, it shows “the reply to your query, a link which you could click on to open the source.” Our platform has traceability and lineage so you may check for yourself whether these answers can be found even eliminating the potential for having doubts or not trusting the information.”

Prices and availability

Clark said Stardog plans to supply its Voicebox LLM tier for $39 per user per thirty days and it may possibly be utilized in clouds or virtual private clouds without the karaoke box.

The price of the karaoke box is dependent upon the variety of users and hardware based on the sizes mentioned above. However, Clark said the box was leased to customers for a period of three to 5 years.

For corporations in regulated industries that wish to leverage the ability of generative AI while meeting compliance requirements, Stardog's Karaoke represents a possible solution.

As these tools turn out to be increasingly essential within the business operations of varied sectors, Stardog is well positioned to take the lead in safely and effectively deploying GenAI applications on-premises. Those interested can find further details about Stardog Karaoke and Voicebox on the Stardog website at www.stardog.com.

With Karaoke, Stardog not only addresses a major market need, but in addition sets a brand new standard for using protected, effective and compliant AI technologies within the enterprise sector.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read