MongoDB announced that Atlas Vector Search's integration with Amazon Bedrock is now available to the general public. First preview at Amazon Re:Invent Last 12 months, the connection enabled developers to synchronize their base models and AI agents with proprietary data stored in MongoDB, leading to more relevant, accurate and personalized answers through the usage of Retrieval Augmented Generation (RAG).
“Many corporations remain concerned about ensuring the accuracy of results from AI-powered systems while protecting their proprietary data,” MongoDB Chief Product Officer Sahir Azam said in a press release. “We are making it easier for common MongoDB (Amazon Web Services) customers to leverage a wide range of base models hosted of their AWS environments to construct generative AI applications that may securely leverage their proprietary data inside MongoDB Atlas “Improve accuracy and supply improved end-user experiences.”
Amazon Bedrock is AWS's managed service for Gen AI, providing enterprise customers with a central repository for all their AI app constructing needs. The rapidly growing collection of obtainable models includes models from Amazon, Anthropic, Cohere, MetaMistral and Stable diffusion. While using models trained by external parties will be helpful, corporations may prefer to make use of their very own databases. This gives them higher context about their customers than most people.
This is where MongoDB integration will be vital. Developers can privately customize the muse model of their alternative using their very own data. Applications can then be built across the newly trained LLMs without the necessity for manual intervention. “You can construct these Gen AI applications, but should you can't put your individual real-time operational data into the models, you're going to get generic answers,” says Scott Sanchez, vp of product marketing and strategy at MongoDB, during an event press conference.
“This integration with MongoDB makes it very easy for people to know,” he continues. “Customers also can privately customize their large language models… with their proprietary data by converting them into vector embeddings stored in MongoDB for these LLMs.” For example, a retailer could develop a Gen AI application that uses autonomous agents to Perform tasks equivalent to processing inventory requests in real time or processing customer returns.”
This is just not the primary collaboration between MongoDB and AWS. MongoDB vector search is out there on Amazon SageMaker and Atlas is powered by CodeWhisperer. Today's announcement comes as MongoDB declares additional efforts to assist enterprise customers construct AI applications, including its AI Applications Program (MAAP).