HomeNewsPVML combines an AI-centric data access and analytics platform with granular data...

PVML combines an AI-centric data access and analytics platform with granular data protection

Companies are hoarding more data than ever to advance their AI ambitions, but at the identical time also they are concerned about who can access this data, which is commonly very private in nature. PVML offers an interesting solution by combining a ChatGPT-like data evaluation tool with the safety guarantees of differential privacy. Using Retrieval-Augmented Generation (RAG), PVML can access an organization's data without moving it, eliminating one other security concern.

The Tel Aviv-based company recently announced that it has closed an $8 million seed round led by NFX with participation from FJ Labs and Gefen Capital.

Photo credit: PVML

The company was founded by a married couple Shachar Snap (CEO) and Rina Galperin (CTO). Schnapp earned a doctorate in computer science with a specialization in differential privacy after which worked at General Motors in computer vision, while Galperin earned her master's degree in computer science with a concentrate on AI and natural language processing and worked on machine learning projects at Microsoft.

“Plenty of our experience on this area comes from our work in large corporations and enormous corporations, where we saw that things may not be as efficient as we had hoped as naive students,” Galperin said. “The fundamental value we would like to bring to organizations as PVML is the democratization of information. This can only occur if, on the one hand, you protect this very sensitive data, but however, you enable quick access to it, which is now synonymous with AI. Everyone wants to research data using free text. It’s much simpler, faster and more efficient – ​​and our secret sauce, Differential Privacy, makes this integration easy.”

Differentiated privacy is anything but a brand new concept. The core idea is to make sure the privacy of individual users in large datasets and supply mathematical guarantees for this. One of essentially the most common ways to attain that is to introduce some extent of randomness into the info set, but in a way that doesn’t alter the info evaluation.

The team argues that today's data access solutions are ineffective and create a number of overhead. For example, a number of data often must be removed to permit employees to access data securely. However, this may be counterproductive as you could not have the ability to make use of the redacted data effectively for some tasks (and beyond). The lead time to access the info implies that real-time use cases are sometimes inconceivable.

Photo credit: PVML

The promise of differential privacy protection implies that PVML users don’t must make any changes to the unique data. This eliminates just about all overhead and securely releases this information for AI use cases.

Practically everyone large Technology Companies now use differential privacy in a single form or one other and make their tools and libraries available to developers. The PVML team argues that it has not yet been truly put into practice by most data communities.

“Current knowledge about differential privacy is more theoretical than practical,” Schnapp said. “We decided to take it from theory to practice. And that’s exactly what we did: we develop practical algorithms that work best with data in real-world scenarios.”

None of the nuanced data protection work would matter if PVML's actual data evaluation tools and platform weren’t useful. The most evident use case here is the power to speak along with your data, all with the guarantee that no sensitive data can enter the chat. Using RAG, PVML can reduce hallucinations to almost zero and the hassle is minimal because the info stays in place.

But there are also other use cases. Schnapp and Galperin noted that differentiated data protection now also allows corporations to share data between business units. In addition, it might also allow some corporations to monetize access to their data for third parties, for instance.

“In today’s stock market, 70% of transactions are carried out by AI,” said Gigi Levy-Weiss, NFX general partner and co-founder. “This is a taste of the longer term, and firms that adopt AI today will probably be one step ahead tomorrow. But corporations are afraid to link their data to AI because they fear disclosure – and for good reasons. PVML’s unique technology creates an invisible layer of protection and democratizes access to data, enabling monetization use cases today and paving the best way for tomorrow.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read