“Integrating AI-Focused Data Management and Differential Privacy: The Power of PVML”

PVML is offering an interesting solution by combining a ChatGPT-like tool for analyzing data with the safety guarantees of differential privacy. It’s much easier, faster and more efficient — and our secret sauce, differential privacy, enables this integration very easily.”Differential privacy is far from a new concept. The team argues that today’s data access solutions are ineffective and create a lot of overhead. The promise of using differential privacy means that PVML’s users don’t have to make changes to the original data. “That’s a taste of things to come, and organizations who adopt AI today will be a step ahead tomorrow.

Enterprises are accumulating an unprecedented amount of data to fuel their AI pursuits. However, with this increased data collection comes the concern of safeguarding it, especially since much of it is highly confidential. PVML offers a compelling solution to this issue by combining a ChatGPT-esque tool for data analysis with the protective measures of differential privacy. By utilizing retrieval-augmented generation (RAG), PVML can access an organization’s data without physically moving it, effectively removing yet another security consideration.

The Israel-based company has recently announced a successful seed round of $8 million, led by NFX, with additional funding from FJ Labs and Gefen Capital.

The startup was founded by husband-and-wife duo Shachar Schnapp (CEO) and Rina Galperin (CTO). Schnapp earned his doctorate in computer science with a specialization in differential privacy and went on to work on computer vision projects at General Motors. Galperin, meanwhile, obtained her master’s degree in computer science with a focus on artificial intelligence and natural language processing, and worked on various machine learning initiatives at Microsoft.

“Our experience in this field stems from our time working in large corporations and organizations, where we witnessed the efficiency issues that we had hoped would not exist as naive students,” said Galperin. “As PVML, our main objective is to democratize data, which can only be achieved by both protecting this sensitive information and allowing easy access to it. In today’s landscape, easy access equates to AI capabilities, with everyone wanting to utilize free text for data analysis, as it proves to be quicker, more efficient, and our secret weapon, differential privacy, simplifies integration.”

Differential privacy is not a novel concept. Its main premise is to guarantee the privacy of individual users within large data sets by providing mathematical assurances. One common technique to achieve this is by introducing a level of randomness into the data, without disrupting its analysis.

The PVML team argues that current data access solutions are not only ineffective, but they also create a considerable amount of overhead. This often results in a significant portion of data being removed in the process of granting secure access to employees. However, this approach can do more harm than good, as the redacted data may not be viable for certain tasks. Additionally, the extended lead time required to access this data often renders it impractical for real-time use cases.

By utilizing differential privacy, PVML’s users do not need to modify the original data, effectively minimizing overhead and safely unlocking this information for AI application.

Many of the prominent tech giants currently implement differential privacy in some form and make their tools and libraries available to developers. However, PVML argues that it has yet to be adopted by the majority of the data community.

“The understanding of differential privacy remains more theoretical than practical,” said Schnapp. “We have made it our mission to take it from concept to execution. Through the development of practical algorithms specifically tailored for data in real-world scenarios, we have made this a reality.”

Of course, none of PVML’s work in differential privacy would matter if their actual data analysis tools and platform were not useful. One of the most obvious use cases is the ability to chat with data, while ensuring no sensitive information leaks into the conversation. By utilizing RAG, PVML can significantly reduce the level of false information and the impact on performance is minimal, as the data remains in its original location.

However, there are other use cases as well. Schnapp and Galperin explain how differential privacy enables companies to share data between departments, and even monetize access to their data for third parties.

“Currently, 70% of stock market transactions are made by AI,” stated Gigi Levy-Weiss, general partner and co-founder of NFX. “This is just a glimpse into the future, where organizations that embrace AI will be ahead of the curve. Unfortunately, many companies are hesitant to connect their data to AI, fearing security breaches, and with good reason. PVML’s innovative technology creates an invisible layer of protection, making data accessible to all and enabling monetization opportunities today, while paving the path for future advancements.”

Avatar photo
Kira Kim

Kira Kim is a science journalist with a background in biology and a passion for environmental issues. She is known for her clear and concise writing, as well as her ability to bring complex scientific concepts to life for a general audience.

Articles: 867

Leave a Reply

Your email address will not be published. Required fields are marked *