Enterprise data stacks are notoriously diverse, chaotic and fragmented. As data flows from multiple sources into complex multi-cloud platforms and is then distributed across various AI, BI and chatbot applications, managing these ecosystems has grow to be a frightening and time-consuming challenge. Today, Connecty AIa San Francisco-based startup, has emerged from stealth mode with $1.8 million to simplify this complexity through a context-aware approach.
Connecty's core innovation is a context engine that spans enterprises' entire horizontal data pipelines and actively analyzes and connects various data sources. By linking data points, the platform captures a differentiated understanding of what is going on in the corporate in real time. This “contextual awareness” enables automated data tasks and ultimately enables accurate, actionable business insights.
Although still in its infancy, Connecty is already streamlining data tasks for several firms. The platform reduces the work of knowledge teams by as much as 80% and executes projects that after took weeks in minutes.
Connecty brings order to “data chaos”
Even before the age of language models, data chaos was a grim reality.
As structured and unstructured information grows at an unprecedented rate, teams are continuously struggling to maintain their fragmented data architectures so as. This resulted in essential business context remaining scattered and data schemas becoming outdated – leading to poor performance of downstream applications. Imagine the case of AI chatbots affected by hallucinations or BI dashboards providing inaccurate business insights.
Connecty AI founders Aish Agarwal and Peter Wisniewski saw these challenges firsthand of their respective roles in the info value chain and discovered that all of it boils right down to one big problem: capturing the nuances of business data distributed across pipelines. Essentially, teams needed to do plenty of manual work for data preparation, mapping, exploratory data evaluation, and data model preparation.
To fix this problem, the duo began working on the startup and the context engine that lies at its heart.
“The core of our solution is the proprietary context engine that extracts, connects, updates and enriches data from multiple sources in real-time (via no-code integrations), including human-in-the-loop feedback to fine-tune custom definitions. We do that with a mixture of vector databases, graph databases and structured data, making a “context graph” that captures and maintains a nuanced, connected view of all information,” Agarwal told VentureBeat.
Once the company-specific context diagram covering all data pipelines is prepared, the platform mechanically generates a dynamic, personalized semantic layer for every user's persona. This layer runs within the background and proactively generates recommendations inside data pipelines, updates documentation and enables the delivery of contextually relevant insights which are immediately tailored to the needs of various stakeholders.
“Connecty AI applies deep context learning of disparate data sets and their connections to every object to create comprehensive documentation and discover business metrics based on business intent. In the info preparation phase, Connecty AI generates a dynamic semantic layer that helps automate data model generation while highlighting inconsistencies and resolving them with human feedback, further enriching contextual learning. Additionally, self-service data exploration capabilities will empower product managers to conduct ad hoc evaluation independently, minimizing their reliance on technical teams and enabling more agile, data-driven decision making,” Agarwal explained.
Insights are delivered through “data agents” that interact with users in natural language, considering their technical expertise, information access level and permissions. Essentially, the founder explains, each user persona receives a customized experience that matches their role and skills, making it easier to interact effectively with data, increasing productivity and reducing the necessity for extensive training.
Significant results for early partners
While many firms, including startups like DataGPT and multi-billion dollar giants like Snowflake, promise faster access to accurate insights with large language model-based interfaces, Connecty claims to face out with its context graph-based approach that covers the whole stack, not only one or two platforms.
According to the corporate, other organizations are automating data workflows by interpreting static schemas. However, this approach falls short in production environments that require an ever-evolving, coherent understanding of knowledge across systems and teams.
Currently, Connecty AI is within the pre-revenue phase, but is working with several partner firms to further improve the performance of its product using real-world data and workflows. These include Kittl, Fiege, Mindtickle and Dept. All 4 organizations run Connecty POCs of their environments and have been in a position to streamline data projects, reduce their teams' work by as much as 80%, and speed up time to insights.
“Our data complexity is increasing rapidly and data preparation and metrics evaluation are taking longer. We would wait a mean of two to 3 weeks to organize data, extract actionable insights from our product usage data, and merge it with transactional and marketing data. With Connecty AI, it’s now a matter of minutes,” said Nicolas Heymann, CEO of Kittl.
As a next step, Connecty plans to expand the understanding capabilities of its context engine by supporting additional data sources. In addition, the product will likely be made available to a wider range of companies as an API service, charging them on a per-seat or usage-based pricing model.