ChatGPT Integration With Analytic Database Enables Conversational Queries
Developers and business users can perform ad-hoc analysis of databases at speed, using natural language queries to solve business problems and drive more value.
Join the DZone community and get the full member experience.
Join For FreeKinetica has just introduced the first analytics database to integrate with ChatGPT. This gives users, even “citizen data scientists” and business owners, the opportunity to ask any question about their proprietary data.
I was able to ask Nima Negahban, co-founder and CEO at Kinetica, some questions leading up to the release of their new offering.
Explain How the Conversational Querying of Databases Works
Conversational querying involves the use of a generative AI interface with a real-time analytic database. Generative AI can take natural language and convert it to SQL, which is then run, and an answer is returned. What's key is that the database must answer ad-hoc questions quickly, not just questions that are known in advance and optimized through tedious data engineering to come back quickly.
How Will Conversational Querying Change How We Query Databases Going Forward?
Soon, the predominant way to query will be through natural language, not Structured Query Language (SQL). SQL is a skill that most people do not have, so many more people will now be able to ask questions about their data. People will come to expect that they can ask any question about their data and get immediate responses.
What Are Some of the Business Problems That the Conversational Querying of Databases Will Solve?
A conversational query can solve business problems by enabling intuitive and dynamic data exploration, expanding user access, and improving decision-making. By ingesting large amounts of streaming data, we are able to ensure answers include the most up-to-date information for questions like, “What is the real-time status of our inventory, and should we reroute delivery vehicles to reduce out-of-stocks?”
Do You Have Some Specific Use Cases?
Users can interact with data generated by IoT devices, such as sensor data, telemetry data, or device logs, in a conversational manner to monitor performance, identify anomalies, or trigger actions for remote operations or maintenance.
Within the supply chain, users can ask questions about inventory levels, supplier performance, or demand forecasts in a conversational manner to optimize supply chain operations, identify bottlenecks, or even solve for re-routing delivery fleets in real-time.
For fraud detection, users can leverage conversational queries to analyze complex data from diverse sources, such as transaction and log data, and apply graph analytics techniques to detect anomalies, patterns, or suspicious connections in real time.
What’s the Biggest Challenge a User Will Need To Overcome To Be Successful Querying Their Databases Conversationally?
Just because there's a generative AI front end on your database doesn't make it conversational. There will be some disillusionment when users are told they can only ask canned questions or have to wait hours for an answer to come back. Organizations are going to need to overcome technical debt associated with siloed data and analytics, batch architectures, and complex data pipelines that restrict analytic agility.
How Will This Make Developers’ Lives Simpler and Easier?
Conversational querying will enable developers to interact with data sources using natural language, eliminating the need for complex query languages or code. This simplifies the querying process and reduces the need for developers to learn and use specialized query languages, making it more accessible and intuitive for non-experts. It also removes the burden of building and maintaining tedious data pipelines, allowing developers to spend less time on the plumbing of an analytic platform.
Is There Anything Else Developers Need To Know About Conversational Querying of Databases?
The future is here. Anyone can start trying it at kinetica.com/sqlgpt.
Opinions expressed by DZone contributors are their own.
Comments