Hi there! I'm Shrijith Venkatrama, the founder of Hexmos. Right now, I’m building LiveAPI, a super-convenient tool that simplifies engineering workflows by generating awesome API docs from your code in minutes.
As a team, we have embraced AI tools such as Cursor Editor and GitHub Copilot, experiencing firsthand the speed at which we can iterate on products and ship improvements for our customers using these new tools.
One area where I feel existing tools fall short is in dealing with databases.
While we can manually generate SQL queries from textual descriptions, I believe automating that process would be far better. Imagine if we could:
- Query tables directly within Cursor/VSCode using natural language
- Chat with an LLM based on the database schema to generate complex SQL queries
- Seamlessly insert queries into the editor from the chat tool
- Provide in-place explanations of how SQL queries work
- Automatically discover database connections from the codebase, if possible
Essentially, I believe interacting with databases could be much more enjoyable with AI-enabled features.
In this series of posts, I will break down this vision into focused tasks and work on them to enhance our internal development practices. I'd love to take you along on this journey as I think through these challenges.
Step 1 - Foundations for an LSP Extension (Language Server Protocol)
At Hexmos, we have experience building VSCode extensions, such as for our API tool Lama2. We are also developing an exciting extension for LiveAPI. This experience provides a strong foundation for creating DBChat’s first component.
You can think of the LSP as a backend server that provides endpoints for the extension frontend. The first challenge is to ship a backend server with every copy of the extension to our customers.
Since we want the backend server to run everywhere, it’s better to use a compiled language rather than an interpreted one.
I think Go is a good choice here, so I’ll use it. Rust is also viable, but I find it slower for prototyping, and I don’t have much time for this now. Python is not ideal because distributing Python-based products can be more challenging than using Go, even within our own team.
Another advantage of an LSP structure is that the core code can support multiple frontends. For instance, we could easily create a CLI or TUI experience based on the Go core.
Speaking of a CLI, a good milestone might be to first ship a functional shell experience using the Go core. The extension UI can come later.
DBChat Shell - A Rough Specification of What We Want to Accomplish
The first step is to build a command-line tool called dbc
that includes a shell mode.
The initial goal is to launch a Python-shell-like terminal with:
dbc shell
Within the shell, we could gradually implement commands for:
- Specifying database connections
- Connecting to a database (handling success/failure)
- Extracting the database schema as text
- Accepting user chat queries
- Connecting to Gemini, sending the query and schema to generate SQL
- Executing SQL (optionally after user confirmation)
- Neatly displaying results
This will involve building a robust GoLang core with a solid foundation of features.
The tricky part is getting the shell experience right. However, if we prioritize the VSCode/Cursor extension as the primary goal, we could simplify the shell experience initially to ensure painless testing of the core functionality.
The DBChat Shell Experience
Within the shell, we could gradually implement commands for:
- Specifying database connections
- Connecting to a database (handling success/failure)
- Extracting the database schema as text
- Accepting user chat queries
- Connecting to Gemini, sending the query and schema to generate SQL
- Executing SQL (optionally after user confirmation)
- Neatly displaying results
About the Upcoming Post
In the next post, I will cover:
- Creating a user-friendly REPL loop in Go (supporting up/down arrows, history, etc.)
- Dynamically accepting database connection inputs in the REPL (focusing initially on PostgreSQL, as it’s what we use internally)
- Automatically extracting the schema from a PostgreSQL database
Top comments (0)