DEV Community

Cover image for Meet StackQL: The Future of AI Multi-Cloud Management with SQL 🤖⚡️
Madza
Madza Subscriber

Posted on • Originally published at madza.hashnode.dev

Meet StackQL: The Future of AI Multi-Cloud Management with SQL 🤖⚡️

In the rapid software development lifecycle of today, the management and integration of different cloud services, APIs, and data sources can easily become a complicated and time-consuming task.

Developers are required to handle various tools, SDKs, and APIs simultaneously if they want to be able to automate workflows, query cloud infrastructure, or integrate AI capabilities into their applications.

StackQL is the innovative solution that provides a unified SQL-like interface to interact seamlessly with a great number of cloud providers and APIs, so you can focus on your product instead.

StackQL

In this article, we will explore how StackQL can be used in conjunction with OpenAI and GitHub providers, and how you can improve workflow via AI-driven querying to boost repository analysis.

Through direct instructions and practical code snippets, you will learn how to install StackQL, set up the required providers, launch the MCP server and make queries to fetch desired data.

This hands-on experience will provide you with the skills to implement AI-powered cloud query automation in your projects, increasing the efficiency and capacity for ​‍​‌‍​‍‌innovation for your future projects.

Thanks to the StackQL team for sponsoring this article!


Chapter 1: What is StackQL and Why Is It Useful?

StackQL​‍​‌‍​‍‌ is an open-source cloud-service query engine, which provides developers an easy-to-understand way of using SQL to interact with diverse cloud APIs and AI providers.

APIs and AI providers

Developers don't have to study the SDKs of each provider, handle complicated authentication, or struggle with poorly documented processes anymore.

Users can simply use StackQL to query, automate and orchestrate cloud and AI resources via the familiar SQL syntax from their local machines, CI pipelines, or custom applications.

The core value proposition of StackQL is to bring efficiency and simplicity to manage cloud infrastructure.

Some of the most useful features include:

Unified Access: Query multiple cloud providers (AWS, Azure, Google Cloud) and AI services (OpenAI, Anthropic) through the same toolset and language.

SQL Querying for APIs: Utilize familiar SQL syntax to interact with RESTful APIs for services, thus eliminating the necessity of writing lengthy HTTP client code.

AI-Powered Automation: Leverage AI providers like OpenAI directly in your queries to generate text, summarize content, detect sentiment, or create code snippets.

Local and CI Integration: You can execute your queries locally or as continuous integration/deployment (CI/CD) pipelines, thereby enhancing DevOps ​‍​‌‍​‍‌automation.


Chapter 2: How StackQL Works and Key Features

At its core, StackQL is a declarative query engine that converts SQL-like queries into API calls over multiple cloud providers and software services.

This design allows developers to abstract complex API specifications and instead interact with their infrastructure and services through a familiar and expressive query language.

SQL

StackQL​‍​‌‍​‍‌ embraces the idea of providers; each provider is a representation of a cloud platform, a SaaS service, or an API ecosystem. In essence, a provider gathers a set of resources and actions that refer to the lowest-level APIs.

If you execute a query, the query will be broken down and the relevant API calls to the target provider will be generated. The calls will be executed one after another or side by side depending on the query logic.

The core working principles include:

Resource Hierarchy and Namespace: The main goal is to represent the cloud and API resources in one hierarchical structure, which would help users to access and query resources easily.

SQL-to-API Transpilation: The REST API calls are automatically generated from SQL queries so that authentication and response handling are done seamlessly and securely.

Provider Registry and OpenAPI Definitions: The usage of the latest OpenAPI registry enables the updating of API changes with the SQL schema mappings.

Multi-Cloud and Multi-Provider Access: Users can query any combination of cloud and SaaS providers such as AWS, GCP, Azure, GitHub, and AI ​‍​‌‍​‍‌services in a single session.


Chapter 3: Installing StackQL on Your Machine

Getting started with StackQL begins with a straightforward installation process.

Getting started

StackQL supports a variety of platforms, including macOS, Windows, Linux, and Docker, making it accessible for developers regardless of their preferred environment.

Installation on macOS

For macOS users, StackQL can be installed easily via Homebrew:

brew install stackql
Enter fullscreen mode Exit fullscreen mode

Alternatively, you can tap the StackQL repository and then install:

brew tap stackql/tap && brew install stackql/tap/stackql
Enter fullscreen mode Exit fullscreen mode

You can also download the signed PKG installer directly from their releases page and follow the installation prompts. Both ARM (Apple Silicon) and Intel architectures are supported.

Installation on Linux

There are multiple installation options for Linux, including precompiled binaries, curl, or by downloading ZIP archives for various architectures (amd64 and arm64).

curl -L https://bit.ly/stackql-zip -O
unzip stackql-zip
Enter fullscreen mode Exit fullscreen mode

Ensure the binary is executable and accessible in your system PATH.

Installation on Windows

On Windows, you can install StackQL via Chocolatey:

choco install stackql
Enter fullscreen mode Exit fullscreen mode

Alternatively, download the MSI installer or ZIP archive from the official StackQL releases and run the installer or extract the binaries.

Verifying the Installation

After installation, you can verify StackQL is installed correctly by running:

stackql --version
Enter fullscreen mode Exit fullscreen mode

This should output the StackQL version string, confirming a successful setup.

StackQL version

This installation process sets the foundation for interacting with StackQL’s powerful SQL interface and AI integrations, which we will configure next.


Chapter 4: Setting Up Keys and Pulling Providers

StackQL supports over 25 commonly used providers in various areas.

Those include cloud, data & analytics, identity & security, monitoring & observability, DevOps, and AI & data science.

Providers

For this tutorial, we will need to configure OpenAI and GitHub providers.

Step 1: Set Your OpenAI API Key

StackQL recognizes the OPENAI_API_KEY environment variable to authenticate API requests to OpenAI.

You need to obtain your API key from the OpenAI dashboard and set it in your local environment.

For example, on macOS or Linux:

export OPENAI_API_KEY="your_openai_api_key_here"
Enter fullscreen mode Exit fullscreen mode

Or on Windows PowerShell:

setx OPENAI_API_KEY "your_openai_api_key_here"
Enter fullscreen mode Exit fullscreen mode

Once set, StackQL will be able to make authenticated requests to OpenAI using the exported token.

Complete

Step 2: Set Your GitHub API Token

StackQL uses the GITHUB_TOKEN environment variable for authenticating API requests to GitHub.

You need to generate a personal access token (PAT) from your GitHub account with the necessary permissions and set it in your local environment.

For macOS or Linux:

export GITHUB_TOKEN="your_personal_access_token_here"
Enter fullscreen mode Exit fullscreen mode

Or on Windows PowerShell:

setx GITHUB_TOKEN "your_personal_access_token_here"
Enter fullscreen mode Exit fullscreen mode

Once set, StackQL will be able to make authenticated requests to GitHub using the exported token.

Authenticated requests

Step 3: Pull the OpenAI Provider Registry

Start by pulling the OpenAI provider from StackQL’s registry:

REGISTRY PULL openai;
Enter fullscreen mode Exit fullscreen mode

This command imports the latest available version of the OpenAI provider, which we will use to perform repository data analysis.

Pull provider

The OpenAI provider includes support for over 50 resources, enabling interactions with chat, completions, embedings, fine tuning and many more.

Step 4: Pull the GitHub Provider Registry

Next, pull the GitHub provider from StackQL’s registry:

REGISTRY PULL github;
Enter fullscreen mode Exit fullscreen mode

This command imports the latest GitHub provider, which enables you to interact with GitHub services such as repositories, releases, issues, pull requests, and more.

Pull provider 2

The GitHub provider supports various resources and API endpoints that we will be using when fetching data from the GitHub repositories.


Chapter 5: Integrating the StackQL MCP server

We will be integrating the StackQL MCP server inside GitHub Copilot.

To achieve this, we will first need to configure both the Copilot chat and the MCP server itself.

Step 1: Open the GitHub Copilot chat

Click on the Copilot icon at the top bar of VS Code and select the “Open Chat“ option.

Open Chat

This should open the GitHub Copilot chat window on the right side of the screen by default.

GitHub Copilot

Step 2: Configure the Copilot

On the very bottom of the Chat window, make sure to check that the Copilot is in the Agent mode.

Agent mode

Click on the dropdown next to the chat mode to select which of the LLM models you want to use.

LLM models

Depending on whether or not you are in paid mode, you can pick between dozens of models, with the latest ones providing the more advanced and precise outputs.

Step 3: Integrating the StackQL MCP

Click on the configure icon at the bottom of the chat window, and it should bring up the menu with all the built-in agents that are available.

Configure

We will integrate a new server, so click on the “Add MCP server“ to start the setup guide and pick the Command (stdio) server type.

Add MCP server

In the next step, provide the command you want to execute to start the server.

Start the server

Give the server a name like "StackQL MCP" or some other name that will help you to better identify it.

StackQL MCP

Finally, open the tools list via the configuration settings again and notice the StackQL MCP server has been successfully installed.

Successfully installed

Step 4: Test the configuration and MCP server

First, open the terminal and run the server via the following command:

stackql mcp --mcp.server.type=stdio
Enter fullscreen mode Exit fullscreen mode

This will launch StackQL as an MCP server using the standard input/output (stdio) for communication.

Launch StackQL as an MCP server

This enables AI agents and assistants to interact with cloud infrastructure by sending queries and commands to StackQL through the MCP interface.

Let’s check if OpenAI and GitHub providers are configured and the server recognizes them.

Ask to show the providers in the Copilot input field. It will execute an SQL-like statement in StackQL that lists all available cloud service providers configured in the StackQL environment.

List providers

This indicates that the server is working as expected and allows you to see which providers are accessible for querying or managing through StackQL.

From here, you can start composing queries against your configured cloud and AI providers, and you can now integrate advanced AI capabilities directly into your StackQL-powered workflows.


Chapter 6: Fetch specific data via StackQL queries

In this section, we will test a few StackQL queries to see how you can fetch the data from GitHub.

Using the StackQL query approach to pull the data is very useful if you want specific info and be sure that no extra data fields are returned, reducing the API usage and improving response speed.

We will use the TypeScript repository as a test data source for this tutorial, but you can use whatever repository you want, including your own.

Also, we will maximize the Copilot window so the outputs are easier to read and fit in the viewport.

1. Repository Health Metrics

This query fetches key health indicators, including stars, forks, and open issues count, giving crucial insights into the repository's vitality.

These metrics help developers decide whether to adopt TypeScript for their projects and assess the responsiveness of the maintainer team.

SELECT name, full_name, description, stargazers_count, forks_count, open_issues_count,
       watchers_count, created_at, updated_at, default_branch
FROM   github.repos.details
WHERE  owner = 'microsoft' AND repo = 'TypeScript'; 
Enter fullscreen mode Exit fullscreen mode

Output:

Output 1

2. Recent Open Issues Analysis

This query retrieves the eight most recently opened issues in the TypeScript repository, helping developers quickly understand current problems and feature requests.

The labels field provides additional context about issue categorization, making it easy for contributors to find issues that match their expertise.

SELECT number, title, state, created_at, labels
FROM   github.issues.issues
WHERE  owner = 'microsoft' AND repo = 'TypeScript' AND state = 'open'
ORDER  BY created_at DESC LIMIT  8; 
Enter fullscreen mode Exit fullscreen mode

Output:

Output 2

3. Pull Request Activity Analysis

This query retrieves the most recent open pull requests, showing active development work and pending contributions.

It helps developers understand current development priorities and identify stale PRs that might need severe attention.

SELECT number, title, state, created_at, updated_at, user
FROM   github.pulls.pull_requests
WHERE  owner = 'microsoft' AND repo = 'TypeScript' AND state = 'open'
ORDER  BY created_at DESC LIMIT  5;
Enter fullscreen mode Exit fullscreen mode

Output:

Output 3

4. Repository Contributors Analysis

This query identifies the top contributors based on commit count, helping developers know whose code reviews to value most and who shapes the project's architecture.

A diverse and active contributor base indicates a healthy open-source project with a lower risk of abandonment, which is crucial for long-term sustainability.

SELECT login, contributions
FROM   github.repos.contributors
WHERE  owner = 'microsoft' AND repo = 'TypeScript'
ORDER  BY contributions DESC LIMIT  10;
Enter fullscreen mode Exit fullscreen mode

Output:

Output 4

5. Recent Repository Releases

This query fetches the most recent TypeScript releases, including version tags and detailed changelog information.

Tracking releases helps teams plan upgrade cycles and assess whether new versions address their specific pain points.

SELECT name, tag_name, created_at, published_at, body
FROM   github.repos.releases
WHERE  owner = 'microsoft' AND repo = 'TypeScript'
ORDER  BY published_at DESC LIMIT  7; 
Enter fullscreen mode Exit fullscreen mode

Output:

Output 5

6. Repo Status and Management Structure

This query returns the status and the structure of the project management properties.

It helps developers understand the stack composition and evaluate compatibility or integration priorities when extending or maintaining the project.

SELECT name, full_name, license, visibility, private, 
       has_issues, has_projects, has_wiki, has_pages, has_discussions,
       archived, disabled, created_at, updated_at
FROM github.repos.details 
WHERE owner = 'microsoft' AND repo = 'TypeScript';
Enter fullscreen mode Exit fullscreen mode

Output:

Output 6


Chapter 7: Use natural prompts for advanced analysis

Now, let’s test how the StackQL MCP server allows us to work with natural prompts.

This approach is very useful if you want to fetch data and perform analysis on it, since you can describe your desired result in more detail, compared to using just bare StackQL queries.

Keep in mind that this approach often requires the Copilot to run multiple queries, resulting in more API usage and being slower. But the output is more advanced and includes analysis.

We will be using React and NextJS repositories as test data sources for this section, but you can use whatever repositories you want, including your own.

Also, the Copilot window will stay maximized so the outputs are easier to read and fit in the viewport.

1. Issue Resolution Time Comparison

This prompt compares the average time it takes to close issues in React vs Next.js repositories, helping teams understand which project has more efficient issue resolution workflows.

It provides insights into team responsiveness and project maintenance velocity.

Compare the issue resolution efficiency between facebook/react and vercel/next.js 
by analyzing the last 20 closed issues from each repository. 
Calculate the average time between creation and closure, 
identify the fastest and slowest resolved issues, 
and determine which repository has better issue turnaround times.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 7

2. Fork and Star Growth Momentum

This prompt reveals which project is gaining more community traction and developer interest by comparing engagement metrics.

Understanding momentum helps developers decide which ecosystem is growing faster and may have better long-term support.

Compare the community engagement metrics between facebook/react and vercel/next.js 
by examining their current star counts, fork counts, watcher counts, and open issues count. 
Calculate engagement ratios and determine which repository shows stronger community momentum.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 8

3. Issue Label Distribution and Organization

This prompt examines how each project categorizes and organizes its issues through labeling systems, revealing the project management sophistication.

Well-organized labels indicate mature issue triage processes and make it easier for contributors to find relevant work.

Analyze the issue labeling strategies of facebook/react and vercel/next.js 
by examining the labels used across their open issues. 
Identify the most common label categories, compare how each project organizes their issues, 
and determine which repository has a more comprehensive labeling system for issue management.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 9

4. Repository Update Recency and Activity

This prompt compares how recently each repository was updated and pushed to, indicating current maintenance activity levels.

Recent activity signals active development and suggests the project is being well-maintained and responsive to community needs.

Compare the recent activity and maintenance patterns between facebook/react and vercel/next.js
by analyzing their last push dates, last update timestamps, 
creation dates, and default branches. 
Determine which repository shows more recent development activity 
and calculate how many days since the last update for each.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 10

5. Issue Creation Velocity Trends

This prompt analyzes the rate at which new issues are being created in both repositories to understand user engagement and potential problem discovery rates.

High issue creation velocity can indicate either active usage with bugs being found, or growing adoption with feature requests.

Examine the issue creation velocity for facebook/react and vercel/next.js 
by analyzing the 20 most recently created issues in each repository. 
Calculate the average daily issue creation rate, 
identify peak activity periods based on creation timestamps, 
and determine which project has higher user engagement through issue reporting.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 11

6. Issue Title Sentiment and Complexity Analysis

This prompt uses AI to analyze the sentiment and complexity of issue titles, revealing whether issues tend to be bug reports, feature requests, or questions.

Understanding issue sentiment helps gauge user satisfaction and the types of challenges each project faces.

Analyze the sentiment and complexity of issue titles from facebook/react and vercel/next.js 
by fetching 20 recent open issues from each repository. 
Use AI to categorize each issue title by sentiment (positive, negative, neutral), 
identify common themes or patterns, 
and determine which repository has more complex or challenging issues based on title analysis.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 12

7. Cross-Repository Issue Topic Clustering

This prompt identifies common topics and themes across issues in both repositories to find overlapping concerns or unique challenges.

Topic clustering reveals what problems are universal to React frameworks versus specific to each implementation.

Perform topic clustering analysis on issues from facebook/react and vercel/next.js 
by examining 25 recent issues from each repository. 
Group issues by common themes or topics mentioned in their titles, 
identify overlapping concerns between both repositories, 
and highlight unique issue categories specific to each project.
Enter fullscreen mode Exit fullscreen mode

Output:

Output 13


Conclusion and final thoughts

Besed on the practical review in this article, it’s safe to say that StackQL​‍​‌‍​‍‌ changes the developer workflows by bringing together cloud and AI interfaces, which means that developers can now use natural language-driven solutions through providers like OpenAI to get their work done faster and more efficiently.

By integrating AI models directly into SQL queries, StackQL transforms routine operations into intelligent, context-aware actions that simplify and has a potential to accelerate anything you can think of.

The AI providers in StackQL fully abstract all complex HTTP calls (so you dont have to write them yourself), request signing, and response parsing behind simple SQL queries, making AI and cloud operations easy to understand and accessible from one platform.

This setup enables a huge potential for developers to seamlessly blend AI capabilities like resource monitoring, management, and powerful analysis for future optimization for their apps into their cloud and developer workflows.

As AI becomes more and more central to modern workflows every day, mastering StackQL’s AI integrations will surely position you at the forefront of efficient, intelligent software engineering.

Hopefully, this article was useful to you and you learned a few new ideas that you can apply in your future projects. Thanks a lot for reading, and best of luck with building cool stuff!


Did you like the article? Here is more 👇

Join 6,000+ others to receive the best DEV resources, tools, productivity tips, and career growth advice I discover by subscribing to my newsletter!

The Developer Toolbox

Also, connect with me on Twitter, LinkedIn, and GitHub!

Writing has always been my passion, and it gives me pleasure to help and inspire people. If you want to get featured or partner up, feel free to get in touch!

Top comments (0)