DEV Community

Misha Zaslavskiy
Misha Zaslavskiy

Posted on • Edited on

Use LLM for EDA licenses analysis

Why?

If your company has an EDA license monitoring system, you have a lot of useful data available for analysis.

Now what do we do with this "big-data"? There most likely will be time gaps caused by infrastructure failures/maintenance, as well as evolution of suppliers and available licenses over time, which certainly complicate manual analysis.

With the advancement of AI/ML tools, you could now connect your Prometheus database with AI assist tool via what's called MCP: Model Context Protocol. This allows LLM to query your database, automatically load relevant and up-to-date information in the context window, and as a result LLM will be able to accurately answer questions about usage, trends or whatever else you are interested in.

How?

Here is an example how to connect Prometheus DB to Cursor AI code editor.

Go to Cursor Settings -> Tools & Integration -> New MCP Server and add this code to the .cursor/mcp.json file:

{
  "mcpServers": {
    "prometheus": {
      "command": "docker",
      "args": [
        "run",
        "--network",
        "cad",
        "--name",
        "prometheus-mcp-server",  
        "-p",
        "8000:8000",
        "-i",
        "--rm",
        "-e",
        "PROMETHEUS_URL",
        "ghcr.io/pab1it0/prometheus-mcp-server:latest"
      ],
      "env": {
        "PROMETHEUS_URL": "http://prometheus:9090"
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Note that this example uses Docker deployment of MCP server. You don't need to run this container manually, Cursor will do it automatically. For that option you'll obviously need access to Docker, and Prometheus DB should run on the same cad docker network. If your setup is different, you'll need to modify this JSON file accordingly, see the source reference at the bottom of this post.

If your MCP tool shows green status and 5 enabled tools like in this snapshot, you are good to go.

Otherwise check the Docker container status and logs. Or you could ask Cursor itself to assist with debugging, it gave me good hints after I renamed Docker network and accidentally broken my setup. If you decide to go this route, make sure to add mcp.json file into the context and be specific that you have issue with MCP tools.

Now when our MCP tool is up, let's test it. Here is the Cursor chat prompt example:

query available licenses through MCP

And the result:

This table gives accurate summary of all licenses available in my system right now, current usage and overview of the infrastructure. Not a bad summary for beginning, right? Now you could ask about historical usage, deep-dive into specific license utilization, find who are the most active users, or search for anomalies and outliers. And all of this can be done now using natural langue instead of learning PromQL - specialized Prometheus query language. The LLM also shows in the chat window the PromQL queries that were used to obtain necessary information. This could help to speed-up writing code for new dashboards.

Please keep in mind that LLM output may be not always accurate, so it's a good idea to double check findings independently before acting on them.

Sources

What Is the Model Context Protocol (MCP) and How It Works

Prometheus MCP server

Top comments (1)

Collapse
 
chariebee profile image
Charles Brown

I used to think LLMs were too hand-wavy for license analytics, but your MCP + Prometheus example showed a clear, workable path that even surfaces the PromQL it uses. It shifted my view: LLMs can be a practical front end for EDA usage insights, with the right wiring and a healthy double-check.