Hello everyone!
Today I'm writing about one of the most viral topics in the developer world right now: Modular Capability Plugins (MCP).
What are MCPs?
MCPs (Modular Capability Plugins) are specialized extensions that enhance AI assistants like Amazon Q with specific capabilities. Think of MCPs as "super-powered tool belts" for AI assistants - they give these assistants specialized knowledge and abilities to perform specific tasks far better than they could with general training alone.
We can think like this if Amazon Q is like a smartphone out of the box, MCPs are like specialized apps you install to transform it from a general-purpose device into a professional-grade tool for specific tasks. Just as you might install Photoshop for image editing or Final Cut Pro for video production, MCPs add specialized capabilities to your AI assistant.
My Experience with Amazon Q MCPs
For the past month, I've been using Amazon Q Chat service for my day-to-day DevOps and CloudOps tasks. It's been an incredibly helpful tool that streamlines my workflow. Since I'm running Arch Linux on my personal laptop, I decided to set up several specialized MCPs to enhance my productivity.
The Challenge
As a cloud engineer, I frequently encounter several time-consuming tasks:
- AWS architecture diagrams have always been a critical but manual and time-consuming task for developers and cloud architects
- Searching through AWS documentation efficiently can be challenging
- Following Terraform best practices and security-first development workflows requires constant reference checking
The Solution: Custom MCPs
To address these challenges, I've set up the following MCPs:
awslabs.aws-diagram-mcp-server *- For generating AWS architecture diagrams automatically
**awslabs.aws-documentation-mcp-server *- For searching AWS documentation using the official AWS search API, getting content recommendations, and converting documentation to markdown format
**awslabs.terraform-mcp-server - For AWS Terraform best practices, security-first development workflows, Checkov integration, AWS provider documentation, AWS-IA GenAI modules, and Terraform workflow execution
Setup Guide for Arch Linux Prerequisites
Before beginning, make sure you have:
- Python 3.10
- python-uv (Python packaging tool)
- AWS CLI configured locally with an AWS Builder ID for accessing Amazon Q
- GraphViz installed (https://www.graphviz.org/)
Installation Steps
Install Amazon Q on Arch Linux
curl --proto '=https' --tlsv1.2 -sSf "https://desktop-release.q.us-east-1.amazonaws.com/latest/q-x86_64-linux.zip" -o "q.zip"
unzip q.zip
./q/install.sh
After installation, you should see an "amazonq" folder under your .aws folder.
Create an MCP configuration file:
vim ~/.aws/amazonq/mcp.json
Add the following content to configure all three MCP servers:
{
"mcpServers": {
"awslabs.terraform-mcp-server": {
"command": "uvx",
"args": ["awslabs.terraform-mcp-server@latest"],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
},
"awslabs.aws-documentation-mcp-server": {
"command": "uvx",
"args": ["awslabs.aws-documentation-mcp-server@latest"],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR"
},
"disabled": false,
"autoApprove": []
},
"awslabs.aws-diagram-mcp-server": {
"command": "uvx",
"args": ["awslabs.aws-diagram-mcp-server"],
"env": {
"FASTMCP_LOG_LEVEL": "ERROR"
},
"autoApprove": [],
"disabled": false
}
}
}
Note: If you don't need all three MCPs, you can remove the ones you don't want from your mcp.json file.
Using Your MCPs
Open Amazon Q - you should see it initializing the MCP servers
Enter /tools in the chat to check what tools you have access to via Amazon Q MCPs
If you encounter any issues, double-check your Python version, the uv package installation, and that GraphViz is properly installed
Testing Your Setup
To verify everything is working correctly, try a sample prompt like:
generate an aws architecture diagram for sample s3 bucket with cloudfront for hosting static website
This should trigger the diagram MCP to create a visualization of the architecture. It should produce an output similar to the one below:
This is the end of my post. Thank you for reading through it. If you liked it, please consider sharing it with your colleagues for better reach.
Top comments (0)