DEV Community

Sagar Budhathoki
Sagar Budhathoki

Posted on

Connecting Claude Tools to Hashnode using MCP Server

Introduction

In the rapidly evolving landscape of AI tools and integrations, the ability to extend AI capabilities through custom interfaces has become increasingly valuable. Today, I'm excited to share a project I've been working on: the Hashnode MCP Server. This tool bridges the gap between AI assistants and the Hashnode blogging platform, enabling seamless content creation, management, and retrieval directly through AI interactions.

In this article, I'll walk you through what the Model Context Protocol (MCP) is, how my Hashnode MCP server works, and how you can set it up to enhance your own content workflow.

Demo First? (😁)

  • Create Article:

Create Article

  • Update Article

Update Article

What is the Model Context Protocol (MCP)?

The Model Context Protocol (MCP) is a framework that allows AI models to interact with external tools and data sources. It provides a standardized way for AI assistants to access additional capabilities beyond their built-in functions.

MCP servers act as intermediaries between AI models and external services, exposing a set of tools and resources that the AI can use to perform specific tasks. This extends what AI assistants can do without requiring them to have direct API access to every possible service.

Image description

Introducing the Hashnode MCP Server

The Hashnode MCP Server is a Python-based implementation that connects AI assistants to the Hashnode API. It allows AI models to perform various operations on Hashnode blogs, including:

  • Creating and publishing new articles

  • Updating existing articles

  • Searching for articles by keywords

  • Retrieving article details

  • Getting user information

  • Fetching the latest articles from a publication

Image description

This means that with the Hashnode MCP Server, you can ask an AI assistant to draft a blog post, publish it to your Hashnode blog, search for related content, or update existing articles—all without leaving your conversation with the AI.

How the Hashnode MCP Server Works

At its core, the Hashnode MCP Server is a bridge between AI assistants and the Hashnode GraphQL API. Here's how it works:

  1. Connection: The MCP server establishes a connection with both the AI assistant and the Hashnode API.

  2. Tool Exposure: It exposes a set of tools that represent different Hashnode operations.

  3. Request Handling: When the AI assistant wants to perform an action, it sends a request to the MCP server.

  4. API Interaction: The server translates this request into the appropriate GraphQL query or mutation for the Hashnode API.

  5. Response Formatting: After receiving a response from Hashnode, the server formats it in a way that's easy for the AI to understand and present to the user.

The server is built using the FastMCP framework, which simplifies the process of creating MCP servers by handling the communication protocol details.

Setting Up the Hashnode MCP Server

Prerequisites

  • Python 3.8 or higher

  • A Hashnode account with a personal access token

  • Basic familiarity with command-line operations

Installation Steps

  1. Clone the repository:

    git clone https://github.com/sbmagar13/hashnode-mcp-server.git
    cd hashnode-mcp-server
    
  2. Create a virtual environment:

    python -m venv .venv
    source .venv/bin/activate  # On Windows: .venv\Scripts\activate
    
  3. Install dependencies:

    pip install -r requirements.txt
    
  4. Set up environment variables: Create a .env file in the project root with the following content:

    HASHNODE_API_URL=https://gql.hashnode.com
    HASHNODE_PERSONAL_ACCESS_TOKEN=your_personal_access_token
    

    Replace your_personal_access_token with your actual Hashnode personal access token, which you can generate in your Hashnode account settings.

  5. Run the server:

    You have two options for running the server:

    Option 1: Run the server manually

    python run_server.py
    

    Or directly using the root file:

    python mcp_server.py
    

    The server will start and listen for connections from AI assistants. By default, it runs on localhost:8000.

    Option 2: Let the MCP integration handle it automatically (I’ll be using this)

    When properly configured in Claude Desktop or Cline VSCode extension, the MCP integration will automatically start and manage the server process for you.

Important Note on File Structure

When configuring your MCP server in Claude Desktop or Cline VSCode extension, you should use the root mcp_server.py file directly rather than the files in the hashnode_mcp directory. The hashnode_mcp directory is primarily for packaging purposes.

For example, in your configuration, point to:

/path/to/your/hashnode-mcp-server/mcp_server.py
Enter fullscreen mode Exit fullscreen mode

And not:

/path/to/your/hashnode-mcp-server/hashnode_mcp/mcp_server.py
Enter fullscreen mode Exit fullscreen mode

This ensures you're using the most up-to-date version of the server with all features enabled. The root mcp_server.py file contains all the necessary functionality and doesn't require the package structure to operate correctly.

Using the Hashnode MCP Server with AI Assistants

Once your server is configured, you can connect compatible AI assistants to it. Unlike traditional API integrations that use URLs, MCP servers are typically configured directly in the AI assistant's configuration files, as we'll see in the next section.

The connection process generally involves:

  1. Setting up the configuration file for your AI assistant (Claude Desktop or Cline VSCode extension)

  2. Specifying the path to your Python interpreter and the MCP server script

  3. Providing necessary environment variables like your Hashnode personal access token

After configuring the connection, you can start giving the AI commands related to your Hashnode blog. For example:

  • "Create a new article about Python programming tips"

  • "Update my article with ID 12345 to fix the code examples"

  • "Get the latest articles from my blog"

  • "Search for articles about machine learning"

The AI will use the MCP server to execute these commands and return the results.

Configuring MCP on Claude Desktop and Cline VSCode Extension

To use your Hashnode MCP Server with Claude AI, you'll need to configure it in either Claude Desktop or the Cline VSCode extension. Here's how to set it up in both environments:

Configuring MCP on Cline VSCode Extension

  1. Open VS Code with the Cline extension installed.

  2. Navigate to the Cline MCP settings file located at:

* Windows: `%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json`

* macOS: `~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json`

* Linux: `Unfortunately, Claude Desktop is not available for Linux as of now (at the time of writing this article)` (So you can use VSCode Cline Extension instead)
Enter fullscreen mode Exit fullscreen mode
  1. Add your Hashnode MCP server configuration to the file:

    {
      "mcpServers": {
        "hashnode": {
          "command": "/path/to/your/venv/bin/python",
          "args": [
            "/path/to/your/hashnode-mcp-server/mcp_server.py"  // Use the root mcp_server.py file
          ],
          "env": {
            "HASHNODE_PERSONAL_ACCESS_TOKEN": "your-personal-access-token"
          }
        }
      }
    }
    

    Note that the configuration points to the root mcp_server.py file, not the one in the hashnode_mcp directory.

  2. Replace the paths and token with your actual values. For example:

    {
      "mcpServers": {
        "hashnode": {
          "command": "/Users/sagar/my_personal/hashnode-mcp-server/.venv/bin/python",
          "args": [
            "/Users/sagar/my_personal/hashnode-mcp-server/mcp_server.py"
          ],
          "env": {
            "HASHNODE_PERSONAL_ACCESS_TOKEN": "your-personal-access-token"
          }
        }
      }
    }
    
  3. Save the file and restart VS Code or reload the window.

  4. Open a new Cline conversation and test the connection by asking it to interact with your Hashnode blog.

Configuring MCP on Claude Desktop

  1. Open Claude Desktop and navigate to the configuration file:
* Windows: `%APPDATA%\Claude\claude_desktop_config.json`

* macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`

* Linux: `Unfortunately, Claude Desktop is not available for Linux as of now (at the time of writing this article)`
Enter fullscreen mode Exit fullscreen mode
  1. Add your Hashnode MCP server configuration to the file, using the same format as for the Cline VSCode extension. Make sure to point to the root mcp_server.py file:

    {
      "mcpServers": {
        "hashnode": {
          "command": "/path/to/your/venv/bin/python",
          "args": [
            "/path/to/your/hashnode-mcp-server/mcp_server.py"  // Use the root mcp_server.py file
          ],
          "env": {
            "HASHNODE_PERSONAL_ACCESS_TOKEN": "your-personal-access-token"
          }
        }
      }
    }
    
  2. Save the file and restart Claude Desktop.

  3. Test the connection by asking Claude to perform a simple operation like "Get the latest articles from my Hashnode blog."

Troubleshooting Connection Issues

If you encounter connection issues:

  1. Verify the server is running by checking the terminal where you started the MCP server.

  2. Check the paths in your configuration are correct and point to the right Python interpreter and script.

  3. Ensure your environment variables are properly set, especially the Hashnode personal access token.

  4. Check the server logs for any error messages.

  5. Try restarting both the MCP server and the Claude application.

Example: Creating a New Article

Let's walk through a practical example of using the Hashnode MCP Server to create and publish a new article:

  1. Start the server as described above.

  2. Connect your AI assistant to the MCP server.

  3. Ask the AI to create an article:

    Create a new article titled "Getting Started with Python" with the following content:
    
    # Getting Started with Python
    
    Python is one of the most popular programming languages today. In this article, we'll explore the basics of Python and how to get started.
    
    ## Installation
    
    First, you need to install Python...
    
    [rest of the article content]
    
    Tags: python, programming, beginners
    
  4. The AI will use the MCP server to:

* Format the request for the Hashnode API

* Send the creation request

* Return the result, including the article ID and URL
Enter fullscreen mode Exit fullscreen mode
  1. You can then ask the AI to:
* Publish the article immediately

  • Save it as a draft

  • Make further edits

Enter fullscreen mode Exit fullscreen mode




Advanced Features

Timeout Handling

The Hashnode MCP Server includes robust timeout handling for API requests. This is particularly important for operations like article creation and updates, which might take longer to process. If a request times out, the server provides helpful error messages and suggestions.

Error Management

The server includes comprehensive error handling to provide clear feedback when issues occur. This makes troubleshooting easier and improves the user experience.

Pagination Support

For operations that might return large amounts of data, like searching for articles, the server supports pagination to manage the response size and improve performance.

Potential Use Cases

The Hashnode MCP Server opens up numerous possibilities for content creators:

  1. Automated Content Creation: Generate draft articles based on outlines or topics.

  2. Content Management: Update, organize, and manage your blog without leaving your AI assistant.

  3. Research Assistance: Search your existing content to find relevant articles or avoid duplication.

  4. Batch Operations: Perform bulk updates or content audits across your blog.

  5. Integration with Workflows: Incorporate blog publishing into broader AI-assisted workflows.

Technical Architecture

The project is organized with a clean, modular structure:

  • mcp_server.py: Root server implementation that can be run directly

  • hashnode_mcp/: Core package containing the modular functionality

    • mcp_server.py: Package version of the server implementation
    • utils.py: Utility functions for formatting responses and GraphQL queries
  • examples/: Example usage scripts

  • tests/: Test suite for verifying functionality

  • run_server.py: Entry point for running the server using the package version

While the project includes a package structure (hashnode_mcp/) for organization and potential distribution, users can simply run the root mcp_server.py file directly without needing to use the package. This provides flexibility in how you choose to deploy the server.

The server uses asynchronous programming with Python's asyncio and httpx libraries for efficient API communication. GraphQL queries and mutations are defined as constants, making them easy to maintain and update.

Future Enhancements

There are several exciting possibilities for future development:

  1. Additional Hashnode Features: Support for more Hashnode API capabilities like managing comments, series, and newsletters.

  2. Analytics Integration: Retrieving and analyzing blog performance metrics.

  3. Content Optimization: AI-assisted SEO optimization for articles.

  4. Multi-User Support: Enhanced capabilities for team publications.

  5. Webhook Support: Responding to events from your Hashnode blog.

Conclusion

The Hashnode MCP Server represents a powerful bridge between AI assistants and content creation on Hashnode. By enabling AI models to interact directly with your blog, it streamlines the writing and publishing process, making content creation more efficient and accessible.

Whether you're a solo blogger looking to optimize your workflow or part of a content team seeking to scale your production, this tool offers valuable capabilities for integrating AI into your content strategy.

I'm excited to see how others in the community will use and extend this project. The code is open-source and available on GitHub, so feel free to fork it, contribute, or adapt it to your specific needs.

Resources

Thanks!


Have you integrated AI tools into your content workflow? Share your experiences in the comments below!

Top comments (0)