DEV Community

Cover image for End-to-End GitHub Workflow: Agents (CrewAI), UI (CopilotKit), Automation (Composio)
Arindam Majumder Subscriber for CopilotKit

Posted on

End-to-End GitHub Workflow: Agents (CrewAI), UI (CopilotKit), Automation (Composio)

TL;DR

In this article, you will learn how to build a production-ready MCP server that automatically analyses GitHub repositories, creates Notion databases, and schedules bug review meetings.

We will be covering the following:

  • Understanding the MCP Protocol and its ecosystem
  • Building a multi-agent system with CrewAI for GitHub analysis
  • Integrating the GitHub, Notion, and Google Calendar tools through Composio's platform
  • Creating an MCP server that exposes AI workflows as standardised tools
  • Connecting and testing your MCP server with CopilotKit AG-UI - the complete frontend interface for MCP server interaction and workflow execution

We will be using: CrewAI for multi-agent orchestration, Composio for tools integrations, and Nebius LLM Models for intelligent analysis.

You will then connect your MCP server to CopilotKit AG-UI, a powerful frontend interface for CopilotKit for testing and interacting with MCP servers in real-time.

Model Context Protocol (MCP)

MCP is an open-source, lightweight protocol that standardises how AI applications connect to external data sources, tools, and services. Unlike traditional API integrations that require custom implementations for each service, MCP provides a unified interface for AI agents to interact with any MCP-compatible tool or data source.

The MCP protocol enables three core interaction types:

  • Tools: AI agents can execute actions (like creating calendar events or fetching GitHub data)
  • Resources: AI agents can access dynamic content (like file systems or databases)
  • Prompts: AI agents can use templated interactions for consistent behaviour

Here's a preview of how we will be using MCP


What is AG-UI

AG-UI (Agent-User Interaction Protocol) is an open, lightweight protocol developed by CopilotKit that streams a single sequence of JSON events over standard HTTP or WebSocket connections. These event: messages, tool calls, state patches, lifecycle signals, flow seamlessly between your agent backend and front-end interface, maintaining perfect real-time synchronisation.

Image1

While the AI ecosystem has largely focused on backend automation with limited user interaction, AG-UI bridges this gap by establishing a consistent contract between agents and interfaces, eliminating custom WebSocket formats and text parsing hacks.

With AG-UI, components become interchangeable; you can use CopilotKit's React components with any AG-UI source, switch between cloud and local models without UI changes, and orchestrate specialised agents through a single interface. This makes building collaborative AI experiences faster and more reliable.

Check out CopilotKit's comprehensive guide on how to add a frontend to any AG2 agent using AG-UI protocol for hands-on implementation examples.

Star AG-UI ⭐️

Why MCP + CrewAI + Composio + CopilotKit?

Building AI agents that interact with real-world tools has traditionally been complex and fragmented. Each integration required custom code, authentication handling, and error management. Our stack solves this by combining:

  1. MCP Protocol: Standardised AI-tool communication
  2. CrewAI: Multi-agent coordination and workflow orchestration
  3. Composio: Production-ready integrations for 3 tools
  4. Nebius LLMs: Intelligent decision-making and analysis
  5. CopilotKit AG-UI: A production-ready Agent-User Interaction Protocol that enables real-time collaboration between AI agents and users through unified event streams, complete with tool orchestration, shared state management, and intuitive chat-based interactions. Read more about it here.

With this combination, we will build an AI workflow that can analyse GitHub repositories, manage Notion databases, and create calendar meetings when necessary. All of these will be done through CopilotKit AG-UI with real-time progress streaming and standardised MCP protocol communication, showcasing how CopilotKit's Agent-User Interaction Protocol makes complex AI workflows accessible through elegant user interfaces.

Building the MCP Server Backend

In this section, we will build our GitHub analysis backend, where we will have a system that will use three specialised AI agents working in sequence to analyse repositories and automate bug triage workflows.

Step 1: Project Structure and Dependencies

First, let's set up our project structure to ensure it is modular and maintainable.

Quick Start: Clone the complete implementation from our GitHub repository which includes all the code files, configuration templates, and setup instructions.

Clone the repository and navigate to the project directory:

git clone https://github.com/Taofiqq/crewai-composio-mcp-server.git
cd crewai-composio-mcp-server
Enter fullscreen mode Exit fullscreen mode

Install the dependencies using pip:

pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode

Copy the environment template and configure your API keys:

cp .env.example .env
Enter fullscreen mode Exit fullscreen mode

Then, configure your .env file with the required API keys:

COMPOSIO_API_KEY=your-composio-api-key
NEBIUS_API_KEY=your-nebius-api-key
TARGET_REPOSITORY=vercel/next-learn
DEFAULT_ATTENDEE_EMAIL=your-email@gmail.com
Enter fullscreen mode Exit fullscreen mode

Your final directory structure should look like this:

crewai-composio-mcp-server/
├── agents/                
│   ├── __init__.py
│   ├── github_agent.py   
│   ├── notion_agent.py    
│   └── calendar_agent.py  
├── tasks/                  
│   ├── __init__.py
│   ├── github_tasks.py    
│   ├── notion_tasks.py   
│   └── calendar_tasks.py 
├── workflow/              
│   ├── __init__.py
│   ├── github_workflow.py      
│   ├── notion_workflow.py       
│   ├── calendar_workflow.py    
│   └── full_orchestrator.py    
├── config/                 
│   ├── __init__.py
│   └── settings.py        
├── tools/                  
│   ├── __init__.py
│   └── composio_setup.py  
├── llm/                    
│   ├── __init__.py
│   └── nebius_client.py   
├── mcp_server_wrapper.py   
├── main.py                
├── requirements.txt       
├── .env.example           
├── .gitignore            
└── README.md             
Enter fullscreen mode Exit fullscreen mode

Step 2: Configuration and Environment Setup

In this section, we will create a config file settings.py that centralises all API keys, model settings, and workflow parameters. This approach ensures secure credential management and will make it easy for the system easily configurable.

# config/settings.py
import os
from dotenv import load_dotenv

load_dotenv()

class Settings:
    # API Keys
    COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
    NEBIUS_API_KEY = os.getenv("NEBIUS_API_KEY")

    # Nebius LLM Configuration
    NEBIUS_MODEL = os.getenv("NEBIUS_MODEL", "meta-llama/Meta-Llama-3.1-8B-Instruct")
    NEBIUS_BASE_URL = os.getenv("NEBIUS_BASE_URL", "https://api.studio.nebius.ai/v1/")

    # Project Configuration
    TARGET_REPOSITORY = os.getenv("TARGET_REPOSITORY", "vercel/next-learn")
    DEFAULT_ATTENDEE_EMAIL = os.getenv("DEFAULT_ATTENDEE_EMAIL", "abumahfuz21@gmail.com")

    # Workflow Settings
    BUG_LABEL = "bug"
    MEETING_DURATION_MINUTES = 30
    ENTITY_ID = "default"

settings = Settings()
Enter fullscreen mode Exit fullscreen mode

Create your .env file with the required API keys:

# .env
COMPOSIO_API_KEY=your_composio_api_key_here
NEBIUS_API_KEY=your_nebius_api_key_here
NEBIUS_MODEL=meta-llama/Meta-Llama-3.1-8B-Instruct
TARGET_REPOSITORY=vercel/next-learn
DEFAULT_ATTENDEE_EMAIL=your-email@gmail.com
Enter fullscreen mode Exit fullscreen mode

Step 3: Composio Tool Integration Setup

Composio provides production-ready integrations for GitHub, Notion, Google Calendar, and 250+ other tools. For this use case, we will be using GitHub, Notion, and Google Calendar tools. Let's set up our tool management system:

# tools/composio_setup.py
from composio_crewai import ComposioToolSet, App, Action
from config.settings import settings

class ComposioTools:
    def __init__(self):
        self.toolset = ComposioToolSet()
        self.entity_id = settings.ENTITY_ID

    def get_github_tools(self):
        """Get GitHub-specific tools for repository analysis."""
        return self.toolset.get_tools(apps=[App.GITHUB], entity_id=self.entity_id)

    def get_notion_tools(self):
        """Get Notion-specific tools for database management."""
        return self.toolset.get_tools(apps=[App.NOTION], entity_id=self.entity_id)

    def get_calendar_tools(self):
        """Get Google Calendar tools for meeting scheduling."""
        return self.toolset.get_tools(apps=[App.GOOGLECALENDAR], entity_id=self.entity_id)

    def get_specific_actions(self, actions):
        """Get specific Composio actions for targeted functionality."""
        return self.toolset.get_tools(actions=actions, entity_id=self.entity_id)

# Initialize global tools instance
composio_tools = ComposioTools()
Enter fullscreen mode Exit fullscreen mode

Run the test_composio_setup.py file in the cloned repository to verify the setup is good to go

Image2

Step 4: Nebius LLM Client Setup

In this section, we will configure our LLM client for intelligent analysis and decision-making:

# llm/nebius_client.py
import openai
from typing import Dict, Any
import logging
from config.settings import settings

class NebiusClient:
    """Wrapper class for Nebius LLM integration with CrewAI"""

    def __init__(self):
        self.client = openai.OpenAI(
            api_key=settings.NEBIUS_API_KEY, 
            base_url=settings.NEBIUS_BASE_URL
        )
        self.model = settings.NEBIUS_MODEL

    def analyze_labels(self, labels: list) -> Dict[str, Any]:
        """Analyze GitHub issue/PR labels to determine if action is needed"""
        labels_str = ", ".join(labels) if labels else "no labels"

        prompt = f"""
        Analyze these GitHub issue/PR labels: {labels_str}

        Determine:
        1. Is this a bug? (yes/no)
        2. Priority level (low/medium/high/critical)  
        3. Should we schedule a meeting? (yes/no)
        4. Meeting urgency (within 24h/within week/no urgency)

        Respond with ONLY a JSON object.
        """

        response = self.client.chat.completions.create(
            model=self.model,
            messages=[
                {"role": "system", "content": "You are a GitHub issue analyzer. Respond only with valid JSON."},
                {"role": "user", "content": prompt}
            ],
            max_tokens=200,
            temperature=0.1
        )

        return json.loads(response.choices[0].message.content.strip())

# Global instance
nebius_client = NebiusClient()
Enter fullscreen mode Exit fullscreen mode

Run the test_nebius_client.py file in the cloned repo to test this out:

Image3

Step 5: Building Specialised CrewAI Agents

Now let's create our three specialised agents. Each agent has a specific role and set of tools:

GitHub Agent - Repository Analysis Tool

The GitHub Agent connects to the provided GitHub repository and fetches all the issues and pull requests. It reads through each item, extracts important details like titles, labels, and who's assigned to work on them. Think of it as an intelligent data collector that gathers everything happening in your GitHub project and organises it for analysis.

# agents/github_agent.py

from crewai import Agent, LLM
from composio_crewai import Action
from tools.composio_setup import composio_tools
from config.settings import settings

class GitHubAgentBuilder:
    def __init__(self):
        self.tools = self._get_github_tools()

    def _get_github_tools(self):
        """Get specific GitHub tools needed for our workflow"""
        github_actions = [
            Action.GITHUB_ISSUES_LIST_FOR_REPO,
            Action.GITHUB_LIST_PULL_REQUESTS,
        ]
        return composio_tools.get_specific_actions(github_actions)
Enter fullscreen mode Exit fullscreen mode

Notion Agent - Database Management Tool

The Notion Agent takes the GitHub data and creates an organised database in your Notion workspace. It automatically sets up tables with the right columns (like "Title", "Bug Status", "Assignee") and fills them with all the GitHub information.

# agents/notion_agent.py
from crewai import Agent, LLM
from composio_crewai import Action
from tools.composio_setup import composio_tools
from config.settings import settings

class NotionAgentBuilder:
    def __init__(self):
        self.tools = self._get_notion_tools()

    def _get_notion_tools(self):
        """Get specific Notion tools for database operations"""
        notion_actions = [
            Action.NOTION_SEARCH_NOTION_PAGE,
            Action.NOTION_CREATE_DATABASE,
            Action.NOTION_INSERT_ROW_DATABASE,
        ]
        return composio_tools.get_specific_actions(notion_actions)

Enter fullscreen mode Exit fullscreen mode

Calendar Agent - Meeting Coordination Tool

The Calendar Agent is the smart meeting scheduler. It looks through all the GitHub data, finds items labeled as "bugs" or critical issues, and automatically creates calendar meetings to discuss them. It sends meeting invitations with all the relevant GitHub context, so your team knows exactly what needs to be discussed and when.

# agents/calendar_agent.py
from crewai import Agent, LLM
from composio_crewai import Action
from tools.composio_setup import composio_tools
from config.settings import settings

class CalendarAgentBuilder:
    def __init__(self):
        self.tools = self._get_calendar_tools()

    def _get_calendar_tools(self):
        """Get specific Google Calendar tools for meeting scheduling"""
        calendar_actions = [Action.GOOGLECALENDAR_CREATE_EVENT]
        return composio_tools.get_specific_actions(calendar_actions)
Enter fullscreen mode Exit fullscreen mode

Step 6: Defining Agent Tasks and Workflows

Each agent that we have created needs specific tasks that define their responsibilities.

Let's create the GitHub tasks first. These tasks tell the GitHub Agent exactly what to do step-by-step. First, it fetches all the recent issues from the repository - like bug reports and feature requests. Then it takes all the pull requests - code changes that developers want to merge. Finally, it analyses everything together to identify which items are marked as bugs or need urgent attention:

# tasks/github_tasks.py
from crewai import Task
from agents.github_agent import github_agent
from config.settings import settings

class GitHubTasks:
    @staticmethod
    def fetch_issues_task() -> Task:
        return Task(
            description=f"""
            Fetch the most recent GitHub issues from the {settings.TARGET_REPOSITORY} repository.

            Requirements:
            1. Get at least 10 recent issues (open or closed)
            2. Extract: title, number, state, labels, assignees, created date, author
            3. Focus on issues that might need attention (bugs, critical issues)
            4. Return data in structured format
            """,
Enter fullscreen mode Exit fullscreen mode

Up next, we will create the Notion tasks.

These tasks guide the Notion Agent through creating and filling a database. First, it searches your Notion workspace to find a suitable page where it can create the database. Then it builds a new database called "GitHub Issues & PRs" with the right columns and structure. Finally, it takes all the GitHub data and neatly organises it into rows in the database:

# tasks/notion_tasks.py

from crewai import Task
from agents.notion_agent import notion_agent
from config.settings import settings
import logging

# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

class NotionTasks:
    """Container class for Notion-related tasks"""

    @staticmethod
    def search_parent_pages_task() -> Task:
        """Task to search for available parent pages in Notion workspace"""

        task = Task(
            description=f"""
            Search for available pages in the Notion workspace to find a parent page for database creation.

            Requirements:
            1. Use NOTION_SEARCH_NOTION_PAGE to search for pages in the workspace
            2. Find any available page that can serve as a parent for database creation
            3. Extract the page ID from the search results
            4. Choose the first available page from the results

            IMPORTANT: Use the exact action name NOTION_SEARCH_NOTION_PAGE

            Focus on finding any page that can be used as a parent for the GitHub database.
            """,
Enter fullscreen mode Exit fullscreen mode

Lastly, we will create the Calendar Tasks.

These tasks help the Calendar Agent schedule bug meetings automatically. First, it scans through all the GitHub data to detect which items are labeled as bugs or critical issues. Then it creates calendar meetings for each bug, scheduling them 24 hours in advance. Finally, it sends meeting invitations with all the bug details so everyone knows what will be discussed:

# tasks/calendar_tasks.py
"""
Calendar-specific tasks for bug meeting scheduling
Sends meeting invitations ONLY to abumahfuz21@gmail.com when bugs are detected
"""

from crewai import Task
from agents.calendar_agent import calendar_agent
from config.settings import settings
import logging

# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

class CalendarTasks:
    """Container class for Calendar-related bug meeting tasks"""

    @staticmethod
    def detect_bugs_task() -> Task:
        """Task to analyze GitHub data and identify bug-labeled items"""

        task = Task(
            description=f"""
            Analyze the GitHub issues and pull requests data to identify items labeled with '{settings.BUG_LABEL}' or similar critical labels.

            Requirements:
            1. Review all GitHub issues and PRs provided in the context
            2. Identify items specifically labeled with '{settings.BUG_LABEL}', 'critical', 'urgent', or 'security'
            3. Extract key information for each bug:
Enter fullscreen mode Exit fullscreen mode

Step 7: Multi-Agent Workflow Orchestration

Now let's create the full workflow orchestrator that coordinates all three agents:

# workflow/full_orchestrator.py
from crewai import Crew, Process
from agents.github_agent import github_agent
from agents.notion_agent import notion_agent  
from agents.calendar_agent import calendar_agent
from tasks.github_tasks import GitHubTasks
from tasks.notion_tasks import NotionTasks
from tasks.calendar_tasks import CalendarTasks
from config.settings import settings
import logging
from datetime import datetime

class FullGitHubNotionCalendarOrchestrator:
    """Complete GitHub + Notion + Calendar workflow orchestrator"""

    def __init__(self):
        self.github_agent = github_agent
        self.notion_agent = notion_agent
        self.calendar_agent = calendar_agent
        self.tasks = self._setup_tasks()
        self.crew = self._create_crew()

    def _setup_tasks(self):
        """Setup tasks with proper context linking for complete data flow"""

        # Create all task instances
        github_tasks = GitHubTasks()
        notion_tasks = NotionTasks()
        calendar_tasks = CalendarTasks()
Enter fullscreen mode Exit fullscreen mode

This is where the agents and tasks get connected. It coordinates all three agents to work together in perfect sequence. This is like a project manager that assigns tasks to different team members and makes sure each person's work gets passed to the next person who needs it. The orchestrator ensures the GitHub Agent finishes collecting data before the Notion Agent starts organising it, and the Calendar Agent only schedules meetings after it has all the bug information from the previous steps.

Run the test_orchestrator.py script file to test the orchestrator workflow:

Image4

Image5

Image6

Image7

Image8

Step 8: Creating the MCP Server Wrapper

Now we'll create the MCP server that exposes our multi-agent workflow as standardised tools that any MCP client can use. The MCP server will acts as a bridge, taking our CrewAI workflows and making them available through the Model Context Protocol.

Main CLI Entry Point:

# main.py
import argparse
import sys
from datetime import datetime
from workflow.full_orchestrator import run_complete_workflow
from config.settings import settings
import logging

def main():
    parser = argparse.ArgumentParser(description="Analyze GitHub repository and create Notion database")
    parser.add_argument("--repo", type=str, help='GitHub repository in format "owner/repo"', default=None)
    args = parser.parse_args()
Enter fullscreen mode Exit fullscreen mode

MCP Server Implementation

Now create the MCP server wrapper that exposes our workflows as standardised tools:

# mcp_server_wrapper.py
"""
MCP Server Wrapper for CrewAI + Composio + Nebius Backend
Exposes GitHub analysis workflows as MCP tools via SSE transport
"""

import asyncio
import logging
from typing import Any, Dict, List
from datetime import datetime
import json

# MCP imports - using FastMCP for easier setup
from mcp.server.fastmcp import FastMCP
from mcp.types import Tool, TextContent

# Your existing workflow imports
from workflow.full_orchestrator import run_complete_workflow
from workflow.github_workflow import run_github_workflow
from workflow.notion_workflow import run_notion_workflow_with_data
from workflow.calendar_workflow import run_calendar_bug_workflow_with_data
from config.settings import settings

# Set up logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger("github-analysis-mcp-server")

# Create FastMCP Server
mcp = FastMCP("github-analysis-backend")

@mcp.tool()
async def analyze_github_repository(
    repository: str, meeting_recipient: str = settings.DEFAULT_ATTENDEE_EMAIL
) -> str:
    """
    Complete GitHub repository analysis with Notion database creation and bug meeting scheduling.
Enter fullscreen mode Exit fullscreen mode

We can test this by running python mcp_server_wrapper.py . This will start the server at port 8000:

Image9

You can test the mcp server also by running the command npx @modelcontextprotocol/inspector to see all the tools and workflows:

(venv) npx @modelcontextprotocol/inspector
Starting MCP inspector...
⚙️ Proxy server listening on port 6277
🔍 MCP Inspector is up and running at http://127.0.0.1:6274 🚀
New SSE connection. NOTE: The sse transport is deprecated and has been replaced by streamable-http
Query parameters: [Object: null prototype] {
  url: 'http://localhost:8000/sse',
  transportType: 'sse'
}
SSE transport: url=http://localhost:8000/sse, headers=Accept
Connected to SSE transport
Connected MCP client to backing server transport
Created client transport
Created server transport
Set up MCP proxy

Enter fullscreen mode Exit fullscreen mode

Open the URL at http://127.0.0.1:6274, and you should be able to test the tools and workflows more properly

Integrating with CopilotKit AG-UI

AG-UI is an open protocol that enables real-time, collaborative interactions between AI agents and users using a unified event stream over standard HTTP. In our project, AG-UI will allow seamless collaboration between users and our MCP-powered GitHub analysis agent. Instead of just triggering automated repository analysis, users can see in real-time, the progress as the agent fetches GitHub data, creates Notion databases, and schedules meetings.

Step 1: Creating the AG-UI Application

We'll use CopilotKit's AG-UI CLI to bootstrap our GitHub analysis frontend:

npx create-ag-ui-app my-github-analyzer
Enter fullscreen mode Exit fullscreen mode

This will start the interactive setup process with the AG-UI:

Image10

Configuration Prompts

During the setup, you'll see several configuration prompts:

1. Select Client Framework

When prompted "What client do you want to use?", select:

> CopilotKit/Next.js
Enter fullscreen mode Exit fullscreen mode

2. Choose AI Integration Method

When asked "How will you be interacting with AI?", select:


> MCP
Enter fullscreen mode Exit fullscreen mode

This is because our backend uses Model Context Protocol.

3. Deployment Options

When prompted "Deploy with Copilot Cloud?", choose:


> No
Enter fullscreen mode Exit fullscreen mode

We'll use local API keys for development.

  1. API Key Configuration

The CLI will prompt for your OpenAI API key:

You can either:

  • Enter your API key now, or
  • Leave empty and configure it later in .env.local

Image12

Then it will prompt you to answer a couple of more questions, and then that’s all, you’re up and ready:

Image13

Step 2: Starting Your MCP Server Backend

In a separate terminal, start your MCP server backend with this command:


python mcp_server_wrapper.py
Enter fullscreen mode Exit fullscreen mode

Your MCP server should now be running at http://localhost:8000/sse and ready to accept connections.

Step 3: Configuring the MCP Server in CopilotKit UI

Start your Frontend server using the command:

npm run dev
Enter fullscreen mode Exit fullscreen mode

This should start on port 3000 on your localhost:/

Image15

Now add the URL for the MCP server - http://localhost:8000/sse

Step 4: Exploring Available Tools

Once you have added the MCP server, you can test for available tools in the MCP Server:

With the MCP server successfully connected to CopilotKit AG-UI, you can now test the complete GitHub analysis pipeline. Try these commands to see your tools in action:

  • Check workflow status - Verify server health and configuration
  • Analyze facebook/react - Run the complete analysis workflow (2-5 minutes)
  • Get GitHub data for microsoft/vscode - Fetch repository data only
  • Schedule bug meetings for the analysis results - Create calendar events

The AG-UI interface will show real-time progress as your MCP tools execute, with visual indicators for each phase: GitHub data collection, Notion database creation, and meeting scheduling.

Conclusion

In this guide, we've built a complete GitHub analysis pipeline that combines MCP servers, CrewAI multi-agent workflows, and Composio's tool integrations. This system automatically analyzes repositories, creates Notion databases, and schedules bug review meetings, all of which are accessible through CopilotKit AG-UI's frontend interface.

The best part of this is that everything runs through standard protocols, so you can easily adapt this system to work with different tools or extend it for your specific needs. Clone the repository, set up your API keys, and start experimenting.

Hopefully, this guide makes it easier for you to integrate AI-powered agents into your existing application.

Follow CopilotKit on Twitter and say hi, and if you'd like to build something cool, join the Discord community.

Top comments (6)

Collapse
 
djones profile image
David Jones

Great Article, Arindam!

Collapse
 
tythos profile image
Brian Kirkpatrick

One of the more interesting overviews I've seen on this topic so far. Good review and examples, very helpful and actionable stuff.

Collapse
 
arindam_1729 profile image
Arindam Majumder CopilotKit

Thanks a lot for checking out Brian!

Collapse
 
nathan_tarbert profile image
Nathan Tarbert CopilotKit

Very well written tutorial Arindam!

I've worked with Composio MCP servers quite a bit, but I'd love to code out this project and implement Crew agents.

Collapse
 
shivaylamba profile image
Shivay Lamba

You are a rockstar man amazing article

Collapse
 
johny0012 profile image
Johny

Saving this for later, thanks!