DEV Community

Cover image for DevRel Empire: The Autonomous Notion-to-Everywhere Publishing Engine
Parul Malhotra
Parul Malhotra

Posted on

DevRel Empire: The Autonomous Notion-to-Everywhere Publishing Engine

Notion MCP Challenge Submission 🧠

This is a submission for the Notion MCP Challenge

What I Built

πŸš€ Omnichannel DevRel Empire β€” an autonomous, two-phase content publishing engine that turns your Notion workspace into a headless, zero-click publishing command center.

The idea is simple: you write rough bullet-point notes inside a Notion page and flip a status dropdown. The rest happens automatically.

Phase 1 β€” AI Drafting (Generate AI Content status):
The system picks up your notes, feeds them to Gemini 2.5 Flash, and generates a full-length expanded article, a Twitter/X thread, and a LinkedIn post. It then uses Imagen 3 to create a bespoke cover image, hosts it on Google Cloud Storage, and writes all the AI-generated content directly back into your Notion page as structured blocks β€” cover image and all.

Phase 2 β€” Publishing (Publish Now status):
You review the draft in Notion, tweak it if needed, then flip the status to Publish Now. The engine cross-posts the article to DEV.to and Hashnode simultaneously, then leaves a πŸš€ comment on your Notion page with the live published URLs.

Infrastructure:

  • Backend: TypeScript + Express, containerised with Docker, deployed on Google Cloud Run
  • Scheduler: Google Cloud Scheduler polls the /execute endpoint every 60 seconds β€” fully autonomous
  • Dashboard: A Vite + TypeScript glassmorphism-style management UI to input API keys and monitor the system

The whole thing is status-driven, built around Notion as the single source of truth and control plane for a developer's content workflow.


Video Demo


Show us the code

GitHub logo parulmalhotraiitk / devrel

An AI-powered automation system that turns Notion into an autonomous publishing command center using the Model Context Protocol (MCP) and Google Gemini.

πŸš€ Omnichannel DevRel Empire

Autonomous Content Publishing & Imagery Command Center built for the Notion MCP Challenge.

DevRel Empire turns Notion into a headless, autonomous publishing engine. By leveraging the official Notion MCP Server, it transforms simple bullet points into comprehensive multi-platform articles with AI-generated cover images, social drafts (Twitter/LinkedIn), and zero-click distribution.

🧩 Notion MCP Challenge (Official Integration)

This project is built to demonstrate the power of the Model Context Protocol (MCP) in a cloud-native, serverless environment.

  • Native Tool Usage: Interacts with Notion using API-post-search for intelligent page discovery, API-patch-page for status-driven state management, and API-create-a-comment for cross-platform link reporting.
  • Zero-Trust Auth: Runs the @notionhq/notion-mcp-server package natively inside a Google Cloud Run container, dynamically injecting credentials per-request over a secure Stdio transport.
  • Protocol Depth: Implements a hybrid architecture using full JSON-RPC 2.0 to the MCP server for logic, and the standard REST SDK for granular…

Project structure:

src/
β”œβ”€β”€ mcp/
β”‚   └── notion-client.ts   # All Notion MCP interactions β€” the heart of the system
β”œβ”€β”€ agent/
β”‚   └── gemini.ts          # Gemini 2.5 Flash + Imagen 3 content generation
β”œβ”€β”€ platforms/
β”‚   └── publishers.ts      # DEV.to & Hashnode publishing logic
└── index.ts               # Express app + /execute endpoint
Enter fullscreen mode Exit fullscreen mode

Dockerfile (Cloud Run deployment):

# Stage 1: Build
FROM node:20 AS builder
WORKDIR /usr/src/app

# Copy manifests and install ALL dependencies (including devDependencies for tsc)
COPY package*.json ./
RUN npm install

# Copy source and compile TypeScript
COPY . .
RUN npm run build

# Stage 2: Production image
FROM node:20-slim
WORKDIR /usr/src/app

# Copy manifests and install ONLY production dependencies
COPY package*.json ./
RUN npm install --omit=dev

# Copy compiled output from the build stage
COPY --from=builder /usr/src/app/dist ./dist

# Run the web service
CMD [ "node", "dist/index.js" ]

Enter fullscreen mode Exit fullscreen mode

How I Used Notion MCP

Notion MCP is the nerve centre of the entire workflow β€” every meaningful action the system takes on Notion goes through the official @notionhq/notion-mcp-server package (v2.2.1) over a JSON-RPC 2.0 / Stdio transport.

Step 1 β€” Booting the MCP Server and Connecting

The backend spawns the official Notion MCP server as a child process on every workflow run and connects to it as an MCP client:

import { Client as MCPClient } from '@modelcontextprotocol/sdk/client/index.js';
import { StdioClientTransport } from '@modelcontextprotocol/sdk/client/stdio.js';

async function connectToMCP(): Promise<MCPClient> {
  const transport = new StdioClientTransport({
    command: 'node',
    args: ['node_modules/@notionhq/notion-mcp-server/bin/cli.mjs'],
    env: {
      ...process.env,
      NOTION_TOKEN: process.env.NOTION_API_TOKEN as string,
    },
  });

  const mcpClient = new MCPClient(
    { name: 'devrel-empire-client', version: '1.0.0' },
    { capabilities: {} }
  );

  await mcpClient.connect(transport);

  const tools = await mcpClient.listTools();
  console.log(`[MCP] ${tools.tools.length} tools available: ${tools.tools.map(t => t.name).join(', ')}`);

  return mcpClient;
}
Enter fullscreen mode Exit fullscreen mode

This means zero hardcoded Notion API calls in the orchestration layer β€” all commands flow through the MCP protocol over stdin/stdout.


Step 2 β€” API-post-search: Workspace-Wide Page Discovery

On every poll cycle, the agent uses this MCP tool to perform a workspace-wide search ordered by most recently edited pages:

const searchResult = await mcpClient.callTool({
  name: 'API-post-search',
  arguments: {
    filter: { value: 'page', property: 'object' },
    sort: { direction: 'descending', timestamp: 'last_edited_time' },
  },
}) as any;

if (searchResult?.isError) {
  throw new Error(searchResult?.content?.[0]?.text || 'Unknown MCP Error');
}

console.log(`[MCP] API-post-search OK β€” ${searchResult?.content?.[0]?.text?.length} chars returned`);
Enter fullscreen mode Exit fullscreen mode

Step 3 β€” API-patch-page: Status-Driven State Machine

The two-phase pipeline is orchestrated entirely by flipping Notion page statuses via MCP. The same call also attaches the AI-generated Imagen 3 cover image to the page:

async function mcpUpdatePageStatus(
  mcpClient: MCPClient,
  pageId: string,
  statusName: string,
  coverUrl?: string
): Promise<void> {
  const body: any = {
    page_id: pageId,
    properties: {
      Status: { status: { name: statusName } },
    },
  };

  if (coverUrl) {
    body.cover = { type: 'external', external: { url: coverUrl } };
  }

  const result = await mcpClient.callTool({
    name: 'API-patch-page',
    arguments: body,
  }) as any;

  if (result?.isError) throw new Error(result?.content?.[0]?.text || 'Unknown MCP Error');
  console.log(`[MCP] Status set to "${statusName}" on page ${pageId}`);
}

// Phase 1 end:  status β†’ "Pending Review"  (+ GCS cover image URL)
// Phase 2 end:  status β†’ "Published"
Enter fullscreen mode Exit fullscreen mode

Step 4 β€” API-patch-block-children: Writing AI Content Back Into Notion

After Gemini 2.5 Flash generates the article and social drafts, the system appends them directly into the Notion page as structured blocks. Text is chunked at 1900 chars to respect Notion's 2000-character per-block limit:

const result = await mcpClient.callTool({
  name: 'API-patch-block-children',
  arguments: {
    block_id: pageId,
    children: [
      { object: 'block', type: 'divider', divider: {} },
      {
        object: 'block', type: 'heading_2',
        heading_2: {
          rich_text: [{ type: 'text', text: { content: 'πŸ€– AI Generated Drafts (Review & Edit)' } }],
        },
      },
      {
        object: 'block', type: 'image',
        image: { type: 'external', external: { url: coverImageUrl } },
      },
      ...articleChunks.map(chunk => ({
        object: 'block', type: 'paragraph',
        paragraph: { rich_text: [{ type: 'text', text: { content: chunk } }] },
      })),
    ],
  },
}) as any;

if (result?.isError) throw new Error(result?.content?.[0]?.text || 'Unknown MCP Error');
console.log('[MCP] AI content blocks appended to Notion page.');
Enter fullscreen mode Exit fullscreen mode

Step 5 β€” API-create-a-comment: Closing the Loop

Once the article is live on DEV.to and Hashnode, the agent posts the published URLs back to the original Notion page as a comment:

async function mcpCreateComment(
  mcpClient: MCPClient,
  pageId: string,
  links: string[]
): Promise<void> {
  if (links.length === 0) return;

  const commentBody = 'πŸš€ Published successfully!\n\n' + links.join('\n');

  const result = await mcpClient.callTool({
    name: 'API-create-a-comment',
    arguments: {
      parent: { page_id: pageId },
      rich_text: [{ type: 'text', text: { content: commentBody } }],
    },
  }) as any;

  if (result?.isError) throw new Error(result?.content?.[0]?.text || 'Unknown MCP Error');
  console.log('[MCP] Published links posted as Notion comment.');
}
Enter fullscreen mode Exit fullscreen mode

The loop is complete: Notion triggered the workflow, Notion received the results.


What Notion MCP Unlocks

Without MCP With MCP
Scattered REST calls hardcoded per operation Single uniform callTool() abstraction for all operations
Credentials embedded in every HTTP client Zero-trust Stdio transport β€” token injected once at server boot
Tools must be manually mapped to AI agents listTools() auto-exposes the full Notion API surface
Adding a new Notion operation = new REST client code Adding a new operation = one callTool() call

The protocol-first architecture means any AI agent β€” Claude, Gemini, GPT-4 β€” can be dropped into this system and interact with Notion through the exact same interface. No glue code changes needed. That's the real unlock: Notion as a universally-accessible, agent-native control plane for your entire content workflow.

Top comments (0)