DEV Community

Cover image for Stop writing LinkedIn posts manually: Build an Auto-Repurposing Agent with n8n, Ghost & OpenAI
zo Aoo
zo Aoo

Posted on

Stop writing LinkedIn posts manually: Build an Auto-Repurposing Agent with n8n, Ghost & OpenAI

The Struggle: "Distribution is King"

We all know the saying: "Content is King, but Distribution is Queen."

You spend 5 hours writing a deep-dive technical article on your Ghost blog. You hit publish. And then... you have to spend another hour staring at a blank screen, trying to figure out how to summarize it for LinkedIn without sounding like a robot.

I got tired of this. I wanted a system that:

  1. Watches my Ghost blog for new posts.
  2. Reads the content (intelligent parsing).
  3. Writes a viral-style LinkedIn post (Hook + Value + CTA).
  4. Saves it to Google Sheets for my final review.

So, I built it with n8n. Here is the breakdown of the logic.

The Architecture

Full Workflow Graph Overview

The workflow follows a linear logic:
Extract -> Clean -> Generate -> Record.

You can interact with the full node graph and download the JSON here:
👉 View this Workflow on n8nworkflows.world

Step 1: Extracting from Ghost (Headless CMS)

First, we use the Ghost Node to fetch all posts.

  • Operation: Get All
  • Limit: I set it to fetch the latest few posts for batch processing, but you can change this to a trigger for real-time automation.

Step 2: The "Secret Sauce" - Cleaning the HTML

This is the most critical step that many beginners miss.

Ghost (and most CMSs) returns the article content as HTML. If you feed raw HTML (with all the <div>, <p>, <img> tags) into OpenAI, two things happen:

  1. You waste tokens (money).
  2. The AI gets confused by the formatting code.

I used a simple Code Node (JavaScript) to strip the tags and clean the text before sending it to the AI.

// The logic inside the "Clean HTML" node
const htmlContent = $input.first().json.content;

const cleanText = htmlContent
  .replace(/<[^>]*>/g, '') // remove HTML tags
  .replace(/\s+/g, ' ')    // normalize spaces
  .replace(/&nbsp;/g, ' ') // decode common entities
  .trim();

return [{ json: { clean_content: cleanText } }];
Enter fullscreen mode Exit fullscreen mode

Step 3: The AI Agent (The Marketer)

Now we have clean text. I connected an AI Agent Node with gpt-4o-mini (it's cheap and perfect for summarization).

The key here is the System Prompt. I didn't just say "Summarize this." I gave the AI a specific role and structure.

The Prompt Strategy:

"You are a content marketing assistant... Start with a hook that grabs attention... Briefly summarize the article’s value... Include a clear Call-To-Action... End with this author signature..."

By defining the output structure, the AI produces usable content, not just generic summaries.

AI Agent Node Settings

Step 4: Human-in-the-Loop (Google Sheets)

I don't let the AI post directly to LinkedIn. Never trust AI 100%.
Instead, the workflow appends the generated post to a Google Sheet.

  • Column A: Blog Title
  • Column B: Original Link
  • Column C: AI Drafted LinkedIn Post

This creates a "Content Backlog" for me. I just open the sheet on Monday morning, tweak the hook slightly, and hit post.

Google Sheet Result Example

Why use n8n for this?

You could use Zapier, but doing the HTML Cleaning (Step 2) and the Advanced Looping (processing multiple posts at once) is much harder and more expensive on other platforms. n8n handles the custom JavaScript and data transformation natively.

Get the Workflow

I've shared this workflow (including the regex code and the prompt) on my search engine. You can preview the nodes visually to understand how the data flows.

🚀 Download the JSON here: https://n8nworkflows.world

Do you automate your content distribution? Let me know what your stack looks like in the comments! 👇

Top comments (0)