DEV Community

roberto degani
roberto degani

Posted on

How I Automated My Entire Content Pipeline with AI APIs

Content creation is one of the most time-consuming aspects of digital marketing. As a developer, I was tired of spending hours writing, editing, and optimizing articles. So I built an automated content pipeline that combines AI-powered generation with intelligent analysis to produce publication-ready content in minutes.

Here's exactly how I did it, step by step.

The Problem: Content Takes Too Long

My typical content workflow looked like this:

  1. Research topics (30 min)
  2. Write first draft (2-3 hours)
  3. Edit and refine (1 hour)
  4. SEO optimize (30 min)
  5. Format for publication (30 min)

Total: 4-5 hours per article. That's not sustainable when you need to publish regularly.

The Solution: AI-Powered Pipeline

I built a Node.js pipeline that chains together multiple APIs to handle each stage automatically:

  1. AI Content Generator — Creates the initial draft from a topic/keyword
  2. AI Text Analyzer — Analyzes readability, sentiment, and tone
  3. Instant SEO Audit — Checks SEO optimization of the output
  4. Web Scraper — Researches competitor content for inspiration

Step 1: Generate the Initial Draft

const axios = require('axios');

const RAPIDAPI_KEY = 'YOUR_KEY';

async function generateContent(topic, keywords, type = 'blog_post') {
  const response = await axios.post(
    'https://ai-content-generator.p.rapidapi.com/api/generate',
    {
      topic,
      keywords,
      content_type: type,
      tone: 'professional',
      length: 'long'
    },
    {
      headers: {
        'x-rapidapi-key': RAPIDAPI_KEY,
        'x-rapidapi-host': 'ai-content-generator.p.rapidapi.com',
        'Content-Type': 'application/json'
      }
    }
  );
  return response.data;
}

// Usage
const draft = await generateContent(
  'web scraping best practices 2026',
  ['web scraping', 'API', 'data extraction', 'automation']
);
console.log('Draft generated:', draft.content.length, 'characters');
Enter fullscreen mode Exit fullscreen mode

Step 2: Analyze Content Quality

async function analyzeContent(text) {
  const response = await axios.post(
    'https://ai-text-analyzer.p.rapidapi.com/api/analyze',
    {
      text,
      analyses: ['readability', 'sentiment', 'keywords', 'summary']
    },
    {
      headers: {
        'x-rapidapi-key': RAPIDAPI_KEY,
        'x-rapidapi-host': 'ai-text-analyzer.p.rapidapi.com',
        'Content-Type': 'application/json'
      }
    }
  );
  return response.data;
}

const analysis = await analyzeContent(draft.content);
console.log('Readability Score:', analysis.readability_score);
console.log('Sentiment:', analysis.sentiment);
console.log('Key Topics:', analysis.keywords.join(', '));
Enter fullscreen mode Exit fullscreen mode

Step 3: Research Competitors

async function scrapeCompetitor(url) {
  const response = await axios.post(
    'https://web-scraper-extractor.p.rapidapi.com/api/scrape',
    { url, extract: ['title', 'headings', 'meta_description'] },
    {
      headers: {
        'x-rapidapi-key': RAPIDAPI_KEY,
        'x-rapidapi-host': 'web-scraper-extractor.p.rapidapi.com',
        'Content-Type': 'application/json'
      }
    }
  );
  return response.data;
}

// Scrape top-ranking articles for inspiration
const competitor = await scrapeCompetitor('https://competitor-blog.com/web-scraping-guide');
console.log('Competitor headings:', competitor.headings);
Enter fullscreen mode Exit fullscreen mode

Step 4: The Complete Pipeline

async function contentPipeline(topic, keywords) {
  console.log('Starting content pipeline...');

  // Step 1: Generate
  console.log('Generating draft...');
  const draft = await generateContent(topic, keywords);

  // Step 2: Analyze
  console.log('Analyzing quality...');
  const analysis = await analyzeContent(draft.content);

  // Step 3: Check quality threshold
  if (analysis.readability_score < 60) {
    console.log('Readability too low, regenerating with simpler tone...');
    const simpleDraft = await generateContent(topic, keywords);
    draft.content = simpleDraft.content;
  }

  // Step 4: Format output
  const output = {
    title: draft.title,
    content: draft.content,
    meta_description: analysis.summary,
    keywords: analysis.keywords,
    readability: analysis.readability_score,
    sentiment: analysis.sentiment,
    word_count: draft.content.split(' ').length,
    generated_at: new Date().toISOString()
  };

  // Save to file
  const fs = require('fs');
  const filename = `content-${Date.now()}.json`;
  fs.writeFileSync(filename, JSON.stringify(output, null, 2));

  console.log('Pipeline complete!');
  console.log(`  Title: ${output.title}`);
  console.log(`  Words: ${output.word_count}`);
  console.log(`  Readability: ${output.readability}/100`);
  console.log(`  Saved to: ${filename}`);

  return output;
}

// Run it
contentPipeline('web scraping best practices', ['scraping', 'API', 'data']);
Enter fullscreen mode Exit fullscreen mode

Results After 30 Days

After running this pipeline for a month:

Metric Before After
Time per article 4-5 hours 20 minutes
Articles per week 1-2 5-7
Readability score 65 avg 78 avg
SEO optimization Manual Automated

The APIs That Power This

All four APIs are available on RapidAPI with free tiers:

Key Takeaways

  1. AI doesn't replace writers — it handles the grunt work so you can focus on strategy
  2. Quality gates matter — always analyze before publishing
  3. Automation compounds — saving 4 hours per article adds up fast
  4. APIs make it simple — no need to build NLP models from scratch

The entire pipeline runs on standard Node.js with no ML infrastructure needed. Just API calls.

Try it yourself — start with the AI Content Generator and build from there!

Top comments (0)