DEV Community

Cover image for I Built an AI Bot That Writes and Posts to LinkedIn While I Sleep — Here's How
Jasmeet Singh
Jasmeet Singh

Posted on • Originally published at Medium

I Built an AI Bot That Writes and Posts to LinkedIn While I Sleep — Here's How

How I automated my LinkedIn presence using Python, Groq AI, and GitHub Actions (and why you should too)

The Problem: LinkedIn Consistency is Hard

Let's be honest — maintaining a consistent LinkedIn presence is exhausting.

As an Android developer building healthcare apps during the day, the last thing I want to do at night is brainstorm "thought leadership" content. But I knew LinkedIn was crucial for:

  • Building my professional network
  • Showcasing my expertise
  • Getting noticed by recruiters
  • Long-term career growth

The irony? I'm a developer. If something is repetitive, I should automate it.

So I built a bot that:

  1. ✅ Finds trending Android topics automatically
  2. ✅ Uses AI to generate engaging posts
  3. ✅ Posts to my LinkedIn profile
  4. ✅ Runs weekly on autopilot via GitHub Actions

Result: I wake up every Wednesday to a fresh LinkedIn post that I never wrote.

Let me show you how it works.

The Architecture: How It Actually Works

Flowchart of the automation

Simple, right? But the devil is in the details. Let me break down each component.


Part 1: Finding Trending Topics (DuckDuckGo API)

The first challenge: How do you know what's trending?

I tried multiple approaches:

  • ❌ Google Trends API (requires API key, rate limited)
  • ❌ Twitter API (expensive, and it's X now)
  • ❌ Reddit API (too noisy, hard to filter)
  • ✅ DuckDuckGo Search (free, no API key, instant results)

The Code

from duckduckgo_search import DDGS

def search_trending_topics(queries, max_results=5):
    """Search for trending topics across multiple queries"""
    all_results = []

    with DDGS() as ddgs:
        for query in queries:
            try:
                results = list(ddgs.text(
                    query, 
                    max_results=max_results,
                    region='wt-wt',  # Worldwide
                    safesearch='off',
                    timelimit='w'  # Last week only
                ))
                all_results.extend(results)
            except Exception as e:
                print(f"Search error for '{query}': {e}")
                continue

    return all_results
Enter fullscreen mode Exit fullscreen mode

What I Search For

SEARCH_QUERIES = [
    "Android development trends 2025",
    "Kotlin new features latest", 
    "Jetpack Compose updates",
    "Android Studio tips",
    "Mobile app development best practices"
]
Enter fullscreen mode Exit fullscreen mode

Why this works:

  • timelimit='w' ensures only recent content (last week)
  • Multiple queries cast a wider net
  • region='wt-wt' gets global trends, not just local

Pro tip: Customize these queries for your niche. If you're in React, search for "React 19 features", "Next.js updates", etc.


Part 2: AI Content Generation (Groq + LLaMA 70B)

Now comes the magic: Getting AI to write like a human.

I experimented with multiple AI providers:

  • ❌ OpenAI GPT-4 (expensive at scale)
  • ❌ Anthropic Claude (also pricey)
  • ❌ Google Gemini (inconsistent quality)
  • ✅ Groq + LLaMA 70B (fast, free tier, excellent quality)

Why Groq?

Groq's LPU (Language Processing Unit) is insanely fast. My posts generate in under 2 seconds. Compare that to 10-15 seconds on other platforms.

Plus: Their free tier is generous (14,400 requests/day with LLaMA 70B).

The Prompt Engineering

This is where 90% of the magic happens. Here's my system prompt:

SYSTEM_PROMPT = """You are a professional Android developer who writes 
engaging LinkedIn posts. Your writing style is:

- Clear and conversational (not corporate jargon)
- Technical but accessible
- Uses real examples from production work
- Includes practical takeaways
- 3-5 sentences max (LinkedIn attention spans are short)
- Ends with a relevant question to drive engagement
- Uses 2-3 hashtags (no more!)

You specialize in:
- Android SDK and Kotlin
- Jetpack Compose
- Mobile architecture patterns
- Performance optimization
- Developer productivity

Never use:
- Emojis (except strategically, max 1-2)
- Buzzwords like "game-changer", "revolutionary"
- Generic advice everyone already knows
- Anything that sounds like it was written by AI

Write as if you're sharing a quick insight with fellow developers 
over coffee."""
Enter fullscreen mode Exit fullscreen mode

The Content Generation Code

from groq import Groq

def generate_post_with_ai(topics, api_key, dry_run=False):
    """Use Groq AI to pick best topic and generate post"""

    # Format topics for the prompt
    topics_text = "\n".join([
        f"{i+1}. {t['title']}\n   {t['body'][:150]}..."
        for i, t in enumerate(topics[:5])
    ])

    prompt = f"""From these trending topics:

{topics_text}

Pick the MOST interesting one for Android developers and write a 
LinkedIn post about it. Remember: technical but conversational, 
3-5 sentences, end with a question."""

    client = Groq(api_key=api_key)

    completion = client.chat.completions.create(
        model="llama-3.3-70b-versatile",
        messages=[
            {"role": "system", "content": SYSTEM_PROMPT},
            {"role": "user", "content": prompt}
        ],
        temperature=0.8,  # More creative
        max_tokens=500
    )

    return completion.choices[0].message.content
Enter fullscreen mode Exit fullscreen mode

Key parameters:

  • temperature=0.8 — Higher = more creative, lower = more predictable
  • max_tokens=500 — Keeps posts concise (LinkedIn sweet spot is 150-250 words)
  • System prompt defines personality/style

Part 3: Posting to LinkedIn (The API Nightmare)

Oh boy. LinkedIn's API is... let's just say it's an adventure.

The Authentication Dance

LinkedIn uses OAuth 2.0, which requires:

  1. Creating a LinkedIn App (Developer Portal)
  2. Getting client ID + secret
  3. Redirecting user for authorization
  4. Exchanging code for access token
  5. Storing token securely

I created a separate script (linkedin_post.py) to handle this one-time setup.

The Posting Code

import requests

def post_to_linkedin(content, access_token, person_urn):
    """Post content to LinkedIn"""

    url = "https://api.linkedin.com/v2/ugcPosts"

    headers = {
        "Authorization": f"Bearer {access_token}",
        "Content-Type": "application/json",
        "X-Restli-Protocol-Version": "2.0.0"
    }

    post_data = {
        "author": person_urn,
        "lifecycleState": "PUBLISHED",
        "specificContent": {
            "com.linkedin.ugc.ShareContent": {
                "shareCommentary": {
                    "text": content
                },
                "shareMediaCategory": "NONE"
            }
        },
        "visibility": {
            "com.linkedin.ugc.MemberNetworkVisibility": "PUBLIC"
        }
    }

    response = requests.post(url, headers=headers, json=post_data)

    if response.status_code == 201:
        return True, "Post successful!"
    else:
        return False, f"Error: {response.status_code} - {response.text}"
Enter fullscreen mode Exit fullscreen mode

The Gotchas I Discovered

Problem #1: Token Expiration
LinkedIn access tokens expire after 60 days. Solution: Store expiry date and refresh proactively.

Problem #2: Rate Limits
LinkedIn allows ~100 posts/day. For weekly automation, this is fine. But test carefully.

Problem #3: Person URN
You need your LinkedIn person_urn (not just your profile URL). Get it via:

https://api.linkedin.com/v2/me
Enter fullscreen mode Exit fullscreen mode

Part 4: Automation (GitHub Actions)

The final piece: Running this automatically every week.

Why GitHub Actions?

  • ✅ Free (2,000 minutes/month on free tier)
  • ✅ Cron scheduling built-in
  • ✅ Secure secrets management
  • ✅ No server maintenance

The Workflow File

Create .github/workflows/weekly-post.yml:

name: Weekly LinkedIn Post

on:
  schedule:
    - cron: '0 3 * * 3'  
  workflow_dispatch:  # Manual trigger for testing

jobs:
  post-to-linkedin:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout code
        uses: actions/checkout@v3

      - name: Set up Python
        uses: actions/setup-python@v4
        with:
          python-version: '3.10'

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install -r requirements.txt

      - name: Run LinkedIn poster
        env:
          GROQ_API_KEY: ${{ secrets.GROQ_API_KEY }}
          LINKEDIN_ACCESS_TOKEN: ${{ secrets.LINKEDIN_ACCESS_TOKEN }}
          LINKEDIN_PERSON_URN: ${{ secrets.LINKEDIN_PERSON_URN }}
        run: python linkedin_ai_poster.py

      - name: Commit post history
        run: |
          git config user.name "GitHub Actions Bot"
          git config user.email "actions@github.com"
          git add post_history.json
          git commit -m "📝 New LinkedIn post published" || echo "No changes"
          git push
Enter fullscreen mode Exit fullscreen mode

Cron Syntax Explained

'0 3 * * 3'
 │ │ │ │ │
 │ │ │ │ └── Day of week (3 = Wednesday)
 │ │ │ └──── Month (any)
 │ │ └────── Day of month (any)
 │ └──────── Hour (03 UTC)
 └────────── Minute (0)
Enter fullscreen mode Exit fullscreen mode

My schedule: Every Wednesday at 8:30 AM ITC

Storing Secrets

In your GitHub repo:

  1. Go to Settings → Secrets and variables → Actions
  2. Add these secrets:
    • GROQ_API_KEY
    • LINKEDIN_ACCESS_TOKEN
    • LINKEDIN_PERSON_URN

Never commit these to your code!


The Safety Features

I added several safeguards to avoid looking like a spam bot:

1. Dry Run Mode

Test without actually posting:

python linkedin_ai_poster.py --dry-run
Enter fullscreen mode Exit fullscreen mode

This shows you:

  • What topics were found
  • What post would be generated
  • NO actual posting

2. Post History Tracking

def save_post_history(post_content):
    """Track what we've posted to avoid duplicates"""
    history = load_history()

    history.append({
        "timestamp": datetime.now().isoformat(),
        "content": post_content[:100],  # First 100 chars
        "hash": hashlib.md5(post_content.encode()).hexdigest()
    })

    with open("post_history.json", "w") as f:
        json.dump(history, f, indent=2)
Enter fullscreen mode Exit fullscreen mode

This prevents posting:

  • Duplicate content
  • Same topics repeatedly
  • Helps audit what's been posted

3. Content Validation

Before posting, I check:

  • ✅ Post isn't empty
  • ✅ Post is 50-2000 characters
  • ✅ Post doesn't match recent history
  • ✅ Post contains at least one hashtag

The Results:
Sample Generated Posts:
Post #1(Jetbrains previewing new features for ktor)
Post#2(Moving from Java to Kotlin: An Android Developer’s Perspective)
Total time invested: 0 minutes (after initial setup)

What Surprised Me

  1. AI quality is REALLY good — Most posts needed zero editing
  2. Engagement is comparable to manual posts — Sometimes even better
  3. People ask questions — The ending question drives real conversations
  4. No one realizes it's automated — (Until now, I guess! 😄)

The Challenges I Faced

Challenge #1: AI Generated Generic Content

Problem: Early versions sounded too much like AI — lots of "game-changing" and "revolutionary" language.

Solution: Heavily refined the system prompt with explicit "never use" lists and examples of good posts.

Challenge #2: DuckDuckGo Rate Limiting

Problem: Occasionally got blocked for too many searches.

Solution: Added exponential backoff and reduced search frequency (5 queries, not 20).

import time
from random import uniform

def search_with_retry(query, max_retries=3):
    for attempt in range(max_retries):
        try:
            return ddgs.text(query, max_results=5)
        except Exception as e:
            if attempt < max_retries - 1:
                wait = uniform(2, 5) * (2 ** attempt)  # Exponential backoff
                time.sleep(wait)
            else:
                raise
Enter fullscreen mode Exit fullscreen mode

Challenge #3: LinkedIn Token Expiration(Upcoming)

Problem: Tokens expire every 60 days. Automation breaks silently.

Solution: Added expiry tracking + email notification (via GitHub Actions) when token is <7 days from expiration.


Should You Build This?

✅ Build this if:

  • You struggle with LinkedIn consistency
  • You're comfortable with Python
  • You have 2-3 hours for initial setup
  • You want to experiment with AI automation

❌ Skip this if:

  • You prefer fully manual, authentic posting
  • You're uncomfortable with automation ethics
  • You don't have time to monitor it initially
  • Your industry requires very personalized content

The Ethics Question

"Isn't this inauthentic?"

I wrestled with this. Here's my thinking:

What the bot does:

  • ✅ Finds relevant, real topics
  • ✅ Generates posts in MY voice (via prompt engineering)
  • ✅ Posts to MY profile with MY name

What the bot doesn't do:

  • ❌ Impersonate someone else
  • ❌ Spam or mislead
  • ❌ Post irrelevant content
  • ❌ Engage with others (I still do that manually)

My take: It's a tool, like a spell-checker or Grammarly. It helps me maintain presence, but I'm still responsible for the content.

I also:

  • Review posts weekly (dry run first)
  • Respond to all comments personally
  • Occasionally edit AI suggestions
  • Stay transparent (this article!)

How to Build Your Own

Step 1: Clone the Repo

git clone https://github.com/jasi381/LinkedInPostGenerator.git
cd LinkedInPostGenerator
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode

Step 2: Get API Keys

Groq (Free):

  1. Visit https://console.groq.com
  2. Sign up
  3. Generate API key

LinkedIn:

  1. Create LinkedIn App (https://developer.linkedin.com)
  2. Get client ID + secret
  3. Run linkedin_post.py for OAuth flow
  4. Save tokens

Step 3: Test Locally

# Dry run (no posting)
python linkedin_ai_poster.py --dry-run

# Review the generated post
# If it looks good, run for real:
python linkedin_ai_poster.py
Enter fullscreen mode Exit fullscreen mode

Step 4: Set Up Automation

  1. Fork the repo
  2. Add secrets to GitHub Actions
  3. Customize workflow schedule
  4. Enable Actions in repo settings

That's it! Your bot is live.


The Bigger Picture

This project taught me something important: Automation isn't about replacing authenticity — it's about enabling consistency.

As developers, we have skills that most professionals don't:

  • API integration
  • AI prompt engineering
  • CI/CD workflows
  • System automation

Why not use these skills for career growth?

This LinkedIn bot:

  • Saves me 2+ hours weekly
  • Maintains my professional presence
  • Showcases my automation skills
  • Generates opportunities

And now, it's a portfolio piece too.


Try It Yourself

Full code is on GitHub: repo link

What would you automate next?

I'm thinking:

  • Cover letter generator (for job applications)
  • GitHub contribution tracker (auto-posts weekly stats)
  • Dev.to article sync (cross-platform publishing)

Drop your ideas in the comments! 👇

Top comments (1)

Collapse
 
josue_chambilla_5994a4735 profile image
Josue Chambilla • Edited

I found this idea very interesting and necessary for people who like to be active on LinkedIn and not just have another LinkedIn profile; I'm going to try it.

I liked it, and if I want to configure the prompt, it's just a matter of changing a few lines. Now I'll test it by automating it with git actions.