How I automated my LinkedIn presence using Python, Groq AI, and GitHub Actions (and why you should too)
The Problem: LinkedIn Consistency is Hard
Let's be honest — maintaining a consistent LinkedIn presence is exhausting.
As an Android developer building healthcare apps during the day, the last thing I want to do at night is brainstorm "thought leadership" content. But I knew LinkedIn was crucial for:
- Building my professional network
- Showcasing my expertise
- Getting noticed by recruiters
- Long-term career growth
The irony? I'm a developer. If something is repetitive, I should automate it.
So I built a bot that:
- ✅ Finds trending Android topics automatically
- ✅ Uses AI to generate engaging posts
- ✅ Posts to my LinkedIn profile
- ✅ Runs weekly on autopilot via GitHub Actions
Result: I wake up every Wednesday to a fresh LinkedIn post that I never wrote.
Let me show you how it works.
The Architecture: How It Actually Works
Simple, right? But the devil is in the details. Let me break down each component.
Part 1: Finding Trending Topics (DuckDuckGo API)
The first challenge: How do you know what's trending?
I tried multiple approaches:
- ❌ Google Trends API (requires API key, rate limited)
- ❌ Twitter API (expensive, and it's X now)
- ❌ Reddit API (too noisy, hard to filter)
- ✅ DuckDuckGo Search (free, no API key, instant results)
The Code
from duckduckgo_search import DDGS
def search_trending_topics(queries, max_results=5):
"""Search for trending topics across multiple queries"""
all_results = []
with DDGS() as ddgs:
for query in queries:
try:
results = list(ddgs.text(
query,
max_results=max_results,
region='wt-wt', # Worldwide
safesearch='off',
timelimit='w' # Last week only
))
all_results.extend(results)
except Exception as e:
print(f"Search error for '{query}': {e}")
continue
return all_results
What I Search For
SEARCH_QUERIES = [
"Android development trends 2025",
"Kotlin new features latest",
"Jetpack Compose updates",
"Android Studio tips",
"Mobile app development best practices"
]
Why this works:
-
timelimit='w'ensures only recent content (last week) - Multiple queries cast a wider net
-
region='wt-wt'gets global trends, not just local
Pro tip: Customize these queries for your niche. If you're in React, search for "React 19 features", "Next.js updates", etc.
Part 2: AI Content Generation (Groq + LLaMA 70B)
Now comes the magic: Getting AI to write like a human.
I experimented with multiple AI providers:
- ❌ OpenAI GPT-4 (expensive at scale)
- ❌ Anthropic Claude (also pricey)
- ❌ Google Gemini (inconsistent quality)
- ✅ Groq + LLaMA 70B (fast, free tier, excellent quality)
Why Groq?
Groq's LPU (Language Processing Unit) is insanely fast. My posts generate in under 2 seconds. Compare that to 10-15 seconds on other platforms.
Plus: Their free tier is generous (14,400 requests/day with LLaMA 70B).
The Prompt Engineering
This is where 90% of the magic happens. Here's my system prompt:
SYSTEM_PROMPT = """You are a professional Android developer who writes
engaging LinkedIn posts. Your writing style is:
- Clear and conversational (not corporate jargon)
- Technical but accessible
- Uses real examples from production work
- Includes practical takeaways
- 3-5 sentences max (LinkedIn attention spans are short)
- Ends with a relevant question to drive engagement
- Uses 2-3 hashtags (no more!)
You specialize in:
- Android SDK and Kotlin
- Jetpack Compose
- Mobile architecture patterns
- Performance optimization
- Developer productivity
Never use:
- Emojis (except strategically, max 1-2)
- Buzzwords like "game-changer", "revolutionary"
- Generic advice everyone already knows
- Anything that sounds like it was written by AI
Write as if you're sharing a quick insight with fellow developers
over coffee."""
The Content Generation Code
from groq import Groq
def generate_post_with_ai(topics, api_key, dry_run=False):
"""Use Groq AI to pick best topic and generate post"""
# Format topics for the prompt
topics_text = "\n".join([
f"{i+1}. {t['title']}\n {t['body'][:150]}..."
for i, t in enumerate(topics[:5])
])
prompt = f"""From these trending topics:
{topics_text}
Pick the MOST interesting one for Android developers and write a
LinkedIn post about it. Remember: technical but conversational,
3-5 sentences, end with a question."""
client = Groq(api_key=api_key)
completion = client.chat.completions.create(
model="llama-3.3-70b-versatile",
messages=[
{"role": "system", "content": SYSTEM_PROMPT},
{"role": "user", "content": prompt}
],
temperature=0.8, # More creative
max_tokens=500
)
return completion.choices[0].message.content
Key parameters:
-
temperature=0.8— Higher = more creative, lower = more predictable -
max_tokens=500— Keeps posts concise (LinkedIn sweet spot is 150-250 words) - System prompt defines personality/style
Part 3: Posting to LinkedIn (The API Nightmare)
Oh boy. LinkedIn's API is... let's just say it's an adventure.
The Authentication Dance
LinkedIn uses OAuth 2.0, which requires:
- Creating a LinkedIn App (Developer Portal)
- Getting client ID + secret
- Redirecting user for authorization
- Exchanging code for access token
- Storing token securely
I created a separate script (linkedin_post.py) to handle this one-time setup.
The Posting Code
import requests
def post_to_linkedin(content, access_token, person_urn):
"""Post content to LinkedIn"""
url = "https://api.linkedin.com/v2/ugcPosts"
headers = {
"Authorization": f"Bearer {access_token}",
"Content-Type": "application/json",
"X-Restli-Protocol-Version": "2.0.0"
}
post_data = {
"author": person_urn,
"lifecycleState": "PUBLISHED",
"specificContent": {
"com.linkedin.ugc.ShareContent": {
"shareCommentary": {
"text": content
},
"shareMediaCategory": "NONE"
}
},
"visibility": {
"com.linkedin.ugc.MemberNetworkVisibility": "PUBLIC"
}
}
response = requests.post(url, headers=headers, json=post_data)
if response.status_code == 201:
return True, "Post successful!"
else:
return False, f"Error: {response.status_code} - {response.text}"
The Gotchas I Discovered
Problem #1: Token Expiration
LinkedIn access tokens expire after 60 days. Solution: Store expiry date and refresh proactively.
Problem #2: Rate Limits
LinkedIn allows ~100 posts/day. For weekly automation, this is fine. But test carefully.
Problem #3: Person URN
You need your LinkedIn person_urn (not just your profile URL). Get it via:
https://api.linkedin.com/v2/me
Part 4: Automation (GitHub Actions)
The final piece: Running this automatically every week.
Why GitHub Actions?
- ✅ Free (2,000 minutes/month on free tier)
- ✅ Cron scheduling built-in
- ✅ Secure secrets management
- ✅ No server maintenance
The Workflow File
Create .github/workflows/weekly-post.yml:
name: Weekly LinkedIn Post
on:
schedule:
- cron: '0 3 * * 3'
workflow_dispatch: # Manual trigger for testing
jobs:
post-to-linkedin:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run LinkedIn poster
env:
GROQ_API_KEY: ${{ secrets.GROQ_API_KEY }}
LINKEDIN_ACCESS_TOKEN: ${{ secrets.LINKEDIN_ACCESS_TOKEN }}
LINKEDIN_PERSON_URN: ${{ secrets.LINKEDIN_PERSON_URN }}
run: python linkedin_ai_poster.py
- name: Commit post history
run: |
git config user.name "GitHub Actions Bot"
git config user.email "actions@github.com"
git add post_history.json
git commit -m "📝 New LinkedIn post published" || echo "No changes"
git push
Cron Syntax Explained
'0 3 * * 3'
│ │ │ │ │
│ │ │ │ └── Day of week (3 = Wednesday)
│ │ │ └──── Month (any)
│ │ └────── Day of month (any)
│ └──────── Hour (03 UTC)
└────────── Minute (0)
My schedule: Every Wednesday at 8:30 AM ITC
Storing Secrets
In your GitHub repo:
- Go to Settings → Secrets and variables → Actions
- Add these secrets:
GROQ_API_KEYLINKEDIN_ACCESS_TOKENLINKEDIN_PERSON_URN
Never commit these to your code!
The Safety Features
I added several safeguards to avoid looking like a spam bot:
1. Dry Run Mode
Test without actually posting:
python linkedin_ai_poster.py --dry-run
This shows you:
- What topics were found
- What post would be generated
- NO actual posting
2. Post History Tracking
def save_post_history(post_content):
"""Track what we've posted to avoid duplicates"""
history = load_history()
history.append({
"timestamp": datetime.now().isoformat(),
"content": post_content[:100], # First 100 chars
"hash": hashlib.md5(post_content.encode()).hexdigest()
})
with open("post_history.json", "w") as f:
json.dump(history, f, indent=2)
This prevents posting:
- Duplicate content
- Same topics repeatedly
- Helps audit what's been posted
3. Content Validation
Before posting, I check:
- ✅ Post isn't empty
- ✅ Post is 50-2000 characters
- ✅ Post doesn't match recent history
- ✅ Post contains at least one hashtag
The Results:
Sample Generated Posts:
Post #1(Jetbrains previewing new features for ktor)
Post#2(Moving from Java to Kotlin: An Android Developer’s Perspective)
Total time invested: 0 minutes (after initial setup)
What Surprised Me
- AI quality is REALLY good — Most posts needed zero editing
- Engagement is comparable to manual posts — Sometimes even better
- People ask questions — The ending question drives real conversations
- No one realizes it's automated — (Until now, I guess! 😄)
The Challenges I Faced
Challenge #1: AI Generated Generic Content
Problem: Early versions sounded too much like AI — lots of "game-changing" and "revolutionary" language.
Solution: Heavily refined the system prompt with explicit "never use" lists and examples of good posts.
Challenge #2: DuckDuckGo Rate Limiting
Problem: Occasionally got blocked for too many searches.
Solution: Added exponential backoff and reduced search frequency (5 queries, not 20).
import time
from random import uniform
def search_with_retry(query, max_retries=3):
for attempt in range(max_retries):
try:
return ddgs.text(query, max_results=5)
except Exception as e:
if attempt < max_retries - 1:
wait = uniform(2, 5) * (2 ** attempt) # Exponential backoff
time.sleep(wait)
else:
raise
Challenge #3: LinkedIn Token Expiration(Upcoming)
Problem: Tokens expire every 60 days. Automation breaks silently.
Solution: Added expiry tracking + email notification (via GitHub Actions) when token is <7 days from expiration.
Should You Build This?
✅ Build this if:
- You struggle with LinkedIn consistency
- You're comfortable with Python
- You have 2-3 hours for initial setup
- You want to experiment with AI automation
❌ Skip this if:
- You prefer fully manual, authentic posting
- You're uncomfortable with automation ethics
- You don't have time to monitor it initially
- Your industry requires very personalized content
The Ethics Question
"Isn't this inauthentic?"
I wrestled with this. Here's my thinking:
What the bot does:
- ✅ Finds relevant, real topics
- ✅ Generates posts in MY voice (via prompt engineering)
- ✅ Posts to MY profile with MY name
What the bot doesn't do:
- ❌ Impersonate someone else
- ❌ Spam or mislead
- ❌ Post irrelevant content
- ❌ Engage with others (I still do that manually)
My take: It's a tool, like a spell-checker or Grammarly. It helps me maintain presence, but I'm still responsible for the content.
I also:
- Review posts weekly (dry run first)
- Respond to all comments personally
- Occasionally edit AI suggestions
- Stay transparent (this article!)
How to Build Your Own
Step 1: Clone the Repo
git clone https://github.com/jasi381/LinkedInPostGenerator.git
cd LinkedInPostGenerator
pip install -r requirements.txt
Step 2: Get API Keys
Groq (Free):
- Visit https://console.groq.com
- Sign up
- Generate API key
LinkedIn:
- Create LinkedIn App (https://developer.linkedin.com)
- Get client ID + secret
- Run
linkedin_post.pyfor OAuth flow - Save tokens
Step 3: Test Locally
# Dry run (no posting)
python linkedin_ai_poster.py --dry-run
# Review the generated post
# If it looks good, run for real:
python linkedin_ai_poster.py
Step 4: Set Up Automation
- Fork the repo
- Add secrets to GitHub Actions
- Customize workflow schedule
- Enable Actions in repo settings
That's it! Your bot is live.
The Bigger Picture
This project taught me something important: Automation isn't about replacing authenticity — it's about enabling consistency.
As developers, we have skills that most professionals don't:
- API integration
- AI prompt engineering
- CI/CD workflows
- System automation
Why not use these skills for career growth?
This LinkedIn bot:
- Saves me 2+ hours weekly
- Maintains my professional presence
- Showcases my automation skills
- Generates opportunities
And now, it's a portfolio piece too.
Try It Yourself
Full code is on GitHub: repo link
What would you automate next?
I'm thinking:
- Cover letter generator (for job applications)
- GitHub contribution tracker (auto-posts weekly stats)
- Dev.to article sync (cross-platform publishing)
Drop your ideas in the comments! 👇

Top comments (3)
I found this idea very interesting and necessary for people who like to be active on LinkedIn and not just have another LinkedIn profile; I'm going to try it.
I liked it, and if I want to configure the prompt, it's just a matter of changing a few lines. Now I'll test it by automating it with git actions.
Hey Josue! Thanks for the kind words! 🙏
Definitely give it a try - the setup takes about 30 mins and then it runs on autopilot.
Pro tip: Spend time customizing the SYSTEM_PROMPT to match your writing style - that's where the magic happens.
Let me know if you hit any issues with the LinkedIn OAuth flow (that's usually the trickiest part). Happy to help!
Also curious what niche you'll use it for (mine's Android dev, but it works for any topic).
I had no problems at all, it works perfectly. The automation part is also working correctly and is easy to set up. For now, it will focus on tools and tips for backend web development, but as I said before, the prompt can always be changed 💪