DEV Community

Wulix
Wulix

Posted on

How I built 9 AI automation agents in Python to run my business 24/7

How I built 9 AI automation agents in Python to run my business 24/7 (full breakdown)

Tags: python, ai, automation, productivity


I'm a French freelance developer. Three months ago I was spending 4-5 hours per week on repetitive tasks: writing invoices, posting on LinkedIn, responding to basic support emails, monitoring competitors.

Today, 9 Python agents do all of that automatically. Here's the full architecture, what I learned, and the actual code patterns I use.


The 9 agents and what they do

Agent Task Stack Schedule
SAMBA Writes + publishes 1 SEO blog article/week Python + Gemini API + Cloudflare deploy Every Monday 8am
MARIAMA Posts on LinkedIn company page Python + Make.com API Every Sunday (prepares Monday 9am post)
LAMINE Reads emails, auto-responds to FAQ Python + IMAP + Gmail SMTP Every 4 hours
NDEYE Daily revenue report by email Python + Gumroad API Every day 8:30am
KOFI AI news digest Python + RSS + Gemini Every day 8:15am
KOUMAN Scans Reddit for qualified leads Python + Reddit JSON API + Gemini Every day 9am
FAKTOUR Generates PDF invoices per sale Python + fpdf2 On demand
IBRAHIMA Creates visual assets (covers, QR codes) Python + Pillow + qrcode On demand
ROKHAYA CRM follow-up emails Python + Gmail SMTP Disabled (safety)

Total monthly cost: ~0€ (all free tiers)


Architecture pattern: every agent looks the same

The key insight that made this scalable: every agent follows the same 5-step pattern.

# base_agent.py β€” shared skeleton
class BaseAgent:
    def __init__(self, name: str):
        self.name = name
        self.log_file = Path(f"agents/logs/{name.lower()}.log")
        load_dotenv()

    def log(self, msg: str):
        ts = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
        line = f"[{ts}] {self.name} | {msg}"
        print(line)
        with open(self.log_file, "a", encoding="utf-8") as f:
            f.write(line + "\n")

    def run(self):
        raise NotImplementedError
Enter fullscreen mode Exit fullscreen mode

Every agent:

  1. Loads config from .env
  2. Fetches data from its source (API, email, RSS, Reddit)
  3. Processes with Gemini API (or rule-based logic)
  4. Acts (sends email, posts content, saves file)
  5. Logs everything for debugging

Agent #1: NDEYE β€” Daily revenue report (simplest, build this first)

import httpx
import os
from datetime import date

GUMROAD_TOKEN = os.getenv("GUMROAD_ACCESS_TOKEN")

def get_sales_today():
    """Fetch today's Gumroad sales."""
    r = httpx.get(
        "https://api.gumroad.com/v2/sales",
        params={"access_token": GUMROAD_TOKEN, "after": str(date.today())}
    )
    sales = r.json().get("sales", [])
    total = sum(float(s["price"]) / 100 for s in sales)
    return len(sales), total

def send_report():
    count, revenue = get_sales_today()
    body = f"""
    πŸ“Š WULIX Daily Report β€” {date.today()}

    Sales today: {count}
    Revenue today: {revenue:.2f}€

    β€” Agent NDEYE
    """
    # Send via Gmail SMTP (see full code below)
    send_email("Daily Revenue Report", body)
Enter fullscreen mode Exit fullscreen mode

This took me 20 minutes to build and now I wake up every day with a revenue summary in my inbox. Start here.


Agent #2: KOUMAN β€” Reddit lead prospector (most valuable)

The idea: scan relevant subreddits for people actively asking about automation, then use Gemini to qualify the lead and generate a response message.

import urllib.request
import json
import time

SUBREDDITS = ["r/n8n", "r/nocode", "r/zapier", "r/freelance_fr"]
KEYWORDS = ["automation", "n8n", "workflow", "need developer", "python script"]

def fetch_hot_posts(subreddit, limit=25):
    url = f"https://www.reddit.com/{subreddit}/hot.json?limit={limit}"
    req = urllib.request.Request(url, headers={"User-Agent": "MyBot/1.0"})
    with urllib.request.urlopen(req, timeout=10) as resp:
        data = json.loads(resp.read())
    return [child["data"] for child in data["data"]["children"]]

def score_post(post):
    """Simple keyword scoring before calling Gemini (saves API quota)."""
    text = (post["title"] + " " + post.get("selftext", "")).lower()
    score = sum(1 for kw in KEYWORDS if kw in text)
    score -= 3 * any(bad in text for bad in ["free", "unpaid", "volunteer"])
    return max(0, score)

def qualify_with_gemini(post):
    """Only called for posts scoring >= 2."""
    time.sleep(4)  # Stay under 15 req/min free tier limit

    prompt = f"""Analyze this Reddit post. Is it a qualified lead for an AI automation service?

Title: {post['title']}
Text: {post.get('selftext', '')[:300]}

Return JSON: {{"qualified": bool, "score": 0-10, "response_message": "2-3 line natural reply"}}"""

    # Call Gemini API...
    result = call_gemini(prompt)
    return result

# Run daily: fetch β†’ filter β†’ qualify β†’ email best leads
Enter fullscreen mode Exit fullscreen mode

Results so far: This finds 3-7 genuinely qualified leads per week in r/n8n and r/nocode. I respond to the posts manually using the generated messages.


Agent #3: SAMBA β€” Autonomous SEO blog writer

Every Monday at 8am, SAMBA:

  1. Picks the next topic from a predefined list
  2. Generates a full article with Gemini
  3. Inserts it into blog.html at a marker comment
  4. Deploys to Cloudflare Pages via Wrangler
ARTICLE_TOPICS = [
    ("automatiser-emails-python", "Comment automatiser ses emails avec Python"),
    ("n8n-vs-make-2026", "n8n vs Make.com en 2026 : lequel choisir ?"),
    ("agent-ia-gemini-gratuit", "CrΓ©er un agent IA gratuit avec l'API Gemini"),
    # ... 20 more topics
]

def generate_article(slug, title):
    prompt = f"""Write a 600-word SEO article for a French AI automation blog.
    Title: {title}
    Include: practical tips, code example if relevant, CTA to wulix.fr
    Format: HTML with h2/h3 tags, no outer html/body tags."""

    return call_gemini(prompt)

def insert_into_blog(html_content, slug, title):
    """Insert at <!-- SAMBA_INSERT_POINT --> marker."""
    blog_path = Path("ui/blog.html")
    current = blog_path.read_text(encoding="utf-8")

    article_block = f"""
    <article id="{slug}">
        <h2>{title}</h2>
        {html_content}
    </article>
    """

    updated = current.replace(
        "<!-- SAMBA_INSERT_POINT -->",
        article_block + "\n<!-- SAMBA_INSERT_POINT -->"
    )
    blog_path.write_text(updated, encoding="utf-8")

def deploy_cloudflare():
    os.system("wrangler pages deploy ui --project-name wulix-pages")
Enter fullscreen mode Exit fullscreen mode

6 articles already published. Google Search Console is indexing them.


Agent #4: MARIAMA β€” LinkedIn auto-poster (the tricky one)

Direct LinkedIn API posting requires w_organization_social scope, which is hard to get approved. My workaround: use the Make.com REST API to update an existing scenario's blueprint.

import httpx
import json

MAKE_API_KEY = os.getenv("MAKE_API_KEY")
SCENARIO_ID = 5342533  # your Make.com scenario ID
BASE_URL = "https://eu1.make.com/api/v2"

def get_blueprint():
    r = httpx.get(
        f"{BASE_URL}/scenarios/{SCENARIO_ID}/blueprint",
        headers={"Authorization": f"Token {MAKE_API_KEY}"}
    )
    return r.json()["response"]["blueprint"]

def update_linkedin_content(blueprint, new_text):
    """Find the LinkedIn module and update its content."""
    for module in blueprint.get("flow", []):
        if "mapper" in module and "content" in module["mapper"]:
            module["mapper"]["content"] = new_text
    return blueprint

def push_and_activate(blueprint):
    payload = {
        "blueprint": json.dumps(blueprint, ensure_ascii=False),
        "scheduling": json.dumps({"days": [1], "time": "09:00", "type": "weekly"})
    }
    # Update blueprint
    httpx.patch(
        f"{BASE_URL}/scenarios/{SCENARIO_ID}",
        headers={"Authorization": f"Token {MAKE_API_KEY}"},
        json=payload
    )
    # Activate (separate call required)
    httpx.post(
        f"{BASE_URL}/scenarios/{SCENARIO_ID}/start",
        headers={"Authorization": f"Token {MAKE_API_KEY}"}
    )
Enter fullscreen mode Exit fullscreen mode

This runs every Sunday evening. Make.com then fires the LinkedIn post Monday at 9am. No LinkedIn API approval needed.


Infrastructure: Windows Task Scheduler (no server needed)

Everything runs locally on Windows using schtasks:

# Register daily agent
schtasks /Create /TN "WULIX_NDEYE_Daily" `
    /TR "python C:\path\to\agents\ndeye_agent.py" `
    /SC DAILY /ST 08:30 /F

# Register weekly agent
schtasks /Create /TN "WULIX_SAMBA_Weekly" `
    /TR "python C:\path\to\agents\samba_agent.py" `
    /SC WEEKLY /D MON /ST 08:00 /F
Enter fullscreen mode Exit fullscreen mode

No cloud server, no cron job. Your local machine runs everything.


What I learned building this

1. Start with the simplest agent (NDEYE/revenue report). It's 40 lines and immediately useful.

2. Gemini free tier is enough β€” 60 requests/minute, 1500/day. More than sufficient for 9 agents.

3. Separate "dry run" from "real run" from day one:

DRY_RUN = os.getenv("ROKHAYA_DRY_RUN", "true") == "true"
if DRY_RUN:
    print(f"[DRY RUN] Would send email to: {recipient}")
else:
    send_email(recipient, subject, body)
Enter fullscreen mode Exit fullscreen mode

This saved me from sending 200 spam emails during testing.

4. One .env file for all agents β€” all API keys in one place, all agents import from it.

5. Rate limit everything β€” add time.sleep(4) between Gemini calls, you'll thank yourself later.


The results so far

  • 0€/month in infrastructure (all free tiers)
  • ~3h/week saved on repetitive tasks
  • 6 SEO articles published automatically
  • 20 LinkedIn posts queued (5 months of content)
  • 4 digital products on Gumroad: 5€ to 29€

What's next

I packaged the most reusable scripts into digital products at wulix.gumroad.com:

  • Pack Scripts Python (29€) β€” the 5 most useful agents as ready-to-deploy scripts
  • Pipeline LinkedIn n8n (19€) β€” the complete Make.com + LinkedIn workflow
  • Guide PDF (9€) β€” build 5 automations in 1 weekend
  • 50 AI Prompts (5€) β€” tested prompts for ChatGPT/Claude/Gemini

Custom automation services: wulix.fr

Happy to answer questions about any specific agent in the comments.


Omar Sylla β€” WULIX | AI Automation for Freelancers & SMBs

Top comments (0)