DEV Community

Igor Ganapolsky
Igor Ganapolsky

Posted on

I built a $0/month automation stack using GitHub Actions free tier

I built a $0/month automation stack using GitHub Actions free tier

After getting rejected from 21 job applications, I decided to build passive income instead. Three months later, I have 6 automation workflows running 24/7 that cost me exactly $0/month.

Here's the complete technical breakdown.

The Problem

I needed automation for:

  • Market scanning (checking prices across platforms)
  • Deal alerts (instant notifications when opportunities appear)
  • Lead monitoring (24/7 inbox watching)
  • Data aggregation (combining info from multiple sources)

Traditional solutions:

  • Zapier: $19.99/month ($240/year)
  • n8n Cloud: $20/month ($240/year)
  • AWS Lambda + EventBridge: ~$20/month

My solution: GitHub Actions free tier = $0/month

What GitHub Actions Gives You (Free)

  • 2,000 minutes/month for public repos
  • Ubuntu Linux runners (Python 3.11, Node.js 18)
  • Cron scheduling (minimum 5-minute intervals)
  • Built-in secret management (encrypted API keys)
  • Artifact storage for outputs
  • Full Git integration

My 6 Workflows

1. Market Scanner

Scans multiple platforms for pricing opportunities every 6 hours.

name: Market Scanner
on:
  schedule:
    - cron: '0 */6 * * *'
  workflow_dispatch:

jobs:
  scan:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.12'

      - name: Run Scanner
        env:
          NTFY_TOPIC: ${{ secrets.NTFY_TOPIC }}
        run: |
          python scripts/market_scanner.py \
            --notify \
            --output-csv data/results.csv

      - name: Commit Results
        run: |
          git config user.name "github-actions[bot]"
          git add data/
          git commit -m "Scan: $(date +%Y-%m-%d)" || true
          git push
Enter fullscreen mode Exit fullscreen mode

Key insight: Using Git as your database. Every scan commits results to CSV. No external database needed.

2. Deal Alerts

Sends push notifications via ntfy.sh (also free) when high-value opportunities are detected.

# notification_helper.py
import urllib.request

def send_alert(title: str, message: str, priority: str = "default"):
    req = urllib.request.Request(
        f"https://ntfy.sh/{NTFY_TOPIC}",
        data=message.encode(),
        headers={
            "Title": title,
            "Priority": priority,
            "Tags": "moneybag,bell"
        }
    )
    urllib.request.urlopen(req, timeout=10)
Enter fullscreen mode Exit fullscreen mode

Why ntfy.sh? Free, no auth required, works on iOS/Android, supports priority levels.

3. SAGE Feedback Loop

This is where it gets interesting. I built a Thompson Sampling-based system that learns from conversions.

# sage_feedback_loop.py
def record_conversion(listing_id, platform, revenue):
    """Boost weights for successful conversions."""
    boost_factor = 1.1 + (revenue / 100)

    if platform in weights["platform_weights"]:
        weights["platform_weights"][platform] *= boost_factor

    normalize_weights()
    save_weights()
Enter fullscreen mode Exit fullscreen mode

The system tracks:

  • Which platforms convert best
  • What price points work
  • Which times of day perform
  • Category performance

Over time, it gets smarter about where to focus effort.

4-6. Lead Monitor, Scout, Dashboard

Similar patterns - scheduled Python scripts that:

  1. Fetch data from APIs
  2. Process and filter
  3. Commit to Git (state persistence)
  4. Send notifications if needed

Usage Stats (3 Months)

Workflow Frequency Minutes/Month
Market Scanner Every 6h 600
Deal Alerts Every 4h 360
Lead Monitor Every 2h 720
Scout Daily 150
Dashboard On push 80
SAGE Every 4h 540

Total: ~2,450 theoretical, ~1,400 actual (early exits save time)

I'm using 70% of my free tier with room to spare.

Architecture

GitHub Actions (FREE)
    ↓
Python scripts (market_scanner.py, etc.)
    ↓
Data stored in Git repo (CSV + JSON)
    ↓
Triggers on update → Alert workflow
    ↓
ntfy.sh push notification (FREE)
    ↓
Mobile alert (iOS/Android)
Enter fullscreen mode Exit fullscreen mode

No servers. No databases. No infrastructure costs.

Limitations (Be Honest)

  1. 2,000 min/month cap - GitHub Pro ($4/month) adds 3,000 more
  2. 5-minute minimum cron - No sub-minute scheduling
  3. 6-hour max job runtime - Fine for most automation
  4. Public repos for free tier - Private needs paid plan
  5. ~15 min scheduling variance - Not precise to the second

When NOT to Use This

  • Real-time applications (<5 minute latency)
  • High-frequency operations (1000s/day)
  • Sensitive data processing (use private repos or self-hosted)
  • Video/image processing (burns minutes fast)

Results

After 3 months:

  • Storage arbitrage: Found 12 opportunities with $120-180/month spreads
  • Tool rental (JIT model): $800 profit first month
  • Lead response time: 35% faster → higher conversion

The workflows paid for themselves immediately.

Get Started

I packaged everything into downloadable templates:

FREE tier (no email required): Download on Gumroad

  • 1 starter workflow
  • Setup guide
  • ntfy.sh integration

Full bundle ($79): Get All 6 Workflows

  • All 6 production workflows
  • SAGE feedback loop system
  • 20+ pages of documentation

Questions?

Drop a comment below. Happy to share:

  • Specific workflow patterns
  • Error handling strategies
  • Scaling approaches
  • Real performance data

This is part of my "Building in Public" series. I got rejected from 21 jobs, so I'm documenting my journey to $30/day passive income instead.

Follow for updates: @IgorGanapolsky

Top comments (0)