After getting laid off last year, I spent the first week applying to jobs manually. Copy-paste resume, customize cover letter, fill out forms. 8 hours a day, maybe 10 applications.
Then I wrote three Python scripts. My output went from 10 to 50+ targeted applications per day — and I got 4 interviews in the first week.
Script 1: Job Aggregator
Instead of checking LinkedIn, Indeed, AngelList, and Hacker News separately, I built a script that pulls from all of them:
import httpx
from bs4 import BeautifulSoup
def search_hn_jobs(keywords):
"""Search Hacker News 'Who is Hiring' threads"""
# HN has a free API — no scraping needed
url = "https://hacker-news.firebaseio.com/v0/item/{}.json"
# Get latest "Who is Hiring" thread
search = httpx.get("https://hn.algolia.com/api/v1/search?query=who+is+hiring&tags=story").json()
thread_id = search['hits'][0]['objectID']
thread = httpx.get(url.format(thread_id)).json()
jobs = []
for kid_id in thread.get('kids', [])[:100]:
comment = httpx.get(url.format(kid_id)).json()
text = comment.get('text', '')
if any(kw.lower() in text.lower() for kw in keywords):
jobs.append({'text': text[:500], 'url': f'https://news.ycombinator.com/item?id={kid_id}'})
return jobs
# Find Python + remote jobs
results = search_hn_jobs(['python', 'remote'])
print(f"Found {len(results)} matching jobs")
The key insight: HN "Who is Hiring" threads have the highest response rate because companies post directly without recruiters.
Script 2: Resume Keyword Matcher
Most companies use ATS (Applicant Tracking Systems) that filter by keywords. This script compares your resume against the job description:
from collections import Counter
import re
def match_keywords(resume_text, job_description):
"""Find keywords in job description missing from resume"""
# Extract meaningful words (skip common ones)
stop_words = {'the','a','an','is','are','was','and','or','to','in','for','of','with','on','at'}
job_words = Counter(w.lower() for w in re.findall(r'\b\w+\b', job_description)
if w.lower() not in stop_words and len(w) > 2)
resume_words = set(w.lower() for w in re.findall(r'\b\w+\b', resume_text))
missing = {word: count for word, count in job_words.most_common(30)
if word not in resume_words}
return missing
# Usage
missing = match_keywords(open('resume.txt').read(), job_posting)
print("Keywords to add:", list(missing.keys())[:10])
After adding the top 5-10 missing keywords to my resume, my callback rate went from ~5% to ~15%.
Script 3: Application Tracker
I track every application in a simple SQLite database:
import sqlite3
from datetime import datetime
def init_db():
conn = sqlite3.connect('jobs.db')
conn.execute('''CREATE TABLE IF NOT EXISTS applications (
id INTEGER PRIMARY KEY,
company TEXT, role TEXT, url TEXT,
applied_date TEXT, status TEXT DEFAULT 'applied',
response_date TEXT, notes TEXT
)''')
return conn
def add_application(company, role, url, notes=''):
conn = init_db()
conn.execute('INSERT INTO applications (company, role, url, applied_date, notes) VALUES (?,?,?,?,?)',
(company, role, url, datetime.now().isoformat(), notes))
conn.commit()
print(f"✅ Applied to {company} — {role}")
def get_stats():
conn = init_db()
total = conn.execute('SELECT COUNT(*) FROM applications').fetchone()[0]
by_status = conn.execute('SELECT status, COUNT(*) FROM applications GROUP BY status').fetchall()
print(f"Total: {total} applications")
for status, count in by_status:
print(f" {status}: {count}")
Simple but effective. At a glance I know: how many applications, response rate, which companies ghosted me.
Results After 30 Days
- Applications sent: 312
- Responses: 47 (15% rate)
- Phone screens: 18
- Technical interviews: 8
- Offers: 2
The two offers came from HN "Who is Hiring" and a direct GitHub connection. Not LinkedIn. Not Indeed.
What Would You Automate?
Has anyone else automated their job search? What tools or scripts did you build?
I build developer tools and write about automation. Check out my 130+ open source repos and free API collection.
Top comments (0)