DEV Community

Mindy Jen
Mindy Jen

Posted on

Building LinkedIN Job Application Agents - Part 1

The Problem: Job Applications Are a Time Sink

Job searching is broken. Candidates spend hours crafting resumes for each position, writing personalized cover letters, and manually filling out repetitive application forms. What if we could automate this entire process using AI?

That’s exactly what I set out to build: HunterAgent — an AI-powered system that automatically discovers jobs, optimizes resumes, generates cover letters, and submits applications on behalf of job seekers.

My Plan (Mar. 2026): Full Automation Stack

The vision was ambitious:

  • Job Discovery Agent: Scrape job sites using web automation
  • Resume Optimization Agent: AI-powered resume customization for each job
  • Cover Letter Agent: Generate personalized cover letters
  • Application Submission Agent: Automate form filling and submission
  • Email Notification Agent: Keep users informed of all activities

A technical deep-dive into integrating AI agents with Streamlit UI and Supabase for job application automation -

The Journey So Far...

Today marks a significant milestone in the HunterAgent project — I’ve successfully transformed three standalone AI agents into a fully integrated web application with persistent data storage. This blog post chronicles the technical challenges, architectural decisions, and breakthrough moments that brought our job application automation system to life.

What had been Built Today:

Core Achievement: Complete System Integration

  • 3 AI Agents fully integrated with Streamlit UI
  • Supabase Database with 11 production tables
  • Real-time Processing with async/await patterns
  • Persistent Data Storage for all agent results
  • Production-Ready Application running at localhost:8501

Three AI Agents

Resume Optimization Agent

Purpose: AI-powered resume customization for specific job applications

Key Features:

  • Resume parsing and content extraction
  • AI-powered optimization with company research
  • Quality scoring and match analysis
  • Multiple optimization approaches (keyword-focused, skills-based, experience-focused)
  • Downloadable optimized resumes

Cover Letter Generation Agent

Purpose: Personalized cover letter creation with company research

Key Features:

  • Multiple letter tones (professional, enthusiastic, confident, creative)
  • Company research integration
  • Candidate information processing
  • Cover letter library with search and filtering
  • Quality scoring and personalization metrics

Job Discovery Agent

Purpose: Intelligent job search with market analysis

Key Features:

  • Multi-criteria job search
  • Market trend analysis
  • Company research capabilities
  • Saved jobs management
  • Real-time filtering and sorting

Technical Architecture

OpenAI Responses API Integration

We chose OpenAI’s new Responses API for its structured output capabilities and web search integration. This allows our agents to:

  1. Access real-time company information
  2. Generate structured JSON responses
  3. Handle complex multi-step reasoning
  4. Integrate web search seamlessly

Streamlit UI Framework

Streamlit provided the perfect balance of simplicity and functionality:

  1. Rapid Development: Built complete UI in hours, not days
  2. Session State Management: Persistent data across user interactions
  3. Async Support: Native support for async/await patterns
  4. Component Ecosystem: Rich set of UI components out of the box

Supabase Database Integration

Supabase offered the ideal backend solution:

  1. PostgreSQL: Full SQL capabilities with JSON support
  2. Real-time Features: Live updates and subscriptions
  3. REST API: Easy integration with Python
  4. Row Level Security: Built-in security features
Database Schema Design

Core Tables
 Users
CREATE TABLE public.users (
id UUID PRIMARY KEY,
email TEXT UNIQUE NOT NULL,
full_name TEXT,
avatar_url TEXT,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);

 Resume templates
CREATE TABLE public.resume_templates (
id UUID PRIMARY KEY,
user_id UUID REFERENCES public.users(id),
name TEXT NOT NULL,
content TEXT NOT NULL,
file_url TEXT,
is_default BOOLEAN DEFAULT FALSE,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);

Agent-Specific Tables
 Agent results
CREATE TABLE public.agent_results (
id UUID PRIMARY KEY,
user_id UUID REFERENCES public.users(id),
agent_type TEXT NOT NULL,
task_type TEXT NOT NULL,
input_data JSONB NOT NULL,
output_data JSONB NOT NULL,
success BOOLEAN NOT NULL,
error_message TEXT,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);

 Cover letters
CREATE TABLE public.cover_letters (
id UUID PRIMARY KEY,
user_id UUID REFERENCES public.users(id),
job_title TEXT NOT NULL,
company_name TEXT NOT NULL,
cover_letter_content TEXT NOT NULL,
quality_score INT,
generation_type TEXT,
agent_result_id UUID REFERENCES public.agent_results(id),
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);

 Optimized resumes
CREATE TABLE public.optimized_resumes (
id UUID PRIMARY KEY,
user_id UUID REFERENCES public.users(id),
original_resume_id UUID REFERENCES public.resume_templates(id),
job_title TEXT NOT NULL,
company_name TEXT NOT NULL,
optimized_content TEXT NOT NULL,
job_match_score INT,
optimization_type TEXT,
agent_result_id UUID REFERENCES public.agent_results(id),
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
Enter fullscreen mode Exit fullscreen mode

Technical Challenges Overcome

  1. LinkedIn scraping via Windows MCP server (repost + 100-applicant filter active)
  2. Job scoring with GPT-4o-mini (ATS fit score 0–100)
  3. Resume tailoring + cover letter generation with GPT-4o-mini 4. LinkedIn Easy Apply automation (Playwright)
  4. Email digest
  5. Dashboard with full pipeline button Before each session, start the MCP server in Windows PowerShell:
uvx linkedin-scraper-mcp --transport streamable-http --host 0.0.0.0 --port 8765  
Enter fullscreen mode Exit fullscreen mode

HunterAgent — Step-by-Step Guide

First-Time Setup (do once)

  • Start the LinkedIn MCP server — open Windows PowerShell and run:
uvx linkedin-scraper-mcp --transport streamable-http --host 0.0.0.0 --port 8765
Enter fullscreen mode Exit fullscreen mode

Leave this window open in the background.

  • Start the app — in your terminal:
streamlit run streamlit_app.py
Enter fullscreen mode Exit fullscreen mode
  • Check the dashboard — confirm the green LinkedIn MCP server: connected status

  • Configure your profile — open the app and go to Settings:
    • Keywords: software engineer, python developer (comma-separated)
    • Location: Remote or New York, NY
    • Resume Text: paste your full resume
    • Companies Not Included: any companies to skip
    • Notification Email: your Gmail address

Top comments (0)