DEV Community

Vhub Systems
Vhub Systems

Posted on

How to Scrape LinkedIn Job Listings and Track Hiring Trends (Using Apify)

How to Scrape LinkedIn Job Listings and Track Hiring Trends (Using Apify)

If you work in recruiting, HR analytics, or B2B sales, you already know the problem: LinkedIn has the world's best hiring data — and almost no affordable way to access it programmatically.

LinkedIn Premium starts at $40/month. LinkedIn Recruiter seats run $170/month per user. Sales Navigator is $80–$130/month per seat. And none of these give you a clean, exportable data feed. You get a search UI, a handful of filters, and a manual copy-paste loop that eats 10+ hours a week.

This article shows how to solve that with linkedin-job-scraper — an Apify actor that pulls structured job listing data directly, without a seat license, for a few dollars per run.


The Problem: LinkedIn Job Data Is Locked Behind Expensive Walls

Here's what a typical workflow looks like for a recruiting agency tracking hiring demand across 6 industry verticals:

  • Search LinkedIn manually for each job title + location combo
  • Copy 30–50 listings into a spreadsheet
  • Note company, role, date, location
  • Repeat across 6 verticals
  • Do it again next week

That's 10+ hours per week before any analysis happens. And it's not just agencies — HR analysts, labor market researchers, sales teams prospecting companies actively hiring, and no-code builders automating recruiting pipelines all face the same wall.

LinkedIn's API is effectively closed unless you're an approved enterprise partner. Browser extensions risk account bans. DIY Python scrapers work until LinkedIn updates their layout (usually within 2–4 weeks).


The Actor: linkedin-job-scraper

The linkedin-job-scraper actor runs on Apify's managed infrastructure, so there's no account risk on your side and no maintenance when LinkedIn's layout changes.

What it does:

  • Accepts job search parameters (title, location, result limit)
  • Runs headless scraping on Apify's cloud
  • Returns structured JSON with job listing fields
  • Outputs directly to storage, ready for CSV export or API integration

Input schema (simple):

{
  "jobTitle": "Data Engineer",
  "location": "New York, NY",
  "limit": 100
}
Enter fullscreen mode Exit fullscreen mode

Key output fields per listing:

Field Description
jobTitle Role name as listed
company Hiring company name
location City/state/remote tag
postedDate When the listing went live
jobDescription Full description text
skills Skills listed in the posting (when available)

Cost model: Pay-per-result. At approximately $5 per 1,000 job listings, a 100-listing run costs around $0.50. Compare that to $170/month for a LinkedIn Recruiter seat — and the Recruiter seat doesn't give you a data export.


Quick Start: 3 Steps

Step 1: Create a free Apify account

Go to apify.com and sign up. New accounts get a free monthly usage credit.

Step 2: Run the actor

Navigate to linkedin-job-scraper, click Try for free, and fill in your input:

{
  "jobTitle": "Software Engineer",
  "location": "San Francisco, CA",
  "limit": 50
}
Enter fullscreen mode Exit fullscreen mode

Click Start. The run completes in a few minutes depending on limit size.

Step 3: Export your data

When the run finishes, click ExportCSV or JSON. You'll get a structured table with one row per job listing, ready to drop into Google Sheets, Excel, or any data pipeline.

Sample output (JSON):

[
  {
    "jobTitle": "Senior Software Engineer",
    "company": "Acme Corp",
    "location": "San Francisco, CA",
    "postedDate": "2026-03-25",
    "jobDescription": "We are looking for a senior engineer...",
    "skills": ["Python", "AWS", "Kubernetes"]
  },
  {
    "jobTitle": "Software Engineer II",
    "company": "TechStartup Inc",
    "location": "Remote",
    "postedDate": "2026-03-26",
    "jobDescription": "Join our growing engineering team...",
    "skills": ["React", "Node.js", "PostgreSQL"]
  }
]
Enter fullscreen mode Exit fullscreen mode

Real Use Case: Weekly Hiring Trend Tracker in Google Sheets

Here's the workflow a small recruiting agency uses to track which roles are growing week-over-week:

Setup (one-time):

  1. Create a Google Sheet with columns: Week, Role, Company, Location, Posted Date
  2. Connect Apify to Google Sheets via Zapier integration or the Apify Google Sheets integration directly
  3. Schedule a weekly actor run (Sunday night, so data is ready Monday morning)

The weekly output:

Each week you get a new batch of listings per role/location combo. After 4–6 weeks, patterns emerge:

  • Which companies opened 5+ new roles this week (scaling signal)
  • Which roles show week-over-week growth in your target vertical
  • Which locations are heating up for specific skills

Chart it in Google Sheets:

Create a pivot table: Role on rows, Week on columns, COUNT(company) as values. You now have a hiring trend chart showing which roles are growing, which are flat, and which have dropped off.

This is the kind of analysis LinkedIn Recruiter is priced to prevent: it's available in their UI only as search results, never as a structured dataset you can analyze over time.


Cost vs. Manual: The Real Numbers

Method Monthly Cost Hours/Week Data Export
Manual LinkedIn search $0 (or $40+ Premium) 10+ hours No
LinkedIn Recruiter $170/seat/month ~2 hours Limited
LinkedIn Sales Navigator $80–$130/seat/month ~3 hours No
linkedin-job-scraper ~$5–$20/month < 1 hour Yes (CSV/JSON)

The math is straightforward. If your time is worth $25/hour, 10 hours/week of manual searching costs $1,000/month in opportunity cost — before the Premium subscription. A weekly automated scrape run costs under $20/month and runs while you sleep.


Next Steps: Automate It

Once you've run the actor manually a few times:

1. Schedule weekly runs

In Apify, go to your actor → SchedulesCreate schedule. Set it to run every Sunday at 11 PM. Your data will be waiting Monday morning.

2. Export to CSV automatically

Use the Apify API to pull results programmatically:

import requests

# Replace with your token and run ID
token = "YOUR_APIFY_TOKEN"
run_id = "YOUR_RUN_ID"

url = f"https://api.apify.com/v2/datasets/{run_id}/items?format=csv&clean=true"
response = requests.get(url, headers={"Authorization": f"Bearer {token}"})

with open("linkedin_jobs.csv", "wb") as f:
    f.write(response.content)

print("Downloaded", len(response.content), "bytes")
Enter fullscreen mode Exit fullscreen mode

3. Build a hiring trend dashboard

Once you have 4+ weeks of data, pipe it into Looker Studio (free), Metabase, or any BI tool. Track role growth rates, company hiring velocity, and skill demand — the metrics that previously required a $170/month seat per analyst.


Bottom Line

LinkedIn job data is valuable, and LinkedIn's pricing reflects that. But for recruiters, HR analysts, and anyone building hiring intelligence pipelines, paying $170/month per seat for a search UI with no export is a difficult case to make.

The linkedin-job-scraper actor costs a fraction of that, returns structured data, and runs automatically on a schedule. The 90.9% success rate across 33+ runs in the last 30 days means it's production-stable.

If you're currently doing this manually, the setup takes under 30 minutes. Your first run is free on Apify's trial credit.


Have questions about setting up the Google Sheets pipeline or the scheduling workflow? Drop them in the comments.

Top comments (0)