DEV Community

agenthustler
agenthustler

Posted on

Building a Political Donor Tracker with FEC Campaign Finance Data

Campaign finance data is public by law. The FEC publishes every donation over $200 to federal candidates. This data powers investigative journalism, civic tech, and political research. Here's how to build a donor tracking system.

FEC Data Sources

  • Bulk data: Downloadable CSV files updated nightly
  • API: REST API at api.open.fec.gov (free API key required)
  • Website: fec.gov with searchable database

Using the FEC API

import requests
import time
from collections import defaultdict

class FECTracker:
    BASE_URL = "https://api.open.fec.gov/v1"

    def __init__(self, api_key):
        self.api_key = api_key
        self.session = requests.Session()
        self.session.params = {"api_key": api_key}

    def search_donors(self, name=None, employer=None, state=None,
                      min_amount=None, cycle=2024):
        params = {
            "two_year_transaction_period": cycle,
            "per_page": 100,
            "sort": "-contribution_receipt_amount",
        }
        if name: params["contributor_name"] = name
        if employer: params["contributor_employer"] = employer
        if state: params["contributor_state"] = state
        if min_amount: params["min_amount"] = min_amount
        resp = self.session.get(f"{self.BASE_URL}/schedules/schedule_a/", params=params)
        return resp.json().get("results", [])

    def get_candidate_totals(self, candidate_id):
        resp = self.session.get(f"{self.BASE_URL}/candidate/{candidate_id}/totals/")
        return resp.json().get("results", [])
Enter fullscreen mode Exit fullscreen mode

Analyzing Donor Patterns by Industry

def analyze_industry_donations(tracker, employer_keywords, cycle=2024):
    industry_totals = defaultdict(lambda: {"total": 0, "count": 0, "donors": set()})
    for keyword in employer_keywords:
        donations = tracker.search_donors(employer=keyword, cycle=cycle)
        for d in donations:
            amount = d.get("contribution_receipt_amount", 0)
            donor = d.get("contributor_name", "Unknown")
            industry_totals[keyword]["total"] += amount
            industry_totals[keyword]["count"] += 1
            industry_totals[keyword]["donors"].add(donor)
        time.sleep(0.5)
    for k in industry_totals:
        industry_totals[k]["unique_donors"] = len(industry_totals[k]["donors"])
        del industry_totals[k]["donors"]
    return dict(industry_totals)

tech_companies = ["Google", "Meta", "Amazon", "Microsoft", "Apple", "OpenAI"]
results = analyze_industry_donations(FECTracker("YOUR_FEC_KEY"), tech_companies)
for company, data in sorted(results.items(), key=lambda x: x[1]["total"], reverse=True):
    print(f"{company}: ${data['total']:,.0f} from {data['unique_donors']} donors")
Enter fullscreen mode Exit fullscreen mode

Building a Donor Network Graph

def build_donor_network(tracker, candidate_ids):
    candidate_donors = {}
    for cid in candidate_ids:
        donations = tracker.search_donors(min_amount=1000)
        donors = set()
        for d in donations:
            donors.add(d.get("contributor_name", "").upper().strip())
        candidate_donors[cid] = donors
        time.sleep(1)

    overlaps = {}
    candidates = list(candidate_donors.keys())
    for i in range(len(candidates)):
        for j in range(i + 1, len(candidates)):
            shared = candidate_donors[candidates[i]] & candidate_donors[candidates[j]]
            if shared:
                pair = f"{candidates[i]} <-> {candidates[j]}"
                overlaps[pair] = {"shared_donors": len(shared), "sample": list(shared)[:10]}
    return overlaps
Enter fullscreen mode Exit fullscreen mode

Scraping State-Level Data

FEC only covers federal elections. For state races, scrape state commission websites:

from bs4 import BeautifulSoup
import re

def scrape_state_filings(state_url, api_key):
    params = {"api_key": api_key, "url": state_url, "render": "true"}
    resp = requests.get("https://api.scraperapi.com", params=params, timeout=60)
    soup = BeautifulSoup(resp.text, "html.parser")
    filings = []
    table = soup.find("table", class_=re.compile(r"filing|contribution|donor"))
    if table:
        headers = [th.get_text(strip=True) for th in table.find_all("th")]
        for row in table.find_all("tr")[1:]:
            cols = [td.get_text(strip=True) for td in row.find_all("td")]
            if len(cols) == len(headers):
                filings.append(dict(zip(headers, cols)))
    return filings
Enter fullscreen mode Exit fullscreen mode

ScraperAPI handles varying protection across state election websites. ThorData provides geo-targeted residential proxies. Track rates with ScrapeOps.

Monitoring Large Donations

def monitor_large_donations(tracker, threshold=50000, cycle=2024):
    donations = tracker.search_donors(min_amount=threshold, cycle=cycle)
    return [{
        "donor": d.get("contributor_name"),
        "amount": d.get("contribution_receipt_amount"),
        "recipient": d.get("committee", {}).get("name"),
        "date": d.get("contribution_receipt_date"),
    } for d in donations]

large = monitor_large_donations(FECTracker("YOUR_KEY"))
for d in large[:20]:
    print(f"${d['amount']:,.0f} from {d['donor']} to {d['recipient']}")
Enter fullscreen mode Exit fullscreen mode

Ethical Guidelines

FEC data is public record — using it is legal and encouraged. Don't harass donors, attribute data to FEC, and note small donors (<$200) aren't individually disclosed.


Campaign finance data is democracy's audit trail. With Python and the FEC API, you can build tools that make this data accessible for researchers, journalists, and citizens.

Happy scraping!

Top comments (0)