DEV Community

Vhub Systems
Vhub Systems

Posted on • Originally published at apify.com

How to Scrape Government Tenders From SAM.gov and TED EU

Build a Public Procurement Scraper for SAM.gov and TED EU in 2026

Government tenders are a goldmine for developers and businesses, but manually tracking opportunities across platforms like SAM.gov (U.S. federal contracts), TED EU (European public procurement), and Find a Tender UK is a nightmare. In 2026, with global public spending projected to hit trillions, automating data extraction with a public procurement scraper for SAM.gov and TED EU isn’t just a nice-to-have—it’s a competitive edge. Whether you’re building tools for clients, feeding data pipelines, or hunting contracts, web scraping these portals saves hours of grunt work. This guide walks you through scraping government tenders efficiently, using real-world tools and code.

Why Government Procurement Data is Valuable in 2026

Public procurement data is a treasure trove for businesses and developers. Governments worldwide are expected to spend over $10 trillion annually by 2026 on goods, services, and infrastructure. Tapping into this via platforms like SAM.gov and TED EU offers massive ROI. For example, a small consultancy could land a $500,000 contract by spotting a niche tender early, while a data firm might charge $50,000 annually for curated procurement insights. Use cases are endless: lead generation for sales teams, market trend analysis for strategists, or compliance monitoring for legal firms. Developers can build APIs or dashboards that clients pay premiums for, turning raw data into actionable intelligence. With automation, you’re not just saving time—you’re unlocking revenue streams. In a world of tightening budgets and fierce competition, being first to bid or analyze trends can make or break a business. Scraping this data programmatically ensures you stay ahead of manual competitors stuck in spreadsheets.

What Data Can You Extract?

When scraping government procurement portals like SAM.gov and TED EU, you can pull structured, actionable fields. Here’s what you might get, with realistic example values:

  • Tender Title: "Construction of Public Library in Austin, TX"
  • Contracting Authority: "City of Austin"
  • Publication Date: "2026-01-15"
  • Deadline for Submission: "2026-02-15"
  • Contract Value: "$1,200,000 USD"
  • Location: "Austin, TX, USA" or "Brussels, Belgium"
  • Category/CPV Code: "Construction Services / 45000000"
  • Description: "Seeking contractors for a 20,000 sq ft library build."
  • Document Links: "https://sam.gov/opp/12345/docs"
  • Status: "Open" or "Closed"

These fields let you filter, analyze, or integrate data into your systems for real-time decision-making.

Step-by-Step Guide

Let’s build a public procurement scraper for SAM.gov and TED EU. We’ll use Apify, a platform that simplifies web scraping with pre-built actors and scalable infrastructure. Here’s how to get started in three steps, assuming basic familiarity with Node.js or APIs.

Step 1: Set Up Your Apify Account and Actor

First, sign up at Apify and access the Public Procurement Hub actor. This pre-built tool targets SAM.gov, TED EU, and other portals, handling authentication, pagination, and data structuring for you. Install the Apify CLI or use the web interface. Configure the actor with your target portal (e.g., SAM.gov) and set parameters like date range or category. No need to write complex selectors—Apify abstracts most of the DOM parsing. Allocate enough compute units (CUs) for larger datasets; a basic scrape might cost $0.50 in CUs.

Step 2: Define Input Parameters

Customize your scrape by defining input fields. For SAM.gov, you might filter by "small business set-asides" or location. For TED EU, specify CPV codes for specific industries like IT services. Here’s a sample JSON input for the actor:

{
  "portals": ["sam.gov", "ted.europa.eu"],
  "maxItems": 100,
  "startDate": "2026-01-01",
  "endDate": "2026-12-31",
  "categories": ["IT Services", "Construction"],
  "locations": ["USA", "EU"],
  "proxy": {
    "useApifyProxy": true
  }
}
Enter fullscreen mode Exit fullscreen mode

This tells the scraper to pull up to 100 tenders from 2026, focusing on IT and construction in the US and EU. Adjust maxItems based on your needs—SAM.gov alone lists thousands of opportunities monthly.

Step 3: Run and Export Data

Hit "Run" on Apify’s dashboard. The actor scrapes data, handles rate limits, and stores results in a dataset. Export as JSON, CSV, or integrate via API into your app. Monitor logs for errors like CAPTCHA blocks (rare with Apify’s proxy rotation). A typical run for 100 items takes 5-10 minutes. Post-scrape, you’ll get structured data matching the fields listed earlier. Pipe this into a database like PostgreSQL or a BI tool like Tableau for analysis. If you’re a developer, automate runs via Apify’s scheduler to get daily tender updates without lifting a finger.

This approach minimizes coding overhead while delivering clean, usable data. Scale up by tweaking inputs or integrating with webhooks for real-time alerts on new tenders.

5 Business Use Cases

Government procurement data scraped from SAM.gov and TED EU can power multiple revenue-generating or efficiency-boosting applications. Here are five concrete examples:

  1. Lead Generation for SMEs: A marketing agency scrapes SAM.gov for federal contracts under $100,000, targeting small business set-asides. They sell leads to local contractors for $500 per qualified opportunity, netting $10,000 monthly.
  2. Market Intelligence Dashboards: A data firm builds a subscription-based dashboard with TED EU data, showing procurement trends in renewable energy. Clients pay $2,000/year for access, with 50 subscribers yielding $100,000 annually.
  3. Compliance Monitoring: A legal consultancy scrapes tenders to track if government entities follow EU procurement laws, charging clients $5,000 per audit report.
  4. Bid Automation Tools: A developer creates a SaaS tool that alerts companies to relevant SAM.gov tenders via email, charging $99/month per user. With 200 users, that’s $19,800 monthly.
  5. Supply Chain Optimization: A logistics company uses scraped data to predict government demand for transport services in the EU, adjusting fleet allocation to save $50,000 in operational costs yearly.

These use cases show how scraped data translates to direct business value, whether through sales, savings, or strategic insights.

Conclusion

Scraping government tenders from SAM.gov and TED EU in 2026 is a game-changer for developers and businesses. With trillions in public spending at stake, a public procurement scraper unlocks valuable data for leads, analysis, and automation. Using tools like Apify’s Public Procurement Hub, you can extract structured data effortlessly and turn it into actionable outcomes. Ready to build or integrate your own solution? Start now and stay ahead of the competition. Try Public Procurement Hub free on Apify →


Take the next step

Skip the setup. Production-ready tools for government tender scraping:

Apify Scrapers Bundle — $29 one-time

Instant download. Documented. Ready to deploy.

Top comments (0)