DEV Community

Alex Spinov
Alex Spinov

Posted on

PubMed Has a Free API — Search 35M+ Medical Papers Without Scraping (No Key)

If you need biomedical research data, stop scraping Google Scholar. PubMed's E-utilities API gives you direct access to 35 million medical papers — completely free, no API key required.

I discovered this API while building research paper scrapers, and it blew my mind how much data is available for free.

Why PubMed API?

  • 35M+ papers — the largest biomedical literature database
  • No API key — just add your email as a courtesy (not required)
  • No rate limits — 3 requests/second without key, 10/sec with free key
  • Structured XML/JSON — clean, parseable output
  • Full abstracts — complete text, not just titles

Search papers in 10 lines

import requests
import xml.etree.ElementTree as ET

# Step 1: Search for paper IDs
query = "COVID-19 vaccine efficacy 2024"
search_url = f"https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esearch.fcgi?db=pubmed&term={query}&retmax=5&retmode=json"
search = requests.get(search_url).json()
ids = search["esearchresult"]["idlist"]
print(f"Found {search['esearchresult']['count']} papers, showing first {len(ids)}")

# Step 2: Fetch paper details
ids_str = ",".join(ids)
fetch_url = f"https://eutils.ncbi.nlm.nih.gov/entrez/eutils/efetch.fcgi?db=pubmed&id={ids_str}&rettype=abstract&retmode=xml"
response = requests.get(fetch_url)
root = ET.fromstring(response.text)

for article in root.findall(".//PubmedArticle"):
    title = article.findtext(".//ArticleTitle")
    year = article.findtext(".//PubDate/Year") or "N/A"
    journal = article.findtext(".//Journal/Title")
    print(f"[{year}] {title[:70]}")
    print(f"  Journal: {journal}")
Enter fullscreen mode Exit fullscreen mode

Get detailed paper metadata

# Use ESummary for structured metadata
summary_url = f"https://eutils.ncbi.nlm.nih.gov/entrez/eutils/esummary.fcgi?db=pubmed&id={ids_str}&retmode=json"
summary = requests.get(summary_url).json()

for pmid, paper in summary.get("result", {}).items():
    if pmid == "uids":
        continue
    print(f"PMID: {pmid}")
    print(f"Title: {paper.get('title', '')[:80]}")
    print(f"Authors: {', '.join(a['name'] for a in paper.get('authors', [])[:3])}")
    print(f"Journal: {paper.get('fulljournalname', '')}")
    print(f"Date: {paper.get('pubdate', '')}")
    print(f"DOI: {paper.get('elocationid', '')}")
    print()
Enter fullscreen mode Exit fullscreen mode

Advanced: Citation analysis

# Find papers that cite a specific paper
pmid = "33264556"  # Example: BNT162b2 vaccine paper
cited_url = f"https://eutils.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&db=pubmed&id={pmid}&linkname=pubmed_pubmed_citedin&retmode=json"
cited = requests.get(cited_url).json()

link_sets = cited.get("linksets", [{}])
if link_sets and "linksetdbs" in link_sets[0]:
    citing_ids = [link["id"] for link in link_sets[0]["linksetdbs"][0].get("links", [])]
    print(f"Papers citing PMID {pmid}: {len(citing_ids)}")
Enter fullscreen mode Exit fullscreen mode

Real use cases

  1. Drug research — Track all papers about a specific compound
  2. Clinical trial monitoring — Find latest trial results by condition
  3. Literature reviews — Automate systematic review paper collection
  4. Trend analysis — Track publication volume over time for any medical topic
  5. Competitive pharma intelligence — Monitor competitor research output

E-utilities endpoints cheat sheet

Endpoint Use Example
esearch Search for IDs Find papers matching a query
efetch Get full records Retrieve titles, abstracts, metadata
esummary Get summaries Quick metadata for multiple papers
elink Find related Citations, similar papers, cross-DB links
einfo Database info Available fields, counts

Rate limits

Access Limit How
No key 3 req/sec Just make requests
Free API key 10 req/sec Register at NCBI
Bulk download Unlimited FTP access available

Compare with other research APIs

API Papers Specialty Article
PubMed 35M+ Biomedical This article
arXiv 2M+ Physics, CS, Math
Semantic Scholar 200M+ All fields, AI-ranked
OpenAlex 250M+ Bibliometrics

Full list: Awesome Research APIs


I write about free APIs developers should know. Follow for more — I've documented 100+ free APIs so far.

Need data extraction? Check my Apify scrapers and GitHub repos.

Top comments (0)