DEV Community

Alex Spinov
Alex Spinov

Posted on

GitHub Has a Free API — Search Repos, List Issues, and Automate Your Workflow Without Paying

GitHub's REST API is free for everyone — authenticated or not. You can search repositories, read issues, list commits, explore users, and automate your entire Git workflow. The free tier gives you 5,000 requests per hour with a token.

Here's how to use it.

Authentication (Optional but Recommended)

Without a token: 60 requests/hour. With a free personal access token: 5,000 requests/hour.

Create a token at github.com/settings/tokens → "Generate new token (classic)" → select scopes you need.

1. Search Repositories

Find the most popular repos for any topic.

# Top Python web scraping repos
curl -s "https://api.github.com/search/repositories?q=web+scraping+language:python&sort=stars&per_page=5" \
  -H "Authorization: Bearer ghp_YOUR_TOKEN"
Enter fullscreen mode Exit fullscreen mode

Response includes: name, description, stars, forks, language, last updated, and more.

2. List Issues

# Open issues for any public repo
curl -s "https://api.github.com/repos/facebook/react/issues?state=open&per_page=5"
Enter fullscreen mode Exit fullscreen mode

3. Get User Profile and Repos

# User info
curl -s "https://api.github.com/users/torvalds"

# Their repos sorted by stars
curl -s "https://api.github.com/users/torvalds/repos?sort=stars&per_page=5"
Enter fullscreen mode Exit fullscreen mode

4. Get Repository Stats

# Repo details (stars, forks, watchers, language)
curl -s "https://api.github.com/repos/microsoft/vscode"

# Contributors
curl -s "https://api.github.com/repos/microsoft/vscode/contributors?per_page=5"

# Commit activity (last year, weekly)
curl -s "https://api.github.com/repos/microsoft/vscode/stats/commit_activity"
Enter fullscreen mode Exit fullscreen mode

5. Python — Trending Repos Tracker

import requests

TOKEN = "ghp_YOUR_TOKEN"  # Optional but recommended
HEADERS = {"Authorization": f"Bearer {TOKEN}"} if TOKEN else {}

def search_trending(language="python", days=7):
    from datetime import datetime, timedelta
    date = (datetime.now() - timedelta(days=days)).strftime("%Y-%m-%d")

    url = "https://api.github.com/search/repositories"
    params = {
        "q": f"language:{language} created:>{date}",
        "sort": "stars",
        "per_page": 10
    }
    response = requests.get(url, headers=HEADERS, params=params)
    repos = response.json().get("items", [])

    for repo in repos:
        print(f"{repo['stargazers_count']:>5} | {repo['full_name']}")
        print(f"       {repo['description'][:80] if repo['description'] else 'No description'}")
        print()

search_trending("python", 7)
Enter fullscreen mode Exit fullscreen mode

6. Node.js — Issue Monitor

const TOKEN = "ghp_YOUR_TOKEN";

async function getNewIssues(repo, since = "2026-03-01") {
  const res = await fetch(
    `https://api.github.com/repos/${repo}/issues?since=${since}&state=open&per_page=10`,
    { headers: { Authorization: `Bearer ${TOKEN}` } }
  );
  const issues = await res.json();

  for (const issue of issues) {
    if (!issue.pull_request) {
      console.log(`#${issue.number}: ${issue.title}`);
      console.log(`  Labels: ${issue.labels.map(l => l.name).join(", ")}`);
      console.log(`  Created: ${issue.created_at}\n`);
    }
  }
}

getNewIssues("vercel/next.js");
Enter fullscreen mode Exit fullscreen mode

Rate Limits

Auth Method Limit
No token 60 requests/hour
Personal access token 5,000 requests/hour
GitHub App 15,000 requests/hour
GraphQL API 5,000 points/hour

What You Can Build

  • Repo analytics dashboard — track stars, forks, and contributors over time
  • Dependency scanner — check if repos use vulnerable packages
  • Hiring tool — find developers by language, contributions, and location
  • Release monitor — get notified when a project publishes a new release
  • Portfolio generator — auto-build a dev portfolio from GitHub profile
  • Open source leaderboard — rank contributors by commits, PRs, reviews

More Free API Articles


Need Web Data? Try These Tools

If you're building apps that need web scraping or data extraction, check out my ready-made tools on Apify Store — scrapers for Reddit, YouTube, Google News, Trustpilot, and 80+ more. No coding needed, just run and get your data.

Need a custom scraping solution? Email me at spinov001@gmail.com


More Free APIs You Should Know About

Need custom data scraping? Email me or check my Apify actors.

Top comments (0)