Every major tech trend — from the AI boom to the layoff wave — shows up on Hacker News days before mainstream media picks it up. But scrolling HN manually is inefficient.
Hacker News has a completely free, no-auth Firebase API. Here's how to use it to spot trends programmatically.
The API
Base URL: https://hacker-news.firebaseio.com/v0/
Endpoints:
-
/topstories.json— top 500 story IDs -
/newstories.json— newest 500 story IDs -
/beststories.json— best 500 story IDs -
/item/{id}.json— any item (story, comment, job) -
/user/{username}.json— user profile
No API key. No rate limit. No auth. Just fetch.
1. Track What's Hot Right Now
import requests
def get_top_stories(limit=10):
ids = requests.get('https://hacker-news.firebaseio.com/v0/topstories.json').json()
stories = []
for story_id in ids[:limit]:
item = requests.get(f'https://hacker-news.firebaseio.com/v0/item/{story_id}.json').json()
stories.append({
'title': item.get('title'),
'score': item.get('score', 0),
'comments': item.get('descendants', 0),
'url': item.get('url', ''),
'by': item.get('by')
})
return stories
for s in get_top_stories(5):
print(f" [{s['score']}pts] {s['title'][:60]}")
2. Find Trending Topics by Keyword
Want to know when HN talks about a specific technology?
def search_hn(keyword, limit=20):
# Use Algolia HN Search API (also free, no auth)
resp = requests.get(
'https://hn.algolia.com/api/v1/search',
params={'query': keyword, 'hitsPerPage': limit, 'tags': 'story'}
).json()
for hit in resp['hits']:
print(f" [{hit.get('points',0):>4}pts | {hit.get('num_comments',0):>3}c] {hit['title'][:55]}")
return resp['hits']
# Example: track AI agent mentions
search_hn('AI agent')
3. Analyze Comment Sentiment
def get_story_comments(story_id, depth=0, max_depth=2):
item = requests.get(f'https://hacker-news.firebaseio.com/v0/item/{story_id}.json').json()
comments = []
if item.get('text'):
comments.append({'by': item.get('by'), 'text': item['text'][:200], 'depth': depth})
if depth < max_depth:
for kid_id in item.get('kids', [])[:5]:
comments.extend(get_story_comments(kid_id, depth+1, max_depth))
return comments
4. Build a Daily Tech Digest
from collections import Counter
import re
def daily_digest():
ids = requests.get('https://hacker-news.firebaseio.com/v0/topstories.json').json()[:30]
titles = []
domains = Counter()
for sid in ids:
item = requests.get(f'https://hacker-news.firebaseio.com/v0/item/{sid}.json').json()
titles.append(item.get('title', ''))
url = item.get('url', '')
if url:
domain = re.findall(r'https?://(?:www\.)?(.*?)/', url + '/')
if domain:
domains[domain[0]] += 1
# Find trending words
words = Counter()
stop = {'the','a','an','is','to','for','and','in','of','on','with','how','why','from','your','you','are'}
for t in titles:
for w in t.lower().split():
if len(w) > 3 and w not in stop:
words[w] += 1
print('Top trending words today:')
for word, count in words.most_common(10):
print(f' {word}: {count} mentions')
print('\nTop domains:')
for domain, count in domains.most_common(5):
print(f' {domain}: {count} stories')
daily_digest()
5. Monitor Job Postings
def get_hiring_posts(limit=5):
resp = requests.get(
'https://hn.algolia.com/api/v1/search',
params={'query': 'hiring', 'hitsPerPage': limit, 'tags': 'story'}
).json()
for hit in resp['hits']:
print(f" {hit['title'][:70]}")
print(f" {hit.get('num_comments',0)} comments | {hit.get('created_at','')[:10]}")
get_hiring_posts()
Rate Limits
- Firebase API: No official limit, but be reasonable (1 req/sec)
- Algolia HN API: 10,000 requests/hour — very generous
- Both APIs return JSON, no parsing needed
Use Cases
- Investors: Spot emerging tech before it peaks
- Content creators: Find trending topics for articles
- Job seekers: Monitor who's hiring in real time
- Product managers: Track competitor mentions
- Researchers: Analyze tech community sentiment
I explore free APIs and build tools with them. More projects: GitHub | Writing inquiries: Spinov001@gmail.com
Top comments (0)