TL;DR: I built a free Polymarket prediction-market screener with 2,001 individual market pages — top movers, highest volume, crash signals, by category. Live at luciferforge.github.io/polyscope. No login, no account, no rate-limited API calls. Built from the same 10.8M-row dataset I use for my own bot.
The problem with Polymarket's UI
Polymarket has thousands of active markets. Their official UI shows a curated front page (politics + sports trending), but if you want to see:
- The 50 highest-volume markets right now
- The biggest probability-moves in the last 24h
- Markets in a specific category (geopolitics, crypto, weather, etc.)
- Markets that just crashed below their 7-day high
…you have to click into each market individually, or use their JSON API and write your own dashboard.
I wrote my own dashboard 6 months ago for personal use. Today I'm making it public.
What's at luciferforge.github.io/polyscope
A static-site screener with 2,001 individual market pages + 10 category indexes:
- By category: politics, sports, crypto, geopolitics, economics, entertainment, weather, science_tech, pop_culture, other
- Per market: question, current Yes/No price, 7-day price chart, total dollar volume, current liquidity, end date, resolution status
- Sortable indexes: by volume, by liquidity, by 24h price change
Static HTML, served from GitHub Pages, regenerated weekly from the underlying dataset. Loads instantly, no JS framework, no tracking pixels (just SimpleAnalytics).
Why static pages instead of a real-time app?
Three reasons:
-
SEO. Each market gets its own URL like
/markets/will-the-fed-cut-rates-in-march.html. Google indexes them. People searching for specific markets land directly on the screener page, not on Polymarket's app. - Cost. Zero servers. GitHub Pages CDN serves the lot. I can scale to 50K markets without paying for compute.
- Honesty. A "real-time" prediction market screener implies sub-second precision that the underlying data doesn't have anyway. Most markets don't have a fill in any given minute. Snapshots every 15 minutes are honest about that.
The 60K files behind it
The site is generated from a 10.8M-snapshot Polymarket price database I've been collecting since March 2026. Every 15 minutes the collector pulls Gamma + CLOB APIs, deduplicates, and inserts into SQLite. The screener generation script reads the DB, picks the 2,000 most-active markets, and renders the static pages.
If you want to build something like this yourself, the dataset is for sale on Gumroad: 10.8M+ price snapshots, 13,963 markets, 43+ days, $9. Same data the screener runs on.
Use cases
- Day-trading prediction markets: open the volume-sorted index every morning, see what's moved overnight
- Research: paste a market URL into a paper or report — the screener pages are stable, citable
- Building agent flows: each market page has structured data (volume, liquidity, end date) that an LLM can parse — the agent doesn't need to talk to Polymarket's API
- Backtesting setup: pick a market, click through to its category page, see related markets that resolved similarly
What's not in this version (yet)
- No real-time streaming (15-min snapshots only)
- No order placement (read-only, by design)
- No user accounts / saved watchlists (it's static)
If you want any of those, two adjacent products:
- Free CLI to audit your own Polymarket trading P&L: pnl-truthteller on PyPI. 253 downloads last month. Open source.
- Buyable bot template (the one I trade with): Polymarket Crash Recovery Bot — 280 trades, 80% WR. $39 on Gumroad.
Source / contributing
Everything's open source: github.com/LuciferForge/polyscope. Discussions just got enabled — if you've used Polymarket data and want better screener views, file a request. Pull requests welcome.
The screener regenerates from a daily DB snapshot. Found a market that should be on the trending page but isn't? Open an issue with the market_id.
Built by @manja316. I run a Polymarket trading desk + a prediction market dataset business. Free tools for the same audience that buys the data.
Top comments (0)