After months of side-project work, I just released v1.0.0 of Anime Tracker — a self-hosted desktop app to manage your anime list. Here's the story of how I built it and the technical decisions behind it.
Why I built this
I wasn't happy with existing anime trackers:
- MyAnimeList: ugly UI, cloud-only, full of ads
- AniList: better UX but still cloud-dependent and limited customization
- Spreadsheets: zero features (no notifications, no recommendations, no search)
- Existing self-hosted alternatives: either abandoned or too complex
So I decided to build my own. Local-first, no ads, clean UI, with features I actually wanted.
Tech stack decisions
Why FastAPI
I considered Flask, Django, and FastAPI. Picked FastAPI because:
- Async support out of the box (critical for calling 8 external APIs concurrently)
- Automatic OpenAPI docs at
/docs— useful for debugging - Pydantic models for request/response validation
- Performance comparable to Node.js
- Type hints make the code self-documenting
Why SQLite (not PostgreSQL)
For a single-user desktop app, SQLite is perfect:
- Zero config: no database server to install
-
One file: the user's entire anime list is in
anime_tracker.db. Easy to back up, easy to move between PCs. - Fast enough: even with thousands of anime entries, queries are sub-millisecond
- Embedded migrations: I wrote a tiny auto-migration system that runs at startup
Why vanilla JS (not React)
The frontend is ~3000 lines of HTML/CSS/JS without any framework. Why:
- No build step: easier for contributors, easier to debug
- Smaller: total app size is ~500KB
- Faster initial load: no React runtime to download
- Trade-off: managing state by hand is more code, but I kept things simple
Interesting technical bits
1. Pluggable scrapers (8 sources)
I built a base class that each scraper implements:
class BaseScraper:
def buscar(self, query: str) -> AnimeData | None:
raise NotImplementedError
This way I can search across:
- AniList (GraphQL API)
- Jikan v4 (REST API for MyAnimeList)
- Kitsu (JSON:API)
- Crunchyroll (v2 API)
- 4 HTML scrapers for specific sites
If one source fails, the next takes over. The user selects priority order.
2. Image fallback chain
A common problem: some scrapers return entries without cover images. Empty anime cards look terrible.
So I built a 3-step fallback:
if not anime_data.imagen:
fallback = _find_anime_image(anime_data.nombre)
if fallback:
anime_data.imagen = fallback
# Otherwise frontend shows a gradient + 🎌 placeholder
The function _find_anime_image queries AniList GraphQL by name. Works 95% of the time. The remaining 5% gets a clean gradient placeholder.
3. Async pattern with run_in_executor
requests is synchronous. FastAPI is async. If I called requests.get() directly in an async route, I'd block the event loop.
The solution:
loop = asyncio.get_running_loop()
result = await loop.run_in_executor(executor, scraper.buscar, query)
This runs the blocking call in a thread pool while the event loop stays free. Concurrent requests to 8 scrapers happen truly in parallel.
4. Mobile access via local WiFi
This was the trickiest feature. When the user enables "Mobile mode" the app:
- Binds to
0.0.0.0:8765instead of127.0.0.1 - Generates a random PIN + secure token
- Renders a QR code with the URL
http://<LOCAL_IP>:8765/m?token=... - Phone scans the QR → opens the mobile route → enters PIN → gets cookie session
- Mobile UI is a PWA with service worker for offline read access
All while keeping the desktop port 127.0.0.1 only for the main UI. The user can disable mobile mode and the network port closes immediately.
5. PyInstaller distribution
Compiling Python to a .exe is famously painful. PyInstaller with --onedir gives you a folder with python.exe + dependencies + your code. ~12MB total.
I built a custom installer in Python (with tkinter UI) that:
- Copies the onedir to
Program Files - Creates desktop and start menu shortcuts (using PowerShell
WScript.Shell) - Registers the uninstaller in Windows Registry
- Sets file associations
No NSIS, no Inno Setup. Just Python.
What I learned
Scope creep is real. I started thinking "just a list manager" and ended with 70+ API routes. Every time I thought I was done I'd find one more thing to polish.
Documentation is harder than code. Writing the README took me 3 hours. Explaining features so users actually discover them is its own skill.
-
Notifications are tricky. Web Notifications need:
- User permission (browser API)
- Service worker for background polling
- State sync between server and client
- Anti-spam (don't notify the same episode twice)
Multi-language support shapes architecture. I added
i18n.pywith a dictionary per language and aT(key)function. Translating the UI is now a 5-minute task per language.CI matters. GitHub Actions running
python -m py_compileon every push has caught more bugs than I'd admit. Free safety net.
What's next
- Web series / movies (the same architecture works for any media type — Reddit user actually suggested this)
- Auto-detection from VLC/MPV like Taiga does
- Discord rich presence
- More languages
The repo
Code, screenshots, installer for Windows:
github.com/lucasusamentiaga/anime-tracker
MIT licensed. Stars and feedback appreciated.
If you want to follow more of my projects:
- Twitter: @T0ff3_x
- Instagram: @lucasuazz_
Happy to answer technical questions in the comments.
Top comments (0)