Introduction
If you're a developer, you probably know roadmap.sh. It's the gold standard for figuring out what to learn, packed with community-curated articles and videos.
Then there's NotebookLM. For me, this is the "holy grail" of studying. You feed it a few links, and it generates these incredibly fun, informative "Deep Dive" podcasts. I started listening to them on my daily commute, and honestly, I've never absorbed complex tech topics faster.
The Problem: The Backend roadmap has over 130 nodes. Each node has about 7 resources.
Doing the math? That's nearly 1,000 links to manually copy-paste.
I tried doing it by hand for about ten minutes before realizing I'd be there all week. So, I did what any developer would do: I automated it.
WaseemAldemeri
/
roadmap-to-notebooklm
Populate Google NotebookLM with learning resources from any roadmap.sh roadmap — one notebook per topic, ready for AI-generated podcasts, flashcards, and quizzes.
roadmap-to-notebooklm
Automatically populate Google NotebookLM with learning resources from any roadmap.sh roadmap.
roadmap-to-notebooklm.mp4
The pipeline extracts every topic from a roadmap, finds the community-curated YouTube videos and articles linked to each one, creates a dedicated NotebookLM notebook per topic, and uploads all the sources into it. Once your notebooks are loaded, a small CLI lets you fuzzy-search through them and fire off study material generation (podcasts, flashcards, quizzes, etc.) in bulk — without copy-pasting prompts everywhere.
Notebook creation and source ingestion is automated using notebooklm-py, a community-built Python package for the NotebookLM API.
Defaults to the Backend roadmap. To use it with any other roadmap.sh roadmap, see the Configuration section.
How it works
roadmap.sh GitHub repo
│
▼
get_resources.py ← scrapes topics + resource links → <roadmap>_resources.json
│
▼
create_notebooks.py ← creates one NotebookLM notebook per topic, uploads all sources
│
▼
generate_study_material_for_notebook.py ← interactive CLI to generate study…Seeing it in Action
Example of a generated notebook about gRPC
How I Built the Bridge
I wanted a simple, three-stage pipeline that didn't over-engineer the problem. I used a community-built wrapper called notebooklm-py to handle the heavy lifting with Google's internal API.
1. The Scraper (get_resources.py)
First, I needed the data. I targeted the open-source repo behind roadmap.sh. The script recursively digs through their JSON structures and markdown files to find every curated YouTube video and article link.
2. The Heavy Lifting (create_notebooks.py)
This is where the magic happens. The script logs into NotebookLM (reusing a local session) and:
- Creates a dedicated notebook for every single topic.
- Generates a "context" markdown file so the AI knows why it's looking at these links.
- Uploads the resources automatically.
Note: Google doesn't love it when you create 130 notebooks in 2 seconds, so I added some deliberate asyncio.sleep pauses to keep things civil.
3. The Interactive CLI (generate_study_material.py)
Since you can't (and shouldn't) generate 130 podcasts at once, I built a snappy CLI. It uses InquirerPy for fuzzy searching.
When I want to study "Redis" or "gRPC," I just type a few letters, select the topic, and hit Enter. The script triggers the generation on Google's servers, and by the time I've grabbed a coffee, my study materials are ready.
The Takeaway
I've learned more in the last few weeks using this setup than I have in months of "doom-scrolling" documentation.
If you want to set this up for yourself (it works for Frontend, DevOps, or any other roadmap too), check out the GitHub repo!
Originally published at demeri.dev.

Top comments (0)