In recent years, as AI has transformed how we code, I've noticed the same questions popping up everywhere:
"What will you do if AI disappears?"
"C...
For further actions, you may consider blocking this person and/or reporting abuse
I had the same problems. Last summer, my internet was constantly disconnected, and by the way, it was quite slow because I lived far from the city during the summer. I coped with it using a light local LLM with a couple of GB of bandwidth, but it was poor, but I could still solve simple problems or just chat. I also had a lot of games installed, so my daily routine depended on internet access.
Hello fellow survivor!! You said you daily routine depended on the internet access, how did you check when you are connected? Cause for us the disconnection had 8 layers of access. so like one day we had GitHub but the next day we didn't even have google. What i mean to ask is did you have a code constantly running to check pings for sites you need? And then like, would you have to suddenly jump and go to work in the middle of the night?
Access to the Internet was simply determined either it was there or it wasn’t, it was a “shutdown”. Slow loading speed of websites was a typical problem that you already get used to. I had to work in the evening for several reasons - because there was internet and it wasn’t so hot (it was about 30 during the day)!
I just suddenly though of living in a coding environment that is 30 degrees would be literal HELL, even more so than living without internet. lol
So I close all the doors and windows and lock myself in a dimly lit room. The weather is generally nice - +30 in the summer, -20 in the winter (I live on the very top floor, so coding and the draft, the howling wind outside - all make for excellent UX!).
Don't you guys have coolers over there? Our cooler is full blown in summer. I mean we are kinda in a desert too but the cost of cooler isn't much so its like a nice breeze in summer.
ps. a dimly lit room on top of the world coding with wind outside, that sounds like coding dream lol.
The cooler and the computer require double costs, lol!
Wait, so you had to live without a cooler? I mean like an air conditioner. I just remembered the word lol.
Yeah!
My 30GB survival kit:
Local documentation (the real bottleneck)
Language docs: Python, JS/TS, Go, Rust, Bash
Runtime docs: Node.js, Python stdlib, Docker CLI
Framework docs: React, Vue, Django, Flask, Express
Database docs: SQLite, Postgres, MySQL
OS docs: Linux man pages, systemd, networking basics
Why: When the internet dies, my memory becomes my API. Local docs restore my working memory.
Local package mirrors (small, curated, essential)
A frozen snapshot of the packages I actually use
Not the whole npm or PyPI—just the 200–500 packages my projects depend on
Include lockfiles and wheels/tarballs
Why: My builds shouldn’t depend on a global registry to exist.
Local runtimes and installers
Python installers for 3–4 versions
Node LTS installers
JDK
SQLite binaries
Git installers
VS Code offline installer
Why: If my machine dies, I can rebuild my environment from scratch.
Local templates and scaffolds
My preferred project skeletons
Dockerfile templates
CI/CD configs
HTML/CSS/JS starter kits
My own “Bootstrap offline” folder
Why: I shouldn’t need the internet to start a new project or fix an old one.
Local reference books (PDF)
Algorithms
Networking
Operating systems
Database design
Security fundamentals
Git fundamentals
One or two “cookbook” style books for my main languages
Why: Books are the highest‑density offline cognition I can store.
Local tools that don’t need cloud identity
A code editor
A diff/merge tool
A local API testing tool
A local Markdown previewer
A local SQLite browser
Why: Tools that require cloud login are not tools—they’re outages waiting to happen.
Local backups of my own work
My repos
My notes
My scripts
My configs
My dotfiles
My documentation
My diagrams
Why: My own code is the only code I truly own.
Local “offline Stack Overflow”
Curated Q&A dumps for the languages I use
Offline copies of the most common error explanations
My own troubleshooting notes
Why: Most debugging is pattern recognition. Store the patterns.
Local databases and datasets
SQLite
Sample datasets
JSON fixtures
Mock API responses
Why: I need something to test against when the real APIs are gone.
Local system‑rebuild kit
OS ISO
Drivers
Firmware
Partitioning tools
Backup/restore scripts
Why: If my machine dies during an outage, I need to resurrect it without the cloud.
WELL DAMN! that is a complete survival kit. I'm sorry to say this it but this kit has been STOLEN. mate this was the most complete kit i have seen. I'll save it for myself. lol. Specially "Python installers for 3–4 versions" it actually didn't cross my mind. Thanks for your time!
lol!
This is really good angle. I am almost sure that most of the SaaS companies don't have plans for these kind of situation. And most of the developers working with these companies could't work if this happens.
Thanks for the compliment! You would not believe how many start ups and small companies vanished without a trace in a matter of days!! It was horrifying. Big companies like Digikala( think of it as Iranian amazon) already had everything saved in large data centers, but that is a luxury small companies cannot afford. That's why its important to have a survivors kit. Cause i think that the dark truth is, no one cares or understands the dev's more than the devs themselves.
This is a great take and congrats on the first post!
For my ASL game, I had to download everything locally because of "can it run offline" requirement. It was a pain since sometimes a library is calling another library, so I had to track it down.
I agree that we shouldn't rely on AI as well the internet in general. We have to learn how to do the hard way then learn the easy way. Otherwise, we may never know the worse case scenario. Great work! Hope your journey goes well :D
Thanks for the encouragement!! I'm asking simply out of curiosity, how much storage did the finished ASL game take? Constantly downloading and tracking libraries simply feels too much.
I used Tauri to export it as a Desktop App. I checked and it is 22.4 MB, which is impressive considering the fact it uses ML MediaPipe. I tend to download the files that usually end with [name].min.js, so it kind of contribute it. If you are thinking of creating Desktop apps using Web Dev, use Tauri. It is good!
Thanks! I'll make sure to remember it!