DEV Community

Cover image for How I Found $300,000 Worth of Secrets in a Download Button
Mr. 0x1
Mr. 0x1

Posted on

How I Found $300,000 Worth of Secrets in a Download Button

A tale of curiosity, incompetence, and why you should never trust a software engineer who makes more than you.


In Which I Click a Button Like a Normal Person

It started, as most disasters do, with mild curiosity and a free afternoon.

I downloaded an application. Not because I'm a hacker. Not because I'm conducting corporate espionage. Not because I have any idea what I'm doing. I downloaded it because I wanted to use it.

Revolutionary concept, I know.

The installer was a .exe file. For the uninitiated, this is the software equivalent of a wrapped gift. And like any gift from a stranger on the internet, I decided to unwrap it.

"What's inside?" I wondered, the way a child wonders what's inside a clock before destroying it with a hammer.


The Electron Inside

Every modern desktop application, it turns out, is just a website pretending to be software.

It's like finding out your "homemade" meal came from a freezer bag—technically real, philosophically disappointing.

This particular application was built with Electron, which means somewhere inside was a file called app.asar. Think of it as a zip file that really, really wants you to think it's not a zip file.

I extracted it:

npx asar extract app.asar ./unpacked
Enter fullscreen mode Exit fullscreen mode

Inside was JavaScript. Thousands of lines of minified, obfuscated JavaScript that looked like someone had sneezed on a keyboard and called it architecture.

And there, sitting in the open like a wallet on a park bench, was a .env file.


The .env File, or: How to Fail at Security 101

For those blissfully unaware, a .env file is where developers store secrets. API keys. Database credentials. The sort of things you absolutely, positively, under no circumstances should ship to production.

It's Security 101. Literally. It's the first thing they teach you:

🚨 Rule #1 of Software Development: Don't commit your .env file.

This is not advanced knowledge. This is not arcane wisdom passed down through generations of security researchers.

This is the "wash your hands after using the bathroom" of software development.

And yet.

There it was. Gleaming. Unencrypted. Full of credentials.


What I Found

I won't name names. I won't point fingers. I'll simply describe what I found, in the same way a nature documentary describes a lion eating a gazelle: with clinical detachment and mild horror.

Discovery Severity My Reaction
API Keys 🔴 Critical Multiple. Active. Expensive.
Infrastructure URLs 🔴 Critical Internal endpoints. Very not public.
Service Credentials 🟠 High Analytics logging everything.
ML Inference Endpoints 🟠 High Cloud GPUs go brrrr on their dime.

The total potential exposure?

Let's just say it was significant enough that I briefly considered a career change.


The $300,000 Question 💸

Now, here's where it gets personal.

The engineer who shipped this? Based on industry averages, location, and the general state of the tech job market, they're probably making around $300,000 a year.

Three. Hundred. Thousand. Dollars.

To do the software equivalent of leaving your house keys under the doormat, except the doormat is see-through and you've put up a sign that says "KEYS UNDER HERE."

I'm not bitter. I'm not bitter at all.

I am simply noting, for the record, that I—a person of humble curiosity—managed to find this in approximately forty-five minutes of casual investigation while eating leftover pizza.

Meanwhile, somewhere, a senior software engineer is collecting stock options.

📍 Plot twist: The pizza was cold. The credentials were not.


The Investigation Continues

Having found the obvious vulnerabilities, I did what any responsible researcher would do: I kept looking.

The JavaScript bundle was minified, but minification is obfuscation in the same way a trench coat is a disguise. It technically conceals things, but anyone who looks for more than five seconds can see what's underneath.

I found:

  • 🗂️ Source map hints pointing to internal repositories
  • 🐛 Debug symbols that should have been stripped
  • 📋 Hardcoded configuration copy-pasted from dev
  • 📡 gRPC definitions outlining the entire API structure

Each discovery was like opening a nesting doll, except instead of smaller dolls, it was smaller failures.


The Moral of the Story

If you've made it this far, you might be expecting a dramatic conclusion. A confrontation with the company. A bug bounty payout. A heartfelt apology from a CEO.

Instead, I'll offer you something more valuable: a lesson.

To the Developers 👩‍💻

Your build pipeline is not a security feature. Electron apps are zip files with extra steps. Minification is not encryption.

And for the love of all that is holy, check what you're shipping before you ship it.

# Before you ship, maybe run:
npx asar extract your-app.asar ./check-this
grep -r "API_KEY\|SECRET\|PASSWORD" ./check-this
Enter fullscreen mode Exit fullscreen mode

To the Companies 🏢

That engineer you're paying $300,000?

Maybe budget $50 for a security audit. I'll do it. I'm available. I have pizza.

To the Users 👤

Every application you download is a mystery box.

The mystery is usually "how badly is my data being handled?"

The answer is usually "badly."


Responsible Disclosure

I want to be clear: I didn't exploit anything.

I didn't access systems I wasn't supposed to. I looked at what was shipped to me, as a user, in an application I downloaded from their official website.

Everything I found was sitting in a package that anyone with fifteen minutes and a search engine could have extracted. The only sophisticated tool I used was npm and a vague sense of disbelief.

This article names no names. Points no fingers that haven't already been pointed by the act of shipping credentials in a desktop application.


Epilogue: The Download Button

I still use the application. It's actually quite good.

I just use it with the quiet knowledge that somewhere, in a data center, there's a server running endpoints I wasn't supposed to know about, processing requests through an API I could technically call, protected by credentials that are sitting in my Downloads folder.

The download button that started all this sits innocently on their website, cheerfully inviting users to install their app.

Beneath it, there should probably be a disclaimer:

"By downloading this software, you agree to receive a free education in application security."


TL;DR Checklist

If you ship Electron apps, please check:

  • [ ] No .env files in your build
  • [ ] No hardcoded API keys
  • [ ] No internal URLs exposed
  • [ ] No debug symbols in production
  • [ ] Source maps are NOT included
  • [ ] You've actually looked inside your .asar file

If you found credentials in your own app while reading this, you're welcome.

If you're the $300k engineer who shipped this... we should talk.


The author is a security researcher in the same way that someone who finds a wallet on the ground is a "detective." DMs are open. Pizza recommendations welcome.

Top comments (0)