DEV Community

Cover image for I almost leaked an API key into ChatGPT, so I built a Chrome extension
BloodAndCode
BloodAndCode

Posted on

I almost leaked an API key into ChatGPT, so I built a Chrome extension

I use AI chats a lot when coding and analyzing logs.

A few days ago I almost pasted a real API key into ChatGPT while sharing some logs.

I noticed it just before sending.

But it made me realize something:

It's extremely easy to accidentally leak sensitive data when using AI chats.

So I built a small Chrome extension called PasteSafe.


What it does

When you paste text into AI chats, it scans the content and detects things like:

  • API keys
  • emails
  • phone numbers
  • IBAN
  • UUID
  • URLs

If something sensitive is detected, it automatically masks the values before sending.

Example:
API_KEY → [API_KEY#1]


Works with

  • ChatGPT
  • Claude
  • Gemini

Privacy first

Everything runs locally in the browser.

  • no servers
  • no tracking
  • no data collection

Try it

https://pastsafe-ext.github.io/pastesafe/


GitHub repo:

https://github.com/pastsafe-ext/pastesafe


Curious:

Have you ever accidentally pasted something sensitive into an AI chat?

Top comments (1)

Collapse
 
bloodandcode profile image
BloodAndCode

One thing that surprised me while testing this:
emails and IDs appear in pasted logs much more often than API keys.
Curious what others accidentally paste into AI chats.