If you've built AI agents that interact with websites, you've hit this wall: login screens.
Your agent needs to check LinkedIn notifications, scrape a dashboard, or post to a platform. But the site demands authentication. So you do what every developer does at first — you open Chrome DevTools, copy the Cookie header, and paste it into your script.
It works. For about 24 hours. Then the session expires, your automation breaks at 3am, and you wake up to angry alerts.
I got tired of this cycle. Chrome already has my authenticated sessions stored locally. I'm logged into LinkedIn right now. What if my agent could just... use that?
Turns out it can. But Chrome doesn't make it easy.
Where Chrome Stores Cookies
Chrome stores cookies in a SQLite database. The location depends on your OS:
Mac:
~/Library/Application Support/Google/Chrome/Default/Cookies
Linux:
~/.config/google-chrome/Default/Cookies
Windows:
%LOCALAPPDATA%\Google\Chrome\User Data\Default\Network\Cookies
You can open it with any SQLite client:
sqlite3 ~/Library/Application\ Support/Google/Chrome/Default/Cookies
The schema looks like this:
sqlite> .schema cookies
CREATE TABLE cookies(
creation_utc INTEGER NOT NULL,
host_key TEXT NOT NULL,
name TEXT NOT NULL,
value TEXT NOT NULL,
encrypted_value BLOB NOT NULL,
path TEXT NOT NULL,
expires_utc INTEGER NOT NULL,
is_secure INTEGER NOT NULL,
is_httponly INTEGER NOT NULL,
...
);
Let's query LinkedIn's cookies:
SELECT name, value, encrypted_value FROM cookies WHERE host_key LIKE '%linkedin%';
You'll see something like:
name: li_at
value: (empty)
encrypted_value: v10[blob of binary garbage]
The value column is empty. Everything interesting is in encrypted_value. And that blob? It's encrypted.
How Chrome Encrypts Cookies
Chrome started encrypting cookies around 2014 to prevent malware from trivially stealing sessions. The encryption varies by platform.
Mac: Keychain + AES-128-CBC
On macOS, Chrome:
- Stores a master key in the macOS Keychain under "Chrome Safe Storage"
- Derives an encryption key using PBKDF2 with:
- Password: the Keychain-stored key
- Salt:
saltysalt(yes, literally the string "saltysalt") - Iterations: 1003
- Key length: 16 bytes (AES-128)
- Encrypts each cookie value with AES-128-CBC
- Prepends
v10to indicate the encryption version
Linux: libsecret + AES-128-CBC
Linux uses a similar approach but stores the key in GNOME Keyring or KWallet via libsecret. If no keyring is available, Chrome falls back to a hardcoded key: peanuts (again, literally).
Windows: DPAPI
Windows uses the Data Protection API (DPAPI), which ties encryption to the Windows user account. This is actually more secure since there's no extractable key — decryption only works for the logged-in user.
Decrypting Cookies with Python
Let's write the decryption. I'll focus on macOS since that's what I use, but the approach is similar for Linux.
Step 1: Get the encryption key from Keychain
import subprocess
def get_chrome_key_mac():
cmd = [
'security', 'find-generic-password',
'-s', 'Chrome Safe Storage',
'-w'
]
result = subprocess.run(cmd, capture_output=True, text=True)
return result.stdout.strip()
Step 2: Derive the AES key
from cryptography.hazmat.primitives.kdf.pbkdf2 import PBKDF2HMAC
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.backends import default_backend
def derive_key(chrome_key):
kdf = PBKDF2HMAC(
algorithm=hashes.SHA1(),
length=16,
salt=b'saltysalt',
iterations=1003,
backend=default_backend()
)
return kdf.derive(chrome_key.encode())
Step 3: Decrypt the cookie
from cryptography.hazmat.primitives.ciphers import Cipher, algorithms, modes
def decrypt_cookie(encrypted_value, key):
# v10 prefix indicates encryption version
if encrypted_value[:3] != b'v10':
return encrypted_value.decode() # Not encrypted
encrypted_value = encrypted_value[3:] # Remove prefix
# AES-128-CBC with 16-byte IV of spaces
iv = b' ' * 16
cipher = Cipher(
algorithms.AES(key),
modes.CBC(iv),
backend=default_backend()
)
decryptor = cipher.decryptor()
decrypted = decryptor.update(encrypted_value) + decryptor.finalize()
# Remove PKCS7 padding
padding_len = decrypted[-1]
return decrypted[:-padding_len].decode()
Putting it together
import sqlite3
import shutil
import tempfile
def get_linkedin_cookies():
# Copy database (Chrome locks the original)
cookie_path = os.path.expanduser(
'~/Library/Application Support/Google/Chrome/Default/Cookies'
)
with tempfile.NamedTemporaryFile(delete=False) as tmp:
shutil.copy2(cookie_path, tmp.name)
conn = sqlite3.connect(tmp.name)
cursor = conn.cursor()
cursor.execute('''
SELECT name, encrypted_value
FROM cookies
WHERE host_key LIKE '%linkedin%'
''')
chrome_key = get_chrome_key_mac()
aes_key = derive_key(chrome_key)
cookies = {}
for name, encrypted_value in cursor.fetchall():
cookies[name] = decrypt_cookie(encrypted_value, aes_key)
conn.close()
os.unlink(tmp.name)
return cookies
# Usage
cookies = get_linkedin_cookies()
print(cookies)
# {'li_at': 'AQEDAT...', 'JSESSIONID': 'ajax:123...', ...}
It works. You now have decrypted session cookies that you can use in requests:
import requests
response = requests.get(
'https://www.linkedin.com/feed/',
cookies=cookies
)
# You're authenticated
Why a Script Isn't Enough
So we can decrypt cookies. Problem solved?
Not quite. This script has issues:
Security: The decrypted cookies are now in plaintext in your script's memory, your logs, potentially your git history. Session tokens are as sensitive as passwords.
No scoping: Any script can access any cookie. Your "LinkedIn agent" can also read your bank cookies. That's a security nightmare.
No audit trail: When something goes wrong (and it will), you have no idea which agent accessed what, when.
Session management: Cookies expire. Sites rotate tokens. You need to track freshness and know when to re-authenticate.
Multi-agent chaos: When you have 5 agents accessing 10 sites, the cookie management becomes its own project.
From Script to Tool
This is why I built AgentAuth.
The architecture:
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Chrome Extension│────▶│ Encrypted Vault │────▶│ Your Agent │
│ (export once) │ │ (AES-256) │ │ (scoped access)│
└─────────────────┘ └─────────────────┘ └─────────────────┘
Chrome Extension: You click "Export" while logged into a site. The extension captures the cookies and sends them to the local vault. No more DevTools copy-paste.
Encrypted Vault: Cookies are stored in a local SQLite database, encrypted with AES-256 using a password you control. Not scattered across scripts.
Scoped Agents: You create named agents with specific domain access. Your
linkedin-agentcan only accesslinkedin.comcookies. It can't touch your bank.Audit Logging: Every access is logged with timestamp, agent name, and domain.
The code becomes:
from agent_auth import Vault
vault = Vault()
vault.unlock("password")
cookies = vault.get_session("linkedin.com")
# Use with requests
response = requests.get('https://linkedin.com/feed', cookies=cookies)
# Or with Playwright
context.add_cookies(cookies)
One line to get authenticated cookies. No decryption code. No hardcoded tokens. No security nightmares.
The Bigger Picture
Here's what I realized while building this: session management for AI agents is unsolved infrastructure.
We have OAuth for user-facing apps. We have API keys for server-to-server. But AI agents? They're stuck with the same hacks we used in 2010 for web scraping.
The industry is building increasingly sophisticated agents — agents that can browse, fill forms, make purchases. But we're still copy-pasting cookies from DevTools.
AgentAuth is my attempt to fix this. It's open source, works today on any site (no OAuth adoption required), and integrates with LangChain, Playwright, and n8n.
Links:
- GitHub: github.com/jacobgadek/agent-auth
- PyPI:
pip install agentauth-py - n8n node:
npm install n8n-nodes-agentauth
If you're building agents that need authentication, give it a try. And if you want to contribute — especially Windows DPAPI support or a Firefox extension — PRs are welcome.
If you found this useful, consider starring the repo. It helps others discover the project.
Top comments (0)