We all use databases — whether it’s PostgreSQL, Firebase, or NoSQL like MongoDB. They’re the backbone for storing and accessing data in a structured way.
But here’s the thing: your database lives on a separate server. And when your web server (maybe running on Render or some cloud platform) tries to talk to it, things can go wrong — connectivity issues, timeouts, or even temporary downtime.
So what if a critical operation fails? Like logging a suspicious login attempt?
🔁 That’s where fallback logs come in.
I was working on strengthening login security to prevent brute force attacks. You’d think brute force isn’t easy to pull off — but once you understand how attackers operate, you realize it's a real threat.
To stop it, I used the express-rate-limit middleware in Node.js — setting request limits per IP on the login route. Pretty standard.
Now, I was logging each login attempt's IP address and userID into MongoDB.
But then a few problems came up:
🔸 Problem 1: MongoDB Fails? Data Loss.
So I added a fallback using Node’s fs module. If MongoDB logging fails, the data (IP + userID) goes into a local file. Simple but effective — no loss of important info.
🔸 Problem 2: IP Address was always ::1 or 127.0.0.1 on Render
Because Render (like many cloud platforms) uses a reverse proxy, the actual client IP wasn’t being captured — instead, Express was logging everything as localhost.
This weakens the brute force protection — because every attacker seems to come from the same IP.
✅ Fix: I told the Express app to trust the proxy by adding:
app.set('trust proxy', true);
After this, the correct IPs started showing up via req.ip. Now the rate limiter and logging actually work as intended.
🧠 Takeaway: You can’t fully rely on external systems — whether it’s a DB or the platform’s default behavior. 👉 Always have fallbacks, 👉 Always verify what data you’re actually capturing, 👉 And from a cybersecurity point of view — don’t just rate-limit blindly. Make sure it’s working accurately and intelligently.
Top comments (0)