DEV Community

Cover image for From PostgreSQL to Redis: How We Reduced Authentication Latency by 98% in Production
Dipankar Sethi
Dipankar Sethi

Posted on

From PostgreSQL to Redis: How We Reduced Authentication Latency by 98% in Production

Introduction
When building authentication systems for our startup from scratch, we made what seemed like a logical choice: store everything in PostgreSQL. After all, it's reliable, we were already using it, and it kept our architecture simple. However, as our user base grew, this decision nearly brought our authentication system to its knees.

This is the story of how storing JWT tokens and OTPs in PostgreSQL created a cascading performance problem, and how migrating to Redis transformed our authentication from a bottleneck into one of our fastest services.

The Beginning: A Simple Startup Architecture
We were building our product from scratch with a small team (Only 2 persons) using Spring Boot. For user authentication, we implemented email verification using OTPs (One-Time Passwords). The approach seemed straightforward and worked perfectly during development.

Our Initial OTP Implementation
When a user requested an OTP for email verification, login, or password reset, we stored it in PostgreSQL:

-- OTP Storage Schema
CREATE TABLE otp_codes (
    id BIGSERIAL PRIMARY KEY,
    user_email VARCHAR(255) NOT NULL,
    otp_code VARCHAR(6) NOT NULL,
    purpose VARCHAR(50) NOT NULL, -- 'EMAIL_VERIFICATION', 'LOGIN', 'PASSWORD_RESET'
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    expires_at TIMESTAMP NOT NULL,
    is_used BOOLEAN DEFAULT FALSE
);

CREATE INDEX idx_otp_lookup ON otp_codes(user_email, purpose, expires_at);
Enter fullscreen mode Exit fullscreen mode

The JPA Entity for OTP:

@Entity
@Table(name = "otp_codes")
@Data
@NoArgsConstructor
@AllArgsConstructor
public class OtpCode {

    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Long id;

    @Column(name = "user_email", nullable = false)
    private String userEmail;

    @Column(name = "otp_code", nullable = false, length = 6)
    private String otpCode;

    @Enumerated(EnumType.STRING)
    @Column(nullable = false)
    private OtpPurpose purpose;

    @Column(name = "created_at")
    private LocalDateTime createdAt;

    @Column(name = "expires_at", nullable = false)
    private LocalDateTime expiresAt;

    @Column(name = "is_used")
    private Boolean isUsed = false;
}
Enter fullscreen mode Exit fullscreen mode

The Repository interface:

@Repository
public interface OtpRepository extends JpaRepository<OtpCode, Long> {

    Optional<OtpCode> findByUserEmailAndPurposeAndOtpCodeAndExpiresAtAfterAndIsUsedFalse(
        String userEmail, 
        OtpPurpose purpose, 
        String otpCode, 
        LocalDateTime currentTime
    );

    @Modifying
    @Query("DELETE FROM OtpCode o WHERE o.expiresAt < :currentTime")
    int deleteExpiredOtps(@Param("currentTime") LocalDateTime currentTime);
}
Enter fullscreen mode Exit fullscreen mode

The Service for generating and storing OTPs:

@Service
@RequiredArgsConstructor
public class OtpService {

    private final OtpRepository otpRepository;
    private final EmailService emailService;

    // Generate and store OTP in PostgreSQL
    public String generateOTP(String email, OtpPurpose purpose) {
        // Generate 6-digit OTP
        String otp = String.format("%06d", new Random().nextInt(999999));

        OtpCode otpCode = new OtpCode();
        otpCode.setUserEmail(email);
        otpCode.setOtpCode(otp);
        otpCode.setPurpose(purpose);
        otpCode.setCreatedAt(LocalDateTime.now());
        otpCode.setExpiresAt(LocalDateTime.now().plusMinutes(10)); // 10 minutes expiry
        otpCode.setIsUsed(false);

        // First database call - INSERT
        otpRepository.save(otpCode);

        // Send OTP via email
        emailService.sendOtpEmail(email, otp);

        return otp;
    }

    // Verify OTP
    public boolean verifyOTP(String email, OtpPurpose purpose, String inputOtp) {
        Optional<OtpCode> otpOptional = otpRepository
            .findByUserEmailAndPurposeAndOtpCodeAndExpiresAtAfterAndIsUsedFalse(
                email, purpose, inputOtp, LocalDateTime.now()
            );

        if (otpOptional.isEmpty()) {
            return false;
        }

        // Mark as used - Second database call - UPDATE
        OtpCode otp = otpOptional.get();
        otp.setIsUsed(true);
        otpRepository.save(otp);

        return true;
    }
}
Enter fullscreen mode Exit fullscreen mode

At first, this approach looked good and worked fine during our early testing phase with limited users.

The Cleanup Scheduler for OTPs
Since OTPs expire after 10 minutes, we needed a way to clean up old records. We implemented a scheduled task:

@Component
@RequiredArgsConstructor
@Slf4j
public class OtpCleanupScheduler {

    private final OtpRepository otpRepository;

    // Running every 5 minutes
    @Scheduled(fixedRate = 300000) // 5 minutes = 300,000 milliseconds
    @Transactional
    public void cleanupExpiredOtps() {
        log.info("Running OTP cleanup scheduler...");

        int deletedCount = otpRepository.deleteExpiredOtps(LocalDateTime.now());

        log.info("Deleted {} expired OTPs", deletedCount);
    }
}
Enter fullscreen mode Exit fullscreen mode

Scheduler Configuration:

Parameter Value
Scheduler Name OTP Cleanup
Frequency Every 5 minutes
Operation DELETE
Database Calls Expired OTPs — 1 DELETE query

Problem #1: The OTP Bottleneck

As traffic increased, we started noticing serious issues with our OTP system.

The Growing Data Problem
Every OTP generation created a new database row. With users requesting OTPs for:

  • Email verification (new signups)
  • Login attempts (2FA)
  • Password resets
  • Resend requests (when users didn't receive the first OTP)

Our otp_codes table exploded:

Time Period Total OTPs Generated Expired Records Database Size
Week 1 11,200 950 15 MB
Month 1 45,000 38,000 180 MB
Month 3 380,000 325,000 1.2 GB
Month 6 1,200,000 1,050,000 3.8 GB

The Scheduler Became a Monster
Our 5-minute cleanup scheduler that initially deleted 10-20 records was now handling thousands:

[2025-10-14 10:05:00] Deleted 3,847 expired OTPs - Execution time: 4.2s
[2025-10-15 10:10:00] Deleted 4,123 expired OTPs - Execution time: 5.1s
[2025-10-16 10:15:00] Deleted 5,691 expired OTPs - Execution time: 6.8s
Enter fullscreen mode Exit fullscreen mode

The DELETE query became very heavy. Each execution:

  • Scanned thousands of rows
  • Acquired table locks
  • Caused query timeouts during peak hours
  • Impacted other database operations

Unnecessary Database Calls
For every single OTP flow, we were making three database calls:

User requests OTP → INSERT into database (Call #1)
                 ↓
User verifies OTP → UPDATE database (Call #2)
                 ↓
Scheduler runs   → DELETE from database (Call #3)
Enter fullscreen mode Exit fullscreen mode

With thousands of OTP requests per hour, this translated to:

  • 6,000+ INSERT queries per hour during peak
  • 4,500+ UPDATE queries per hour (75% verification rate)
  • 12 DELETE queries per hour (scheduler) each deleting thousands of rows

Problem #2: Managing JWT Logout Tokens

The second major challenge came from managing user logouts. We were using JWT tokens for authentication, which are stateless by design.
The JWT Stateless Problem
When a user logs out, their JWT token doesn't automatically become invalid it remains valid until it expires. This created a security concern:

User logs in  → JWT issued (expires in 60 minutes)
User logs out → JWT still works for remaining time 
Enter fullscreen mode Exit fullscreen mode

We needed a way to blacklist tokens when users logged out.

Our PostgreSQL Solution for Token Blacklist
We created another table to store invalidated tokens:

-- Token Blacklist Schema
CREATE TABLE invalidated_tokens (
    id BIGSERIAL PRIMARY KEY,
    token_jti VARCHAR(255) UNIQUE NOT NULL, -- JWT ID claim
    user_id BIGINT NOT NULL,
    invalidated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    expires_at TIMESTAMP NOT NULL
);

CREATE INDEX idx_token_jti ON invalidated_tokens(token_jti);
CREATE INDEX idx_token_expires ON invalidated_tokens(expires_at);
Enter fullscreen mode Exit fullscreen mode

The JPA Entity:

@Entity
@Table(name = "invalidated_tokens")
@Data
@NoArgsConstructor
@AllArgsConstructor
public class InvalidatedToken {

    @Id
    @GeneratedValue(strategy = GenerationType.IDENTITY)
    private Long id;

    @Column(name = "token_jti", unique = true, nullable = false)
    private String tokenJti;

    @Column(name = "user_id", nullable = false)
    private Long userId;

    @Column(name = "invalidated_at")
    private LocalDateTime invalidatedAt;

    @Column(name = "expires_at", nullable = false)
    private LocalDateTime expiresAt;
}
Enter fullscreen mode Exit fullscreen mode

The Repository:

@Repository
public interface InvalidatedTokenRepository extends JpaRepository<InvalidatedToken, Long> {

    boolean existsByTokenJtiAndExpiresAtAfter(String tokenJti, LocalDateTime currentTime);

    @Modifying
    @Query("DELETE FROM InvalidatedToken t WHERE t.expiresAt < :currentTime")
    int deleteExpiredTokens(@Param("currentTime") LocalDateTime currentTime);
}
Enter fullscreen mode Exit fullscreen mode

The Token Blacklist Service:

@Service
@RequiredArgsConstructor
public class TokenBlacklistService {

    private final InvalidatedTokenRepository tokenRepository;

    // Blacklist token on logout
    public void blacklistToken(String tokenJti, Long userId, LocalDateTime expiresAt) {
        InvalidatedToken token = new InvalidatedToken();
        token.setTokenJti(tokenJti);
        token.setUserId(userId);
        token.setInvalidatedAt(LocalDateTime.now());
        token.setExpiresAt(expiresAt);

        // Database call - INSERT
        tokenRepository.save(token);
    }

    // Check if token is blacklisted (called on every authenticated request)
    public boolean isTokenBlacklisted(String tokenJti) {
        // Database call - SELECT
        return tokenRepository.existsByTokenJtiAndExpiresAtAfter(
            tokenJti, LocalDateTime.now()
        );
    }
}
Enter fullscreen mode Exit fullscreen mode

JWT Authentication Filter with blacklist check:

@Component
@RequiredArgsConstructor
public class JwtAuthenticationFilter extends OncePerRequestFilter {

    private final JwtUtil jwtUtil;
    private final TokenBlacklistService blacklistService;

    @Override
    protected void doFilterInternal(HttpServletRequest request, 
                                    HttpServletResponse response, 
                                    FilterChain filterChain) 
            throws ServletException, IOException {

        String token = extractToken(request);

        if (token != null && jwtUtil.validateToken(token)) {
            String jti = jwtUtil.getJtiFromToken(token);

            // Database call on EVERY authenticated request
            if (blacklistService.isTokenBlacklisted(jti)) {
                response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
                return;
            }

            // Continue with authentication
            Authentication auth = getAuthentication(token);
            SecurityContextHolder.getContext().setAuthentication(auth);
        }

        filterChain.doFilter(request, response);
    }
}
Enter fullscreen mode Exit fullscreen mode

The Token Cleanup Scheduler
Similar to OTPs, we needed to clean up expired blacklisted tokens:

@Component
@RequiredArgsConstructor
@Slf4j
public class TokenCleanupScheduler {

    private final InvalidatedTokenRepository tokenRepository;

    // Running every 60 minutes
    @Scheduled(fixedRate = 3600000) // 60 minutes = 3,600,000 milliseconds
    @Transactional
    public void cleanupExpiredTokens() {
        log.info("Running token blacklist cleanup scheduler...");

        int deletedCount = tokenRepository.deleteExpiredTokens(LocalDateTime.now());

        log.info("Deleted {} expired tokens", deletedCount);
    }
}
Enter fullscreen mode Exit fullscreen mode

Updated Scheduler Summary:

Scheduler Name Frequency Operation Database Calls
OTP Cleanup Every 5 minutes DELETE 1 DELETE query
Token Cleanup Every 60 minutes DELETE 1 DELETE query

The Problems with Token Blacklisting
At first, this looked good, but as users grew, critical issues emerged:

1. Unnecessary Data Retention:

User logs out at 10:01 AM → Token stored in database
Token expires at 11:00 AM → Token sits idle for 59 minutes
Cleanup runs at 11:00 AM → Token finally deleted
Enter fullscreen mode Exit fullscreen mode

A large number of non-functional tokens were sitting in the database doing nothing but consuming space.

2. Database Call on Every Request:
The biggest problem was checking the blacklist on every authenticated API request:

GET /api/user/profile     → Check blacklist (DB call)
GET /api/posts            → Check blacklist (DB call)
POST /api/comment         → Check blacklist (DB call)
GET /api/notifications    → Check blacklist (DB call)
Enter fullscreen mode Exit fullscreen mode

With 10,000 authenticated requests per minute:

  • 10,000 SELECT queries per minute to check blacklist
  • Added 50-100ms latency to every request
  • Database connection pool frequently exhausted
  • Queries queued during peak traffic
Time Period Logged Out Users Blacklisted Tokens Table Size
Week 1 450 450 8 MB
Month 1 18,000 18,000 95 MB
Month 3 125,000 125,000 680 MB
Month 6 480,000 480,000 2.1 GB

4. Scheduler Overhead:

[2025-10-20 11:00:00] Deleted 8,234 expired tokens - Execution time: 7.3s
[2025-10-21 12:00:00] Deleted 9,127 expired tokens - Execution time: 8.9s
[2025-10-21 13:00:00] Deleted 11,456 expired tokens - Execution time: 12.1s
Enter fullscreen mode Exit fullscreen mode

The Complete Problem Summary

  1. Unnecessary Database Calls:
    For OTPs:

    • INSERT on generation
    • UPDATE on verification
    • DELETE every 5 minutes (bulk)
  2. For Token Blacklist:

    • INSERT on logout
    • SELECT on every authenticated request
    • DELETE every 60 minutes (bulk)

The Impact:
Database connection pool exhaustion (18-20 out of 20 connections constantly in use)
Increased infrastructure costs (larger database instances needed)
Slow response times (added 100-150ms to authentication flows)
Schedulers consuming resources and locking tables
Growing storage costs for temporary data.

The Realization: We Needed a Better Solution
After monitoring our production metrics, we realized:

  1. OTPs are temporary by nature (10-minute lifespan)
  2. Blacklisted tokens are temporary (lifespan = remaining token TTL)
  3. These are simple key-value lookups (no complex queries or joins needed)
  4. High read frequency (especially token blacklist checks)
  5. Auto-expiry would eliminate schedulers (no manual cleanup needed)

We needed a data store optimized for:

  1. Temporary data with automatic expiration (TTL)
  2. Fast read/write operations (microseconds, not milliseconds)
  3. Simple key-value storage
  4. No manual cleanup overhead

My friend suggested Redis. Previously, I had only heard of Redis being used for caching, so this was new territory for me. But the more we researched, the more Redis seemed like the perfect solution for our authentication challenges.

The Solution: Migrating to Redis
After analyzing our requirements, Redis checked all the boxes. It's an in-memory data store designed for exactly this use case—temporary data that needs to be accessed quickly and expire automatically.

Setting Up Redis with Spring Boot

<!-- pom.xml -->
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
<dependency>
    <groupId>redis.clients</groupId>
    <artifactId>jedis</artifactId>
</dependency>
Enter fullscreen mode Exit fullscreen mode

Redis Configuration:

@Configuration
@EnableCaching
public class RedisConfig {

    @Bean
    public RedisConnectionFactory redisConnectionFactory() {
        RedisStandaloneConfiguration config = new RedisStandaloneConfiguration();
        config.setHostName("localhost");
        config.setPort(6379);
        return new JedisConnectionFactory(config);
    }

    @Bean
    public RedisTemplate<String, Object> redisTemplate() {
        RedisTemplate<String, Object> template = new RedisTemplate<>();
        template.setConnectionFactory(redisConnectionFactory());
        template.setKeySerializer(new StringRedisSerializer());
        template.setValueSerializer(new GenericJackson2JsonRedisSerializer());
        return template;
    }

    @Bean
    public StringRedisTemplate stringRedisTemplate() {
        return new StringRedisTemplate(redisConnectionFactory());
    }
}
Enter fullscreen mode Exit fullscreen mode

OTP Implementation with Redis

@Service
@RequiredArgsConstructor
@Slf4j
public class RedisOtpService {

    private final StringRedisTemplate redisTemplate;
    private final EmailService emailService;

    private static final long OTP_EXPIRY_MINUTES = 10;
    private static final int MAX_ATTEMPTS = 3;

    // Generate and store OTP in Redis
    public String generateOTP(String email, OtpPurpose purpose) {
        String otp = String.format("%06d", new Random().nextInt(999999));
        String redisKey = buildOtpKey(email, purpose);

        // Store OTP with metadata as hash
        Map<String, String> otpData = new HashMap<>();
        otpData.put("code", otp);
        otpData.put("attempts", "0");
        otpData.put("createdAt", String.valueOf(System.currentTimeMillis()));

        redisTemplate.opsForHash().putAll(redisKey, otpData);

        // Set expiry - Auto-deletion after 10 minutes (No scheduler needed!)
        redisTemplate.expire(redisKey, OTP_EXPIRY_MINUTES, TimeUnit.MINUTES);

        // Send OTP via email
        emailService.sendOtpEmail(email, otp);

        log.info("OTP generated for {}: {}", email, redisKey);
        return otp;
    }

    // Verify OTP
    public boolean verifyOTP(String email, OtpPurpose purpose, String inputOtp) {
        String redisKey = buildOtpKey(email, purpose);

        // Check if OTP exists (will be null if expired)
        String storedOtp = (String) redisTemplate.opsForHash().get(redisKey, "code");

        if (storedOtp == null) {
            log.warn("OTP not found or expired for {}", email);
            return false;
        }

        // Increment attempts
        Long attempts = redisTemplate.opsForHash().increment(redisKey, "attempts", 1);

        if (attempts > MAX_ATTEMPTS) {
            // Delete OTP after max attempts
            redisTemplate.delete(redisKey);
            log.warn("Max attempts exceeded for {}", email);
            return false;
        }

        // Verify OTP
        if (storedOtp.equals(inputOtp)) {
            // Delete OTP immediately after successful verification
            redisTemplate.delete(redisKey);
            log.info("OTP verified successfully for {}", email);
            return true;
        }

        log.warn("Invalid OTP for {}", email);
        return false;
    }

    private String buildOtpKey(String email, OtpPurpose purpose) {
        return String.format("otp:%s:%s", email, purpose.name().toLowerCase());
    }
}
Enter fullscreen mode Exit fullscreen mode

Key Benefits of Redis for OTPs:

  1. No INSERT query - Directly set key-value
  2. No UPDATE query - Atomic operations
  3. No DELETE query - Auto-expires with TTL
  4. No scheduler needed - Redis handles cleanup automatically

JWT Token Blacklist with Redis

@Service
@RequiredArgsConstructor
@Slf4j
public class RedisTokenBlacklistService {

    private final StringRedisTemplate redisTemplate;
    private final JwtUtil jwtUtil;

    // Blacklist token on logout
    public void blacklistToken(String token) {
        try {
            String jti = jwtUtil.getJtiFromToken(token);
            Date expiration = jwtUtil.getExpirationFromToken(token);

            // Calculate remaining TTL
            long ttlSeconds = (expiration.getTime() - System.currentTimeMillis()) / 1000;

            if (ttlSeconds > 0) {
                String redisKey = buildTokenKey(jti);

                // Store with exact remaining TTL
                redisTemplate.opsForValue().set(redisKey, "blacklisted", ttlSeconds, TimeUnit.SECONDS);

                log.info("Token blacklisted: {} with TTL: {}s", jti, ttlSeconds);
            }
        } catch (Exception e) {
            log.error("Error blacklisting token", e);
            throw new RuntimeException("Failed to blacklist token");
        }
    }

    // Check if token is blacklisted (called on every request)
    public boolean isTokenBlacklisted(String token) {
        try {
            String jti = jwtUtil.getJtiFromToken(token);
            String redisKey = buildTokenKey(jti);

            // Single Redis check - incredibly fast!
            Boolean exists = redisTemplate.hasKey(redisKey);

            return Boolean.TRUE.equals(exists);
        } catch (Exception e) {
            log.error("Error checking token blacklist", e);
            return false;
        }
    }

    private String buildTokenKey(String jti) {
        return String.format("blacklist:%s", jti);
    }
}
Enter fullscreen mode Exit fullscreen mode

Updated Authentication Filter:

@Component
@RequiredArgsConstructor
public class JwtAuthenticationFilter extends OncePerRequestFilter {

    private final JwtUtil jwtUtil;
    private final RedisTokenBlacklistService redisBlacklistService; // Changed to Redis

    @Override
    protected void doFilterInternal(HttpServletRequest request, 
                                    HttpServletResponse response, 
                                    FilterChain filterChain) 
            throws ServletException, IOException {

        String token = extractToken(request);

        if (token != null && jwtUtil.validateToken(token)) {
            // Redis check - 1-2ms instead of 50-100ms!
            if (redisBlacklistService.isTokenBlacklisted(token)) {
                response.setStatus(HttpServletResponse.SC_UNAUTHORIZED);
                return;
            }

            Authentication auth = getAuthentication(token);
            SecurityContextHolder.getContext().setAuthentication(auth);
        }

        filterChain.doFilter(request, response);
    }
}
Enter fullscreen mode Exit fullscreen mode

Logout Controller:

@RestController
@RequestMapping("/api/auth")
@RequiredArgsConstructor
public class AuthController {

    private final RedisTokenBlacklistService tokenBlacklistService;

    @PostMapping("/logout")
    public ResponseEntity<?> logout(@RequestHeader("Authorization") String authHeader) {
        String token = authHeader.substring(7); // Remove "Bearer " prefix

        // Blacklist token in Redis
        tokenBlacklistService.blacklistToken(token);

        return ResponseEntity.ok(Map.of("message", "Logged out successfully"));
    }
}
Enter fullscreen mode Exit fullscreen mode

Key Benefits of Redis for Token Blacklist:

  • 1-2ms lookup time (vs 50-100ms with PostgreSQL)
  • Auto-expiry - Token removed exactly when it expires naturally
  • No scheduler needed - Redis manages cleanup
  • No storage overhead - Memory freed automatically

Scheduler Comparison
Before Redis (PostgreSQL):

Scheduler Name Frequency Purpose Impact
OTP Cleanup Every 5 minutes Delete expired OTPs Heavy DELETE, table locks
Token Cleanup Every 60 minutes Delete expired tokens Heavy DELETE, table locks

After Redis:

Scheduler Name Frequency Purpose Impact
NONE N/A Redis TTL handles everything Zero overhead

The Results: Production Performance Improvements
After migrating to Redis, the improvements were dramatic and immediate.
Response Time Improvements

Operation PostgreSQL Latency Redis Latency Improvement
OTP Generation 80–120 ms 2–3 ms 97% faster
OTP Verification 150–200 ms 2–4 ms 98% faster
Token Blacklist Check 50–100 ms 1–2 ms 98% faster
Logout Operation 60–90 ms 2–3 ms 97% faster

Storage and Cost Savings

  • Connection pool utilization dropped from 90–100% to 40–60%, resulting in a 50% reduction in database load.
  • Eliminated 612,000+ database queries per hour during peak traffic.
  • Freed 6.2 GB of PostgreSQL storage while Redis used only 145 MB of in-memory data.
  • Simplified infrastructure by removing schedulers and cleanup jobs, and reducing backup and maintenance overhead.
  • Improved API performance drastically: OTP operations fell from 150–180 ms to <10 ms, and authenticated request overhead dropped to 1–2 ms.

Key Lessons Learned

1. Not All Data Belongs in Your Primary Database
PostgreSQL is excellent for persistent, relational data. But temporary, high-frequency data like OTPs and session tokens are better served by specialized tools like Redis.

2. Automatic Expiry > Manual Cleanup
Redis TTL eliminated our schedulers entirely. No more:

  • Cron jobs to maintain
  • Bulk DELETE operations
  • Table locks during cleanup
  • Monitoring cleanup job failures

3. In-Memory Speed Matters for Authentication
Authentication happens on nearly every request. Reducing latency from 100ms to 2ms means:

  • Better user experience
  • Lower infrastructure costs
  • Higher throughput capacity

4. Choose the Right Tool for the Job
We were using PostgreSQL as a:

  • Temporary data store (wrong tool)
  • High-frequency cache (wrong tool)
  • Persistent data store (right tool)

Redis excels at:

  • Temporary data with TTL
  • High-frequency reads/writes
  • Simple key-value operations

Conclusion
Migrating OTPs and JWT token blacklists from PostgreSQL to Redis reduced authentication latency by 98%, cut database load by 50%, and eliminated over 600K queries per hour, while also simplifying infrastructure and lowering costs. The takeaway is clear: use PostgreSQL for persistent data and Redis for fast, temporary data don’t force one tool to do everything.

Have you faced similar authentication challenges? Share your experiences in the comments below!

Top comments (0)