Building a URL Shortener with Go and Redis
Ever wondered how services like bit.ly or TinyURL work? I recently built my own URL shortener as part of the Coding Challenges series, and I'm excited to share what I learned!
๐ฏ The Challenge
Build a service that:
- Takes long URLs and generates short, memorable codes
- Redirects users from short URLs to original destinations
- Handles millions of URLs efficiently
- Deals with hash collisions gracefully
My solution: A Go-based service with Redis for blazing-fast lookups, all containerized with Docker.
๐ฆ Full source code on GitHub
๐๏ธ Architecture Overview
The system has three main components:
User Request โ Go API Server โ Redis Database
Why Go?
- Fast compilation and execution
- Built-in concurrency
- Excellent standard library for HTTP
Why Redis?
- In-memory storage = microsecond lookups
- Simple key-value model perfect for this use case
- Persistence options for data durability
๐ The Core: Hash Generation
The heart of any URL shortener is converting long URLs into short codes. Here's my approach:
func ShortHash(url string) string {
hash := xxhash.Sum64String(url)
return EncodeBase62(hash)
}
func EncodeBase62(num uint64) string {
const base62Chars = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz"
if num == 0 {
return "0"
}
encoded := ""
base := uint64(len(base62Chars))
for num > 0 {
remainder := num % base
encoded = string(base62Chars[remainder]) + encoded
num = num / base
}
return encoded
}
Why xxHash?
- Extremely fast (GB/s throughput)
- Good distribution (fewer collisions)
- Non-cryptographic (we don't need security here)
Why Base62?
- Uses
0-9,A-Z,a-z(URL-safe) - Shorter than Base16 or Base10
- More readable than Base64
Result: https://example.com/page โ 2Hs4pQx7 (8 characters)
๐ฅ Handling Collisions
Hash collisions are inevitable. Two different URLs might generate the same short code. Here's how I handle it:
func (a *API) ShortenHandler(w http.ResponseWriter, r *http.Request) {
normalized := Normalize(payload.URL)
hash := ShortHash(normalized)
finalHash := hash
maxRetries := 10
for attempts := 0; attempts < maxRetries; attempts++ {
existing, err := a.Redis.Get(ctx, finalHash).Result()
// Key doesn't exist - we can use it!
if err == redis.Nil {
break
}
// Key exists with same URL - return existing
if existing == normalized {
return existingShortURL(finalHash, normalized)
}
// Collision! Add random suffix
finalHash = hash + generateRandomSuffix(attempts + 1)
}
// Store in Redis
a.Redis.Set(ctx, finalHash, normalized, 0)
}
Strategy:
- Check if hash exists
- If it's the same URL, return existing short code
- If different URL (collision), append random suffix
- Retry up to 10 times
- Fail gracefully if all retries exhausted
๐ URL Normalization
To prevent the same URL from getting multiple short codes, I normalize URLs:
func Normalize(raw string) string {
u, err := url.Parse(raw)
if err != nil {
return raw
}
u.Host = strings.ToLower(u.Host)
return u.String()
}
This ensures:
-
EXAMPLE.COMโexample.com -
Example.comโexample.com
All variations of the same domain get the same short code!
๐ณ Docker Setup
The entire stack runs in Docker with a single command:
# docker-compose.yml
version: '3.8'
services:
redis:
image: redis:7-alpine
volumes:
- redis_data:/data
healthcheck:
test: ["CMD", "redis-cli", "ping"]
app:
build: .
ports:
- "8080:8080"
depends_on:
redis:
condition: service_healthy
environment:
- REDIS_HOST=redis
- SHORT_URL_BASE=http://localhost:8080
Key features:
- Health checks ensure Redis is ready before app starts
- Persistent volumes keep data safe
- Environment variables for easy configuration
๐ Performance Considerations
Current performance:
- Hash generation: ~0.1ms
- Redis lookup: <1ms
- Total response time: ~2ms
Scalability improvements I'd add:
- Rate limiting to prevent abuse
- Analytics tracking (click counts, referrers)
- Custom short codes for vanity URLs
- Expiration times for temporary links
- Caching layer for hot URLs
๐งช Testing It Out
# Start the service
docker-compose up --build
# Shorten a URL
curl -X POST http://localhost:8080/api/v1/shorten \
-H "Content-Type: application/json" \
-d '{"url":"https://github.com/Kryko7/URLShortner"}'
# Response
{
"key": "xY9pQ2",
"long_url": "https://github.com/Kryko7/URLShortner",
"short_url": "http://localhost:8080/xY9pQ2"
}
# Use it
curl -L http://localhost:8080/xY9pQ2
# Redirects to GitHub!
๐ What I Learned
- xxHash is incredibly fast - Perfect for non-cryptographic hashing
- Redis is simple but powerful - Key-value stores are underrated
- Collision handling matters - Even with good hash functions
- Docker makes deployment trivial - One command to run everything
- Context timeouts prevent hangs - Always set timeouts for external calls
๐ Try It Yourself
The full project is open source and ready to run:
GitHub: https://github.com/Kryko7/URLShortner
Challenge: Coding Challenges - URL Shortener
Clone it, run docker-compose up, and you'll have a working URL shortener in 30 seconds!
๐ฎ Future Improvements
Some ideas I'm considering:
- Analytics dashboard - Track clicks, geographic data
- Custom aliases - Let users choose their short codes
- QR code generation - Generate QR codes for short URLs
- API authentication - Protect the service from abuse
- Link expiration - Auto-delete old URLs
- Batch API - Shorten multiple URLs at once
๐ฌ What Would You Add?
Have ideas for improvements? Found a bug? Want to discuss the architecture?
Drop a comment below or open an issue on GitHub! I'd love to hear your thoughts.
Happy coding! ๐
If you found this helpful, give the GitHub repo a โญ and follow me for more coding challenges!
Top comments (0)