๐ง Java Memory Optimization with Guava Cache: A Real-World Guide
Java is a powerful language, but if you're not careful, it can also be a memory hog. In this post, Iโll walk you through a real-world scenario where a Java service was running into memory issues โ and how we solved it using Guavaโs Cache.
TL;DR: Use bounded caches, close your streams, and profile before you optimize.
๐จ The Problem: OutOfMemoryError in Production
We had a Spring Boot service that:
- Accepted PDF uploads
- Parsed file content
- Cached parsed data for reuse
Under heavy load, the app began throwing this:
java.lang.OutOfMemoryError: Java heap space
Letโs dive into what went wrong and how we fixed it.
๐ฌ Step 1: Profile Before You Panic
We used tools like:
- VisualVM
- Eclipse Memory Analyzer (MAT)
These revealed:
- Many unclosed
FileInputStreams - A massive
HashMap<String, byte[]> - Large
byte[]objects retained indefinitely
๐ The Code That Broke Things
Map<String, byte[]> documentCache = new HashMap<>();
public byte[] processFile(String fileName) throws IOException {
if (documentCache.containsKey(fileName)) {
return documentCache.get(fileName);
}
File file = new File("/docs/" + fileName);
FileInputStream fis = new FileInputStream(file); // ๐จ Leaking stream
byte[] data = fis.readAllBytes(); // ๐จ No close()
documentCache.put(fileName, data); // ๐จ Unbounded map
return data;
}
Whatโs wrong?
- No stream cleanup
- No eviction policy in the cache
- Large byte arrays living forever
โ The Solution: Guava Cache + Stream Safety
๐งช Step 1: Close Your Streams!
try (FileInputStream fis = new FileInputStream(file)) {
byte[] data = fis.readAllBytes();
// ...
}
Using try-with-resources ensures your streams are closed properly, avoiding file descriptor leaks.
๐งช Step 2: Add a Bounded Cache
import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;
Cache<String, byte[]> documentCache = CacheBuilder.newBuilder()
.maximumSize(100) // Max 100 files cached
.expireAfterWrite(10, TimeUnit.MINUTES) // Evict after 10 mins
.build();
This creates a smart cache:
- ๐ Automatically evicts unused entries
- ๐ง Prevents memory bloat
- โก๏ธ Still fast (in-memory)
๐งช Step 3: Use It Safely
public byte[] processFile(String fileName) throws IOException {
byte[] cached = documentCache.getIfPresent(fileName);
if (cached != null) return cached;
File file = new File("/docs/" + fileName);
try (FileInputStream fis = new FileInputStream(file)) {
byte[] data = fis.readAllBytes();
documentCache.put(fileName, data);
return data;
}
}
Now your code:
- โ Frees up memory
- โ Avoids leaks
- โ Stays performant
๐ง JVM Tuning (Bonus)
We also added these JVM flags in Docker:
-Xms512m -Xmx1024m -XX:+UseG1GC -XX:+ExitOnOutOfMemoryError
These help manage memory pressure and auto-restart the service on OOM.
๐ Results
| Metric | Before | After |
|---|---|---|
| Memory Usage | ~1.4GB | ~850MB |
| Crashes/Day | 3โ4 | 0 |
| GC Pause Time | High | Low |
๐ง Key Takeaways
- Profile first, donโt blindly optimize.
- Always close your I/O resources.
- Use Guavaโs
Cacheto avoid writing your own LRU logic. - Tune your JVM based on the workload.
๐ Whatโs Your Java Memory War Story?
Have you ever battled memory leaks or slow GCs? Share your experience in the comments!
If you found this useful, follow me for more real-world Java, Spring Boot, and performance engineering tips!
Top comments (0)