Developing production-quality video recording in Android is deceptively tricky. You might get your app to work perfectly on a Pixel device, only to see green/purple stripes, crashes, or corrupted files on MediaTek or Exynos phones. The root cause? Subtle MediaCodec and YUV layout inconsistencies across OEMs.
Here’s what I learned after testing dozens of devices and fixing real-world production failures.
The Problem
Many developers take a naive approach: convert a Bitmap to YUV manually and push it to the encoder. It seems simple — but in production, it’s a minefield:
- Stride assumptions differ per device (MediaTek often pads by 16–64 bytes)
- Chroma plane ordering varies (Planar vs SemiPlanar)
- Surface locking can race, causing IllegalArgumentException
- Encoder output threads can block indefinitely, hanging your app
The result? Crashes, corrupted files, and a video pipeline that’s unreliable across devices.
Why Naive Approaches Fail Manual YUV Conversion
Manual YUV Conversion
fun bitmapToYuv420(bitmap: Bitmap, width: Int, height: Int): ByteArray {
val yuv = ByteArray(width * height * 3 / 2)
// Naive RGB → YUV conversion without stride handling
return yuv
}
What goes wrong:
- Assumes stride == width → fails on MediaTek
- Ignores planar/interleaved layout → color corruption
- Misses hardware alignment → encoder rejects frames
Even if it works on your test phone, it’s likely to fail on other devices.
Naive Surface Handling
fun recordFrame(bitmap: Bitmap) {
val canvas = surface.lockCanvas(null)
canvas.drawBitmap(bitmap, ...)
surface.unlockCanvasAndPost(canvas)
}
Issues:
- No concurrency control → crashes if frame arrives while previous is drawn
- Blocking main thread → dropped frames or camera freezes
- No error recovery → entire recording fails
The Production Solution: Surface-Based Encoding
The only vendor-agnostic, production-ready approach is Surface-based encoding (COLOR_FormatSurface). The hardware handles stride, color conversion, and alignment internally.
Key Patterns:
- Frame dropping: Skip a frame if encoder is busy to prevent Surface lock conflicts
- Short timeout: Use ~100ms for dequeueOutputBuffer() for responsive shutdown
- Resource cleanup order: MediaMuxer → MediaCodec → Surface
- Thread safety: Use a dedicated encoder thread for all operations
val isEncodingFrame = AtomicBoolean(false)
fun recordFrame(bitmap: Bitmap) {
if (!isEncodingFrame.compareAndSet(false, true)) return
try {
val canvas = encoderSurface.lockCanvas(null)
canvas.drawBitmap(bitmap, null, dstRect, paint)
encoderSurface.unlockCanvasAndPost(canvas)
} finally {
isEncodingFrame.set(false)
}
}
This approach works across Qualcomm, MediaTek, and Exynos devices, for Android 10–15, with long recordings and high FPS.
Trade-offs and Lessons
- Frame dropping vs quality: Slightly choppy video is better than a crash
- Memory & CPU: Surface encoding offloads conversion to GPU, reducing memory footprint
- Background processing: Android 12+ may kill encoder threads if the app is backgrounded — use a foreground service
- Testing: Focus on MediaTek devices (Vivo, Oppo, Xiaomi) and high-FPS scenarios
Key Takeaways
- Avoid manual YUV conversion — it’s fragile and device-specific
- Use Surface-based encoding — hardware handles quirks automatically
- Test on multiple chipsets and Android versions — high signal beats theory
With this approach, your video pipeline will be production-ready, reliable, and maintainable — no device-specific hacks required.
Top comments (0)