DEV Community

Cover image for How I Eliminated Video Jank in Flutter Feeds — Instance Reuse Instead of Dispose/Create
Abdullah Taş
Abdullah Taş

Posted on

How I Eliminated Video Jank in Flutter Feeds — Instance Reuse Instead of Dispose/Create

Every Flutter developer who's built a video feed has hit the same wall. You scroll through 20 videos on a Redmi Note 8 Pro, and by video 15, frames are dropping, the phone is warm, and by video 30, you're getting OOM crashes on budget devices.

The root cause isn't Flutter. It's the standard approach to video lifecycle.

The Problem: Dispose/Create on Every Scroll

Here's what most video feed implementations do:

User scrolls from video 4 to video 5
→ controller_4.dispose() // tears down decoder pipeline, releases texture
→ controller_5 = VideoPlayerController.network(url) // allocates new decoder
→ controller_5.initialize() // GPU texture allocation, ~15-30ms
→ controller_5.play()

Each cycle triggers:

  • Decoder teardown: The hardware decoder pipeline is torn down and rebuilt
  • Texture deallocation/reallocation: GPU memory freed, then re-acquired
  • GC pressure: Dart's garbage collector runs to clean up the old controller
  • Jank spike: 15-30ms pause visible as a frame drop during scroll

On 100 scrolls, that's 100 allocation cycles. The jank compounds — GC runs become longer as heap fragments, device temperature rises reducing clock speeds, and available memory shrinks.

The Solution: Never Dispose During Scroll

What if the players just... never went away?

That's the core idea behind video_pool. Create N player instances at startup (typically 3), and reuse them by swapping the video source:

*Pool Init: Create Player-0, Player-1, Player-2
Scroll 1→2: Player-0 keeps video 1 (pause), Player-1 plays video 2
Scroll 2→3: Player-0.swapSource(video 4), Player-2 plays video 3
Scroll 3→4: Player-1.swapSource(video 5), Player-0 plays video 4
...
Result: 3 players handle infinite scroll. Zero GC pressure.
*

swapSource() replaces the media URI without destroying the decoder pipeline or texture surface. The player stays allocated in GPU memory — only the bitstream changes.

The Architecture

The pool doesn't just reuse players. It's a full orchestration engine:

*User scrolls → VisibilityTracker (intersection ratios)
→ VideoPool.onVisibilityChanged(primary: 5, ratios: {...})
→ LifecycleOrchestrator.reconcile()
→ queries DeviceMonitor (thermal + memory state)
→ computes effective limits
→ produces ReconciliationPlan
→ VideoPool executes: release → preload → play → pause
*

Key design decisions:

1. Reconciliation plans, not direct mutation

The orchestrator never touches player state directly. It returns an immutable plan:

ReconciliationPlan(
toRelease: {2}, // return to idle pool
toPreload: {6}, // swap source for upcoming video
toPlay: {5}, // play (instant if preloaded)
toPause: {4}, // keep decoder, stop playback
)

The pool executes in strict order: release → preload → play → pause. This prevents race conditions and makes the behavior deterministic.

2. Device adaptation via native monitoring

Native iOS/Android code (Swift/Kotlin) streams thermal state and memory pressure to Dart every 2 seconds:

Condition Pool Response
Thermal nominal Full capacity (3 players)
Thermal serious Reduced to 2 players, preloading disabled
Thermal critical 1 player only
Memory terminal Emergency flush — dispose all non-playing players

This happens automatically. No configuration needed.

3. Disk pre-fetching in Isolate

A separate Dart Isolate downloads the first 2MB of upcoming videos to a 500MB LRU disk cache. When the user scrolls to a preloaded video, playback starts from the local file — near-instant first frame.

Usage

VideoPoolScope(
config: const VideoPoolConfig(
maxConcurrent: 3,
preloadCount: 1,
),
adapterFactory: (_) => MediaKitAdapter(),
sourceResolver: (index) => videos[index],
child: VideoFeedView(sources: videos),
)

That's a full TikTok-style feed. The VideoFeedView handles PageView snapping, visibility tracking, and lifecycle callbacks internally.

For Instagram-style mixed feeds:

VideoPoolScope(
  config: const VideoPoolConfig(maxConcurrent: 2, preloadCount: 1),
  adapterFactory: (_) => MediaKitAdapter(),
  sourceResolver: (index) => getVideoSource(index),
  child: VideoListView(
    itemCount: feedItems.length,
    itemBuilder: (context, index) {
      if (feedItems[index].isVideo) {
        return VideoCard(index: index, source: feedItems[index].source);
      }
      return TextPost(feedItems[index]);
    },
  ),
)
Enter fullscreen mode Exit fullscreen mode

What's in v0.3.1

  • Event-sourced observability: All pool operations emit typed events (SwapEvent, ThrottleEvent, CacheEvent, etc.) via a stream
  • Bandwidth intelligence: EMA-based bandwidth estimation adjusts preload behavior based on network speed
  • Predictive scroll engine: Uses Flutter's scroll physics to predict where the user will stop, pre-loading that video
  • Cooperative multi-pool: Share hardware decoder budget across multiple pool instances (e.g., Feed tab + Discover tab)
  • 227 unit tests

Try It

dependencies:
video_pool: ^0.3.1
media_kit: ^1.1.11
media_kit_video: ^1.2.5
media_kit_libs_video: ^1.0.5

I'd love feedback — especially on API design, default values, and what features you'd need before using this in production. Open an issue or comment here.

Top comments (0)