DEV Community

Ross Kim
Ross Kim

Posted on

Understanding the EIS Pipeline Through Gyroflow — A Look Inside Image Stabilization via Open Source

Introduction

In the previous article, I covered the differences between OIS, EIS, and HIS. For EIS, I explained it roughly as "correcting frames based on gyro data," but what actually happens inside is more complex than that suggests.

EIS is not simply "cropping a shaky video to stabilize it." It is a pipeline that reads data from gyro sensors, estimates camera motion, decides which movements to keep and which to remove, and finally transforms each frame.

In this article, I will reference an open-source project called Gyroflow to examine what each stage of the EIS pipeline actually does. While commercial EIS implementations are proprietary, Gyroflow implements the same principles as open source, making it an excellent learning resource.


What Is Gyroflow

Gyroflow is an open-source video stabilization software that uses gyroscope data. It was originally built for post-processing stabilization of action camera and drone footage.

Here are its key features:

  • Gyro data-based: Estimates motion vectors from gyro sensor data rather than image analysis (optical flow)
  • Wide camera support: Can directly read gyro data from major action cameras including GoPro, DJI, Sony, and Insta360
  • Lens profiles: Includes built-in distortion models for various lenses, including fisheye
  • GPU acceleration: Supports near-real-time processing speeds via OpenCL/wgpu
  • Rust-based: Core logic is written in Rust, making the code relatively readable

The source code is available on GitHub, and the community is active. Beyond practical use, it holds significant value as a reference for understanding how EIS works internally.


EIS Pipeline Overview

The EIS pipeline can be broken down into five stages. The diagram below shows the overall flow:

Let's walk through each stage.


Stage 1: Gyro Data Acquisition

A gyroscope sensor measures the camera's angular velocity. In simple terms, it records in which direction and how fast the camera is rotating across three axes (pitch, yaw, roll).

This data is typically recorded alongside video frames at the camera firmware level. Action cameras like GoPro embed gyro data in the video file's metadata. Gyroflow parses this metadata to extract the gyro data.

There are some critical points here:

Timestamp Synchronization

The timestamps of gyro data and video frames must match precisely. Since the gyro sensor's sampling rate (typically 200Hz–8kHz) and the video frame rate (30fps, 60fps, etc.) differ, time synchronization is essential to correctly match gyro data to specific frames.

To illustrate how sensitive this synchronization is: an offset of just 10ms can cause the correction direction to reverse relative to the actual shake, effectively doubling the shake amplitude. At 30fps, one frame is approximately 33ms, so 10ms is only one-third of a frame — yet it is fatal to the result. Gyroflow allows adjusting this offset either automatically or manually, while commercial products typically handle synchronization at the hardware level.

Gyro Data Noise

Gyro sensor data always contains noise. There is drift — a slow accumulation of error — as well as high-frequency noise. Using this noisy data directly results in jittery or unstable corrections. Therefore, preprocessing of the gyro data itself (such as a low-pass filter) is necessary.


Stage 2: Motion Estimation — Camera Orientation Calculation

Integrating gyro data (angular velocity) over time yields the camera's orientation. The direction the camera was pointing at each frame is expressed as a quaternion or rotation matrix.

Angular velocity data → Time integration → Per-frame camera orientation (quaternion)
Enter fullscreen mode Exit fullscreen mode

Several practical issues arise in this process:

Integration Drift

Integrating angular velocity accumulates error. This is gyro drift. It is negligible over short intervals, but over tens of seconds to several minutes of footage, it builds up to significant levels. Gyroflow offers an option to use optical flow-based auxiliary estimation alongside gyro data to correct this drift.

Coordinate System Transformation

The gyro sensor's coordinate system does not always match the camera (image) coordinate system. The axis mapping varies depending on how the sensor is mounted on the circuit board. This is why Gyroflow has an IMU orientation setting. In commercial products, this mapping is fixed in firmware.


Stage 3: Smoothing — Which Movements to Keep, Which to Remove

This stage is the core of EIS and has the greatest impact on output quality.

Camera movements fall into two categories:

  • Intended movements: Smooth camera motion the operator deliberately performs, such as panning and tilting
  • Unintended movements: Unwanted shake from hand tremor, walking vibration, etc.

The goal of EIS is to remove the latter while preserving the former. The challenge is that perfectly distinguishing between the two automatically is impossible.

Limitations of a Simple Low-Pass Filter

The most intuitive approach is to apply a low-pass filter (LPF) to the camera orientation data — remove high-frequency shake while passing through low-frequency motion.

However, this method has a fundamental problem:

  • Set the cutoff frequency low → Hand shake is well suppressed, but fast panning introduces lag (the video feels like it is trailing behind)
  • Set the cutoff frequency high → Panning response is good, but hand shake remains

Ultimately, a fixed LPF cannot simultaneously achieve "stable yet responsive" results.

Gyroflow's Smoothing Algorithms

Gyroflow provides several smoothing algorithms, notably:

  • Plain 3D: Gaussian smoothing in quaternion space. Simple but produces basic results.
  • Horizon lock: A special mode that locks the horizon. Frequently used for drone footage.
  • Fixed camera: Removes all movement as if the camera were fixed. Tripod effect.
  • Velocity dampened: Dynamically adjusts smoothing intensity based on camera movement speed. Strong smoothing when stationary, weak smoothing during panning.

Among these, velocity dampened is the most practical. It takes an adaptive approach to overcome the limitations of a simple LPF: "reduce smoothing when the camera is moving fast, strengthen smoothing when stationary."

Commercial EIS products also use similar adaptive smoothing in principle, but the specific implementations vary by manufacturer and are not publicly disclosed.


Stage 4: Frame Warping

Once a "corrected camera orientation" has been obtained through smoothing, the actual video frames need to be transformed.

The frame is rotated/transformed by the difference between the original camera orientation and the corrected camera orientation. This process is called frame warping.

How Warping Works

Simplified, the flow looks like this:

  1. Original frame's camera orientation: Q_original
  2. Smoothed target orientation: Q_smoothed
  3. Correction rotation: Q_correction = Q_smoothed × Q_original⁻¹
  4. Apply this correction rotation to the image

Multiplying by Q_original⁻¹ (the inverse of the quaternion) means "reversing the original frame's rotation." It is an operation that cancels out the original shaky orientation and moves the frame to the desired smooth orientation.

In practice, lens distortion correction is also performed alongside the rotation. If the image is rotated while wide-angle fisheye distortion remains, the edges exhibit a warping artifact. This is why Gyroflow requires a lens profile — the distortion model must be known for accurate warping.

FOV Loss

Rotating/transforming a frame creates empty areas at the edges. These empty areas must be hidden by cropping the video, which reduces the field of view (FOV).

This is the most fundamental trade-off in EIS:

  • Stronger correction → more cropping → greater FOV loss
  • Preserving FOV → correction intensity must be reduced → shake remains

Gyroflow allows users to directly adjust this crop ratio and offers a Dynamic Zoom feature that automatically calculates the optimal crop per frame. Commercial products typically use a fixed crop (around 10–15%) or resolution oversampling (4K capture → 1080p stabilized output) to make FOV loss imperceptible.


Stage 5: Output

Encoding the warped frames produces the final stabilized video. Since Gyroflow is a post-processing tool, this occurs as a separate rendering step, whereas commercial real-time EIS executes this entire pipeline simultaneously with each frame capture.

Real-time processing introduces additional constraints:

  • Look-ahead limitation: Future frames available for smoothing are limited
  • Computational resources: Processing must complete within GPU/DSP resource budgets
  • Thermal/power: Must operate within the mobile device's thermal and power budget

Because of these constraints, even when using the same principles, there is a quality gap between post-processing (Gyroflow) and real-time (commercial EIS) output.


Differences Between Gyroflow and Commercial EIS

While Gyroflow helps us understand EIS principles, there are several important differences from commercial EIS:

Aspect Gyroflow (Post-processing) Commercial EIS (Real-time)
Processing timing After recording Simultaneous with recording
Smoothing look-ahead Entire video available A few to tens of frames
Lens distortion correction Separate lens profile Built into ISP
Gyro synchronization Manual/auto estimation Hardware synchronization
Computational resources PC GPU (unlimited) Mobile DSP (limited)
FOV handling Dynamic crop available Typically fixed crop

In commercial products, this entire pipeline runs with hardware acceleration inside the ISP (Image Signal Processor), often operating in hybrid mode coordinated with OIS.


Summary

On the surface, EIS can be summarized as "cropping to suppress shake," but internally, a complex pipeline exists:

  1. Gyro sensor data acquisition and time synchronization
  2. Camera orientation estimation through angular velocity integration
  3. Smoothing to separate intended movements from shake
  4. Frame warping that accounts for lens distortion
  5. Managing the trade-off between FOV loss and correction intensity

Gyroflow implements this pipeline transparently as open source, making it an excellent reference for understanding how EIS works. For those interested in reading the actual code, I recommend checking out the GitHub repository.

In the next article, I plan to cover the relationship between Rolling Shutter and EIS — why rolling shutter correction is additionally required in EIS.


This article was written based on hands-on experience in camera module development. The descriptions of Gyroflow reference its publicly available source code and documentation, and do not describe the internal implementation of any specific product. If you find any errors or have feedback, please leave a comment.

Top comments (0)