Live streaming is everywhere. But just streaming a plain camera feed feels boring in 2026. Viewers expect filters, effects, branding β the kind of stuff you see on TikTok and Instagram Live.
In this post I'll show you how to build exactly that: an Android app that applies real-time DeepAR AR filters (think viking helmets, neon devil horns, cat ears) and custom canvas overlays (your logo, watermark, live badge) β and streams the whole thing live via Ant Media Server over WebRTC with ~0.5s latency.
The full sample project is on GitHub β AntMedia-DeepAR-And-Overlay
What you'll build
Two Activities, both streaming live:
- DeepARActivity β real-time AR face filters powered by DeepAR SDK
- CustomCanvasActivity β custom bitmap/text overlays drawn with Android Canvas
Prerequisites
Before you start, make sure you have:
- Android Studio (Hedgehog or newer)
- A physical Android device (API 21+) β emulators don't do face tracking justice
- A DeepAR account and license key (free tier available)
- Git installed β Android Studio does not include Git, install it separately from git-scm.com first
Step 1 β Clone the project
git clone https://github.com/USAMAWIZARD/AntMedia-DeepAR-And-Overlay.git
Open Android Studio β File β Open β select the cloned folder. Wait for Gradle sync to finish.
π‘ Good news: The repo already has
settings.gradle,build.gradle, and theassetsfolder pre-configured. You don't need to add dependencies manually β they're already there.
Step 2 β Get your DeepAR license key
- Sign up at developer.deepar.ai
- Create a new project β Add Android App
- Enter your exact package name:
com.example.antmediacustomcanvasstreaming - Copy the generated license key
Open res/values/strings.xml and paste it:
<resources>
<string name="app_name">AntMediaDeepAR</string>
<string name="deepar_license_key">YOUR_LICENSE_KEY_HERE</string>
</resources>
β οΈ The license key is tied to your exact package name. If the app crashes on launch, this is almost certainly why β double-check the package name matches exactly.
Step 3 β Add the DeepAR effect files
Here's a gotcha that isn't obvious from the code: DeepARActivity references 17 specific effect files by name. If any of them are missing from your assets folder, the app crashes with no clear error message.
Go to developer.deepar.ai/downloads and download the DeepAR Filter Pack (Legacy) β this zip contains all the effects the code needs. Unzip it and copy the .deepar files into:
app/src/main/assets/
The code expects these files specifically:
viking_helmet.deepar
MakeupLook.deepar
Emotions_Exaggerator.deepar
Neon_Devil_Horns.deepar
Elephant_Trunk.deepar
flower_face.deepar
Humanoid.deepar
... (and more)
β Don't just download a couple β copy everything from the zip.
Step 4 β How the DeepAR integration works
This is the interesting part. The key to making DeepAR and Ant Media work together is one line:
.setVideoSource(IWebRTCClient.StreamSource.CUSTOM)
By default IWebRTCClient opens the camera itself. Setting CUSTOM tells it: "I'll push frames to you manually." That's what lets you insert DeepAR (or Canvas drawing) between the camera and the WebRTC encoder.
Here's the flow:
CameraX β feedDeepAR() β DeepARRenderer (GLSurfaceView) β IWebRTCClient β Ant Media Server
The IWebRTCClient is set up like this:
webRTCClient = IWebRTCClient.builder()
.setServerUrl("wss://test.antmedia.io/LiveApp/websocket")
.setActivity(this)
.setVideoSource(IWebRTCClient.StreamSource.CUSTOM)
.setWebRTCListener(createWebRTCListener())
.setInitiateBeforeStream(true)
.build();
CameraX feeds each frame into DeepAR via receiveFrame():
deepAR.receiveFrame(
buffers[currentBuffer],
width,
height,
image.getImageInfo().getRotationDegrees(),
lensFacing == CameraSelector.LENS_FACING_FRONT, // mirror for selfie
DeepARImageFormat.YUV_420_888,
uPixelStride
);
DeepAR processes the frame (applies the AR effect using GPU), then DeepARRenderer pushes the result to the WebRTC client.
Switching effects mid-stream is one line and doesn't interrupt the broadcast at all:
deepAR.switchEffect("effect", "file:///android_asset/viking_helmet.deepar");
Step 5 β Custom Canvas overlays
CustomCanvasActivity takes a different approach β no DeepAR needed. It grabs each camera frame as a Bitmap and paints your graphics on top using Android's Canvas API before pushing to IWebRTCClient.
private Bitmap addOverlay(Bitmap frame) {
Bitmap mutable = frame.copy(Bitmap.Config.ARGB_8888, true);
Canvas canvas = new Canvas(mutable);
// Logo β bottom-right corner
int lx = mutable.getWidth() - overlayBitmap.getWidth() - 20;
int ly = mutable.getHeight() - overlayBitmap.getHeight() - 20;
canvas.drawBitmap(overlayBitmap, lx, ly, null);
// Text watermark β top-left with drop shadow
Paint paint = new Paint();
paint.setColor(Color.WHITE);
paint.setTextSize(40f);
paint.setAntiAlias(true);
paint.setShadowLayer(4f, 2f, 2f, Color.BLACK);
canvas.drawText("@YourBrand", 24, 60, paint);
return mutable;
}
The same StreamSource.CUSTOM pattern applies β same builder, same sendFrameForProcessing() call.
Step 6 β Run it
Connect your Android phone via USB, enable USB Debugging, and hit βΆ Run in Android Studio.
Grant camera and microphone permissions when prompted.
The repo already points to Ant Media's public test server β you don't need your own server to test this. Just tap Start in the app, then open this URL in your browser:
https://test.antmedia.io/LiveApp/player.html?id=test1
You should see your AR-filtered live stream playing in the browser within a second. π
When you're ready to use your own server, just swap this line in setupStreamingAndPreview():
.setServerUrl("wss://YOUR-SERVER/LiveApp/websocket")
Things that tripped me up (so they don't trip you up)
After going through this end-to-end, here's what I wish I'd known:
1. Install Git before you do anything else.
Android Studio doesn't come with Git. If you run git clone and get 'git' is not recognized, go install it from git-scm.com first, then reopen your terminal.
2. The Filter Pack zip is not obvious.
The downloads page shows "DeepAR Android SDK" which is the SDK itself. The effect files are in a separate "Filter Pack (Legacy)" download. Easy to miss, causes a silent crash.
3. Missing effect files = instant crash, no clear error.
The crash log shows an EGL surface error which looks completely unrelated to missing files. If your DeepAR activity crashes immediately, check your assets folder first.
4. The test server is genuinely useful.
wss://test.antmedia.io is Ant Media's public test server β it's real, it works, and it means you can verify your stream end-to-end without any server setup.
5. Use a physical device for DeepAR.
Emulator cameras are simulated and face tracking won't work meaningfully. The custom canvas overlay works fine on emulators though.
What to build next
Once you have this running, some natural next steps:
-
Filter picker UI β a horizontal RecyclerView that calls
deepAR.switchEffect()on tap -
Local recording β
deepAR.startVideoRecording()saves an AR copy to the gallery simultaneously while streaming - Custom effects β build your own branded AR filter in DeepAR Creator Studio
- Your own server β sign up for Ant Media's free 14-day trial to get a cloud server with your own stream URLs
Links
- π¦ Sample repo: github.com/USAMAWIZARD/AntMedia-DeepAR-And-Overlay
- π Full tutorial: antmedia.io/deepar-and-custom-overlay-with-ant-media-android-sdk
- π DeepAR developer portal: developer.deepar.ai
- π₯οΈ Ant Media docs: antmedia.io/docs
Have questions or hit a different error? Drop it in the comments β happy to help. π
Top comments (0)