Live streaming has evolved beyond simple camera-to-viewer broadcasts. Today’s audiences expect interactive, engaging content with visual effects, branding elements, and augmented reality features.
In this blog post, we’ll explore two powerful approaches to enhance your Android live streams using Ant Media Server: DeepAR integration for stunning AR effects and Custom Canvas overlays for logo overlay or adding some custom overlays to the video on the client side. One can apply these effects and start streaming with the Ant Media Server SDK.
Both approaches leverage Ant Media Server’s WebRTC capabilities to deliver low-latency, high-quality streams with custom visual enhancements applied in real-time
GitHub Repository: https://github.com/USAMAWIZARD/AntMedia-DeepAR-And-Overlay
Table of Contents
- Getting Started
- Section 1: DeepAR Activity – Augmented Reality for Live Streaming
- What is DeepAR?
- The DeepAR Streaming Experience
- How It Works (High Level)
- Use Cases
- Code Overview
- Section 2: Custom Canvas Activity – Branded Overlays for Professional Streams
- Why Custom Overlays?
- The Custom Overlay Experience
- Built-in Overlay Features
- How It Works (High Level)
- Use Cases
- Code Overview
- The Ant Media Server Advantage
- Conclusion
Getting Started
There are two activity samples, DeepARActivity.java and CustomCanvasActivity.java.
You may need to create a licence key for Deep AR, sign up, and create a licence by creating a new project, then select Android, and set the licence key in code.
- git clone https://github.com/USAMAWIZARD/AntMedia-DeepAR-And-Overlay
- Build and run on your Android device
- Start streaming with effects or overlays!
- Play the stream with WebRTC
Section 1: DeepAR Activity – Augmented Reality for Live Streaming
View Source Code: DeepARActivity.java
What is DeepAR?
DeepAR is an augmented reality SDK that enables real-time face tracking, filters, and effects on mobile devices. When combined with Ant Media Server, you can stream AR-enhanced video to your audience, creating engaging and fun live experiences.
The DeepAR Streaming Experience
The DeepARActivity brings the power of augmented reality to your live streams. Imagine going live with a Viking helmet, neon devil horns, or even an elephant trunk – all rendered in real-time and streamed to your viewers.
How It Works (High Level)
- The camera captures your video feed
- Each frame passes through the DeepAR SDK, which applies face tracking and the selected AR effect
- The processed frame is sent to Ant Media Server via WebRTC
- Your viewers see the AR-enhanced stream in real-time with ultra-low latency
DeepAR Viking Helmet Effect: The Viking Helmet, neon horn effects in action – one of many AR filters available.
Use Cases
- Entertainment Streaming: Add fun filters to engage your audience during live shows
- Gaming Streams: React to gameplay with expressive emotion effects
- Virtual Events: Create memorable virtual appearances with unique AR effects
- Social Streaming: Stand out with creative filters on your live broadcasts
Code Overview
Let’s take a brief look at how the DeepARActivity is structured:
- Initialization
The activity initializes DeepAR with a license key and sets up the camera and streaming components:
`deepAR = new DeepAR(this);
deepAR.setLicenseKey("your-license-key");
deepAR.initialize(this, this);
initializeEffects();
setupCamera();
setupStreamingAndPreview();`
- WebRTC Client Setup
The Ant Media WebRTC client is configured to use a custom video source, which allows us to feed AR-processed frames:
webRTCClient = IWebRTCClient.builder()
.setServerUrl("wss://test.antmedia.io/LiveApp/websocket")
.setActivity(this)
.setVideoSource(IWebRTCClient.StreamSource.CUSTOM)
.setWebRTCListener(createWebRTCListener())
.setInitiateBeforeStream(true)
.build();
- Camera Frame Processing
Each camera frame is captured via CameraX’s ImageAnalysis and fed to DeepAR for AR processing:
imageAnalysis.setAnalyzer(ContextCompat.getMainExecutor(this), image -> {
try {
feedDeepAR(image); // Send frame to DeepAR SDK
} finally {
image.close();
}
});
- Effect Switching
Users can cycle through effects using simple navigation methods:
`public void nextEffect(View v) {
currentEffect = (currentEffect + 1) % effects.size();
deepAR.switchEffect("effect", getFilterPath(effects.get(currentEffect)));
}
public void previousEffect(View v) {
currentEffect = (currentEffect - 1 + effects.size()) % effects.size();
deepAR.switchEffect("effect", getFilterPath(effects.get(currentEffect)));
}`
- Stream Control
Starting and stopping the stream is handled with a simple toggle:
public void startStopStream(View v, String streamId) {
if (!webRTCClient.isStreaming(streamId)) {
((Button) v).setText("Stop");
webRTCClient.publish(streamId);
} else {
((Button) v).setText("Start");
webRTCClient.stop(streamId);
}
}
The DeepARRenderer handles the OpenGL rendering and sends processed frames to the WebRTC client for streaming.
Section 2: Custom Canvas Activity – Branded Overlays for Professional Streams
View Source Code: CustomCanvasActivity.java
Why Custom Overlays?
While AR effects are fun, sometimes you need professional branding elements on your stream – your logo, text announcements, watermarks, or promotional graphics. The CustomCanvasActivity provides exactly this capability.
The Custom Overlay Experience
This activity demonstrates how to add static visual elements on top of your camera feed before streaming. Think of it as having your own broadcast graphics system built right into your app.
Built-in Overlay Features
The sample implementation includes two types of overlays:
Image Overlay
- Displays a custom image (logo or branding graphic)
- set custom image positionx
- set custom image size
- Perfect for watermarks and brand logos
- Text Overlay
- Renders custom text directly on the video feed
- Customizable font size (64pt in the sample)
- Custom colors (red text in the sample)
- Set custom position.
- Great for titles, announcements, or hashtags
Custom Overlay Demo Camera feed with logo overlay and text overlay applied.
How It Works (High Level)
- CameraX captures your camera feed
- An OpenGL ES renderer processes each frame
- Custom overlays (images and text) are composited onto the video
- The final composited frame streams to Ant Media Server via WebRTC
- Viewers receive your branded stream in real-time
Use Cases
- Corporate Streaming: Add company logos and branding to internal broadcasts
- Educational Content: Display titles, chapter names, or key points
- News & Media: Show channel branding and lower-thirds
- Product Launches: Overlay promotional text and graphics
- Influencer Streams: Watermark your content with your brand
Code Overview
Let’s explore how the CustomCanvasActivity brings overlays to your stream:
- WebRTC Client Setup
Similar to DeepAR, we configure the WebRTC client to use a custom video source:
webRTCClient = IWebRTCClient.builder()
.setServerUrl("wss://test.antmedia.io/LiveApp/websocket")
.setActivity(this)
.setVideoSource(IWebRTCClient.StreamSource.CUSTOM)
.setWebRTCListener(createWebRTCListener())
.setInitiateBeforeStream(true)
.build();
- Creating Overlays
Overlays are created when the OpenGL surface is initialized. You can add image overlays and text overlays with custom positioning:
`imageProxyRenderer = new ImageProxyRenderer(webRTCClient, this, surfaceView, new CanvasListener() {
@override
public void onSurfaceInitialized() {
// Image overlay: positioned at 80% X, 80% Y with 20% size
logoOverlay = new Overlay(getApplicationContext(), R.drawable.test, 0.8f, 0.8f);
logoOverlay.setSize(0.2f);
// Text overlay: "Hello" in red, 64pt font, positioned at center X, -30% Y
textOverlay = new Overlay(getApplicationContext(), "Hello", 64, Color.RED, 0f, -0.3f);
textOverlay.setSize(0.12f);
}
});`
- Camera Frame Processing
Each camera frame is submitted to the renderer for overlay compositing:
imageAnalysis.setAnalyzer(ContextCompat.getMainExecutor(this), new ImageAnalysis.Analyzer() {
@Override
public void analyze(@NonNull ImageProxy image) {
imageProxyRenderer.submitImage(image); // Send frame to renderer
if (surfaceView != null) {
surfaceView.requestRender(); // Trigger OpenGL render
}
image.close();
}
});
- Camera Switching
Switching between front and back cameras is handled by the CameraProviderHelper:
switchCam.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
cameraProviderHelper.switchCamera(imageAnalysis);
}
});
The ImageProxyRenderer handles the OpenGL compositing of overlays onto the camera frames before sending them to the WebRTC client.
The Ant Media Server Advantage
Both activities connect to Ant Media Server using WebRTC, which provides:
- Ultra-Low Latency: Sub-second delay for real-time interaction
- Scalability: Handle thousands of concurrent viewers
- Cross-Platform Playback: Viewers can watch on any device or browser
- Adaptive Bitrate: Automatic quality adjustment based on network conditions
Conclusion
Whether you’re looking to add fun AR effects to engage your audience or professional branding to your streams, Ant Media Server provides the foundation for high-quality, low-latency broadcasts. The DeepARActivity and CustomCanvasActivity demonstrate just how easy it is to elevate your live streaming experience on Android.
The best part? Both approaches can be customized and extended to match your specific needs. Add your own AR effects, design custom overlays, or combine both techniques for the ultimate streaming experience.
Ready to take your live streams to the next level? Clone the project and start experimenting today!






Top comments (0)