Hey folks! Today I’ll show you how to implement a casting feature in your Flutter app —
especially useful for video apps, courses, streaming platforms, or even fitness apps, like
Netflix or YouTube.
What is Cast?
"Cast" is the ability to send media (like video or audio) from your app to another device,
such as a Chromecast-enabled TV or an AirPlay-compatible receiver. The content plays
on the external screen, while your app becomes a remote control.
How does it work?
Casting protocols work mainly in two ways:
- Chromecast (Google Cast): The receiver device (usually a smart TV or dongle) runs a player and receives control commands over the network.
- AirPlay (Apple): Native on Apple TVs and compatible smart TVs. iOS can also mirror the screen when casting isn't supported directly. ## Documentation and Useful Links
Cast package on pub.dev: https://pub.dev/packages/cast
Google Cast SDK Developer Docs: https://developers.google.com/cast/docs/developers
Apple AirPlay Overview: https://developer.apple.com/airplay/
-
flutter_to_airplay package: https://pub.dev/packages/flutter_to_airplay
How to implement it?
First, add the necessary dependencies to your
pubspec.yaml
:
dependencies:
cast: ^2.1.0
flutter_to_airplay: ^2.0.5
Then run:
flutter pub get
⚠
Important: Read the documentation for the cast
package carefully — you must
configure AndroidManifest.xml, permissions, and iOS protocols as required.
Creating the Cast Service
To follow Clean Architecture, let’s encapsulate the cast logic in a service:
class VideoCastService {
CastSession? currentSession;
int mediaSessionId = 0;
Result<List<CastDevice>> searchDevices() async {
try {
final devices = await CastDiscoveryService().search();
return Right(devices);
} on Exception catch (error) {
return Left(error);
}
}
Future<void> close({required CastSession session}) async {
await session.close();
}
void forward10seconds(double currentTime) {
final seekMessage = {
"type": "SEEK",
"mediaSessionId": mediaSessionId,
"currentTime": currentTime + 10,
"resumeState": "PLAY",
};
if (currentSession != null) {
currentSession?.sendMessage(CastSession.kNamespaceMedia, seekMessage);
}
}
void backwards10seconds(double currentTime) {
final seekMessage = {
"type": "SEEK",
"mediaSessionId": mediaSessionId,
"currentTime": currentTime - 10,
"resumeState": "PLAY",
};
if (currentSession != null) {
currentSession?.sendMessage(CastSession.kNamespaceMedia, seekMessage);
}
}
void stopSession() {
if (currentSession != null) {
currentSession?.sendMessage(
CastSession.kNamespaceMedia,
{"type": "STOP", "mediaSessionId": mediaSessionId},
);
}
currentSession = null;
}
void playMedia() {
if (currentSession != null) {
currentSession?.sendMessage(
CastSession.kNamespaceMedia,
{
'type': "PLAY",
'mediaSessionId': mediaSessionId,
},
);
}
}
void setVolume({
required bool isMuted,
}) {
if (currentSession != null) {
currentSession?.sendMessage(CastSession.kNamespaceReceiver, {
'type': "SET_VOLUME",
'volume': {"muted": isMuted},
});
}
}
void pauseMedia() {
if (currentSession != null) {
currentSession?.sendMessage(CastSession.kNamespaceMedia, {
'type': 'PAUSE',
'mediaSessionId': mediaSessionId,
});
}
}
Future<void> connect({
required CastDevice device,
}) async {
currentSession = await CastSessionManager().startSession(
device,
const Duration(minutes: 30),
);
launchDefaultMediaApp();
}
Stream<CastSessionState>? listenToStateStream() =>
currentSession?.stateStream;
Stream<Map<String, dynamic>>? listenToMessageStream() =>
currentSession?.messageStream;
void launchDefaultMediaApp() {
if (currentSession != null) {
currentSession?.sendMessage(CastSession.kNamespaceReceiver, {
'type': 'LAUNCH',
'appId': 'CC1AD845', // Default media device
});
}
}
void getMediaStatus() {
if (currentSession != null) {
currentSession?.sendMessage(
CastSession.kNamespaceMedia,
{"type": "GET_STATUS"},
);
}
}
void getReceiverStatus() {
if (currentSession != null) {
currentSession?.sendMessage(
CastSession.kNamespaceMedia,
{"type": "GET_STATUS"},
);
}
}
Future<void> openMedia({required VideoCastParams params}) async {
final message = {
'contentId': params.url,
'contentType': 'video/dash',
'streamType': 'BUFFERED',
'metadata': {
'type': 0,
'metadataType': 0,
'title': params.title,
'images': [
{
'url': params.thumb,
}
]
}
};
if (currentSession != null) {
await Future.delayed(const Duration(seconds: 5), () {
currentSession?.sendMessage(CastSession.kNamespaceMedia, {
'type': 'LOAD',
'autoPlay': true,
'currentTime': 0,
'media': message,
});
});
}
}
}
You can inject this service using GetIt, Riverpod, or any other DI solution and Create a Cubit or a bloc to consume it, also using a enum to simplify the flow of use for that.
AirPlay support (iOS)
As a fallback for iOS, when Google Cast is not supported, we can use
flutter_to_airplay
.
Displaying the AirPlay button:
AirPlayRoutePickerView(
tintColor: Colors.white,
activeTintColor: Colors.blue,
backgroundColor: Colors.transparent,
)
This widget displays the native system button, opening the AirPlay or screen mirroring
interface.
Fallback: Native screen mirroring (iOS)
If the device doesn’t support AirPlay directly, the AirPlayRoutePickerView
lets users
enable screen mirroring instead — ensuring your video is still shown.
✅ Tip: You can listen to the CastService
stream to detect when casting starts or stops,
and adapt your UI accordingly (e.g., hiding local controls or changing the player view).
Conclusion
With the cast
and flutter_to_airplay
packages, you can implement a complete
casting experience on Flutter — supporting both Google Cast and AirPlay on iOS. Even
though Flutter doesn't natively handle all media controls, this solution gives you great
flexibility.
To make your app more robust, consider:
- Showing visual feedback when connected.
- Centralizing media controls (play, pause, seek).
- Displaying a casting icon in your app bar. If you want the full example or have questions, feel free to reach out on Instagram @kiustudios or comment below!
Top comments (0)