DEV Community

Cover image for Building Face Detection and Emotion Recognition in Flutter Using Google ML Kit
Prince Tomar
Prince Tomar

Posted on

Building Face Detection and Emotion Recognition in Flutter Using Google ML Kit

Face detection used to sound futuristic. Now it's just part of how tech sees and responds to people. Whether it’s unlocking your phone, putting cat ears on your selfie, or tracking how users feel about a product — facial analysis is everywhere.

Let’s walk through how to build this in Flutter using Google’s ML Kit. We'll detect faces in a live camera feed, grab facial landmarks, and even infer simple emotions like happy, sleepy, or neutral.

Whether you’re just starting with Flutter or already shipping apps, this is worth adding to your toolkit.


Why Even Care About Face Detection?

It’s not just about knowing there’s a face. You can:

  • Track multiple faces in real time
  • Get details like smile intensity, eye openness, head rotation
  • Build smarter features: filters, auto-focus, or even mood-based UI
  • Add a touch of fun or utility — or both

🔧 Setup: What You’ll Need

Step 1: Add Dependencies

In pubspec.yaml, toss in:

dependencies:
  google_mlkit_face_detection: ^0.10.0
  camera: ^0.10.5
  permission_handler: ^11.0.0
Enter fullscreen mode Exit fullscreen mode

Run:

flutter pub get
Enter fullscreen mode Exit fullscreen mode

Step 2: Android & iOS Permissions

Android

  • Set minSdkVersion to 21 in android/app/build.gradle
  • Add this to AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Enter fullscreen mode Exit fullscreen mode

iOS

  • Open Info.plist and add:
<key>NSCameraUsageDescription</key>
<string>This app uses the camera for face detection.</string>
Enter fullscreen mode Exit fullscreen mode

📸 Open the Camera Feed

Let’s stream live images from the camera:

final controller = CameraController(camera, ResolutionPreset.medium);
await controller.initialize();
await controller.startImageStream(onImageAvailable);
Enter fullscreen mode Exit fullscreen mode

Your onImageAvailable function will handle incoming frames for face detection.


🧠 Face Detection in Action

Set up the face detector like this:

final options = FaceDetectorOptions(
  enableContours: true,
  enableClassification: true, // for smile & eye open detection
  performanceMode: FaceDetectorMode.fast,
);

final faceDetector = FaceDetector(options: options);
Enter fullscreen mode Exit fullscreen mode

Process incoming images from the stream:

Future<void> onImageAvailable(CameraImage image) async {
  final inputImage = getInputImageFromCameraImage(image);
  final faces = await faceDetector.processImage(inputImage);

  for (Face face in faces) {
    final smile = face.smilingProbability;
    final leftEye = face.leftEyeOpenProbability;
    print('Smile: $smile | Left Eye Open: $leftEye');
  }
}
Enter fullscreen mode Exit fullscreen mode

These probabilities (from 0.0 to 1.0) give you real-time signals to work with.


😄 Emotion Detection (Basic)

ML Kit doesn’t return labels like "happy" or "sad", but we can infer:

String analyzeEmotion(Face face) {
  final smile = face.smilingProbability ?? 0;
  final eye = face.leftEyeOpenProbability ?? 0;

  if (smile > 0.8) return "Happy";
  if (eye < 0.3) return "Sleepy";
  return "Neutral";
}
Enter fullscreen mode Exit fullscreen mode

You can tweak thresholds and combine features depending on your use case.

Want deeper emotion classification? You’ll need a custom model via TensorFlow Lite. But that’s a different beast — and worth its own article.


⚡ Where You Can Use This

  • Real-time selfie filters (like Snap lenses, but yours)
  • Smile-to-unlock actions
  • Mood-based UI tweaks or chatbot reactions
  • Attendance systems that detect faces, not just IDs
  • Face-aware focus tracking in camera apps

🔐 Privacy Matters

Even though ML Kit processes everything on-device, be upfront:

  • Ask for camera permissions clearly
  • Don’t store facial data unless absolutely necessary
  • Mention any facial analysis in your privacy policy

People care about how they’re being watched. Respect that.


🧪 Test Smart

Face detection is tricky to test on emulators. Use a real device, preferably with decent lighting. Try different angles, faces, and distances.

If you’re using live camera feed, latency and device performance will also affect results — especially if you’re tracking multiple faces.


🚀 Want to Go Beyond?

If you’re feeling ambitious:

  • Build filters using facial contours
  • Animate avatars using facial landmarks
  • Combine with voice recognition for AI assistants
  • Integrate TensorFlow Lite models for emotion labels like “angry”, “excited”, etc.

The face is just the start. Combine it with other signals and your app starts to feel… aware.


✅ Wrap-Up

Face detection in Flutter is no longer a hacky side project — it’s production-ready, thanks to Google ML Kit.

You can build apps that see, analyze, and respond — in real time, offline, and without sending anything to the cloud.

Give it a shot. Add some eyes to your app.


If this helped, consider clapping, sharing, or reaching out if you build something cool with it. I'm always up for seeing how people push Flutter and ML in the real world.


Top comments (0)