DEV Community

Cover image for Building a Camera App with Flutter and the Camera Package
Bishop Uzochukwu
Bishop Uzochukwu

Posted on

Building a Camera App with Flutter and the Camera Package

Introduction

The camera package is a Dart package that provides an interface for accessing the device's camera. It's a popular package for building camera apps and is maintained by the Flutter team. The camera package provides a lot of functionality, including displaying the camera preview, capturing photos and videos, adjusting camera settings, and more.

In this article, we will explore the Camera package in Flutter to build a camera app. We will cover the basics of capturing photos and videos using the camera package.

Advantages of Using the Camera Package

There are several advantages to using the camera package for building camera apps with Flutter. Here are some of the main benefits:

  • Ease of use: The camera package provides a simple and easy-to-use interface for accessing the device's camera, making it easy for developers to build camera apps quickly.
  • Cross-platform compatibility: Since Flutter is a cross-platform framework, apps built with the camera package can run on both Android and iOS platforms.
  • Performance: The camera package is optimized for performance, providing a smooth and responsive camera experience for users.
  • Customization: The camera package provides a lot of customization options, allowing developers to adjust camera settings, add filters and effects, and more.

Prerequisites

Before we begin, make sure you have the following prerequisites:

  • Flutter installed on your machine
  • An Android or iOS device connected to your machine

Not yet installed flutter?, check out my Getting Started with Flutter, everything you need to install flutter is explained there.

Let's refocus on the subject of this article and start developing our camera app.

Building the Camera App

Let's begin building our camera app with the help of Flutter and the camera package. Here are the steps to get you started.

Setup

Create a new Flutter app by running the following command:



flutter create flutter_camera_app


Enter fullscreen mode Exit fullscreen mode

This will create a new Flutter project named flutter_camera_app.

Add the camera and video_player package to your project

The video_player package is responsible for rendering captured videos on the screen. We will be using it to display recorded videos from the camera plugin.

To use these packages in your Flutter project, you need to add them to your project's pubspec.yaml file. Here's an example:



dependencies:
  camera: ^0.10.0
    video_player: ^2.5.2


Enter fullscreen mode Exit fullscreen mode

Once you have added this to your pubspec.yaml file, open a terminal on VSCode or your code editor, and run this command flutter pub get

This will download and install the respective packages and their dependencies.

Import the packages

After adding the camera and video_player package to your project, you need to import it into your Flutter code. Add the following import statement to your code:



import 'package:camera/camera.dart';
import 'package:video_player/video_player.dart';


Enter fullscreen mode Exit fullscreen mode

Implement the root widget of our app

To configure the home screen of our app, we set the root widget to display the CameraPage
widget and apply a dark theme for a more visually appealing experience.



void main() => runApp(const MyApp());

class MyApp extends StatelessWidget {
  const MyApp({super.key});

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Flutter Camera App',
      themeMode: ThemeMode.dark,
      theme: ThemeData.dark(),
      debugShowCheckedModeBanner: false,
      home: const CameraPage(),
    );
  }
}


Enter fullscreen mode Exit fullscreen mode

Implement the CameraPage widget class

The CameraPage widget class is responsible for handling critical tasks like previewing the live camera feed, capturing photos, and recording videos in our application. To configure this widget class to effectively fulfill these tasks, we'll convert it into a stateful widget and set up the required parameters for the camera package within the state class to ensure seamless operations.



class CameraPage extends StatefulWidget {
  const CameraPage({super.key});

  @override
  CameraPageState createState() => CameraPageState();
}

class CameraPageState extends State<CameraPage> with WidgetsBindingObserver {
  CameraController? _controller;
  bool _isCameraInitialized = false;
  late final List<CameraDescription> _cameras;
  bool _isRecording = false;

    @override
    build(BuildContext context){
        return Container();
    }
}


Enter fullscreen mode Exit fullscreen mode

You may be curious why we added the WidgetsBindingObserver mixin to the state class of the CameraPage widget. As we continue with the article, we will provide further explanation on this.

In our implementation, the CameraController property serves as the controller for the camera, responsible for calling all camera-specific methods such as previewing the camera, taking photos, and recording videos.

We also utilize two boolean flags, _isCameraInitialized and _isRecording, to manage state changes and update the UI in response to events from the _controller property.

Lastly, the List<CameraDescription> _cameras property is a crucial component of this widget class, returning a list of available cameras on the device where the app is being launched.

Initializing the camera

To gain access to the camera preview and initialize the camera, the next step involves calling the availableCameras() method from the camera package. This method returns a list of available cameras, which you can utilize to enable camera functionality within your application.



Future<void> initCamera() async {
    _cameras = await availableCameras();
    // Initialize the camera with the first camera in the list
    await onNewCameraSelected(_cameras.first);
  }


Enter fullscreen mode Exit fullscreen mode

This method is responsible for getting a list of available cameras from the device and then passing the first camera available on the list which is the back camera to the onNewCameraSelected which is responsible for initializing this camera using the _controller property. The implementation of this onNewCameraSelected method is shown below.



Future<void> onNewCameraSelected(CameraDescription description) async {
    final previousCameraController = _controller;

    // Instantiating the camera controller
    final CameraController cameraController = CameraController(
      description,
      ResolutionPreset.high,
      imageFormatGroup: ImageFormatGroup.jpeg,
    );

    // Initialize controller
    try {
      await cameraController.initialize();
    } on CameraException catch (e) {
      debugPrint('Error initializing camera: $e');
    }
    // Dispose the previous controller
    await previousCameraController?.dispose();

    // Replace with the new controller
    if (mounted) {
      setState(() {
        _controller = cameraController;
      });
    }

    // Update UI if controller updated
    cameraController.addListener(() {
      if (mounted) setState(() {});
    });

    // Update the Boolean
    if (mounted) {
      setState(() {
        _isCameraInitialized = _controller!.value.isInitialized;
      });
    }
  }


Enter fullscreen mode Exit fullscreen mode

In the above code, previousCameraController is related to the AppLifecycleState changes and the WidgetsBindingObserver mixin, which we will discuss in more detail shortly. Moving to the next code block, we begin by instantiating a cameraController with the cameraDescription parameter, followed by setting the camera's resolution to Resolution.high to achieve clear camera feedback in the camera preview.

Afterwards, we proceed to initialize the cameraController, wrapping it in a try-catch block to handle any possible exceptions that may arise. Upon successful initialization, we then reassign the _controller property to this new cameraController object, using the setState method and ensuring that the widget class is mounted before doing so.

The cameraController.addListener((){}) function listens for updates to the controller object, with setState being called within the callback to recall the build method and render the camera preview seamlessly. This method is closely linked with the AppLifecycleStates and WidgetsBindingObserver. Finally, in the dispose method, we make sure to dispose of the _controller to properly clean up the stateful widget whenever the app is minimized or killed.



@override
  void dispose() {
    _controller?.dispose();
    WidgetsBinding.instance.removeObserver(this);
    super.dispose();
  }


Enter fullscreen mode Exit fullscreen mode

To prevent the camera plugin from throwing an exception when returning to the app with a disposed controller, we need to handle the AppLifecycleStates. We can achieve this by adding the WidgetsBindingObserver mixin to our CameraPage state class, as shown below.



class CameraPage extends StatefulWidget {
  const CameraPage({super.key});

  @override
  CameraPageState createState() => CameraPageState();
}

class CameraPageState extends State<CameraPage> with WidgetsBindingObserver {

@override
  void didChangeAppLifecycleState(AppLifecycleState state) {
   final CameraController? cameraController = _controller;

         if (cameraController == null || !cameraController.value.isInitialized) {
      return;
    }

    if (state == AppLifecycleState.inactive) {
      // Free up memory when camera not active
      cameraController.dispose();
    } else if (state == AppLifecycleState.resumed) {
      // Reinitialize the camera with same properties
      onNewCameraSelected(cameraController.description);
    }
  }


Enter fullscreen mode Exit fullscreen mode

By adding the WidgetsBindingObserver mixin to the CameraPage state class, we gain access to the didChangeAppLifecycleState method to handle the AppLifecycleStates. We check if the cameraController is null or has not been initialized, and if so, we exit the function. When the app is inactive, we dispose of the non-null camera controller to free up memory, and when the app is resumed, we call the onNewCameraSelected method to handle the disposal and reinitialization of the previous and new controllers.

Displaying the camera preview

After successfully initializing the camera, the next step is to display the camera preview in the application's UI. This is an important step as it allows the user to interact with the camera and view the live feed.



@override
  Widget build(BuildContext context) {
    if (_isCameraInitialized) {
      return SafeArea(
        child: Scaffold(
          body: Column(
            children: [
              CameraPreview(_controller!),
                 ...
                            ]
      );
    } else {
      return const Center(
        child: CircularProgressIndicator(),
      );
    }
  }


Enter fullscreen mode Exit fullscreen mode

The camera plugin provides a useful CameraPreview widget that takes in our controller as an argument and renders the visual feedback coming in from our initialized camera. The _isCameraInitialized boolean flag, which is set to true or false in the onNewCameraSelected method, determines whether the camera has been initialized successfully or not. If the camera is not initialized, our UI renders a CircularProgressIndicator until the camera has been initialized.

Here's an example of how the camera preview appears on a mobile device:

Camera Preview

Capture photos and videos

With the camera preview successfully displayed, we can now move on to implementing the photo capture feature in our application. This involves utilizing the camera plugin's CameraController
to capture images from the camera.



Future<XFile?> capturePhoto() async {
    final CameraController? cameraController = _controller;
    if (cameraController!.value.isTakingPicture) {
      // A capture is already pending, do nothing.
      return null;
    }
    try {
      await cameraController.setFlashMode(FlashMode.off); //optional
      XFile file = await cameraController.takePicture();
      return file;
    } on CameraException catch (e) {
      debugPrint('Error occured while taking picture: $e');
      return null;
    }
  }

    void _onTakePhotoPressed() async {
    final navigator = Navigator.of(context);
    final xFile = await capturePhoto();
    if (xFile != null) {
      if (xFile.path.isNotEmpty) {
        navigator.push(
          MaterialPageRoute(
            builder: (context) => PreviewPage(
              imagePath: xFile.path,
            ),
          ),
        );
      }
    }
  }


Enter fullscreen mode Exit fullscreen mode

In order to capture photos in our application, we will make use of two helper methods. The first method will be responsible for taking the photo and saving it to a file, while the second will handle displaying the captured photo on a screen.

The capturePhoto() method initializes a cameraController and assigns it to our _controller. We then check if the camera is currently taking pictures, and exit the method if it is. After this check, we call cameraController.takePicture() and assign the returned data to an Xfile object, which is then returned from the method. This entire process is wrapped in a try-catch block to handle potential exceptions.

When _onTakePhotoPressed() is called from the UI, we await the capturePhoto() method which returns an XFile. If the file is not null and not empty, we push our new screen, PreviewPage, onto the navigation stack and pass the photo as a parameter.

The code snippet below shows an example of how this method is called in the UI:



@override
  Widget build(BuildContext context) {
    if (_isCameraInitialized) {
      return SafeArea(
        child: Scaffold(
          body: Column(
            children: [
                ....
              Row(
                mainAxisAlignment: MainAxisAlignment.center,
                children: [
                  if (!_isRecording)
                    ElevatedButton(
                      onPressed: _onTakePhotoPressed,
                      style: ElevatedButton.styleFrom(
                          fixedSize: const Size(70, 70),
                          shape: const CircleBorder(),
                          backgroundColor: Colors.white),
                      child: const Icon(
                        Icons.camera_alt,
                        color: Colors.black,
                        size: 30,
                      ),
                       ....
                ],
              ),
            ],
          ),
        ),
      );
    } else {
      return const Center(
        child: CircularProgressIndicator(),
      );
    }
  }


Enter fullscreen mode Exit fullscreen mode

The image below shows a sample photograph taken by the app.

Sample Photo

Now to record videos in our application, we make use of these helper methods.



Future<XFile?> captureVideo() async {
    final CameraController? cameraController = _controller;
    try {
      setState(() {
        _isRecording = true;
      });
      await cameraController?.startVideoRecording();
      await Future.delayed(const Duration(seconds: 5));
      final video = await cameraController?.stopVideoRecording();
      setState(() {
        _isRecording = false;
      });
      return video;
    } on CameraException catch (e) {
      debugPrint('Error occured while taking picture: $e');
      return null;
    }
  }

void _onRecordVideoPressed() async {
    final navigator = Navigator.of(context);
    final xFile = await captureVideo();
    if (xFile != null) {
      if (xFile.path.isNotEmpty) {
        navigator.push(
          MaterialPageRoute(
            builder: (context) => PreviewPage(
              videoPath: xFile.path,
            ),
          ),
        );
      }
    }
  }


Enter fullscreen mode Exit fullscreen mode

These methods follow the same logic with the capturePhoto() function, where we call the cameraController?.startVideoRecording() function and then delay for 5 seconds before calling the cameraController?.stopVideoRecording() function, where we then save our 5 second video to an XFile and return it.

The _onRecordVideoPressed() takes the recorded file and passes to our PreviewPage to the videoPath parameter.

The below code shows how we are calling the _onRecordVideoPressed() method in the UI.



@override
  Widget build(BuildContext context) {
    if (_isCameraInitialized) {
      return SafeArea(
        child: Scaffold(
          body: Column(
            children: [
                ....
              Row(
                mainAxisAlignment: MainAxisAlignment.center,
                children: [
                                    ....
                  ElevatedButton(
                    onPressed:_isRecording? null: _onRecordVideoPressed,
                    style: ElevatedButton.styleFrom(
                        fixedSize: const Size(70, 70),
                        shape: const CircleBorder(),
                        backgroundColor: Colors.white),
                    child: Icon(
                      _isRecording ? Icons.stop : Icons.videocam,
                      color: Colors.red,
                    ),
                  ),
                ],
              ),
            ],
          ),
        ),
      );
    } else {
      return const Center(
        child: CircularProgressIndicator(),
      );
    }
  }


Enter fullscreen mode Exit fullscreen mode

The link below shows a successfully recorded video.

Video example

The PreviewPage Widget Class

The PreviewPage widget class is responsible for rendering the captured photos and recorded videos. Let's take a closer look at how this widget handles the display of these media assets.



class PreviewPage extends StatefulWidget {
  final String? imagePath;
  final String? videoPath;

  const PreviewPage({Key? key, this.imagePath, this.videoPath})
      : super(key: key);

  @override
  State<PreviewPage> createState() => _PreviewPageState();
}

class _PreviewPageState extends State<PreviewPage> {
  VideoPlayerController? controller;

  Future<void> _startVideoPlayer() async {
    if (widget.videoPath != null) {
      controller = VideoPlayerController.file(File(widget.videoPath!));
      await controller!.initialize().then((_) {
        // Ensure the first frame is shown after the video is initialized
        setState(() {});
      });
      await controller!.setLooping(true);
      await controller!.play();
    }
  }

  @override
  void initState() {
    super.initState();
    if (widget.videoPath != null) {
      _startVideoPlayer();
    }
  }

  @override
  void dispose() {
    controller?.dispose();
    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      body: Center(
        child: widget.imagePath != null
            ? Image.file(
                File(widget.imagePath ?? ""),
                fit: BoxFit.cover,
              )
            : AspectRatio(
                aspectRatio: controller!.value.aspectRatio,
                child: VideoPlayer(controller!),
              ),
      ),
    );
  }
}


Enter fullscreen mode Exit fullscreen mode

To display images in the PreviewPage widget, the imagePath parameter is passed to an Image.file widget. However, displaying recorded videos requires the use of the video_player
package that was added to the pubspec.yaml file.

This package provides access to the VideoPlayerController and VideoPlayer classes, with the VideoPlayerController class being responsible for initializing the video file.

The _startVideoPlayer() method checks if the videoPath is not null, assigns the controller to a new VideoPlayerController.file(File(widget.videoPath)), and initializes the controller.

Upon successful initialization of the video file, the playback of the video can be initiated through the invocation of the controller.play() function, and to achieve a continuous playback loop, the controller.setlooping(...) function must be called with the boolean value of true as a parameter.

With all these steps complete, the camera app is fully functional.

Conclusion

In this article, we've explored how to build a camera app with Flutter using the camera package. We've seen how to initialize the camera, display the camera preview, and capture photos and videos. The camera package provides a lot of functionality and customization options, making it a great choice for building camera apps with Flutter.

The camera package provides a lot of other functionality, like adjusting camera settings, setting exposure levels etc, so be sure to check out the documentation and this article by Souvik Biswas for more information.

Here is the GitHub repo to the full code in this article.

Top comments (2)

Collapse
 
doncutz profile image
Doncutz

Thanks for writing but I have one question. I was hoping you to talk about how to save it in the BE.

Collapse
 
bishopeze profile image
Bishop Uzochukwu

do you mean save the captured picture in the Backend?