A last-mile delivery app is a software application used to manage, coordinate, and track the delivery of goods from a transportation hub to their final destination, which is usually a personal residence. This is often the most complex part of the entire delivery process because it involves navigating residential areas, dealing with traffic, ensuring the safe and timely delivery of the product, and confirming successful delivery. The demand for last-mile delivery apps has increased significantly with the rise of e-commerce and the growing expectation of fast and efficient home deliveries. In this article, we'll show you how to build a prototype for a last-mile delivery app using Flutter and Dynamsoft Vision SDKs (Dynamsoft Barcode Reader, Dynamsoft Label Recognizer, and Dynamsoft Document Normalizer). With this prototype, you can experiment with the app's design and features and use it as a starting point for developing your own last-mile delivery app.
Why Flutter and Dynamsoft Vision SDKs?
- Flutter: Our goal is to build an app for desktop, mobile, and web. Flutter is a cross-platform UI toolkit that allows you to easily build apps from a single codebase. It is efficient that only Dart code is required to build UIs for multiple platforms. Flutter also has a large community of developers and a wide range of third-party packages that can be used to add additional functionality to your app.
- Dynamsoft Vision SDKs: Dynamsoft Vision SDKs are a set of software development kits that provide APIs for barcode scanning, MRZ recognition, and document processing. They are available for Windows, Linux, macOS, Android, iOS, and web platforms. The Flutter plugins for Dynamsoft Vision SDKs include flutter_barcode_sdk, flutter_ocr_sdk, and flutter_document_scan_sdk. They allow you to easily integrate Dynamsoft Vision SDKs into your Flutter apps.
App Design and Workflow
Check out the app design by clicking the link below.
https://xd.adobe.com/view/7bbceea3-74e8-4013-80bc-0565dea8cc52-2eef/
The basic workflow of the app is as follows:
Launch the App: Start the application on your device. This will take you directly to the sign-up page.
Sign Up or Sign In: If you are a new user, create a new account by signing up. If you are an existing user, sign in to your account.
Profile Verification: Once you have signed up or signed in, you will be directed to the profile page. At this point, your profile is not yet verified. To verify your profile, click a button to open the camera.
Scan License or Passport: Use the camera to scan your driver's license or passport. This will provide the necessary personal information for profile verification.
Profile Verification Process: After scanning your license or passport, your profile will go through a verification process.
Navigate to Order Page: Once your profile is verified, you will be directed to the order page. Here, you can see the orders assigned to you.
Scan Order Barcode: To get the information of an order, scan the barcode of the order.
Scan Document and Deliver Order: Scan the necessary document for the order and click a button to deliver the order.
Return to Order Page: After delivering the order, you will be directed back to the order page where you can continue with the next order.
Developing the Core Features
In the subsequent sections, we will discuss how to develop the core features of the app, including camera integration, barcode scanning, MRZ recognition, document scanning, and data storage management.
How to Fetch Camera Streaming Images and Construct Camera Preview Widgets
We use camera plugins to obtain camera stream images, which are essential for barcode scanning, MRZ recognition, and document scanning. The official camera plugin offers a startImageStream()
method that fetches the camera stream for both Android and iOS platforms. For web applications, its takePicture()
method can be utilized to continuously capture images, which are of blob type. The camera_windows plugin is currently under development and does not yet support image streaming. However, a modified version of the Windows camera plugin that does support image streaming is accessible at https://github.com/yushulx/flutter_camera_windows.git. Consequently, the pubspec.yaml
file should be updated as follows:
dependencies:
camera: ^0.10.5+2
camera_windows:
git:
url: https://github.com/yushulx/flutter_camera_windows.git
The camera code applicable to various platforms can be consolidated into a single file:
void initState() {
initCamera();
}
Future<void> initCamera() async {
try {
WidgetsFlutterBinding.ensureInitialized();
_cameras = await availableCameras();
if (_cameras.isEmpty) return;
toggleCamera(0);
} on CameraException catch (e) {
print(e);
}
}
Future<void> toggleCamera(int index) async {
if (controller != null) controller!.dispose();
controller = CameraController(_cameras[index], ResolutionPreset.medium);
controller!.initialize().then((_) {
if (!cbIsMounted()) {
return;
}
previewSize = controller!.value.previewSize;
startVideo();
}).catchError((Object e) {
if (e is CameraException) {
switch (e.code) {
case 'CameraAccessDenied':
break;
default:
break;
}
}
});
}
Future<void> startVideo() async {
if (kIsWeb) {
webCamera();
} else if (Platform.isAndroid || Platform.isIOS) {
mobileCamera();
} else if (Platform.isWindows) {
_frameAvailableStreamSubscription?.cancel();
_frameAvailableStreamSubscription =
(CameraPlatform.instance as CameraWindows)
.onFrameAvailable(controller!.cameraId)
.listen(_onFrameAvailable);
}
}
// web
Future<void> webCamera() async {
if (controller == null || isFinished) return;
XFile file = await controller!.takePicture();
// TODO
if (!isFinished) {
webCamera();
}
}
// Mobile
Future<void> mobileCamera() async {
await controller!.startImageStream((CameraImage availableImage) async {
// TODO
});
}
// Windows
void _onFrameAvailable(FrameAvailabledEvent event) {
// TODO
}
When constructing the camera preview widget, if the camera preview appears mirrored, the Transform
widget can be employed to horizontally flip the preview.
Widget getPreview() {
if (kIsWeb) {
return Transform(
alignment: Alignment.center,
transform: Matrix4.identity()..scale(-1.0, 1.0), // Flip horizontally
child: CameraPreview(controller!),
);
}
return CameraPreview(controller!);
}
In order to render the camera preview and overlay in full screen, we utilize a combination of Stack
, Positioned
, FittedBox
, and SizedBox
widgets.
Stack(
children: <Widget>[
if (_mobileCamera.controller != null &&
_mobileCamera.previewSize != null)
Positioned(
top: 0,
right: 0,
left: 0,
bottom: 0,
child: FittedBox(
fit: BoxFit.cover,
child: Stack(
children: createCameraPreview(),
),
),
),
],
),
List<Widget> createCameraPreview() {
if (_mobileCamera.controller != null && _mobileCamera.previewSize != null) {
return [
SizedBox(
width: MediaQuery.of(context).size.width <
MediaQuery.of(context).size.height
? _mobileCamera.previewSize!.height
: _mobileCamera.previewSize!.width,
height: MediaQuery.of(context).size.width <
MediaQuery.of(context).size.height
? _mobileCamera.previewSize!.width
: _mobileCamera.previewSize!.height,
child: _mobileCamera.getPreview()),
Positioned(
top: 0.0,
right: 0.0,
bottom: 0,
left: 0.0,
child: createOverlay(_mobileCamera.barcodeResults,
_mobileCamera.mrzLines, _mobileCamera.documentResults),
)
];
} else {
return [const CircularProgressIndicator()];
}
}
How to Integrate Dynamsoft Vision SDKs into Flutter Apps
-
Add the following dependencies to the
pubspec.yaml
file:
flutter_barcode_sdk: ^2.2.2 flutter_document_scan_sdk: ^1.0.2 flutter_ocr_sdk: ^1.1.0
Apply for a trial license for Dynamsoft Vision SDKs at https://www.dynamsoft.com/customer/license/trialLicense.
-
Initialize the SDKs with the license keys:
FlutterBarcodeSdk barcodeReader = FlutterBarcodeSdk(); FlutterOcrSdk mrzDetector = FlutterOcrSdk(); FlutterDocumentScanSdk docScanner = FlutterDocumentScanSdk(); Future<void> initBarcodeSDK() async { await barcodeReader.setLicense( 'LICENSE-KEY'); await barcodeReader.init(); await barcodeReader.setBarcodeFormats(BarcodeFormat.ALL); } Future<void> initMRZSDK() async { await mrzDetector.init( "LICENSE-KEY"); await mrzDetector.loadModel(); } Future<void> initDocumentSDK() async { await docScanner.init( 'LICENSE-KEY'); await docScanner.setParameters(Template.color); }
-
Invoke the methods for barcode scanning, MRZ recognition, and document scanning across web, mobile, and Windows platforms.
Web
XFile file = await controller!.takePicture(); // Barcode Scanning var results = await barcodeReader.decodeFile(file.path); // MRZ Recognition var results = await mrzDetector.recognizeByFile(file.path); // Document Scanning var results = await docScanner.detectFile(file.path);
Mobile
int format = ImagePixelFormat.IPF_NV21.index; switch (availableImage.format.group) { case ImageFormatGroup.yuv420: format = ImagePixelFormat.IPF_NV21.index; break; case ImageFormatGroup.bgra8888: format = ImagePixelFormat.IPF_ARGB_8888.index; break; default: format = ImagePixelFormat.IPF_RGB_888.index; } // Barcode Scanning var results = await barcodeReader .decodeImageBuffer(availableImage.planes[0].bytes, availableImage.width, availableImage.height, availableImage.planes[0].bytesPerRow, format); // MRZ Recognition var results = await mrzDetector .recognizeByBuffer(availableImage.planes[0].bytes, availableImage.width, availableImage.height, availableImage.planes[0].bytesPerRow, format); // Document Scanning var results = await docScanner .detectBuffer(availableImage.planes[0].bytes, availableImage.width, availableImage.height, availableImage.planes[0].bytesPerRow, format)
Windows
Map<String, dynamic> map = event.toJson(); final Uint8List? data = map['bytes'] as Uint8List?; if (data != null) { int width = previewSize!.width.toInt(); int height = previewSize!.height.toInt(); // Barcode Scanning var results = await barcodeReader .decodeImageBuffer(data, width, height, width * 4, ImagePixelFormat.IPF_ARGB_8888.index); // MRZ Recognition var results = await mrzDetector .recognizeByBuffer(data, width, height, width * 4, ImagePixelFormat.IPF_ARGB_8888.index); // Document Scanning var results = await docScanner .detectBuffer(data, width, height, width * 4, ImagePixelFormat.IPF_ARGB_8888.index) }
How to Write and Read Data to Local Storage in Flutter
To emulate the sign-up and sign-in process, we use the shared_preferences plugin to store and retrieve user information. The shared_preferences
plugin is used for storing simple data in key-value pairs on the device, supporting Android, iOS, macOS, Linux, Windows, and web platforms. The following code snippet shows how to store and retrieve user information:
class ProfileData {
String? firstName;
String? lastName;
String? email;
String? password;
bool? verified;
String? nationality;
String? idNumber;
ProfileData({
this.firstName,
this.lastName,
this.email,
this.password,
this.verified,
this.nationality,
this.idNumber,
});
}
// Retrieve user information
SharedPreferences prefs = await SharedPreferences.getInstance();
bool verified = prefs.getBool('verified') ?? false;
String email = prefs.getString('email') ?? '';
ProfileData data = ProfileData(
email: email,
firstName: snapshot.data!.getString('firstName') ?? '',
lastName: snapshot.data!.getString('lastName') ?? '',
password: snapshot.data!.getString('password') ?? '',
verified: verified);
if (verified) {
route =
MaterialPageRoute(builder: (context) => const OrderPage());
} else {
if (email.isEmpty) {
route =
MaterialPageRoute(builder: (context) => const MyHomePage());
} else {
route = MaterialPageRoute(
builder: (context) => const ProfilePage());
}
}
// Write user information
Future<void> saveData() async {
SharedPreferences prefs = await SharedPreferences.getInstance();
await prefs.setString('firstName', data.firstName ?? '');
await prefs.setString('lastName', data.lastName ?? '');
await prefs.setString('email', data.email ?? '');
await prefs.setString('password', data.password ?? '');
}
MaterialButton(
onPressed: () {
saveData();
},
color: Colors.black,
child: const Text(
'Sign Up',
style: TextStyle(
color: Colors.white,
),
),
)
Try Online Demo
https://yushulx.me/flutter-last-mile-delivery/
Top comments (0)