DEV Community

Ryan Miller
Ryan Miller

Posted on

Google Launches ML Kit for Android and iOS Developers


With the recent launch of Google’s ML Kit Beta, it is estimated that the software development kit would eventually be optimised to deploy AI for mobile apps on platforms such as Firebase. As this kit is available for Android and iOS platforms, APIs can be called for both on-device and cloud platforms.

Inclusion of AI in mobile platform can do a range of things such as extraction of information related to nutrition from a product label and add style transfers, masks, and effects to the photo. The entire kit is designed in a way to be used by beginners as well as advanced developers who can make use of the ML programs. AI can be used for text recognition, barcode scanning, picture labelling, landmark recognition, and face detection; these features are available on the Firebase Console. Moreover, the on-device APIs do not require a network connection for work. Moreover, some of the top Android developers along with the top iOS app developersare highly convinced about this new feature as it would give a boost to the performance of the Apps.

One of the major innovations that you might get to notice is that the developers would be able to integrate their apps in the offline mode as well and that too for free. However, the models that run on the device are quite smaller and come with a low level accuracy. In the cloud platforms, neither the model size nor the available computed power would prove to be an issue here which is why the models are somewhat larger and more accurate than usual.

Machine Learning technology requires a high processing speed and Google is ready to take the plunge. The newly launched interface that is used with Firebase is meant to make machine learning easier for the mobile developers. There are a large number of pre-trained and customisable APIs that are offered by Google Cloud which do not work offline, the experience of which is not integrated tightly with the Firebase or the Firebase Console which is a now highly used by Google for mobile development.

In case the ML Kit APIs does not cover the use cases, you can always bring the existing TensorFlow Lite models and incorporate them with the kit. For this, you would just need to upload your model on Firebase and the rest will be taken care of by the hosting and serving team on your App. The ML kit also acts as an API layer to the custom model which would make it easy to run and use on the proposed platform.

As it is a part of the firebase platform, you would be able to develop high-quality and functional apps, spread your user base, and grow your finances through it. Each of the features that you would find on the app works independently and they would even work better together.

Google has also proposed that in the coming month, you would be able to extend the current proposed set of the APIs which involve integration of the same kind of smart replies that you would find in applications like Gmail in addition to a high-density face contour feature that is suitable for face detection API.

Conclusion

Overall, the ML kit launched by Google would bring in new prospects in terms of the functionality and the usage of the apps. They are trying to bring in new changes and innovation in the way people approach the mobile apps and provide them with something that has better performance objectives that would ensure better usage and high quality features.

Top comments (0)