Check out the youtube video for the post 😊
Using PyTorch on mobile is a great idea because it allows the user to use the models that we create without the need of connecting to a server.
We need to be careful tho because sometimes our models can be really big and can be detrimental to the user, so using a server for the processing can be really good.
That being said if the model can run really fast it's a great thing if the user has it locally.
So the overview of how to use PyTorch on flutter is using the method channel to run native code in this case java or Kotlin.
- Create method channel to run native java code
- Send data to method channel in case of an image an uint8list
- Receive the data in the backend and parse it, in the case of an image we need to use
BitmapFactory.decodeByteArray(byteList, boffset, blength);
- Run the forward method of the module Keep in mind that the model needs to output a tensor or something that you really understand and can parse easily in java. Sometimes the model outputs a map so the output of the model is in the "out" key of the map.
- Send the data to the frontend and display it
You do this by
result.success(data_to_send);then keep in mind the type conversion from java to dart or the rest of the programming languages.
If you like follow for more and consider subscribing to the YT channel ramgendeploy 😁
I gonna do a more in-depth video about this exploring how to run a segmentation model.
One drawback of using flutter and the PyTorch java implementation is that since flutter runs in a single thread the app always halts when running the model in the background, and since you can't use the method channel in other isolates this becomes a big problem that I didn't find a good solution yet.
So in production you either need to go for all native java/Kotlin android or suffer the consequences 😂