Training an Emotion Recognition in a Jupyter Notebook and converting it into a Flask API project without additional code using Cuttle
For people who use Jupyter notebooks for development and testing quite often, you’re probably habituated to copying and pasting code around all the time.
Deleting test snippets, using a different IDE to compile and test your code again, and developing an API is just a few of the time-consuming tasks an ML developer is used to.
Wouldn’t it be great if we could automate the drudgery of boilerplate code needed to build an API project?** **Here we will train an emotion recognition model in a Jupyter Notebook and convert it into a Flask API project with absolutely no extra code!
Dependencies
pip install
tensorflow==2.5.0 Keras==2.4.3 numpy==1.19.5 opencv-python==4.4.0.44
We also need to install Cuttle to convert our ML notebook into a Flask API project.
pip install cuttle
Dataset
You can use any dataset on emotion recognition or create your own dataset. In this article, we’re using the emotion recognition dataset available on Kaggle. You should find it here. This dataset contains images of 7 classes: angry, disgust, fear, happy, neutral, sad, and surprise. If you want to add another class, just create a new directory and add images belonging to that class.
Create the notebook in the same directory as the dataset you downloaded having ‘train’ and ‘validation’ folders.
Imports
Dataset Augmentation
The ImageDataGenerator allows you to fit models using image augmentation, transformation on each training image and also pass any pre-processing functions if available.
We’re rescaling images because colors range from [0–255] and rescaling them transforms each pixel value in the range of [0–1]. Here, we define the target size for our model as 48x48 and the batch_size as 64.
Model Training
We’re defining a CNN here but you can choose to use any of the pre-trained models available like VGG16 or ResNet, or make use of transfer learning as well.
Your model gets saved in the given path after completing the training.
Emotion Recognition
We now load the model, give it an image and test it. In this step, let’s introduce cuttle and turn the resultant function into an API that any frontend application can hit.
Let’s use a test image ‘test.jpeg’.
Source: Unsplash
The output we get for this from the model prediction is ‘fear’. Continue testing with a few more images till you’re satisfied with the model’s accuracy and results. Let’s look at how we can convert this into an API project.
Initialize Cuttle
Initialize cuttle with ‘cuttle init’ as follows and enter the name of the notebook being used.
Create Cuttle environment
In this step, you name the environment, specify the platform you’re using, and which transformer you want to use. We want to use the flask transformer in this scenario. You can choose an environment name of your convenience. I’ve used ‘emotion-rec’ as an example.
At this point, your cuttle.json should look something like this:
After you’re done creating the config, it’s time to edit your Jupyter notebook to include cuttle config. We only need to edit the last portion of the code to define the API route and set the output config.
Adding Config to cells:
We use Cuttle config to perform 2 main actions here as seen here.
disabling the training steps and load from the saved model file so as to not re-train every time we want to run the script.
specify the cell to be executed every time the API is called along with required parameters
To disable cells, add this config at the beginning of the cell.
#cuttle-environment-disable emotion-rec
Now add config to transform your script into an API as follows:
We’re setting two configs: the cuttle-environment-set-config and the cuttle-environment-assign configs. They are cell-scoped and line-scoped respectively. The cell-scoped config sets configuration needed during transformation for cells. The line-scoped config allows us to configure variables.
In the first line, we set the cell-scoped config specifying the environment name, the method, and the route of our choice. We also configure variable ‘file’ allowing us to get-config from the request method and the ‘output’ variable which is going to be our response.
Transform
Our last step is to cuttle-transform using the environment name we’ve been using. Let’s jump into how:
After this step, you should see an output directory formed with another sub-directory containing the environment name. Go to this directory to spot your code that’s transformed into an API. Run this file as:
Now our code is running on localhost port 5000 by default. Test this out by sending images using Postman to get your response.
test.jpeg sent via POST request to our flask API. Source — Author
Your API project is now ready to deploy! Cuttle also allows you to convert your notebook into a script or a pipeline.
Resources
Cuttle Website: cuttle.it
Github: Emotion recognition API source
Cuttle Source: Cuttle CLI Source
You can also find us on Twitter @ https://twitter.com/cuttlehq
Top comments (0)