DEV Community

Cover image for Seasons of Serverless, Lovely Ladoos
Aditya Raman
Aditya Raman

Posted on • Updated on

Seasons of Serverless, Lovely Ladoos

This article is part of #SeasonsOfServerless.

Each week we will publish a challenge created by Azure Advocates with amazing Student Ambassadors around the world. Discover popular festive recipes and learn how Microsoft Azure empowers you to do more with Serverless Functions! 🍽 😍.
Explore our serverless Resources here and learn how you can contribute solutions here.

Check original repository at, https://github.com/microsoft/Seasons-of-Serverless
Serverless Solution for Lovely Ladoos: https://github.com/ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos

Challenge 2: Lovely Ladoos

Featured Region: India
Your Chefs: Jasmine Greenaway, Cloud Advocate (Microsoft) with Soumya Narapaju and Aditya Raman, Microsoft Student Ambassadors

It's Diwali season in India! A very popular delicacy that Indians eat during Diwali are ladoos, balls of flour dipped in sugar syrup. Can you create a machine learning model to detect a proper ladoo? Learn more about this challenge.

Teaser

Solution

  1. The Solution creates an image classifier using Custom Vision API
  2. The solution also creates a Serverless Endpoint which is used to
    • List all the images stored on the Azure Blob Storage
    • Upload a new image on the Azure Blob Storage
    • Predict the already uploaded image using Custom Vision API

Serverless Compute

Alt Text

Serverless computing enables developers to build applications faster by eliminating the need for them to manage infrastructure. With serverless applications, the cloud service provider automatically provisions, scales and manages the infrastructure required to run the code.

Altogether it can be summarized as

  • Function as a Service: Modular way of writing business logics as a different application(s) which is/are triggered with the event(s). eg., Azure Functions
  • Stateless Compute Containers​: Service which does not require to read/store their previous states after each start.
  • Event-triggered and Ephemeral​: Triggered only by events, and they exist until the time it is triggered. If not triggered for a period, the service gets down allowing other services to use their resources.
  • Dynamically Allocated​: Services gets down when not in use so their resources might be used by other services, that is why they have to be redeployed(behind the scene) thus dynamically allocated.
  • Pay as you go​: Pay for the services that have actually been used.

Advantages and Disadvantages of using Serverless

Advantages Disadvantages
Lower Operational Cost​ Complex Architecture​
Easily Scalable​ Execution Time​
Billing is based upon Usage​ Execution Frequency​
Easy Deployment​
Low Cost​

Azure CLI

Azure CLI: The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.

Azure CLI Commands

Login to Azure: The command will open the tab in the browser letting you to log in to the portal.

az login
Enter fullscreen mode Exit fullscreen mode

To check all the available locations

az account list-locations \
    --query "[].{Region:name}" \
    --out table
Enter fullscreen mode Exit fullscreen mode

Setting up the subscription

If this step is performed you can skip the subscription argument while creating group, storage and applications. This set as the default subscription on your local system.

# To see all the available subscriptions
az account list 
# To set the subscription
az account set --subscription <Subscription-ID or Subscription-Name>
Enter fullscreen mode Exit fullscreen mode

Create Azure Resource Group

Azure Resource Group is the container to store the resources as the directory in your system. It helps in managing the resources associated with a similar type of solutions or the related one.

az group create --name <Your-Resource-Group-Name> --location eastus
Enter fullscreen mode Exit fullscreen mode

Blob Storage

Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data.

az storage account create \
    --name <Blob Storage Name> \
    --resource-group <Resource Group Name> \
    --location eastus \
    --sku Standard_ZRS \
    --encryption-services blob
Enter fullscreen mode Exit fullscreen mode

Create Container

A container organizes a set of blobs, similar to a directory in a file system. A storage account can include an unlimited number of containers, and a container can store an unlimited number of blobs.

az ad signed-in-user show --query objectId -o tsv | az role assignment create \
    --role "Storage Blob Data Contributor" \
    --assignee @- \
    --scope "/subscriptions/<Subscription>/resourceGroups/<Resource Group Name>/providers/Microsoft.Storage/storageAccounts/<Blob Storage Name>"

az storage container create \
    --account-name <Storage Account> \
    --name <Container Name> \
    --auth-mode login
Enter fullscreen mode Exit fullscreen mode

Azure Functions

Azure Functions is a serverless compute service that lets you run event-triggered code without having to explicitly provision or manage infrastructure.

Azure Functions Core Tools: To develop and test the functions locally using a terminal or command prompt. This is required and can be downloaded from here.

Creating Functions Project

The command initializes the project and creates a project directory - worker-runtime denotes the language you want to use in the application

# This will generate a folder containing two files host.json and local.settings.json
func init Seasons-of-Serverless-Solution-Lovely-Ladoos --worker-runtime node

cd Seasons-of-Serverless-Solution-Lovely-Ladoos
Enter fullscreen mode Exit fullscreen mode

Note: The values of these files can be changed according to the connection strings which you had or will have for DB, storage etc.

Creating Functions Template

To Create a JavaScript HTTP Trigger, this is a sample template to create the HTTP Trigger.

# Here a pre-built template for the Http Trigger will be created
# with two files, function.json and index.js
# We are going to write our logics in the index.js
# To create blob endpoint
func new --template "Http Trigger" --name blobs

# To create prediction endpoint
func new --template "Http Trigger" --name predict
Enter fullscreen mode Exit fullscreen mode

Initialize your nodeJs project with npm

npm init
Enter fullscreen mode Exit fullscreen mode

Refer to the GitHub Repository for the complete code: ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos

List all the blobs in a Blob Storage Container

const { BlobServiceClient, StorageSharedKeyCredential, newPipeline } = require('@azure/storage-blob');

const sharedKeyCredential = new StorageSharedKeyCredential(
  process.env.AZURE_STORAGE_ACCOUNT_NAME,
  process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY
);
const pipeline = newPipeline(sharedKeyCredential);
const containerName = process.env.CONTAINER_NAME;

const blobServiceClient = new BlobServiceClient(`https://${process.env.AZURE_STORAGE_ACCOUNT_NAME}.blob.core.windows.net`, pipeline );

const containerClient = blobServiceClient.getContainerClient(containerName);

const listBlobsResponse = await containerClient.listBlobFlatSegment();
Enter fullscreen mode Exit fullscreen mode

Uploading the image to the Blob Storage

const { BlobServiceClient } = require('@azure/storage-blob');
const streamifier = require("streamifier");

const containerName = process.env.CONTAINER_NAME;

const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AzureWebJobsStorage);

const containerClient = blobServiceClient.getContainerClient(containerName);;

const blockBlobClient = containerClient.getBlockBlobClient(fileName);

const result = await blockBlobClient.uploadStream(streamifier.createReadStream(Buffer.from(fileData)), fileData.length);
Enter fullscreen mode Exit fullscreen mode

Predict the image with image URL

// Import Dependencies
const msRest = require("@azure/ms-rest-js");
const { PredictionAPIClient } = require("@azure/cognitiveservices-customvision-prediction");

// Get the env variables
const projectId = process.env.PROJECT_ID
const publishedName = process.env.PUBLISHED_NAME
const predictionKey = process.env.PREDICTION_KEY
const endPoint = process.env.PREDICTION_ENDPOINT
const predictionResourceId = process.env.PREDICTION_RESOURCE_ID

const predictor_credentials = new msRest.ApiKeyCredentials({ inHeader: { "Prediction-key": predictionKey } });

const predictor = new PredictionAPIClient(predictor_credentials, endPoint);

const results = await predictor.classifyImageUrl(projectId, publishedName, { url: imageUrl });
Enter fullscreen mode Exit fullscreen mode

Custom Vision API

Custom Vision lets you build, deploy, and improve your own image classifiers. An image classifier is an AI service that applies labels (which represent classes) to images, based on their visual characteristics. 

Note: The solution here is not the only solution, one can use the Serverless Function or the Azure CLI for the same, we just went with the simpler method of using the Azure Portal.

  • Log in the Azure Portal, search for Cognitive Services → Custom Vision or search for Custom Vision, and click Create.
    create custom vision

  • Select the subscription, resource group you created above.
    Select the pricing tier best suited for your purpose, well here I would go with Free F0 Pricing Tier.

  • It takes a few minutes to deploy
    custom vision deployment

  • Navigate to 'Go to prediction resource' ie., customvision.ai portal

  • Create a New Project

Name: <Any Name>
Description: <Any Description>
Resources: <The Resource just created>
Project Types: Classification
Classification Types: Multiclass (Single tag per image)
Domains: Food
Enter fullscreen mode Exit fullscreen mode
  • On the Custom Vision Portal, click on 'Add Images', and upload the images of "ladoos" and tag them as "ladoo". At the end Upload all the files. 

For your simplicity, I have attached the dataset: ramanaditya/Seasons-of-Serverless-Solution-Lovely-Ladoos/tree/main/datasets

Alt Text

  • Similarly, upload the images of doughnut and sesame buns.
  • Once all the images are uploaded click on Train, you can select any training options, here I have selected "Quick Training".
  • Once the training is completed you will find the beautiful statistics
    statistics

  • Click on "Publish" to publish your prediction model to be used in the Azure Functions.

  • While publishing save the "Published Name" to your .env file
    Navigate to settings and store Project ID, Prediction Endpoint, Prediction Key and Prediction Resource ID in the .env file.

Run locally

func start
Enter fullscreen mode Exit fullscreen mode

Check the working of API Endpoints

GET: /api/blobs 

List all the images stored with their image URL
Alt Text

POST: /api/blobs

To upload the image to the blob storage
Alt Text

GET: /api/predict?imageurl=<imageUrl>

To predict the image
predict

Chefs for the Challenge

Top comments (0)