This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
AI Shopping Cart is a sample application that supercharges your shopping experience with the power of AI. It leverages Azure OpenAI and Azure Spring Apps to build a recommendation engine that is not only scalable, resilient, and secure, but also personalized to your needs. Taking advantage of Azure OpenAI, the application performs nutrition analysis on the items in your cart and generates the top 3 recipes using those ingredients. With Azure Developer CLI (azd), you’re just a few commands away from having this fully functional sample application up and running in Azure. Let's get started!
This sample application take inspiration on this original work: https://github.com/lopezleandro03/ai-assisted-groceries-cart
Refer to the App Templates repository Readme for more samples that are compatible with azd.
AI Shopping Cart
Pre-requisites
Install the Azure Developer CLI
An Azure account with an active subscription. Create one for free.
OpenJDK 17
Node.js 20.5.0+
Docker
Azure OpenAI with gpt-4 or gpt-35-turbo [Note]
Review the architecture diagram and the resources you'll deploy and the Azure OpenAI section.
Quickstart
To learn how to get started with any template, follow this quickstart. For this template Azure-Samples/app-templates-java-openai-springapps, you need to execute a few additional steps as described below.
This quickstart will show you how to authenticate on Azure, enable Spring Apps alpha feature for azd, initialize using a template, set the environment variables for Azure OpenAI, provision the infrastructure, and deploy the code to Azure:
Log in to azd if you haven't already
azd auth login
Enable Azure Spring Apps alpha feature for azd
azd config set alpha.springapp on
First-time project setup. Initialize a project in the current directory using this template
azd init --template Azure-Samples/app-templates-java-openai-springapps
Set the environment variables for Azure OpenAI
azd env set azureOpenAiApiKey
azd env set azureOpenAiEndpoint
azd env set azureOpenAiDeploymentId
To use GPT-3.5 Turbo model set this environment variable to false
azd env set isAzureOpenAiGpt4Model true
Provision and deploy to Azure
azd up
Notes
Replace the placeholders with the values from your Azure OpenAI resource.
If you are using gpt-35-turbo model, you need to set isAzureOpenAiGpt4Model to false before provisioning the resource and deploying the sample application to Azure:
azd env set isAzureOpenAiGpt4Model false
At the end of the deployment, you will see the URL of the front-end. Open the URL in a browser to see the application in action.
Application Architecture
This sample application uses the following Azure resources:
Azure Container Apps (Environment) to host the frontend as a Container App and Azure Spring Apps Standard comsumption and dedicated plan
Azure Spring Apps to host the AI Shopping Cart Service as a Spring App
Azure Container Registry to host the Docker image for the frontend
Azure Database for PostgreSQL (Flexible Server) to store the data for the AI Shopping Cart Service
Azure Monitor for monitoring and logging
Azure OpenAI to perform nutrition analysis and generate top 3 recipes. It is not deployed with the sample app[Note].
Here's a high level architecture diagram that illustrates these components. Excepted Azure OpenAI, all the other resources are provisioned in a single resource group that is created when you create your resources using azd up.
Architecture diagram
This template provisions resources to an Azure subscription that you will select upon provisioning them. Please refer to the Pricing calculator for Microsoft Azure and, if needed, update the included Azure resource definitions found in infra/main.bicep to suit your needs.
Azure OpenAI
This sample application uses Azure OpenAI. It is not part of the automated deployment process. You will need to create an Azure OpenAI resource and configure the application to use it. Please follow the instructions in the Azure OpenAI documentation to get access to Azure OpenAI. Do not forget to read the overview of the Responsible AI practices for Azure OpenAI models before you start using Azure OpenAI and request access.
The current version of the sample app requires a publicly accessible Azure OpenAI resource (i.e. Allow access from all networks). This sample is not intended to be used in production. To know more about networking and security for Azure OpenAI, please refer to the Azure OpenAI documentation.
This sample app was developed to be used with gpt-4 model. It also supports gpt-35-turbo. To use gpt-35-turbo, you need to set isAzureOpenAiGpt4Model to false (cf. Quickstart). By default, this parameter/environment variable is set to true. To complete the setup of the application, you need to set the following information from the Azure OpenAI resource:
azureOpenAiApiKey - Azure OpenAI API key
azureOpenAiEndpoint - Azure OpenAI endpoint
azureOpenAiDeploymentId - Azure OpenAI deployment ID of gpt-4 or gpt-3.5-turbo model
The API key and the endpoint can be found in the Azure Portal. You can follow these instructions: Retrieve key and enpoint. The deployment id corresponds to the deployment name in this guide.
Prompt engineering is important to get the best results from Azure OpenAI. Text prompts are how users interact with GPT models. As with all generative large language model (LLM), GPT models try to produce the next series of words that are the most likely to follow the previous text. It is a bit like asking to the AI model: What is the first thing that comes to mind when I say ?
With the Chat Completion API, there are distinct sections of the prompt that are sent to the API associated with a specific role: system, user and assitant. The system message is included at the begining of the prompt and is used to provides the initial instructions to the model: description of the assitant, personality traits, instructions/rules it will follow, etc.
AI Shopping Cart Service is using Azure OpenAI client library for Java. This libary is part of of Azure SDK for Java. It is implemented as a chat completion. In the service, we have 2 system messages in SystemMessageConstants.java: one for AI Nutrition Analysis and one to generate top 3 recipes. The system message is followed by a user message: The basket is: . The assistant message is the response from the model. The service is using the ShoppingCartAiRecommendations to interact with Azure OpenAI. In this class you will find the code that is responsible for generating the prompt and calling the Azure OpenAI API: getChatCompletion. To know more about temperature and topP used in this class, please refer to the documentation.
For gpt-35-turbo model, more context is added to the user message. This additional context is added at the end of the user message. It provides more information on the format of the JSON that OpenAI model needs to return and ask the model tor return only the JSON without additional text. This additional context is available in UserMessageConstants.java.
Pre-requisites ⤴️
Application Architecture ⤴️
Application Code
This template is structured to follow the Azure Developer CLI template convetions. You can learn more about azd architecture in the official documentation.
Next Steps
At this point, you have a complete application deployed on Azure.
Enterprise Scenarios
For enterprise needs, looking for polyglot applications deployment, Tanzu components support and SLA assurance, we recommend to use Azure Spring Apps Enterprise. Check the Azure Spring Apps landing zone accelerator that provides architectural guidance designed to streamline the production ready infrastructure provisioning and deployment of Spring Boot and Spring Cloud applications to Azure Spring Apps. As the workload owner, use architectural guidance provided in landing zone accelerator to achieve your target technical state with confidence.
Azure Developer CLI
You have deployed the sample application using Azure Developer CLI, however there is much more that the Azure Developer CLI can do. These next steps will introduce you to additional commands that will make creating applications on Azure much easier. Using the Azure Developer CLI, you can setup your pipelines, monitor your application, test and debug locally.
azd down - to delete all the Azure resources created with this template
azd pipeline config - to configure a CI/CD pipeline (using GitHub Actions or Azure DevOps) to deploy your application whenever code is pushed to the main branch.
Several environment variables / secrets need to be set for Azure OpenAI resource:
AZURE_OPENAI_API_KEY: API key for Azure OpenAI resource
For GitHub workflows, you should use GitHub Secrets
For Azure DevOps pipelines, you check 'Keep this value secret' when creating the variable
AZURE_OPENAI_ENDPOINT: Endpoint for Azure OpenAI resource
AZURE_OPENAI_DEPLOYMENT_ID: Deployment ID/name for Azure OpenAI resource
IS_AZURE_OPENAI_GPT4_MODEL: Set to true if you are using GPT-4 model and to false if you are using GPT-3.5 Turbo model
azd monitor - to monitor the application and quickly navigate to the various Application Insights dashboards (e.g. overview, live metrics, logs)
Run and Debug Locally - using Visual Studio Code and the Azure Developer CLI extension
Additional azd commands
The Azure Developer CLI includes many other commands to help with your Azure development experience. You can view these commands at the terminal by running azd help. You can also view the full list of commands on our Azure Developer CLI command page.
Resources
These are additional resources that you can use to learn more about the sample application and its underlying technologies.
Start from zero and scale to zero – Azure Spring Apps consumption plan
Azure Spring Apps Consumption - Networking and Security
https://learn.microsoft.com/en-us/azure/ai-services/openai/encrypt-data-at-rest
https://learn.microsoft.com/en-us/azure/ai-services/openai/encrypt-data-at-rest
How to configure Azure OpenAI Service with managed identities
Data Collection
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoft's privacy statement. Our privacy statement is located at https://go.microsoft.com/fwlink/?LinkId=521839. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.
Telemetry Configuration
Telemetry collection is on by default.
To opt-out, set the variable enableTelemetry to false in infra/main.parameters.json or in bicep template infra/main.bicep. It can be set using the following command when the provisionning is done with Azure Developer CLI:
azd env set enableTelemetry false
Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Code of Conduct
Issues and Bugs
Feature Requests
Submission Guidelines
Code of Conduct
Help us keep this project open and inclusive. Please read and follow our Code of Conduct.
Found an Issue?
If you find a bug in the source code or a mistake in the documentation, you can help us by submitting an issue to the GitHub Repository. Even better, you can submit a Pull Request with a fix.
Want a Feature?
You can request a new feature by submitting an issue to the GitHub Repository. If you would like to implement a new feature, please submit an issue with a proposal for your work first, to be sure that we can use it.
Small Features can be crafted and directly submitted as a Pull Request.
Submission Guidelines
Submitting an Issue
Before you submit an issue, search the archive, maybe your question was already answered.
If your issue appears to be a bug, and hasn't been reported, open a new issue. Help us to maximize the effort we can spend fixing issues and adding new features, by not reporting duplicate issues. Providing the following information will increase the chances of your issue being dealt with quickly:
Overview of the Issue - if an error is being thrown a non-minified stack trace helps
Version - what version is affected (e.g. 0.1.2)
Motivation for or Use Case - explain what are you trying to do and why the current behavior is a bug for you
Browsers and Operating System - is this a problem with all browsers?
Reproduce the Error - provide a live example or a unambiguous set of steps
Related Issues - has a similar issue been reported before?
Suggest a Fix - if you can't fix the bug yourself, perhaps you can point to what might be causing the problem (line of code or commit)
You can file new issues by providing the above information at the corresponding repository's issues link: https://github.com/Azure-Samples/app-templates-java-openai-springapps/issues/new].
Submitting a Pull Request (PR)
Before you submit your Pull Request (PR) consider the following guidelines:
Search the repository (https://github.com/Azure-Samples/app-templates-java-openai-springapps/pulls) for an open or closed PR that relates to your submission. You don't want to duplicate effort.
Make your changes in a new git fork:
Commit your changes using a descriptive commit message
page_type: sample
languages:
- azdeveloper
- java
- typescript
- bicep
- html products:
- azure
- azure-container-apps
- azure-spring-apps
- azure-container-registry
- azure-monitor
- ms-build-openjdk
- ai-services
- azure-openai
- azure-database-postgresql urlFragment: app-templates-java-openai-springapps name: AI Shopping Cart - App Template for Java, Azure OpenAI and Azure Spring Apps description: "AI Shopping Cart Sample Application with Azure OpenAI and Azure Spring Apps" ---
AI Shopping Cart - App Template for Java, Azure OpenAI and Azure Spring Apps
AI Shopping Cart is a sample application that supercharges your shopping experience with the power of AI. It leverages Azure OpenAI and Azure Spring Apps to build a recommendation engine that is not only scalable, resilient, and secure, but also personalized to your needs. Taking advantage of Azure OpenAI, the application performs nutrition analysis on the items in your cart and generates the top 3 recipes using those ingredients. With Azure Developer CLI (azd), you’re just a few commands away from having this fully functional sample application up and running in Azure. Let's get started!
This sample application take inspiration on this original work: https://github.com/lopezleandro03/ai-assisted-groceries-cart
Refer to the App Templates repository Readme for more samples that are compatible with
azd
.
Pre-requisites
- Install the Azure Developer CLI
- An Azure account with an active subscription. Create one for free.
- OpenJDK 17
- Node.js 20.5.0+
- Docker
-
Azure OpenAI with
gpt-4
orgpt-35-turbo
[Note] - Review the architecture diagram and the resources you'll deploy and the Azure OpenAI section.
Quickstart
To learn how to get started with any template, follow this quickstart. For this template Azure-Samples/app-templates-java-openai-springapps
, you need to execute a few additional steps as described below.
This quickstart will show you how to authenticate on Azure, enable Spring Apps alpha feature for azd, initialize using a template, set the environment variables for Azure OpenAI, provision the infrastructure, and deploy the code to Azure:
# Log in to azd if you haven't already
azd auth login
# Enable Azure Spring Apps alpha feature for azd
azd config set alpha.springapp on
# First-time project setup. Initialize a project in the current directory using this template
azd init --template Azure-Samples/app-templates-java-openai-springapps
# Set the environment variables for Azure OpenAI
azd env set azureOpenAiApiKey <replace-with-Azure-OpenAi-API-key>
azd env set azureOpenAiEndpoint <replace-with-Azure-OpenAi-endpoint>
azd env set azureOpenAiDeploymentId <replace-with-Azure-OpenAi-deployment-id/name>
# To use GPT-3.5 Turbo model set this environment variable to false
azd env set isAzureOpenAiGpt4Model true
# Provision and deploy to Azure
azd up
Notes
- Replace the placeholders with the values from your Azure OpenAI resource.
- If you are using
gpt-35-turbo
model, you need to setisAzureOpenAiGpt4Model
tofalse
before provisioning the resource and deploying the sample application to Azure:
bash
azd env set isAzureOpenAiGpt4Model false
At the end of the deployment, you will see the URL of the front-end. Open the URL in a browser to see the application in action.
Application Architecture
This sample application uses the following Azure resources:
- Azure Container Apps (Environment) to host the frontend as a Container App and Azure Spring Apps Standard comsumption and dedicated plan
- Azure Spring Apps to host the AI Shopping Cart Service as a Spring App
- Azure Container Registry to host the Docker image for the frontend
- Azure Database for PostgreSQL (Flexible Server) to store the data for the AI Shopping Cart Service
- Azure Monitor for monitoring and logging
- Azure OpenAI to perform nutrition analysis and generate top 3 recipes. It is not deployed with the sample app[Note].
Here's a high level architecture diagram that illustrates these components. Excepted Azure OpenAI, all the other resources are provisioned in a single resource group that is created when you create your resources using azd up
.
This template provisions resources to an Azure subscription that you will select upon provisioning them. Please refer to the Pricing calculator for Microsoft Azure and, if needed, update the included Azure resource definitions found in
infra/main.bicep
to suit your needs.
Azure OpenAI
This sample application uses Azure OpenAI. It is not part of the automated deployment process. You will need to create an Azure OpenAI resource and configure the application to use it. Please follow the instructions in the Azure OpenAI documentation to get access to Azure OpenAI. Do not forget to read the overview of the Responsible AI practices for Azure OpenAI models before you start using Azure OpenAI and request access.
The current version of the sample app requires a publicly accessible Azure OpenAI resource (i.e. Allow access from all networks). This sample is not intended to be used in production. To know more about networking and security for Azure OpenAI, please refer to the Azure OpenAI documentation.
This sample app was developed to be used with gpt-4
model. It also supports gpt-35-turbo
. To use gpt-35-turbo
, you need to set isAzureOpenAiGpt4Model
to false
(cf. Quickstart). By default, this parameter/environment variable is set to true
. To complete the setup of the application, you need to set the following information from the Azure OpenAI resource:
-
azureOpenAiApiKey
- Azure OpenAI API key -
azureOpenAiEndpoint
- Azure OpenAI endpoint -
azureOpenAiDeploymentId
- Azure OpenAI deployment ID ofgpt-4
orgpt-3.5-turbo
model
The API key and the endpoint can be found in the Azure Portal. You can follow these instructions: Retrieve key and enpoint. The deployment id corresponds to the deployment name
in this guide.
Prompt engineering is important to get the best results from Azure OpenAI. Text prompts are how users interact with GPT models. As with all generative large language model (LLM), GPT models try to produce the next series of words that are the most likely to follow the previous text. It is a bit like asking to the AI model: What is the first thing that comes to mind when I say <prompt>
?
With the Chat Completion API, there are distinct sections of the prompt that are sent to the API associated with a specific role: system, user and assitant. The system message is included at the begining of the prompt and is used to provides the initial instructions to the model: description of the assitant, personality traits, instructions/rules it will follow, etc.
AI Shopping Cart Service
is using Azure OpenAI client library for Java. This libary is part of of Azure SDK for Java. It is implemented as a chat completion. In the service, we have 2 system messages in SystemMessageConstants.java: one for AI Nutrition Analysis and one to generate top 3 recipes. The system message is followed by a user message: The basket is: <list of items in the basket separated by a comma>
. The assistant message is the response from the model. The service is using the ShoppingCartAiRecommendations to interact with Azure OpenAI. In this class you will find the code that is responsible for generating the prompt and calling the Azure OpenAI API: getChatCompletion
. To know more about temperature and topP used in this class, please refer to the documentation.
For gpt-35-turbo
model, more context is added to the user message. This additional context is added at the end of the user message. It provides more information on the format of the JSON that OpenAI model needs to return and ask the model tor return only the JSON without additional text. This additional context is available in UserMessageConstants.java.
Application Code
This template is structured to follow the Azure Developer CLI template convetions. You can learn more about azd
architecture in the official documentation.
[Code](https://github.com/Azure-Samples/app-templates-java-openai-springapps.git)
Next Steps
At this point, you have a complete application deployed on Azure.
Enterprise Scenarios
For enterprise needs, looking for polyglot applications deployment, Tanzu components support and SLA assurance, we recommend to use Azure Spring Apps Enterprise. Check the Azure Spring Apps landing zone accelerator that provides architectural guidance designed to streamline the production ready infrastructure provisioning and deployment of Spring Boot and Spring Cloud applications to Azure Spring Apps. As the workload owner, use architectural guidance provided in landing zone accelerator to achieve your target technical state with confidence.
Azure Developer CLI
You have deployed the sample application using Azure Developer CLI, however there is much more that the Azure Developer CLI can do. These next steps will introduce you to additional commands that will make creating applications on Azure much easier. Using the Azure Developer CLI, you can setup your pipelines, monitor your application, test and debug locally.
azd down
- to delete all the Azure resources created with this template-
azd pipeline config
- to configure a CI/CD pipeline (using GitHub Actions or Azure DevOps) to deploy your application whenever code is pushed to the main branch.- Several environment variables / secrets need to be set for Azure OpenAI resource:
-
AZURE_OPENAI_API_KEY
: API key for Azure OpenAI resource- For GitHub workflows, you should use GitHub Secrets
- For Azure DevOps pipelines, you check 'Keep this value secret' when creating the variable
-
AZURE_OPENAI_ENDPOINT
: Endpoint for Azure OpenAI resource -
AZURE_OPENAI_DEPLOYMENT_ID
: Deployment ID/name for Azure OpenAI resource -
IS_AZURE_OPENAI_GPT4_MODEL
: Set totrue
if you are using GPT-4 model and tofalse
if you are using GPT-3.5 Turbo model
azd monitor
- to monitor the application and quickly navigate to the various Application Insights dashboards (e.g. overview, live metrics, logs)Run and Debug Locally - using Visual Studio Code and the Azure Developer CLI extension
Additional azd
commands
The Azure Developer CLI includes many other commands to help with your Azure development experience. You can view these commands at the terminal by running azd help
. You can also view the full list of commands on our Azure Developer CLI command page.
Resources
These are additional resources that you can use to learn more about the sample application and its underlying technologies.
Azure Spring Apps Consumption - Networking and Security
- https://learn.microsoft.com/en-us/azure/ai-services/openai/encrypt-data-at-rest
- https://learn.microsoft.com/en-us/azure/ai-services/openai/encrypt-data-at-rest
- How to configure Azure OpenAI Service with managed identities
Data Collection
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoft's privacy statement. Our privacy statement is located at https://go.microsoft.com/fwlink/?LinkId=521839. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.
Telemetry Configuration
Telemetry collection is on by default.
To opt-out, set the variable enableTelemetry to false in infra/main.parameters.json
or in bicep template infra/main.bicep
. It can be set using the following command when the provisioning is done with Azure Developer CLI:
azd env set enableTelemetry false
Push your fork to GitHub:
In GitHub, create a pull request
If we suggest changes then:
Make the required updates.
Rebase your fork and force push to your GitHub repository (this will update your Pull Request):
git rebase main -i
git push -f
That's it! Happy Contributing !!
For More such content and Open Source Project Tutorial Guide, Please follow, share and like.
Top comments (0)