<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sabah Shariq</title>
    <description>The latest articles on DEV Community by Sabah Shariq (@sabahshariq).</description>
    <link>https://dev.to/sabahshariq</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sabahshariq"/>
    <language>en</language>
    <item>
      <title>No Brainer Machine Learning model in Azure</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Mon, 31 Mar 2025 15:39:10 +0000</pubDate>
      <link>https://dev.to/sabahshariq/no-brainer-machine-learning-model-in-azure-1i63</link>
      <guid>https://dev.to/sabahshariq/no-brainer-machine-learning-model-in-azure-1i63</guid>
      <description>&lt;h1&gt;
  
  
  Introduction:
&lt;/h1&gt;

&lt;p&gt;Azure ML has a process called "Designer" which creates machine learning workflows then makes it accessible to both beginners and experienced user. Azure Machine Learning Designer is a drag-and-drop interface within Microsoft's Azure Machine Learning platform that allows users to build, test and deploy machine learning models without requiring extensive coding knowledge.&lt;/p&gt;

&lt;p&gt;Here we will train a linear regression model that predicts car prices. Following are the steps for that need to follow.&lt;/p&gt;

&lt;h1&gt;
  
  
  Create a workspace
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Sign in to Azure Machine Learning studio&lt;/li&gt;
&lt;li&gt;Select Create workspace&lt;/li&gt;
&lt;li&gt;Provide the following information to configure your new workspace:&lt;/li&gt;
&lt;li&gt;Workspace name&lt;/li&gt;
&lt;li&gt;Friendly name&lt;/li&gt;
&lt;li&gt;Hub

&lt;ul&gt;
&lt;li&gt;If you did not select a hub provide the advanced information&lt;/li&gt;
&lt;li&gt;If you selected a hub these values are taken from the hub.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Subscription&lt;/li&gt;

&lt;li&gt;Resource group&lt;/li&gt;

&lt;li&gt;Region&lt;/li&gt;

&lt;li&gt;Select Create to create the workspace&lt;/li&gt;

&lt;/ul&gt;

&lt;h1&gt;
  
  
  Create a new pipeline
&lt;/h1&gt;

&lt;p&gt;Pipeline is a visual workflow where sequence of steps connected together to process data, train a model and produce an output. These are designed using the drag-and-drop interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6oe0uyo78epua6wz37wx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6oe0uyo78epua6wz37wx.png" alt="pipeline" width="800" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Sign in to ml.azure.com&lt;/li&gt;
&lt;li&gt;select the workspace you want to work with. If no workspace available then create one&lt;/li&gt;
&lt;li&gt;Select Create a new pipeline using classic prebuilt components.&lt;/li&gt;
&lt;li&gt;Click the pencil icon beside the automatically generated pipeline draft name, rename it to Automobile price prediction. The name doesn't need to be unique.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1q4bz70ceqxb82l69x4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1q4bz70ceqxb82l69x4.png" alt="Pipeline_Name" width="379" height="82"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Import data
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ric664uhizk7u9ve0x5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ric664uhizk7u9ve0x5.png" alt="Import_pipeline_data" width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The designer comes with prebuilt sample dataset for user to do experiment. Here we will be using Automobile price data (Raw). Use the following steps to select the data set.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On the left hand side you will see options like datasets and components.

&lt;ul&gt;
&lt;li&gt;Select components&lt;/li&gt;
&lt;li&gt;Expand sample data&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Select the dataset Automobile price data (Raw) and drag it onto the canvas.&lt;/li&gt;

&lt;/ul&gt;

&lt;h1&gt;
  
  
  Visualize the data
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcxiuhzsg24rstw1xhwjq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcxiuhzsg24rstw1xhwjq.png" alt="car_price_dataset" width="800" height="326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It's better to visualize the data for better understanding of the dataset. To do this follow the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Right click the Automobile price data (Raw) and select Preview Data.&lt;/li&gt;
&lt;li&gt;Select the different columns in the data window to view information about each one.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Each row represents an automobile, and the variables associated with each automobile appear as columns. There are 205 rows and 26 columns in this dataset.&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Setup Data
&lt;/h1&gt;

&lt;p&gt;Before doing an analysis on a data set it requires some processing. Like there might be some missing values which might create an issue for model to analyze correctly. &lt;br&gt;
For our dataset we can see that the normalized-losses column is missing many values so, this need to exclude. Use the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas, click Component and search for the Select Columns in Dataset component.&lt;/li&gt;
&lt;li&gt;Drag the Select Columns in Dataset component onto the canvas. Drop the component below the dataset component.&lt;/li&gt;
&lt;li&gt;Connect the Automobile price data (Raw) dataset to the Select Columns in Dataset component. Drag from the dataset's output port which is the small circle at the bottom of the dataset on the canvas to the input port of Select Columns in Dataset which is the small circle at the top of the component.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyrraj2ugrruc9liqkixc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyrraj2ugrruc9liqkixc.png" alt="first connect" width="653" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the "Select Columns in Dataset component."&lt;/li&gt;
&lt;li&gt;Click on the arrow icon under Settings to the right of the canvas to open the component details pane. Alternatively, you can double click the "Select Columns" in Dataset component to open the details pane.&lt;/li&gt;
&lt;li&gt;Select Edit column to the right of the pane.&lt;/li&gt;
&lt;li&gt;Expand the Column names drop down next to Include and select All columns.&lt;/li&gt;
&lt;li&gt;Select the + to add a new rule.&lt;/li&gt;
&lt;li&gt;From the drop-down menus, select Exclude and Column names.&lt;/li&gt;
&lt;li&gt;Enter normalized-losses in the text box.&lt;/li&gt;
&lt;li&gt;In the lower right, select Save to close the column selector.&lt;/li&gt;
&lt;li&gt;In the Select Columns in Dataset component details pane expand Node info.&lt;/li&gt;
&lt;li&gt;Select the Comment text box and enter Exclude normalized losses.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Comments will appear on the graph to help you organize your pipeline.&lt;/em&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod3gveo66dah9vgryx8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod3gveo66dah9vgryx8t.png" alt="exclude column" width="712" height="416"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Clean missing data
&lt;/h1&gt;

&lt;p&gt;After removing normalized-losses column the still might be missing value remaining. "Clean Missing Data" component can be used to remove this.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas, click Component and search for the Clean Missing Data component.&lt;/li&gt;
&lt;li&gt;Drag the Clean Missing Data component to the pipeline canvas. Connect it to the Select Columns in Dataset component.&lt;/li&gt;
&lt;li&gt;Select the Clean Missing Data component.&lt;/li&gt;
&lt;li&gt;Click on the arrow icon under Settings to the right of the canvas to open the component details pane. &lt;/li&gt;
&lt;li&gt;Select Edit column to the right of the pane.&lt;/li&gt;
&lt;li&gt;In the Columns to be cleaned window that appears, expand the drop-down menu next to Include. Select All columns&lt;/li&gt;
&lt;li&gt;Select Save&lt;/li&gt;
&lt;li&gt;In the Clean Missing Data component details pane -&amp;gt; under Cleaning mode -&amp;gt; select Remove entire row.&lt;/li&gt;
&lt;li&gt;In the Clean Missing Data component details pane -&amp;gt; expand Node info.&lt;/li&gt;
&lt;li&gt;Select the Comment text box and enter Remove missing value rows.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5gwwv6uujrolzte5bcv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5gwwv6uujrolzte5bcv.png" alt="pipeline clean" width="538" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Setup a machine learning model
&lt;/h1&gt;

&lt;p&gt;As we want to predict the price we can use a regression algorithm like linear regression model.&lt;/p&gt;

&lt;h3&gt;
  
  
  Split the data
&lt;/h3&gt;

&lt;p&gt;We will split the data into two separate datasets. One dataset trains the model and the other will test how well the model performed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas, click Component and search for the Split Data component.&lt;/li&gt;
&lt;li&gt;Drag the Split Data component to the pipeline canvas.&lt;/li&gt;
&lt;li&gt;Connect the left port of the Clean Missing Data component to the Split Data component.&lt;/li&gt;
&lt;li&gt;Select the Split Data component.&lt;/li&gt;
&lt;li&gt;Click on the arrow icon under Settings to the right of the canvas to open the component details pane. Alternatively, you can double-click the Split Data component to open the details pane.&lt;/li&gt;
&lt;li&gt;In the Split Data details pane set the Fraction of rows in the first output dataset to 0.7.&lt;/li&gt;
&lt;li&gt;This option splits 70 percent of the data to train the model and 30 percent for testing it. The 70 percent dataset will be accessible through the left output port. The remaining data is available through the right output port.&lt;/li&gt;
&lt;li&gt;In the Split Data details pane, expand Node info.&lt;/li&gt;
&lt;li&gt;Select the Comment text box and enter Split the dataset into training set (0.7) and test set (0.3).&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Train the model:
&lt;/h3&gt;

&lt;p&gt;Train the model by giving it a dataset that includes the price. The algorithm constructs a model that explains the relationship between the features and the price as presented by the training data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas click Component and search for the Linear Regression component.&lt;/li&gt;
&lt;li&gt;Drag the Linear Regression component to the pipeline canvas.&lt;/li&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas click Component and search for the Train Model component.&lt;/li&gt;
&lt;li&gt;Drag the Train Model component to the pipeline canvas.&lt;/li&gt;
&lt;li&gt;Connect the output of the Linear Regression component to the left input of the Train Model component.&lt;/li&gt;
&lt;li&gt;Connect the training data output (left port) of the Split Data component to the right input of the Train Model component.&lt;/li&gt;
&lt;li&gt;Select the Train Model component.&lt;/li&gt;
&lt;li&gt;Click on the arrow icon under Settings to the right of the canvas to open the component details pane. Alternatively, you can double-click the Train Model component to open the details pane.&lt;/li&gt;
&lt;li&gt;Select Edit column to the right of the pane.&lt;/li&gt;
&lt;li&gt;In the Label column window that appears, expand the drop-down menu and select Column names.&lt;/li&gt;
&lt;li&gt;In the text box enter price to specify the value that your model is going to predict.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcqdf1s9v9vwitq6aijh0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcqdf1s9v9vwitq6aijh0.png" alt="Pipeline Graph" width="800" height="602"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Add the Score Model component
&lt;/h1&gt;

&lt;p&gt;After training the model by using 70 percent of the data, we can use it to score the other 30 percent to see how well your model functions.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas, click Component and search for the Score Model component.&lt;/li&gt;
&lt;li&gt;Drag the Score Model component to the pipeline canvas.&lt;/li&gt;
&lt;li&gt;Connect the output of the Train Model component to the left input port of Score Model. Connect the test data output (right port) of the Split Data component to the right input port of Score Model.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Add the Evaluate Model component
&lt;/h1&gt;

&lt;p&gt;Use the Evaluate Model component to evaluate how well your model scored the test dataset.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the datasets and component palette to the left of the canvas, click Component and search for the Evaluate Model component.&lt;/li&gt;
&lt;li&gt;Drag the Evaluate Model component to the pipeline canvas.&lt;/li&gt;
&lt;li&gt;Connect the output of the Score Model component to the left input of Evaluate Model.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;The final pipeline should look something like this:&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1dsm1wr3x3pzjmgyovb5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1dsm1wr3x3pzjmgyovb5.png" alt="Pipeline FInal" width="800" height="688"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Submit pipeline
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Select Configure &amp;amp; Submit on the right top corner to submit the pipeline.&lt;/li&gt;
&lt;li&gt;Then you'll see a step-by-step wizard follow the wizard to submit the pipeline job.&lt;/li&gt;
&lt;li&gt;In Basics step, you can configure the experiment, job display name, job description etc.&lt;/li&gt;
&lt;li&gt;After submitting the pipeline job, there will be a message on the top with a link to the job detail. You can select this link to review the job details.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  View scored labels
&lt;/h3&gt;

&lt;p&gt;In the job detail page, you can check the pipeline job status, results and logs.&lt;/p&gt;

&lt;p&gt;After the job completes we can view the results of the pipeline job. First, look at the predictions generated by the regression model.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Right-click the Score Model component, and select Preview data -&amp;gt; Scored dataset to view its output.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here we can see the predicted prices and the actual prices from the testing data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F647skodt1awc3i8p0lt3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F647skodt1awc3i8p0lt3.png" alt="Score Result" width="800" height="437"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Model Evaluation
&lt;/h3&gt;

&lt;p&gt;Use the Evaluate Model to see how well the trained model performed on the test dataset.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Right-click the Evaluate Model component and select Preview data &amp;gt; Evaluation results to view its output.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The following statistics are shown for your model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Mean Absolute Error (MAE): The average of absolute errors. An error is the difference between the predicted value and the actual value.&lt;/li&gt;
&lt;li&gt;Root Mean Squared Error (RMSE): The square root of the average of squared errors of predictions made on the test dataset.&lt;/li&gt;
&lt;li&gt;Relative Absolute Error: The average of absolute errors relative to the absolute difference between actual values and the average of all actual values.&lt;/li&gt;
&lt;li&gt;Relative Squared Error: The average of squared errors relative to the squared difference between the actual values and the average of all actual values.&lt;/li&gt;
&lt;li&gt;Coefficient of Determination: Also known as the R squared value, this statistical metric indicates how well a model fits the data.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;For each of the error statistics, smaller is better. A smaller value indicates that the predictions are closer to the actual values. For the coefficient of determination, the closer its value is to one (1.0), the better the predictions.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Training a Machine Learning model in Azure</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Mon, 31 Mar 2025 10:17:22 +0000</pubDate>
      <link>https://dev.to/sabahshariq/training-a-machine-learning-model-in-azure-ie6</link>
      <guid>https://dev.to/sabahshariq/training-a-machine-learning-model-in-azure-ie6</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;We will use a credit card dataset to understand how to use Machine Learning for a classification problem in Azure. Here our goal is to identify customers who are at risk of not paying their credit card bills. To get started with Azure Machine Learning following two prerequisites first we need to setup:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Workspace&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In order to use Machine Learning in Azure we need to create a workspace. A workspace is a centralized environment where all the machine learning resource get stored.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Compute Instance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A compute instance is a cloud-based virtual machine (VM) used for developing and running ML models. Here we will run Jupyter notebooks and Python scripts and test AI algorithms.&lt;/p&gt;

&lt;h1&gt;
  
  
  Create a workspace
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Sign in to Azure Machine Learning studio&lt;/li&gt;
&lt;li&gt;Select Create workspace&lt;/li&gt;
&lt;li&gt;Provide the following information to configure your new workspace:&lt;/li&gt;
&lt;li&gt;Workspace name&lt;/li&gt;
&lt;li&gt;Friendly name&lt;/li&gt;
&lt;li&gt;Hub

&lt;ul&gt;
&lt;li&gt;If you did not select a hub, provide the advanced information&lt;/li&gt;
&lt;li&gt;If you selected a hub, these values are taken from the hub.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Subscription&lt;/li&gt;

&lt;li&gt;Resource group&lt;/li&gt;

&lt;li&gt;Region&lt;/li&gt;

&lt;li&gt;Select Create to create the workspace&lt;/li&gt;

&lt;/ul&gt;

&lt;h1&gt;
  
  
  Create a compute instance
&lt;/h1&gt;

&lt;p&gt;Create compute instance following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select your workspace&lt;/li&gt;
&lt;li&gt;On the top right, select New&lt;/li&gt;
&lt;li&gt;Select Compute instance in the list&lt;/li&gt;
&lt;li&gt;Supply a name&lt;/li&gt;
&lt;li&gt;Keep the default values for the rest of the page, unless your organization policy requires you to change other settings&lt;/li&gt;
&lt;li&gt;Select Review + Create&lt;/li&gt;
&lt;li&gt;Select Create&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Create handle to workspace
&lt;/h1&gt;

&lt;p&gt;We need a way to reference our workspace. That is Azure Machine Learning workspace is where all your ML resources live (models, datasets, compute, etc.). In order to work in the workspace "Python code" needs to connect to it and creating an "ml_client" does this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential

#authenticate
credential = DefaultAzureCredential()

SUBSCRIPTION="&amp;lt;SUBSCRIPTION_ID&amp;gt;"
RESOURCE_GROUP="&amp;lt;RESOURCE_GROUP&amp;gt;"
WS_NAME="&amp;lt;AML_WORKSPACE_NAME&amp;gt;"

#Get a handle to the workspace
ml_client = MLClient(
    credential=credential,
    subscription_id=SUBSCRIPTION,
    resource_group_name=RESOURCE_GROUP,
    workspace_name=WS_NAME,
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, we need to import required library&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Import Libraries&lt;/li&gt;
&lt;li&gt;MLClient → Used to interact with Azure Machine Learning.&lt;/li&gt;
&lt;li&gt;DefaultAzureCredential → Handles authentication using Azure credentials&lt;/li&gt;
&lt;li&gt;Authenticate&lt;/li&gt;
&lt;li&gt;This automatically retrieves credentials from the user's Azure environment.&lt;/li&gt;
&lt;li&gt;Define Azure Subscription &amp;amp; Resource Details&lt;/li&gt;
&lt;li&gt;Connect to the Azure ML Workspace&lt;/li&gt;
&lt;li&gt;Verify Connection
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ws = ml_client.workspaces.get(WS_NAME)
print(ws.location,":", ws.resource_group)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;ml_client.workspaces.get(WS_NAME) → Fetches the workspace details from Azure.&lt;/li&gt;
&lt;li&gt;ws.location → Prints the region where the workspace is deployed (e.g. "eastus").&lt;/li&gt;
&lt;li&gt;ws.resource_group → Prints the resource group name.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;If this prints the correct location and resource group, the connection is successful!&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Enviroment Setup
&lt;/h1&gt;

&lt;p&gt;To run machine learning jobs in Azure, we need to define an environment. To make sure the job runs correctly, need to tell Azure what software and libraries are required.&lt;/p&gt;

&lt;p&gt;Create an environment in Azure ML to tell Azure what tools, packages and Python version the code needs to run correctly and reliably.&lt;/p&gt;

&lt;p&gt;In this example, we create a custom conda environment for your jobs, using a conda yaml file. The reference this yaml file to create and register this custom environment in your workspace:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import os

dependencies_dir = "./dependencies"
os.makedirs(dependencies_dir, exist_ok=True)

from azure.ai.ml.entities import Environment

custom_env_name = "aml-scikit-learn"

custom_job_env = Environment(
    name=custom_env_name,
    description="Custom environment for Credit Card Defaults job",
    tags={"scikit-learn": "1.0.2"},
    conda_file=os.path.join(dependencies_dir, "conda.yaml"),
    image="mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04:latest",
)
custom_job_env = ml_client.environments.create_or_update(custom_job_env)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;This registers the environment to your Azure ML workspace using ml_client.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;print(
    f"Environment with name {custom_job_env.name} is registered to workspace, the environment version is {custom_job_env.version}"
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Confirms that the environment has been successfully created or updated.&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Configure a command job to run training script
&lt;/h1&gt;

&lt;p&gt;Now we're setting up a command job in Azure to run a Python script that trains a credit default prediction model using GradientBoostingClassifier. This training script will do the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prepares the data&lt;/li&gt;
&lt;li&gt;Trains the model&lt;/li&gt;
&lt;li&gt;Splits the data using train_test_split (training and testing sets)&lt;/li&gt;
&lt;li&gt;Registers the trained model so it can be reused or deployed later&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here we will use Azure ML Studio to create and run the command job.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;%%writefile {train_src_dir}/main.py
import os
import argparse
import pandas as pd
import mlflow
import mlflow.sklearn
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.metrics import classification_report
from sklearn.model_selection import train_test_split

def main():
    """Main function of the script."""

    # input and output arguments
    parser = argparse.ArgumentParser()
    parser.add_argument("--data", type=str, help="path to input data")
    parser.add_argument("--test_train_ratio", type=float, required=False, default=0.25)
    parser.add_argument("--n_estimators", required=False, default=100, type=int)
    parser.add_argument("--learning_rate", required=False, default=0.1, type=float)
    parser.add_argument("--registered_model_name", type=str, help="model name")
    args = parser.parse_args()

    # Start Logging
    mlflow.start_run()

    # enable autologging
    mlflow.sklearn.autolog()

    ###################
    #&amp;lt;prepare the data&amp;gt;
    ###################
    print(" ".join(f"{k}={v}" for k, v in vars(args).items()))

    print("input data:", args.data)

    credit_df = pd.read_csv(args.data, header=1, index_col=0)

    mlflow.log_metric("num_samples", credit_df.shape[0])
    mlflow.log_metric("num_features", credit_df.shape[1] - 1)

    #Split train and test datasets
    train_df, test_df = train_test_split(
        credit_df,
        test_size=args.test_train_ratio,
    )
    ####################
    #&amp;lt;/prepare the data&amp;gt;
    ####################

    ##################
    #&amp;lt;train the model&amp;gt;
    ##################
    # Extracting the label column
    y_train = train_df.pop("default payment next month")

    # convert the dataframe values to array
    X_train = train_df.values

    # Extracting the label column
    y_test = test_df.pop("default payment next month")

    # convert the dataframe values to array
    X_test = test_df.values

    print(f"Training with data of shape {X_train.shape}")

    clf = GradientBoostingClassifier(
        n_estimators=args.n_estimators, learning_rate=args.learning_rate
    )
    clf.fit(X_train, y_train)

    y_pred = clf.predict(X_test)

    print(classification_report(y_test, y_pred))

    ###################
    #&amp;lt;/train the model&amp;gt;
    ###################

    ##########################
    #&amp;lt;save and register model&amp;gt;
    ##########################
    # Registering the model to the workspace
    print("Registering the model via MLFlow")
    mlflow.sklearn.log_model(
        sk_model=clf,
        registered_model_name=args.registered_model_name,
        artifact_path=args.registered_model_name,
    )

    # Saving the model to a file
    mlflow.sklearn.save_model(
        sk_model=clf,
        path=os.path.join(args.registered_model_name, "trained_model"),
    )
    ###########################
    #&amp;lt;/save and register model&amp;gt;
    ###########################

    # Stop Logging
    mlflow.end_run()

if __name__ == "__main__":
    main()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Above code explanation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Set Up the Script:&lt;/em&gt; Uses "argparse" to get inputs (like data path, model name, and training settings).&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Prepare the Data&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;credit_df = pd.read_csv(args.data)
train_df, test_df = train_test_split(credit_df)
Reads the CSV file.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Splits the data into training and test sets using train_test_split.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Train the Model&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;clf = GradientBoostingClassifier()
clf.fit(X_train, y_train)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Trains a Gradient Boosting Classifier a tree-based ML model. Then uses the model to make predictions.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Log Metrics and Results with MLflow&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mlflow.sklearn.autolog()
mlflow.log_metric("num_samples", ...)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Automatically logs model metrics like accuracy, training time, etc. This helps track and compare different model runs.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Register the Trained Model&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mlflow.sklearn.log_model(...)
mlflow.sklearn.save_model(...)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The model is saved and registered Azure ML Workspace. This allows to keep track of your trained models.&lt;/p&gt;

&lt;h1&gt;
  
  
  Configure a training job using "Command"
&lt;/h1&gt;

&lt;p&gt;Now we need to configure a training job in Azure Machine Learning using the command function which will run the main.py training script in Azure with the right data, parameters and environment.&lt;/p&gt;

&lt;p&gt;Create a variable called "inputs" that will use for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;data: The CSV file with credit card data.&lt;/li&gt;
&lt;li&gt;test_train_ratio: How much of the data should be used for testing (20%).&lt;/li&gt;
&lt;li&gt;learning_rate: A parameter for the model training.&lt;/li&gt;
&lt;li&gt;registered_model_name: The name to register your trained model under.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Submit the job
&lt;/h1&gt;

&lt;p&gt;Submit the job to run in Azure Machine Learning studio. This time will use create_or_update on ml_client. ml_client is a client class that allows you to connect to your Azure subscription using Python and interact with Azure Machine Learning services. "ml_client" allows to submit the jobs using Python.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ml_client.create_or_update(job)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;After run the cell, the notebook output shows a link to the job's details page on Machine Learning studio. Following the link will show you result statics.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>machinelearning</category>
      <category>training</category>
    </item>
    <item>
      <title>Artificial Intelligence: A Breakdown of Must-Know Buzzwords</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Mon, 17 Mar 2025 20:46:10 +0000</pubDate>
      <link>https://dev.to/sabahshariq/artificial-intelligence-a-breakdown-of-must-know-buzzwords-2l10</link>
      <guid>https://dev.to/sabahshariq/artificial-intelligence-a-breakdown-of-must-know-buzzwords-2l10</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;In the field of artificial intelligence now a days we are hearing various buzzwords. In this blog post we will familiar ourselves with AI subfields and how utilizing their set of techniques and theories solve particular types of problems.&lt;/p&gt;

&lt;p&gt;Here we will familiarize ourselves with Machine Learning (ML), Deep Learning, Neural Network (NN), Natural Language Processing (NLP), Computer Vision (CV), AI Agent/ Agentic, RAG AI AI, CAG AI and Multi Model.&lt;/p&gt;

&lt;h1&gt;
  
  
  Artificial Intelligence
&lt;/h1&gt;

&lt;p&gt;AI is making computers think and act like humans. Basically, it's about designing algorithms and systems so that machines can learn from data, adapt to new inputs and execute functions that mimic human cognitive abilities like language understanding, image interpretation and logical reasoning. In the other word AI learns from examples to help humans solve problems.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;          .----------------.
          |   [  AI  ]     |
          |  Neural Net    |
          |  Connections   |
           '------||------'
                  ||
        .---------||---------.
        |        ||         |
        |  Data  ||  Learning|
        | Stream ||   Flow   |
        '---------||---------'
                  ||
           .------||------.
          |   Adaptive   |
          |    Systems   |
           '------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Following is some of the classic subfields.&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Machine Learning
&lt;/h1&gt;

&lt;p&gt;Focuses on algorithms that allow computers to learn patterns from data to make predictions or decisions. It has many branches such as supervised learning, unsupervised learning and reinforcement learning.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        .---------||---------.
        |        ||         |
        |  Data  || Features |
        | Input  || Extraction |
        '---------||---------'
                  ||
           .------||------.
          |  Model Selection |
          |  &amp;amp; Evaluation    |
           '------||------'
                  ||      
           .------------------.
          |[ Machine Learning ]|
          |  Model Training    |
          |  &amp;amp; Optimization    |
           '------||------'
                  ||        
           .------||------.
          |   Predictions   |
          |  &amp;amp; Deployment   |
           '------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Deep Learning
&lt;/h1&gt;

&lt;p&gt;Deep Learning focuses on training neural networks with multiple layers to learn patterns and make decisions without human intervention. The "deep" in deep learning refers to the multiple layers in these neural networks.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
        Input Layer
    .----------------.
    | Feature Input  |
    '----------------'
            ||
  .------------------.
  | Hidden Layer 1   |   &amp;lt;-- Extracts basic patterns
  '------------------'
            ||
  .------------------.
  | Hidden Layer 2   |   &amp;lt;-- Learns deeper features
  '------------------'
            ||
  .------------------.
  | Hidden Layer 3   |   &amp;lt;-- Identifies complex relationships
  '------------------'
            ||
  .------------------.
  | Output Layer     |   &amp;lt;-- Makes final decision/prediction
  '------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Deep Learning: Multiple hidden layers allow the network to learn complex patterns&lt;/p&gt;

&lt;h1&gt;
  
  
  Neural Network
&lt;/h1&gt;

&lt;p&gt;Neural Network is a computational system inspired by the structure and function of the human brain also the foundation of Deep Learning. It consists of interconnected neurons/nodes that process and learn from data.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        Input Layer
    .----------------.
    | Feature Input  |
    '----------------'
            ||
  .------------------.
  | Hidden Layer 1   |   &amp;lt;-- Extracts basic patterns
  '------------------'
            ||
  .------------------.
  | Hidden Layer 2   |   &amp;lt;-- Learns deeper features
  '------------------'
            ||
  .------------------.
  | Output Layer     |   &amp;lt;-- Makes final decision/prediction
  '------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Neural Networks are the core of Deep Learning allowing AI to learn from data and improve over time.&lt;/p&gt;

&lt;h1&gt;
  
  
  Natural Language Processing (NLP)
&lt;/h1&gt;

&lt;p&gt;Focuses on enabling communication between humans and machines. NLP combines linguistics and machine learning to help AI process human language in a way that is meaningful and useful.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        User Input
    .----------------.
    | "Hello, AI!"   |
    '----------------'
            ||
  .------------------.
  |  Tokenization    |   &amp;lt;-- Splitting text
  '------------------'
            ||
  .------------------.
  |  POS Tagging     |   &amp;lt;-- Identifying grammar
  '------------------'
            ||
  .------------------.
  |  Sentiment Analysis |   &amp;lt;-- Understanding emotion
  '------------------'
            ||
  .------------------.
  |  AI Response      |   &amp;lt;-- Generating reply
  '------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Computer Vision (CV)
&lt;/h1&gt;

&lt;p&gt;Enables computers to "see" and interpret visual information from the world, much like humans do i.e. helps machines interpret the world visually.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        Image Input
    .----------------.
    |  Captured Image |
    '----------------'
            ||
  .------------------.
  |  Preprocessing   |   &amp;lt;-- Noise reduction, resizing
  '------------------'
            ||
  .------------------.
  |  Feature Extraction  |   &amp;lt;-- Identifying edges, textures
  '------------------'
            ||
  .------------------.
  |  Object Detection  |   &amp;lt;-- Locating objects in image
  '------------------'
            ||
  .------------------.
  |  Classification  |   &amp;lt;-- Labeling objects
  '------------------'
            ||
  .------------------.
  |  AI Decision Making  |   &amp;lt;-- AI response (e.g., stop sign detected)
  '------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  AI Agent/ Agentic AI
&lt;/h1&gt;

&lt;p&gt;An &lt;strong&gt;AI agent&lt;/strong&gt; is a computer program that can sense its surroundings, think and take actions to achieve specific goals. Similar to human making choices based on what they see and hear.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      User or Environment
    .--------------------.
    |   Input Data       |  &amp;lt;-- AI gets information (text, images, sound)
    '--------------------'
              ||
    .--------------------.
    |   AI Thinking      |  &amp;lt;-- AI processes &amp;amp; decides what to do
    '--------------------'
              ||
    .--------------------.
    |   AI Takes Action  |  &amp;lt;-- AI responds or does a task
    '--------------------'
              ||
    .--------------------.
    |   AI Learns        |  &amp;lt;-- AI improves over time
    '--------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;On the other hand, &lt;strong&gt;Agentic AI&lt;/strong&gt; is a type of artificial intelligence that is designed to act as an agent i.e. performing tasks or making decisions to solve problems and achieve goals.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
      User Gives a Goal
    .----------------------.
    |   "Plan my trip"     |  &amp;lt;-- AI gets the goal
    '----------------------'
              ||
    .----------------------.
    |   AI Plans Steps     |  &amp;lt;-- AI figures out what needs to be done
    '----------------------'
              ||
    .----------------------.
    |   AI Takes Action    |  &amp;lt;-- AI books tickets, suggests hotels
    '----------------------'
              ||
    .----------------------.
    | AI Learns &amp;amp; Improves |  &amp;lt;-- AI remembers and refines future plans
    '----------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  RAG AI
&lt;/h1&gt;

&lt;p&gt;It is an AI technique where instead of relying on what it was trained on, it searches for new relevant facts from external knowledge sources (e.g. databases, documents, or the web) and then generates a more accurate answer.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      User Asks a Question
    .----------------------.
    |  "Latest AI trends?" |  &amp;lt;-- User Input
    '----------------------'
              ||
    .----------------------.
    | AI Searches Sources  |  &amp;lt;-- Retrieves updated info
    '----------------------'
              ||
    .----------------------.
    | AI Generates Answer  |  &amp;lt;-- Uses retrieved data
    '----------------------'
              ||
    .----------------------.
    | AI Gives Response    |  &amp;lt;-- Provides accurate answer
    '----------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  CAG AI
&lt;/h1&gt;

&lt;p&gt;Unlike RAG AI which is performing real-time retrieval of documents instead CAG preloads all relevant knowledge into the model's context. This preloaded data is stored in a key-value (KV) cache, allowing the AI to generate responses instantly without the latency or potential errors introduced by retrieval processes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;+-----------------+
|  Start          |
+-----------------+
         |
         v
+-----------------+
|  Define Dataset |  (e.g., HR policies, manuals)
+-----------------+
         |
         v
+-----------------+
|  Process Data   |  (Encode/compress data into cache)
+-----------------+
         |
         v
+-----------------+
|  Preload Cache  |  (Store in model’s context/memory)
+-----------------+
         |
         v
+-----------------+
|  User Asks      |  (e.g., “What’s the vacation policy?”)
|  Question       |
+-----------------+
         |
         v
+-----------------+
|  Generate       |  (Use preloaded cache, no retrieval)
|  Response       |
+-----------------+
         |
         v
+-----------------+
|  End            |
+-----------------+
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Multi-Modal AI
&lt;/h1&gt;

&lt;p&gt;Multimodal AI refers to artificial intelligence systems that can process, understand, and generate outputs using multiple types of data—called "modalities"—such as text, images, audio, video, or even numerical data. Here,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"Multi" means multiple (more than one).&lt;/li&gt;
&lt;li&gt;"Modal" refers to modes of data (text, images, audio, video).&lt;/li&gt;
&lt;li&gt;So, Multi-Modal AI can process and combine different types of information.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      User Provides Input
    .----------------------.
    |  Text + Image + Audio |  &amp;lt;-- Different data types
    '----------------------'
              ||
    .----------------------.
    |  AI Processes Data    |  &amp;lt;-- Combines all input types
    '----------------------'
              ||
    .----------------------.
    |  AI Links Information |  &amp;lt;-- Connects text, image, or sound
    '----------------------'
              ||
    .----------------------.
    |  AI Generates Response|  &amp;lt;-- Intelligent answer using all inputs
    '----------------------'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Microsoft Azure provides various AI solution which can be use to build various intelligent applications. Some of Azure AI solutions are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure AI Services&lt;/li&gt;
&lt;li&gt;Azure Cognitive Services&lt;/li&gt;
&lt;li&gt;Azure Machine Learning&lt;/li&gt;
&lt;li&gt;Azure OpenAI Service&lt;/li&gt;
&lt;li&gt;AI-Powered Analytics with Azure Synapse&lt;/li&gt;
&lt;li&gt;AI Ethics and Responsible AI on Azure&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;AI holds immense potential to drive innovation and solve complex problems, but its success will depend on our ability to navigate its ethical implications and integrate it responsibly into society. And we can start with Microsoft Azure to get start.&lt;/p&gt;

</description>
      <category>generativeai</category>
      <category>ragai</category>
      <category>cagai</category>
      <category>multimodal</category>
    </item>
    <item>
      <title>Generative AI: Novice Guide to Transformers</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Fri, 29 Mar 2024 20:55:02 +0000</pubDate>
      <link>https://dev.to/sabahshariq/generative-ai-novice-guide-to-transformers-2gec</link>
      <guid>https://dev.to/sabahshariq/generative-ai-novice-guide-to-transformers-2gec</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;Transformer model brings a revolutionary change in Natural Language Processing by overcoming the limitations of Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN). It was first introduced in 2017 on a paper titled "Attention Is All You Need" by Vaswani et al. It is a type of neural network capable of understanding the context of sequential data, such as sentences, by analyzing the relationships between the words.&lt;/p&gt;

&lt;h1&gt;
  
  
  Transformers Structure
&lt;/h1&gt;

&lt;p&gt;The transformer architecture consists of two main parts: an encoder and a decoder. The encoder takes a sequence of input tokens and produces a sequence of hidden states. The decoder then takes these hidden states and produces a sequence of output tokens. The encoder and decoder are both made up of a stack of self-attention layers.&lt;/p&gt;

&lt;h1&gt;
  
  
  Self-Attention
&lt;/h1&gt;

&lt;p&gt;Let us start with revisiting what attention is in the NLP universe?&lt;/p&gt;

&lt;p&gt;Attention allowed us to focus on parts of our input sequence while we predicted our output sequence. In simpler terms, self-attention helps us create similar connections but within the same sentence. Look at the following example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was full.” it =&amp;gt; cup&lt;/li&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was empty.” it=&amp;gt; bottle&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By changing one word “&lt;strong&gt;full&lt;/strong&gt;” to “&lt;strong&gt;empty&lt;/strong&gt;” the reference object for “it” changed. If we are translating such a sentence, we will want to know what the word “&lt;strong&gt;it&lt;/strong&gt;” refers to.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgf1a6urfszp7p30oqkso.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgf1a6urfszp7p30oqkso.png" alt="transformers" width="695" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Transformers Model:
&lt;/h1&gt;

&lt;p&gt;In transformers "token" and "vector" are fundamental concepts of how data is represented and processed within the model.&lt;/p&gt;

&lt;h4&gt;
  
  
  Token
&lt;/h4&gt;

&lt;p&gt;In natural language processing, a token refers to a unit of text that has been segmented for processing. For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;At the word level: Each word in a sentence is treated as a separate token.&lt;/li&gt;
&lt;li&gt;At the subword level: Words are broken down into smaller subword units, such as prefixes, suffixes, or stems. This approach is particularly useful for handling morphologically rich languages or dealing with out-of-vocabulary words.&lt;/li&gt;
&lt;li&gt;At the character level: Each character in a word is treated as a separate token.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tokens serve as the basic building blocks of input sequences in transformers. Before processing input data, it is tokenized into these units to represent the textual information in a format suitable for the model.&lt;/p&gt;

&lt;h4&gt;
  
  
  Vector
&lt;/h4&gt;

&lt;p&gt;A vector refers to a mathematical representation of a token in a high-dimensional space. Each token is associated with a vector which captures its semantic and syntactic properties learned from the training data.&lt;/p&gt;

&lt;p&gt;During the training process, the model learns to map tokens to vectors in such a way that tokens with similar meanings or contexts are represented by similar vectors, facilitating the model's ability to understand and process language effectively.&lt;/p&gt;

&lt;p&gt;These token embeddings are the input to the transformer model. The model processes these embeddings through multiple layers of attention mechanisms and feedforward neural networks to generate contextualized representations of the tokens, capturing the relationships between them within the input sequence.&lt;/p&gt;

&lt;h1&gt;
  
  
  How Transformres works:
&lt;/h1&gt;

&lt;h4&gt;
  
  
  Word embeddings
&lt;/h4&gt;

&lt;p&gt;Word embeddings in transformers are a way to represent words in a numerical form that a machine learning model can understand. Think of it like translating words into a language that computers speak.&lt;/p&gt;

&lt;h4&gt;
  
  
  Position embeddings
&lt;/h4&gt;

&lt;p&gt;Position embeddings in transformers are a crucial component that helps the model understand the sequential order of tokens within an input sequence. In natural language processing tasks, such as text generation or language translation, the order of words or tokens in a sentence carries important semantic and syntactic information.&lt;/p&gt;

&lt;h4&gt;
  
  
  Encoder
&lt;/h4&gt;

&lt;p&gt;The encoder is like a smart detective that reads and understands the input sequence. Its job is to take in a sequence of tokens (words or subwords) and convert each token into a rich representation called a context vector. The encoder does this by processing the tokens through multiple layers of attention mechanisms and feedforward neural networks. These context vectors capture the meaning and context of each token in the input sequence.&lt;/p&gt;

&lt;h4&gt;
  
  
  Decoder
&lt;/h4&gt;

&lt;p&gt;The decoder is like a creative storyteller that uses the context vectors provided by the encoder to generate an output sequence. It takes the context vectors and produces tokens one by one, using them to guide its generation process. At each step, the decoder pays attention to the context vectors produced by the encoder to ensure that the generated tokens are relevant and coherent with the input sequence. This allows the decoder to generate meaningful output sequences based on the understanding provided by the encoder.&lt;/p&gt;

&lt;h4&gt;
  
  
  Attention
&lt;/h4&gt;

&lt;p&gt;Attention helps the model to understand the context of a word by considering words that go before and after it.&lt;/p&gt;

&lt;h4&gt;
  
  
  Training
&lt;/h4&gt;

&lt;p&gt;Transformer models are trained using supervised learning, where they learn to minimize a loss function that quantifies the difference between the model's predictions and the ground truth for the given task.&lt;/p&gt;

&lt;h1&gt;
  
  
  Where are transformers used?
&lt;/h1&gt;

&lt;p&gt;Earlier we looked that transformers are used for natural language processing tasks. However now it is using in the fields like computer vision and speech recognition. Some examples are: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Computer vision&lt;/li&gt;
&lt;li&gt;Speech recognition&lt;/li&gt;
&lt;li&gt;Question answering&lt;/li&gt;
&lt;li&gt;Text classification&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Conclusion:
&lt;/h1&gt;

&lt;p&gt;The Transformers architecture is poised to significantly advance the capabilities and applications of Generative AI, pushing the boundaries of what machines can create and how they assist in the creative process.&lt;/p&gt;

</description>
      <category>generativeai</category>
      <category>transformers</category>
      <category>machinelearning</category>
      <category>nlp</category>
    </item>
    <item>
      <title>A Novice Guide to Azure AI Search</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Thu, 28 Mar 2024 22:30:47 +0000</pubDate>
      <link>https://dev.to/sabahshariq/a-novice-guide-to-azure-ai-search-48dj</link>
      <guid>https://dev.to/sabahshariq/a-novice-guide-to-azure-ai-search-48dj</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;Azure AI Search provides an information retrieval service on a large amount of searchable data. For example document search, data exploration etc. You can use this service to enable powerful search capabilities within applications and solutions. Like Personalization, Insight Discovery, E-commerce Search, Healthcare Information Retrieval etc.&lt;/p&gt;

&lt;p&gt;In Azure AI Search two term you need to know&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Indexing&lt;/li&gt;
&lt;li&gt;Querying&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Indexing:
&lt;/h4&gt;

&lt;p&gt;The purpose of this process is to loads the dataset into Azure AI Search that you created and make it searchable. The data format of an Index is in JSON. So, you need to format your data into JSON format or use and indexer to retrieve and serialize your data into JSON. Index is like a table which contains fields and their data. And each row in Index contains a new sets of records which is called document.&lt;/p&gt;

&lt;h4&gt;
  
  
  Querying:
&lt;/h4&gt;

&lt;p&gt;You use query to search data based on a search criteria. Like retrieving a searchable content and associated document with the search terms.&lt;/p&gt;

&lt;p&gt;Azure AI Search provides you the capability of full text search, keyword search. Also, you can use both to get most relevant results. Let's create our first search index and test some query to gain knowledge how Azure AI Search works.&lt;/p&gt;

&lt;h1&gt;
  
  
  Create a search index:
&lt;/h1&gt;

&lt;p&gt;In order to create a search, we to follow these steps. We will be using free tier for creating a service and data source connection to sample data hosted by Microsoft on Azure Cosmos DB.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 01
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Go to Azure AI Search and create a resource and name it for example: FirstAISearch&lt;/li&gt;
&lt;li&gt;Create a search service and name it for example: firstaisearchservice. Please not that service name should be all lowercase.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once service created successfully you will see the following successful message.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3jxqfkuqw9vcy7aamnro.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3jxqfkuqw9vcy7aamnro.png" alt="create resource" width="800" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 02
&lt;/h4&gt;

&lt;p&gt;Now go to overview menu from left hand side and select "Usage" tab to get the status of the service like indexes, indexers, and data sources you already have. Since we are using "free tier" it is limited to three indexes, three data sources, and three indexers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6kauhrvh47qimw6jf34.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6kauhrvh47qimw6jf34.png" alt="resource create1" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vbp0cpzxqvdg9eubsaq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3vbp0cpzxqvdg9eubsaq.png" alt="resource create2" width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 03
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;On the "Overview" page in the top menu there is option call "Import Data"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs28p666ajwaw6scphvui.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs28p666ajwaw6scphvui.png" alt="overview3" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;On Connect to your data, select the data source "Samples" from dropdown list.&lt;/li&gt;
&lt;li&gt;In the list of built-in samples, select hotels-sample.&lt;/li&gt;
&lt;li&gt;Select Next: Add cognitive skills (Optional) to continue.&lt;/li&gt;
&lt;li&gt;For now, select Skip to: Customize target index to continue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fthzbqv5aozgpxovh4fmk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fthzbqv5aozgpxovh4fmk.png" alt="step31" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ww0mr2ei1t2jmcmd7og.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ww0mr2ei1t2jmcmd7og.png" alt="step32" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft243i95bd9rwu12qrf3w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft243i95bd9rwu12qrf3w.png" alt="step33" width="800" height="586"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 04
&lt;/h4&gt;

&lt;p&gt;Search service refers to a schema for the built-in hotels-sample index. Follow these steps to configure the index:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keep as it is the Index name "hotels-sample-index" and Key field "HotelId".&lt;/li&gt;
&lt;li&gt;Keep values for all field attributes as it is.&lt;/li&gt;
&lt;li&gt;Select Next: Create an indexer to continue.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5er2bcos2an5jnf0gts1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5er2bcos2an5jnf0gts1.png" alt="step4" width="800" height="616"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For creating the index it requires an Index name and a collection of Fields. One field must be marked as the document key to uniquely identify each document and the value is always a string.&lt;/p&gt;

&lt;p&gt;Each field has a name, data type, and attributes that control how to use the field in the search index. Checkboxes enable or disable the following attributes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Retrievable: Fields returned in a query response.&lt;/li&gt;
&lt;li&gt;Filterable: Fields that accept a filter expression.&lt;/li&gt;
&lt;li&gt;Sortable: Fields that accept an orderby expression.&lt;/li&gt;
&lt;li&gt;Facetable: Fields used in a faceted navigation structure.&lt;/li&gt;
&lt;li&gt;Searchable: Fields used in full text search. Strings are searchable. Numeric fields and Boolean fields are often marked as not searchable.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Step 05
&lt;/h4&gt;

&lt;p&gt;Now we will configure the executable process and indexer name. Since we are getting started keep the default settings and select "Submit"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjy57wai0d7i4cer3j9u4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjy57wai0d7i4cer3j9u4.png" alt="step51" width="800" height="609"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq668m55g9oe5b8qhu0ze.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq668m55g9oe5b8qhu0ze.png" alt="step52" width="800" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5t9cpoxr8fm9felqbhxd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5t9cpoxr8fm9felqbhxd.png" alt="step53" width="800" height="613"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 06
&lt;/h4&gt;

&lt;p&gt;Go to "Indexers" from the left hand side menu and you will see that index cration status with document count in it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1380vlw0cp05wmdqpdt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj1380vlw0cp05wmdqpdt.png" alt="step6" width="800" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 07:
&lt;/h4&gt;

&lt;p&gt;Go to "Indexes" from the left hand side menu and you will see that index document count and storage size.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqx0duzeonj9dybkzd51w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqx0duzeonj9dybkzd51w.png" alt="step7" width="800" height="290"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Querying with Search explorer
&lt;/h1&gt;

&lt;h4&gt;
  
  
  Step 01:
&lt;/h4&gt;

&lt;p&gt;On the Search explorer tab, enter text to search on example: new york hotel near to Time's Square and in Chelsea neighborhood. And select "Search". You will see the result.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxots348e45qrnmhhbn4j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxots348e45qrnmhhbn4j.png" alt="query search" width="800" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Query output in JSON be like below. You will see that output contains search terms in it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "@odata.context": "https://firstaisearchservice.search.windows.net/indexes('hotels-sample-index')/$metadata#docs(*)",
  "value": [
    {
      "@search.score": 39.42806,
      "HotelId": "15",
      "HotelName": "Peaceful Market Hotel &amp;amp; Spa",
      "Description": "Book now and Save up to 30%.  Central location. Steps from Empire State Building &amp;amp; Times Square, in Chelsea neighborhood. Brand new rooms. Impeccable service.",
      "Description_fr": "Réservez dès maintenant et économisez jusqu'à 30%.  Emplacement central. A quelques pas de l'Empire State Building &amp;amp; Times Square, dans le quartier de Chelsea. Chambres flambant neuves. Service impeccable.",
      "Category": "Resort and Spa",
      "Tags": [
        "continental breakfast",
        "restaurant",
        "view"
      ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Clean up resources
&lt;/h4&gt;

&lt;p&gt;It's a good idea at the end of a project to identify whether you still need the resources you created. Resources left running can cost you money. You can delete resources individually or delete the resource group to delete the entire set of resources.&lt;/p&gt;

&lt;h4&gt;
  
  
  Conclusion
&lt;/h4&gt;

&lt;p&gt;Overall, the purpose of using Azure AI Search is to deliver intelligent, relevant, and efficient search experiences that drive user engagement, unlock insights, and add value to applications and solutions across various industries and domains.&lt;/p&gt;

&lt;h4&gt;
  
  
  Sample Dataset
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://github.com/Azure-Samples/azure-search-sample-data"&gt;Azure AI Search Sample Data&lt;/a&gt;&lt;/p&gt;

</description>
      <category>generativeai</category>
      <category>index</category>
      <category>machinelearning</category>
      <category>azure</category>
    </item>
    <item>
      <title>A Novice Guide to Large Language Model (LLM)</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Mon, 25 Mar 2024 20:54:49 +0000</pubDate>
      <link>https://dev.to/sabahshariq/a-novice-guide-to-large-language-model-llm-341l</link>
      <guid>https://dev.to/sabahshariq/a-novice-guide-to-large-language-model-llm-341l</guid>
      <description>&lt;h1&gt;
  
  
  What is Large Language Model?
&lt;/h1&gt;

&lt;p&gt;A large language model is an artificial intelligence (AI) model that is trained to understand and generate human language text. For example, as a human if we want to express something we use sentence to express it. This sentence is a collection of words that putted together one after another. But for a computer it is just a string of characters arranged in a particular order.&lt;/p&gt;

&lt;p&gt;In LLM the term “Large” here refers to the number of parameters that were used to train the model. Based on the data it is fed and trained on it posesses the capability to provide answers and solutions to any questions and problems. The larger the model, the more data it can process and the more complex patterns it can learn. This Large Language Model is built on Transformer architecture.&lt;/p&gt;

&lt;h4&gt;
  
  
  Example:
&lt;/h4&gt;

&lt;p&gt;An AI system using large language models can learn from a database of short stories and then use that knowledge to generate new short stories like "Once upon a time".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapsmezm718l8asumrdat.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapsmezm718l8asumrdat.png" alt="LLM_Model" width="635" height="379"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Pre-trained Transformers:
&lt;/h1&gt;

&lt;p&gt;This means that the model’s wealth of knowledge is limited to the vast amount of data it has been trained with. If some data was not fed to the model, then it is not known to the model and therefore it cannot generate content around it or answer the question.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyvcedhf5u28pgt7jjd44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyvcedhf5u28pgt7jjd44.png" alt="Attention Image" width="695" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Transformers:
&lt;/h1&gt;

&lt;p&gt;Transformers are a type of neural network capable of understanding the context of sequential data, such as sentences, by analyzing the relationships between the words.&lt;/p&gt;

&lt;p&gt;The Transformer in NLP is a novel architecture (ANN) that aims to solve sequence-to-sequence tasks while handling long-range dependencies with ease. It relies entirely on "self-attention" to compute representations of its input and output WITHOUT using sequence-aligned RNNs or convolution.&lt;/p&gt;

&lt;h1&gt;
  
  
  Self-Attention:
&lt;/h1&gt;

&lt;p&gt;Let us start with revisiting what attention is in the NLP universe?&lt;/p&gt;

&lt;p&gt;Attention allowed us to focus on parts of our input sequence while we predicted our output sequence. In simpler terms, self-attention helps us create similar connections but within the same sentence. Look at the following example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was full.” it =&amp;gt; cup&lt;/li&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was empty.” it=&amp;gt; bottle&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By changing one word “&lt;strong&gt;full&lt;/strong&gt;” to “&lt;strong&gt;empty&lt;/strong&gt;” the reference object for “it” changed. If we are translating such a sentence, we will want to know what the word “&lt;strong&gt;it&lt;/strong&gt;” refers to.&lt;/p&gt;

&lt;p&gt;Using the above that we discussed LLM provide answers and solutions to any questions and problems. How? by using "Prompts"?&lt;/p&gt;

&lt;h1&gt;
  
  
  Prompts
&lt;/h1&gt;

&lt;p&gt;Prompts are a set instructions that is given to the LLM by human in order to obtain a solution, to guide its behavior or to generate desired outputs.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;Large language models are deep learning models that can be used to generating creative content to aiding in language translation, summarization, and much more, LLMs showcase the incredible potential of machine learning and deep neural networks.&lt;/p&gt;

&lt;p&gt;Image Reference:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.wisecube.ai/blog/a-comprehensive-overview-of-large-language-models/"&gt;Transformers&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>generativeai</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>nlp</category>
    </item>
    <item>
      <title>A Novice Guide to Generative AI</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Sun, 03 Sep 2023 13:19:49 +0000</pubDate>
      <link>https://dev.to/sabahshariq/a-novice-guide-to-generative-ai-2270</link>
      <guid>https://dev.to/sabahshariq/a-novice-guide-to-generative-ai-2270</guid>
      <description>&lt;h1&gt;
  
  
  What is Generative AI?
&lt;/h1&gt;

&lt;p&gt;Generative AI is a subfield of artificial intelligence (AI) that involves creating algorithms that can generate new contents like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An email response&lt;/li&gt;
&lt;li&gt;A short story&lt;/li&gt;
&lt;li&gt;A simple image&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Example:
&lt;/h4&gt;

&lt;p&gt;Google search has auto completion feature. Think of it as a very basic implementation of Generative AI where query suggestions were generated by the model based on the terms you type on the search bar. This model was trained against the zillions of queries searched by users across the world and generated suggestions or queries.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qjVsu9ts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dla9h7cioc59n70kcq7x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qjVsu9ts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dla9h7cioc59n70kcq7x.png" alt="Generative AI banner" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Generative AI in details:
&lt;/h1&gt;

&lt;p&gt;Generative AI is supported by &lt;strong&gt;GPT&lt;/strong&gt; or &lt;strong&gt;Generative Pre-trained Transformers&lt;/strong&gt; takes this “&lt;strong&gt;generate&lt;/strong&gt;” capability further by several levels. GPT are a family of "&lt;strong&gt;large-language models&lt;/strong&gt;" trained using "&lt;strong&gt;artificial neural networks&lt;/strong&gt;" using "&lt;strong&gt;transformer architecture&lt;/strong&gt;" on "&lt;strong&gt;massive un-labelled text data&lt;/strong&gt;" to generate human-like text responses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3kbk23Ut--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tfzqii4j8agxtezvznei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3kbk23Ut--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tfzqii4j8agxtezvznei.png" alt="GPT flow" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have a very basic understanding of the different jargons let us dig a bit deeper.&lt;/p&gt;

&lt;h1&gt;
  
  
  Large Language Model:
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;What are Large Language Models and how are they defined?&lt;/li&gt;
&lt;li&gt;How large is a Large Language Model?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The term “Large” here refers to the number of parameters that were used to train the model. Here’s a summary of the versions of GPT models released so far along with the parameter count.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yf-zPN8b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8kpa5ossztk25gltae4v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yf-zPN8b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8kpa5ossztk25gltae4v.png" alt="LLM train counts" width="800" height="123"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Pre-trained Transformers:
&lt;/h1&gt;

&lt;p&gt;This means that the model’s wealth of knowledge is limited to the vast amount of data it has been trained with. If some data was not fed to the model, then it is not known to the model and therefore it cannot generate content around it or answer the question.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m-jZNRMD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4vlj11nmvthwtj5wo1hh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m-jZNRMD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4vlj11nmvthwtj5wo1hh.png" alt="Transformer path" width="695" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Transformers:
&lt;/h1&gt;

&lt;p&gt;Transformers are a type of neural network capable of understanding the context of sequential data, such as sentences, by analyzing the relationships between the words.&lt;/p&gt;

&lt;p&gt;The Transformer in NLP is a novel architecture (ANN) that aims to solve sequence-to-sequence tasks while handling long-range dependencies with ease. It relies entirely on "self-attention" to compute representations of its input and output WITHOUT using sequence-aligned RNNs or convolution.&lt;/p&gt;

&lt;h1&gt;
  
  
  Self-Attention:
&lt;/h1&gt;

&lt;p&gt;Let us start with revisiting what attention is in the NLP universe?&lt;/p&gt;

&lt;p&gt;Attention allowed us to focus on parts of our input sequence while we predicted our output sequence. In simpler terms, self-attention helps us create similar connections but within the same sentence. Look at the following example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was full.” it =&amp;gt; cup&lt;/li&gt;
&lt;li&gt;“I poured water from the bottle into the cup until it was empty.” it=&amp;gt; bottle&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By changing one word “&lt;strong&gt;full&lt;/strong&gt;” to “&lt;strong&gt;empty&lt;/strong&gt;” the reference object for “it” changed. If we are translating such a sentence, we will want to know what the word “&lt;strong&gt;it&lt;/strong&gt;” refers to.&lt;/p&gt;

&lt;h1&gt;
  
  
  How does generative AI work?
&lt;/h1&gt;

&lt;p&gt;The most common way to train a generative AI model is to use supervised learning - here the model is given a set of human-created content and corresponding labels. It then learns to generate content that is similar to the human-created content and labeled with the same labels.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--T4iKaFXT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nommylxabm0pn8eadikk.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--T4iKaFXT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nommylxabm0pn8eadikk.jpg" alt="AI Brain" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Applications of Generative AI?
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Language: Marketing Content, Code Development, Essay Writing&lt;/li&gt;
&lt;li&gt;Visual: Image/Video Generation, Design, 3D Modelling&lt;/li&gt;
&lt;li&gt;Music: Music/Voice Generation &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--X9iwws5G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/od1eo04eoz90vr1lv6ea.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--X9iwws5G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/od1eo04eoz90vr1lv6ea.png" alt="difference" width="793" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Traditional AI vs. Generative AI:
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Traditional AI is focused on detecting patterns, generating insights, automation, and prediction.&lt;/li&gt;
&lt;li&gt;Generative AI starts with a prompt that lets a user submit a question along with any relevant data to guide content generation. &lt;/li&gt;
&lt;li&gt;Traditional AI algorithms process data and return expected results, such as analyses or predictions&lt;/li&gt;
&lt;li&gt;Generative AI algorithms produce newly synthesized content, like text or images, based on training from existing data&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Model Setup:
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;A reason that Generative AI became so popular so quickly, is because it empowers the end user. Right now, anyone can log on to ChatGPT and start using it, which is a first for an AI application. Zero barrier to entry.&lt;/li&gt;
&lt;li&gt;whereas Traditional AI necessitates rigorous data preparation and processes to develop and test a model designed to produce a good outcome. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So, with generative AI, you can just start talking to it, and it will understand what you want to do and offer a response.&lt;/p&gt;

&lt;h1&gt;
  
  
  Example:
&lt;/h1&gt;

&lt;p&gt;ChatGPT, DALL-E, and Bard are examples of generative AI applications that produce text or images based on user-given prompts or dialogue.&lt;/p&gt;

&lt;h1&gt;
  
  
  Applications by Industry:
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Automotive Industry: Synthetic data produced by AI can run simulations and train autonomous vehicles. &lt;/li&gt;
&lt;li&gt;Media and Entertainment: AI can be used to quickly, easily, and more cheaply generate content, or (as a tool) to enhance the work of creatives like writers and designers.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Image Reference:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://echofish.io/traditional-ai-vs-generative-ai/"&gt;Traditional AI vs. Generative AI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.wisecube.ai/blog/a-comprehensive-overview-of-large-language-models/"&gt;Transformers&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/cloudfordummies/a-dummies-introduction-to-generative-ai-2f0c0a5e0e9f"&gt;GPT Tarining data&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/pulse/generative-ai-art-prompt-engineering-jesmary-k-j/"&gt;Prompt Engineering&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://medium.com/javarevisited/how-to-train-a-gpt-model-a-comprehensive-guide-cd77d8db2693"&gt;GPT&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>generativeai</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>nlp</category>
    </item>
    <item>
      <title>Visual Studio: .pdb File</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Sun, 12 Mar 2023 17:48:12 +0000</pubDate>
      <link>https://dev.to/sabahshariq/visual-studio-pdb-file-14n0</link>
      <guid>https://dev.to/sabahshariq/visual-studio-pdb-file-14n0</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q_jsogAB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m01tyeweng9vkh784lmu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q_jsogAB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m01tyeweng9vkh784lmu.png" alt="Visual Studio Logo" width="215" height="182"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A file with the &lt;strong&gt;.pdb&lt;/strong&gt; file extension is a program database file that's used to hold debugging information about a program, like a &lt;em&gt;(dll or .exe)&lt;/em&gt; file. These are also called symbol files.&lt;/p&gt;

&lt;p&gt;When we compile our code in &lt;strong&gt;Visual Studio&lt;/strong&gt;, it creates a &lt;em&gt;.pdb&lt;/em&gt; file alongside the executable file, which contains information about the location of each function and variable in the source code, as well as information about the compiled code, such as instructions and memory layout.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;During debugging, the debugger uses the .pdb file to map the compiled code to the original source code, enabling us to step through code, set breakpoints, inspect variables, and view call stacks. Without a .pdb file, the debugger cannot provide detailed information about our code during debugging.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wyQXpC0r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ndojecdmn26zzr5k5oy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wyQXpC0r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ndojecdmn26zzr5k5oy.png" alt="Debug Image" width="736" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Remember, &lt;strong&gt;.pdb files are as important as source code&lt;/strong&gt;. Because a build we do on our development machine is a private build. A build done on a build machine is a public build. And without the correct PDB files debugging is nearly impossible as debugger will not find the information to look for.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;For more information:&lt;/em&gt; &lt;a href="https://learn.microsoft.com/en-us/visualstudio/debugger/specify-symbol-dot-pdb-and-source-files-in-the-visual-studio-debugger?view=vs-2022"&gt;Specify symbol (.pdb) and source files in the Visual Studio debugger (C#, C++, Visual Basic, F#)&lt;/a&gt;&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>visualstudio</category>
      <category>debug</category>
      <category>devops</category>
    </item>
    <item>
      <title>Visual Studio: Build, Rebuild, Clean and Batch Build</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Sun, 12 Mar 2023 16:45:55 +0000</pubDate>
      <link>https://dev.to/sabahshariq/visual-studio-build-rebuild-clean-and-batch-build-2a17</link>
      <guid>https://dev.to/sabahshariq/visual-studio-build-rebuild-clean-and-batch-build-2a17</guid>
      <description>&lt;h4&gt;
  
  
  Introduction
&lt;/h4&gt;

&lt;p&gt;In this blog post we will look into the &lt;strong&gt;build configuration&lt;/strong&gt; of our &lt;em&gt;project&lt;/em&gt; or &lt;em&gt;solution&lt;/em&gt; in Visual Studio application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkpq78szvzjeqn4pngepa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkpq78szvzjeqn4pngepa.png" alt="Visual Studio Logo"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Build Configuration
&lt;/h4&gt;

&lt;p&gt;The build solution will always perform an incremental build i.e. it will only build those files &lt;em&gt;(dll and .exe)&lt;/em&gt; which have changed.&lt;/p&gt;

&lt;p&gt;When we develop our .NET application in Visual Studio we see following build configuration:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Debug&lt;/li&gt;
&lt;li&gt;Release
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu2xjlv4vuwmwzwgpeciu.png" alt="VS Build Config"&gt;
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fey86zg3l3uejft6vxul8.png" alt="Debug Release Folder"&gt;
&lt;em&gt;Now, if we look into more details...&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Debug
&lt;/h4&gt;

&lt;p&gt;Debug mode allows developer to break the execution of program and step through the code. Compiled code &lt;em&gt;dll&lt;/em&gt; in debug is not optimized. Therefore, size of &lt;em&gt;dll&lt;/em&gt; file in debug mode is comparatively larger than release mode. Some additional instructions are added to enable the developer to set a breakpoint on every source code line like &lt;em&gt;"symbolic information"&lt;/em&gt;. For example, adding following code block will only execute in debug and in release mode this code block will get ignored.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#if DEBUG  
Console.WriteLine("Hello! This is Debug Mode");  
#endif 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Release
&lt;/h4&gt;

&lt;p&gt;This mode is for final deployment of the source code on production server. Release mode &lt;em&gt;dll&lt;/em&gt; file contains optimized code, and it is generated for end user. Debugging symbol information is also gets omitted from the &lt;em&gt;dll&lt;/em&gt; file in this mode.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faads500ia3n107fp47bn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faads500ia3n107fp47bn.png" alt="Solution Structure"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Rebuild Solution
&lt;/h4&gt;

&lt;p&gt;Rebuild solution will delete all the files &lt;em&gt;(dll and .exe)&lt;/em&gt; or "bin and obj" and perform build from scratch irrespective of whether the files are modified or not. Please note that "Rebuild" will clean then build each project, one at a time, rather than cleaning all and then building all projects.&lt;/p&gt;

&lt;h4&gt;
  
  
  Clean Solution
&lt;/h4&gt;

&lt;p&gt;The clean solution will delete all the compiled files &lt;em&gt;(dll and .exe)&lt;/em&gt; from &lt;em&gt;"bin and obj"&lt;/em&gt; directories. You may notice that not all of the files are actually removed from the project folder. This might happen because &lt;em&gt;clean&lt;/em&gt; only removes files that are associated with a build and not everything else.&lt;/p&gt;

&lt;h4&gt;
  
  
  Build vs. Rebuild vs. Clean
&lt;/h4&gt;

&lt;p&gt;The difference lies in what happens in each build configuration &lt;br&gt;
for every projects.&lt;/p&gt;

&lt;p&gt;For example, if the solution has two projects: Project A &amp;amp; Project B and you perform rebuild operation, it will take Project A, clean the compiled files for Project A, and build it. After that, it will take Project B, clean the files for Project B, and build it.&lt;/p&gt;

&lt;p&gt;On the other hand, if you do a clean and then build separately, it will first clean all compiled files &lt;em&gt;(dlls and .exe)&lt;/em&gt; for both the projects – Project A &amp;amp; Project B and then it will build Project A first then Project B.&lt;/p&gt;

&lt;h4&gt;
  
  
  Batch Build
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fovzuzk31fhyc8yzt1lef.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fovzuzk31fhyc8yzt1lef.png" alt="Batch Build"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When we build our project our project in visual studio we can only build it into one configuration mode at a time like &lt;em&gt;Debug&lt;/em&gt; or &lt;em&gt;Release&lt;/em&gt; etc. So, if we want to &lt;strong&gt;build our project into multiple configuration&lt;/strong&gt; i.e. &lt;em&gt;Debug and Release&lt;/em&gt; or &lt;strong&gt;build the project for multiple platforms&lt;/strong&gt;, such as &lt;em&gt;x86 and x64&lt;/em&gt;, &lt;strong&gt;all at once at the same time&lt;/strong&gt; then we can do that with &lt;strong&gt;Batch Build&lt;/strong&gt; operation. &lt;em&gt;Batch build allows you to build multiple configurations with a single command instead of building them individually.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To perform a batch build, you can go to the "Build" menu, select "Batch Build," and then select the configurations and platforms that you want to build.&lt;/p&gt;

&lt;h4&gt;
  
  
  Concluiton
&lt;/h4&gt;

&lt;p&gt;In summary, it is important to keep in mind that what happens in each build configuration in Visual Studio and based on our requirement we can select how our project or solution can be build.&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>visualstudio</category>
      <category>debug</category>
      <category>devops</category>
    </item>
    <item>
      <title>ASP.NET Core Identity: Registration, Login and Logout</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Sat, 08 Oct 2022 20:42:42 +0000</pubDate>
      <link>https://dev.to/sabahshariq/aspnet-core-identity-registration-login-and-logout-g3d</link>
      <guid>https://dev.to/sabahshariq/aspnet-core-identity-registration-login-and-logout-g3d</guid>
      <description>&lt;h4&gt;
  
  
  Introduction
&lt;/h4&gt;

&lt;p&gt;In this blog post we will look into how to implement ASP.NET Core Identity in a web app for adding authentication and authorization feature. Users can create an account and login with a username and password. This article is especially focusing on newcomers and anyone new wants to learn or thinking of using Identity in their .NET program. This is a series of blog post so please look into other for more information.&lt;/p&gt;

&lt;h4&gt;
  
  
  What is ASP.NET Core Identity?
&lt;/h4&gt;

&lt;p&gt;When we build an application one of the key factors is that "how to handle the user management?", that means manages users, login, logout, passwords, profile data, roles, claims, tokens, email confirmation and more. So, to help ease these user management process we can use ASP.NET Core Identity a membership system.&lt;/p&gt;

&lt;h4&gt;
  
  
  Development Setup
&lt;/h4&gt;

&lt;p&gt;For this example, we will be creating a ASP.NET Core MVC Web App using Visual Studio 2022 and .NET 6.0&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 01:&lt;/strong&gt; When creating an ASP.NET Core Web App (MVC) we will not select the Authentication type and set it as "None" as we will be looking from scratch how to set up the Identity feature in the application. For simplicity we will uncheck the "Configure for HTTPS" and "Enable Docker"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4OTWXGZ2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gn1az87qsf09wduam42y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4OTWXGZ2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gn1az87qsf09wduam42y.png" alt="Create Project" width="880" height="586"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After creating we will have a basing MVC web application without any authentication feature.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tTWiMZo7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l2gevl4s1apw41tqhwgg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tTWiMZo7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l2gevl4s1apw41tqhwgg.png" alt="Uncheck Authentication" width="347" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And running the project will show the following webpage:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pQECjHTX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n79pwya4s73i075y6cqc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pQECjHTX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n79pwya4s73i075y6cqc.png" alt="Default Webpage" width="762" height="531"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 02:&lt;/strong&gt; Now we will add Identity feature by Scaffolding. So, right click on the project and go to "ADD" and select "New Scaffolded Item". In the following window select "Identity" and press "Add". In order to scaffolded to run a nuget package "Microsoft.VisualStudio.Web.CodeGeneration.Design" need to install. This will get automatically install when adding scaffolded item.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VTt5Y08V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x9mozv2pttwk8fq7jpi8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VTt5Y08V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x9mozv2pttwk8fq7jpi8.png" alt="Scaffolding" width="644" height="779"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XVRgQKAu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jh2s5tixxv24di27oovf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XVRgQKAu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jh2s5tixxv24di27oovf.png" alt="options" width="880" height="615"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the following select the Identity features that we would like to add in our web app. For this example, we will be selecting "Login", "Logout", "Register". We also need to specify the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Layout page: a layout page for the identity files so, we will be selecting our existing shared "_Layout.cshtml" page that got created by default with the project.&lt;/li&gt;
&lt;li&gt;Data Context Class: By selecting + button it will generate the context class name and we will rename it to "AspNetCoreIdentityDbContext"&lt;/li&gt;
&lt;li&gt;User class: By selecting + button we are setting the user model class name to "AspNetCoreIdentityUser"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BVD4AyxV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/434g1ks1wugru6ihg5ph.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BVD4AyxV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/434g1ks1wugru6ihg5ph.png" alt="Identity select" width="800" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on "Add" now our identity scaffolded. And we have the following structure changes in the project&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--orc3qjQL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f5sroyqo9ynec8r6gmhc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--orc3qjQL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f5sroyqo9ynec8r6gmhc.png" alt="project structure" width="346" height="771"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here we have "AspNetCoreIdentityDbContext" for our database. By which we will access our SQL Server. Then we have "AspNetCoreIdentityUser" which inherits from "IdentityUser" which implements the code for Login, Logout, Register and more. Now run the app and we need to add "Register" "Login" &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8c3kuPC2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0kvkkmjtjshrubmxeqnn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8c3kuPC2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0kvkkmjtjshrubmxeqnn.png" alt="model class" width="762" height="531"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To add this go to the header section in "_Layout.cshtml" under "Views" -&amp;gt; "Shared" and made the following changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gFh_rGbh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iuj1dm1kt5uy7w52yoav.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gFh_rGbh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iuj1dm1kt5uy7w52yoav.png" alt="default page" width="880" height="289"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hheaF4m8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0mksavc90vd4cyvmj8cs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hheaF4m8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0mksavc90vd4cyvmj8cs.png" alt="Regisyer Login" width="815" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But wait clicking "Register" and "Login" link does not work. This is because Identity is implemented in ASP.Net Core MVC is with "Razor" pages. So, we have an MVC application which needs Ragpr page supports and for this we have to inject our program with razor page support by adding "app.MapRazorPages();" in the Program.cs&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--A_9r583p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fqmp4oqrioo6abflks4r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A_9r583p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fqmp4oqrioo6abflks4r.png" alt="exception" width="467" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now clicking Register and Login button redirects to the respective pages.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OikSGaVE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1vwr6wql6j8fq0d7iwqy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OikSGaVE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1vwr6wql6j8fq0d7iwqy.png" alt="exception page" width="880" height="501"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9ZSbwdEa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w4qz7hus04u2t5dy3934.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9ZSbwdEa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w4qz7hus04u2t5dy3934.png" alt="login page" width="880" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 03:&lt;/strong&gt; By default in Register page Identity adds only Email, Password and confirm passowrd fields. But we will add "First Name" and "Last Name" field. To do this we first need to a "First Name" and "Last Name" property in the "AspNetCoreIdentityUser" model class.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--51BliMpP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h181w266c1mdpw992296.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--51BliMpP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h181w266c1mdpw992296.png" alt="custom add" width="412" height="202"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we have the entity properties set we can create our database. For this go to appsettings.json for our connection string. Here we have a default connection added by scafollded. Here we have our server "mssqllocaldb" which is a local db and database "AspNetCoreIdentity".&lt;/p&gt;

&lt;p&gt;![connection string(&lt;a href="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7exjayqgmnwrlbtzip6v.png"&gt;https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7exjayqgmnwrlbtzip6v.png&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 04:&lt;/strong&gt; Let's create the database. First we will add a migration then we will import the migration and update the database. This is a EF Core concept.&lt;br&gt;
&lt;/p&gt;
&lt;center&gt;Add-Migration InitialCreate&lt;br&gt;
Update-Database&lt;/center&gt;

&lt;p&gt;After adding the migration we will see the following "Migration" folder added with the file structure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4jpnd-fL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hai16jl9x6dg1dd197br.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4jpnd-fL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hai16jl9x6dg1dd197br.png" alt="command" width="355" height="823"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And in the "InitialCreate" file we can see the our "First Name" and "Last Name" added to the "AspNetUsers" table.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qClrWHbI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j30v1stn4585dp8ig4gl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qClrWHbI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j30v1stn4585dp8ig4gl.png" alt="migration" width="847" height="388"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rMNpIr8y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nj6l1lpjb9k91ilnbi2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rMNpIr8y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nj6l1lpjb9k91ilnbi2d.png" alt="update" width="880" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 05:&lt;/strong&gt; Now our database is ready. We will now work on our Register page and two new field "First Name" and "Last Name". First we need to change our input model and add definition for  "First Name" and "Last Name". As Register is a razor page we will go to the backend code and make the following changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q6Cl9GIi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5wmmvfugwjgh4l147sws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q6Cl9GIi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5wmmvfugwjgh4l147sws.png" alt="register update" width="880" height="717"&gt;&lt;/a&gt;&lt;br&gt;
Now goto the HTML page and add these two fields.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4CnWG-HY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pxi61gnz01k1e4758ko8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4CnWG-HY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pxi61gnz01k1e4758ko8.png" alt="ui change" width="880" height="518"&gt;&lt;/a&gt;&lt;br&gt;
Let's build the application goto Register page and we now see the new fields.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FjvkGNIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pgwzuabra6bdnedahrzk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FjvkGNIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pgwzuabra6bdnedahrzk.png" alt="field added" width="880" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 06:&lt;/strong&gt; Before creating a new user we need to modify our backend as we need to configure our firstname and lastname. Goto "OnPostAsync" in Register.csghtml.cs page and add the following code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--v6tQY4zV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ynd41v4udxsu0dsvxiy4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--v6tQY4zV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ynd41v4udxsu0dsvxiy4.png" alt="register newfield" width="766" height="251"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now create a new user:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ea4UHulD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dhwf40tym4x9v1mp2k7v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ea4UHulD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dhwf40tym4x9v1mp2k7v.png" alt="create user" width="880" height="536"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our user is successfully registered and this is temporary confirmation page. Click confirm account. Later we will disable this email confirmation option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--C_4Qp9Ae--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/24zvbkbugk5pfiil3v2g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--C_4Qp9Ae--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/24zvbkbugk5pfiil3v2g.png" alt="regigtration complete" width="880" height="539"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now goto Login page and put our user credential and click Login and we successfully login to our user.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OEjvACn---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/84dc1d5ob92yg1m5j0a8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OEjvACn---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/84dc1d5ob92yg1m5j0a8.png" alt="user login" width="880" height="541"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 07:&lt;/strong&gt; Now we will look into how to disable Email confirmation. First logout from your application and goto Program.cs. Here we will see that in AddDefaultIdentity section we will set RequireConfirmedAccount to false.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ruf8Wm6F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tul0bs2fdccqudkm6l9m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ruf8Wm6F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tul0bs2fdccqudkm6l9m.png" alt="email" width="880" height="47"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now build the application and goto Register page and create another new user.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xqCw7ZZ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gi5kp7p1bs2w34q8cbvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xqCw7ZZ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gi5kp7p1bs2w34q8cbvy.png" alt="create" width="880" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After registering we directly redirect to main page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VLpGps6V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bva55hdlnxgb4i8ue8ra.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VLpGps6V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bva55hdlnxgb4i8ue8ra.png" alt="disable" width="880" height="540"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And our data looks like below in our database:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QCBJn_ca--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6ujw4tq21d4tyeomtt2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QCBJn_ca--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6ujw4tq21d4tyeomtt2d.png" alt="data" width="880" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 08:&lt;/strong&gt; Now lets look into our identity features. Click on username which will take you to the manage profile page. Here you will see vairous option to manage our credentials like Profile, Email, Password, Two-factor authentication, personal data&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MV1XvkL_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0ge33ngqfml1l5b2c75y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MV1XvkL_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0ge33ngqfml1l5b2c75y.png" alt="user manager" width="880" height="539"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Fix Anonymous Authentication:
&lt;/h4&gt;

&lt;p&gt;As we can see user can login successfully but our identity is still anonymous as we can see from the following watcher windows for user identity value from base class.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fIpnDZ04--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aklzdq3jufkc8akvhuws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fIpnDZ04--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aklzdq3jufkc8akvhuws.png" alt="isauth" width="880" height="616"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For this we need to create a cookie and server will look into this cookie to authenticate. Goto Login.cshtml.cs page and add the following Claim, Identity and Claims principle in "OnPosyAsync". Then use HttpContext.SignInAsync() which will serialize the claims principles into string and encrypt that string and save that as a cookie into httpcontext object.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HuGWkO60--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6bzoe8zmplfkr1wvwo4y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HuGWkO60--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6bzoe8zmplfkr1wvwo4y.png" alt="claim principle" width="880" height="516"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now when you try to login you will get the following exception page.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Vjv7O2oZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uv3892dwciu4tv3bd9k4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Vjv7O2oZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/uv3892dwciu4tv3bd9k4.png" alt="middleware missing" width="880" height="416"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To fix this goto to Program.cs&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IOq4JY3p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9r8vekle2ted5xh7r1gh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IOq4JY3p--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9r8vekle2ted5xh7r1gh.png" alt="cookie name" width="608" height="181"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now run the app again and login you now can see that you are not anymore logged in as anonymous&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--R43vOApG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mtd1acornum2lt92cmky.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--R43vOApG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mtd1acornum2lt92cmky.png" alt="auth true" width="880" height="1042"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Concluiton:
&lt;/h4&gt;

&lt;p&gt;Here we implemented user verification process. Then we injected the cookie authentication handlers so user do not logged in as anonymous. ASP.NET Core provides many tools and libraries to secure ASP.NET Core apps such as built-in identity providers and third-party identity services such as Facebook, Twitter, and LinkedIn. &lt;/p&gt;

</description>
      <category>aspnetcore</category>
      <category>identity</category>
      <category>authentication</category>
      <category>authorization</category>
    </item>
    <item>
      <title>A Quick Glimpse Into: Nuget Package Manager</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Wed, 01 Jun 2022 06:40:29 +0000</pubDate>
      <link>https://dev.to/sabahshariq/a-quick-glimpse-into-nuget-package-manager-4poi</link>
      <guid>https://dev.to/sabahshariq/a-quick-glimpse-into-nuget-package-manager-4poi</guid>
      <description>&lt;p&gt;NuGet packages contain reusable code that other developers make available to us for use in our .NET projects. Often such code is bundled into &lt;strong&gt;packages&lt;/strong&gt; that contain compiled code (as DLLs) along with other content needed in the projects that consume these packages.&lt;/p&gt;

&lt;p&gt;Because of these features it cut down production costs, and means developers don’t have to reinvent the wheel when encountering a problem that’s been solved once before.&lt;/p&gt;

&lt;p&gt;In simple word, a NuGet package is a single ZIP file with the &lt;strong&gt;.nupkg&lt;/strong&gt; extension that contains compiled code (DLLs), other files related to that code, and includes information like the package's version number.&lt;/p&gt;

&lt;p&gt;Because NuGet supports private hosts alongside the public nuget.org host, you can use NuGet packages to share code that's exclusive to an organization or a work group. You can also use NuGet packages as a convenient way to factor your own code for use in nothing but your own projects.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Also, if some (DLLs) are missing in windows that require you project to run you can also package in NuGet packages and deploy with your projects.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>nuget</category>
      <category>packagemanagement</category>
      <category>aspnetcore</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>Visual Studio 2022 is 64-bit. What does it mean?</title>
      <dc:creator>Sabah Shariq</dc:creator>
      <pubDate>Tue, 31 May 2022 20:21:52 +0000</pubDate>
      <link>https://dev.to/sabahshariq/visual-studio-2022-is-64-bit-what-does-it-mean-183b</link>
      <guid>https://dev.to/sabahshariq/visual-studio-2022-is-64-bit-what-does-it-mean-183b</guid>
      <description>&lt;h2&gt;
  
  
  Overview:
&lt;/h2&gt;

&lt;p&gt;Before we look into what Visual Studio 2022 now 64-bit application first lets look into what does 32-bit or 64-bit means?&lt;/p&gt;

&lt;p&gt;When we consider the various Windows versions, we might first think of the Home or Pro editions. While these are indeed different, there's another factor that separates Windows versions which is whether is system 32-bit or 64-bit?&lt;/p&gt;

&lt;p&gt;Information in computers is normally represented in bits, a binary sequence of 1 and 0. The more bits you have, the more information you can represent. Specifically, a sequence of n bits can represent 2^n different pieces of information.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why bits matter?
&lt;/h2&gt;

&lt;p&gt;In computing, there are two types of processors existing, i.e., 32-bit and 64-bit processors. Where the CPU register stores memory addresses, which is how the processor accesses data from RAM. One bit in the register can reference an individual byte in memory. So, this 32-bit and 64-bit processors tell us how much memory a processor can access from a CPU register. For instance,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A 32-bit system can address a maximum of 4GB (2^32 possible addresses, or 4,294,967,296 bytes) of RAM&lt;/li&gt;
&lt;li&gt;A 64-bit system can theoretically reference 16 exabytes (2^64 possible addresses or 18,446,744,073,709,551,616 bytes, about 18 billion gigabytes) of RAM.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is several million times more than an average workstation would need to access. Meaning that a computer can calculate more, faster. And with 64-bit computer, a processor can hold 16 exabytes of memory addresses. (An exabyte is 1,000 petabytes, a petabyte is 1,000 terabytes, and a terabyte is 1,000 gigabytes.)&lt;/p&gt;

&lt;h2&gt;
  
  
  Advantages of 64-bit over 32-bit:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Using 64-bit user can do a lot in multi-tasking&lt;/li&gt;
&lt;li&gt;Easily switch between various applications without any windows hanging problems.&lt;/li&gt;
&lt;li&gt;Allows you to access virtual memory per process.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Visual Studio 2022 is 64-bit application:
&lt;/h2&gt;

&lt;p&gt;Visual Studio 2022 is 64-bit application i.e. it is no longer limited to 4GB of memory in the main &lt;strong&gt;devenv.exe&lt;/strong&gt; process. With a 64-bit Visual Studio on Windows, one can open, edit, run, and debug even the biggest and most complex solutions without running out of memory.&lt;/p&gt;

&lt;p&gt;Earlier what happen is that even though you downloaded the 64-bit version of Visual Studio and able to create 64-bit application on computer but Visual Studio it self remains 32-bit application. Now, long into the 64-bit computing era, that has finally changed. Till Visual Studio 2019 the application is 32-bit however few of internal components such as diagnostics/debuggers, MSBuild, compilers, designers will take advantage of 64-bit processors if available.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;So at the end the main core devenv.exe process is now a 64-bit. It will have much more advantages of using more CPU registers, and taking advantage of more than 4GB of memory.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How to check how if your application running on 64bit or 32 bit?
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;devenv.exe&lt;/strong&gt; file installation location for Visual Studio 2019 Enterprise vs. Visual Studio 2022 Enterprise preview clearly highlights this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE&lt;/li&gt;
&lt;li&gt;C:\Program Files\Microsoft Visual Studio\2022\Enterprise\Common7\IDE&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Another way to check is via “Task Manager”. In windows 11 Open the “Task Manager” . Move to the “Details” tab. In architecture column you can find on which architecture the process is running. There you should see the 64bit / 32bit as platform for the &lt;em&gt;devenv.exe&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PxGRdWlA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mwltrl15bkjoeag4bffo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PxGRdWlA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mwltrl15bkjoeag4bffo.png" alt="VS Process" width="880" height="141"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aspnetcore</category>
      <category>codenewbie</category>
      <category>visualstudio</category>
      <category>visualstudiocode</category>
    </item>
  </channel>
</rss>
