<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sammy Deprez</title>
    <description>The latest articles on DEV Community by Sammy Deprez (@sammydeprez).</description>
    <link>https://dev.to/sammydeprez</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sammydeprez"/>
    <language>en</language>
    <item>
      <title>Azure Machine Learning – The Responsible Road</title>
      <dc:creator>Sammy Deprez</dc:creator>
      <pubDate>Wed, 20 May 2020 06:26:18 +0000</pubDate>
      <link>https://dev.to/sammydeprez/azure-machine-learning-the-responsible-road-2j9m</link>
      <guid>https://dev.to/sammydeprez/azure-machine-learning-the-responsible-road-2j9m</guid>
      <description>&lt;p&gt;Artificial Intelligence can change people's lives. It can help people see again, and it can help us to get faster diagnosed for deceases. It is helping me right now to write a blog post that reads easier and with fewer spelling mistakes. Companies use AI to predict their sales, to optimize their production. AI assists us wholly or partially in driving a car. These are just some examples of what it can do.&lt;/p&gt;

&lt;p&gt;But ... Yes, there is always a but to every story. Data can be dangerous if used or interpreted in the wrong way. A very famous paradox is the &lt;a href="https://www.youtube.com/watch?v=wgLUDw8eLB4"&gt;'Simpson Paradox&lt;/a&gt;.' In this paradox, it shows that data can tell the truth, but it is not the full truth.&lt;/p&gt;

&lt;p&gt;Now imagine you are working for the Department of Corrections, and you need to build a model that predicts what the risk is for recidivism. Based on that, they could decide whether to grant people parole or not.&lt;/p&gt;

&lt;p&gt;Or you work for a bank, and you build an algorithm that calculates if you will be able to pay back a loan or not. Based on the result, the bank will give you credit or not.&lt;/p&gt;

&lt;p&gt;These projects can work great, but what if you cannot explain why Ms. Emma Smith cannot get a loan. Or why is Mr. Kalifa Abara is a higher risk for recidivism? And should you, as a data scientist, have access to all these very confidential data. (*)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---UK46QSy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.datafish.eu/wp-content/uploads/2020/05/upc-1024x344.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---UK46QSy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.datafish.eu/wp-content/uploads/2020/05/upc-1024x344.png" alt="Undestand - Protect - Control"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Last year's Build (2019) was all about AI for all skill levels. With many new features within the cognitive services and automated ML.\&lt;br&gt;
Then we had Ignite (2019), where the focus was on bringing AI into production with MLOps. By converting the techniques of software development cycles into the machine learning industry.&lt;/p&gt;

&lt;p&gt;Today Build 2020 has started, and Microsoft put their focus now on "Responsible ML." Within this topic, there are three different pillars.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understand
&lt;/h3&gt;

&lt;p&gt;"Why did Mr. Carl Jones not get a loan?"\&lt;br&gt;
It has happened many times customers leaves a bank office devastated, because they did not get a loan, and the banker could not give a reason.&lt;/p&gt;

&lt;p&gt;For this, Microsoft is working on three projects, namely:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/interpretml/interpret-text"&gt;Interpret-Text&lt;/a&gt;: a library that incorporates explainers for text-based machine learning models and visualizes the results with a built-in dashboard&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/fairlearn/fairlearn"&gt;FairLearn&lt;/a&gt;: a python package that implements a variety of algorithms that mitigate unfairness in supervised machine learning&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/interpretml/DiCE"&gt;DiCE (Diverse Counterfactual Explanations):&lt;/a&gt; a library that generates "what-if" explanations for a model output&lt;/p&gt;

&lt;h3&gt;
  
  
  Protect
&lt;/h3&gt;

&lt;p&gt;How do we make the difference in our datasets between general information and private information, and how can you protect that part of your data.&lt;/p&gt;

&lt;p&gt;Or what if you don't want your data scientist even to see the data. Or how can we protect data while it is getting processed?&lt;/p&gt;

&lt;p&gt;To fix these problems, Microsoft revealed some other great tools:&lt;/p&gt;

&lt;p&gt;Differential Privacy Whitenoise (&lt;a href="https://github.com/opendifferentialprivacy/whitenoise-core"&gt;Core&lt;/a&gt; &amp;amp; &lt;a href="https://github.com/opendifferentialprivacy/whitenoise-system"&gt;System&lt;/a&gt;): Differential Privacy validator and runtime + tools and services for DP processing of tabular and relational data&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Microsoft/SEAL"&gt;Seal SDK&lt;/a&gt;: An easy-to-use and powerful homomorphic encryption library, or in other words how can you do mathematical calculations on encrypted data&lt;/p&gt;

&lt;p&gt;&lt;a href="https://openenclave.io/sdk/"&gt;Open Enclave&lt;/a&gt;: New Azure VMs, where data can stay encrypted while getting processed, by making use of specialized hardware.&lt;/p&gt;

&lt;h3&gt;
  
  
  Control
&lt;/h3&gt;

&lt;p&gt;How do we explain that a new iteration of training your Machine Learning model gives suddenly different results?&lt;/p&gt;

&lt;p&gt;Azure Machine Learning Audit Trail enables you to automatically track your experiments and datasets that correspond to your registered ML Model.&lt;/p&gt;

&lt;h2&gt;
  
  
  Stay tuned for more details.
&lt;/h2&gt;

&lt;p&gt;The above items are a lot of excellent new features that will get built-in within AML; some soon, some might still take some time.&lt;/p&gt;

&lt;p&gt;Myself and three other Microsoft AI MVP's were asked by Microsoft to have a look at it. And on Thursday (21th of May 2020) we will kick-off with a live stream on Twitch (hosted by &lt;a href="https://twitter.com/hboelman"&gt;Henk Boelman&lt;/a&gt; and &lt;a href="https://twitter.com/TessFerrandez"&gt;Tess Ferrandez&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;If you want to hear more details about these topics, then join us on Twitch:&lt;/p&gt;

&lt;p&gt;&lt;a href="http://twitch.com/microsoftdeveloper"&gt;http://twitch.com/microsoftdeveloper&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each four of us will go more in-depth into one of the features.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Eve Pardi &lt;a href="https://twitter.com/evepardi"&gt;(@EvePardi&lt;/a&gt;) will talk about Interpret-Text. You can read her intro blog [&lt;a href="https://medium.com/@evepardi/responsible-ai-interpret-text-cabc32ca5857"&gt;here&lt;/a&gt;]&lt;/li&gt;
&lt;li&gt;  Willem Meints (&lt;a href="https://twitter.com/willem_meints"&gt;@Willem_Meints&lt;/a&gt;) is going to discuss FairLearn. You can find his intro blog [&lt;a href="https://fizzylogic.nl/fairlearn-the-secret-to-teaching-your-models-to-play-fair"&gt;here&lt;/a&gt;]&lt;/li&gt;
&lt;li&gt;  Alicia Moniz (&lt;a href="https://twitter.com/AliciaMoniz"&gt;@AliciaMoniz&lt;/a&gt;) looked into Differential Privacy. Her intro blog can be found [here]&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And me myself, I will take you into the magical world of Confidential ML with Microsoft Seal and OpenEnclave.&lt;/p&gt;

&lt;p&gt;Stay tuned!&lt;/p&gt;

&lt;p&gt;Initially posted on: &lt;a href="https://www.datafish.eu/article/azure-machine-learning-the-responsible-road/"&gt;https://www.datafish.eu/article/azure-machine-learning-the-responsible-road/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;: The names used in this article are random, not based on any real person.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Integrating Qna Maker Into The Power Virtual Agent</title>
      <dc:creator>Sammy Deprez</dc:creator>
      <pubDate>Wed, 22 Jan 2020 15:48:53 +0000</pubDate>
      <link>https://dev.to/sammydeprez/integrating-qna-maker-into-the-power-virtual-agent-27bm</link>
      <guid>https://dev.to/sammydeprez/integrating-qna-maker-into-the-power-virtual-agent-27bm</guid>
      <description>&lt;p&gt;In another post I introduced the Power Virtual Agent. Which is a nice tool for business users to build simple chat bots. One big feature that it is missing is standard integration with Qna Maker.But since we can call Power Automate Flows, there is a work around. Let’s have a look how.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/1PyIjzuP9dM"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Initially posted on &lt;a href="https://www.datafish.eu/media/integrating-qna-maker-into-the-power-virtual-agent"&gt;datafish.eu&lt;/a&gt;&lt;/p&gt;

</description>
      <category>qnamaker</category>
      <category>powerplatform</category>
      <category>ai</category>
      <category>azure</category>
    </item>
    <item>
      <title>Data Gathering For Image Recognition</title>
      <dc:creator>Sammy Deprez</dc:creator>
      <pubDate>Tue, 07 Jan 2020 22:24:46 +0000</pubDate>
      <link>https://dev.to/sammydeprez/data-gathering-for-image-recognition-3g9c</link>
      <guid>https://dev.to/sammydeprez/data-gathering-for-image-recognition-3g9c</guid>
      <description>&lt;p&gt;The other time I was reading an interesting &lt;a href="https://www.henkboelman.com/articles/simpsons-classifier-with-azure-machine-learning-service/"&gt;blog&lt;/a&gt; from Henk Boelman (a Microsoft Advocate). He was describing how you can build an image classifier with Azure Machine Learning Studio. He used a dataset that he had downloaded from Kaggle. But what if you cant  find on Kaggle what you need. This blog post will discuss how you can make use of the Bing Image Search to generate your own dataset.&lt;/p&gt;

&lt;p&gt;So first of all some explanation. "Bing Image Search API" is part of the Cognitive Services on Azure and it gives you the possibility to search for images just like how you would do on bing.com/images. The API has a search function that allows you to search for a specific keyword but also filter the results based on size (height/width), file size, license, color, ...&lt;/p&gt;

&lt;p&gt;More info and free easy try out can be found on the product page of Microsoft.\&lt;br&gt;
&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/bing-image-search-api/"&gt;https://azure.microsoft.com/en-us/services/cognitive-services/bing-image-search-api/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we are talking about Machine Learning all the code that will be discussed will be in Python. That way you can just copy it in your Jupiter/Azure Notebook.&lt;/p&gt;

&lt;h1&gt;
  
  
  Step 1: Install/Import necessary packages
&lt;/h1&gt;

&lt;p&gt;The Bing Image Search has its own Python SDK, what is very handy. But if you want you can also make use of the classic HTTP request methods.&lt;/p&gt;

&lt;p&gt;!pip install azure-cognitiveservices-search-imagesearch&lt;/p&gt;

&lt;p&gt;from azure.cognitiveservices.search.imagesearch import ImageSearchClient&lt;br&gt;
from msrest.authentication import CognitiveServicesCredentials&lt;br&gt;
import pandas as pd&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure subscription key + endpoint
&lt;/h3&gt;

&lt;p&gt;Subscription Key + Endpoint can be found in your Azure Portal after you create a Cognitive Service resource. You don't need to look for a specific Bing Search Cognitive Services. Since a while Bing Search and many more cognitive services have been implemented in 1 resource. Which means 1 endpoint and 1 key to use them all.&lt;/p&gt;

&lt;p&gt;I also added the search term here&lt;/p&gt;

&lt;p&gt;subscription_key = "[YourKeyHere]"&lt;/p&gt;

&lt;p&gt;subscription_endpoint = "[YourEndpointHere]"&lt;br&gt;
search_term = "apples"&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Configure Client
&lt;/h3&gt;

&lt;p&gt;So in this step we create 2 new objects. One is the 'credentials' which contains the subscription key that we just configured. And the other is the client that we will use to make the call. Last mentioned also needs the endpoint.&lt;/p&gt;

&lt;p&gt;credentials = CognitiveServicesCredentials(subscription_key)&lt;/p&gt;

&lt;p&gt;client = ImageSearchClient(endpoint=subscription_endpoint, credentials=credentials)&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Search and gather the results
&lt;/h3&gt;

&lt;p&gt;Searching for items is very easy. Just by making use of the client object and the search function that has a bunch of variables that you can configure (like color, size, ...) In this case we only configure what we are search for and how many results we want to get back.&lt;/p&gt;

&lt;h1&gt;
  
  
  Search for images
&lt;/h1&gt;

&lt;p&gt;image_results = client.images.search(query=search_term,count=150)&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Convert to dataframe
&lt;/h3&gt;

&lt;p&gt;The result we receive from the API is a list of objects. This might we handy if we are in an c# application or so. But in this case we want to work with clear data. So with below functionality we convert the data to a dataframe. You will notice we get quite some information back. Some information is still stored in an object like "_type", but in this case we don't need this data. So we will leave it like this.&lt;/p&gt;

&lt;h1&gt;
  
  
  convert to dataframe
&lt;/h1&gt;

&lt;p&gt;df = pd.DataFrame([x.as_dict() for x in image_results.value])&lt;/p&gt;

&lt;p&gt;df.head()&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt; &lt;/th&gt;
&lt;th&gt;_type&lt;/th&gt;
&lt;th&gt;accent_color&lt;/th&gt;
&lt;th&gt;content_size&lt;/th&gt;
&lt;th&gt;content_url&lt;/th&gt;
&lt;th&gt;date_published&lt;/th&gt;
&lt;th&gt;encoding_format&lt;/th&gt;
&lt;th&gt;height&lt;/th&gt;
&lt;th&gt;host_page_display_url&lt;/th&gt;
&lt;th&gt;host_page_url&lt;/th&gt;
&lt;th&gt;image_id&lt;/th&gt;
&lt;th&gt;image_insights_token&lt;/th&gt;
&lt;th&gt;insights_metadata&lt;/th&gt;
&lt;th&gt;name&lt;/th&gt;
&lt;th&gt;thumbnail&lt;/th&gt;
&lt;th&gt;thumbnail_url&lt;/th&gt;
&lt;th&gt;web_search_url&lt;/th&gt;
&lt;th&gt;width&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;0&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;9D2E39&lt;/td&gt;
&lt;td&gt;224643 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://upload.wikimedia.org/wikipedia/commons"&gt;https://upload.wikimedia.org/wikipedia/commons&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-09-14T23:40:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Apple"&gt;https://en.wikipedia.org/wiki/Apple&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Apple"&gt;https://en.wikipedia.org/wiki/Apple&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;6AEAF1C3894ED7D563916634549ED00175C970D3&lt;/td&gt;
&lt;td&gt;ccid_pfp3ysAm*mid_6AEAF1C3894ED7D563916634549E...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Apple -- Wikipedia&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse3.mm.bing.net/th?id=OIP.pfp3ysAmXA6"&gt;https://tse3.mm.bing.net/th?id=OIP.pfp3ysAmXA6&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;4F300B&lt;/td&gt;
&lt;td&gt;161831 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://upload.wikimedia.org/wikipedia/commons"&gt;https://upload.wikimedia.org/wikipedia/commons&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-12-21T20:48:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1083&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Honeycrisp"&gt;https://en.wikipedia.org/wiki/Honeycrisp&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Honeycrisp"&gt;https://en.wikipedia.org/wiki/Honeycrisp&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;4531DA369849BECBA2980492B1F021A9EF15BB99&lt;/td&gt;
&lt;td&gt;ccid_wPWDNJmR*mid_4531DA369849BECBA2980492B1F0...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Honeycrisp -- Wikipedia&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse1.mm.bing.net/th?id=OIP.wPWDNJmRmNP"&gt;https://tse1.mm.bing.net/th?id=OIP.wPWDNJmRmNP&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;C47807&lt;/td&gt;
&lt;td&gt;172706 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/wp-content/uploads"&gt;https://www.tasteofhome.com/wp-content/uploads&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2018-08-15T00:12:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/collection/new-typ"&gt;https://www.tasteofhome.com/collection/new-typ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/collection/new-typ"&gt;https://www.tasteofhome.com/collection/new-typ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;EC1EBFE716FD1033C0E0983931FEC881E8A1177D&lt;/td&gt;
&lt;td&gt;ccid_7jSg14s4*mid_EC1EBFE716FD1033C0E0983931FE...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;15 New Types of Apples You Should Be Buying&lt;/td&gt;
&lt;td&gt;...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse2.mm.bing.net/th?id=OIP.7jSg14s46gp"&gt;https://tse2.mm.bing.net/th?id=OIP.7jSg14s46gp&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;7E0F24&lt;/td&gt;
&lt;td&gt;4462721 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.michiganapples.com/portals/0/MAC%20"&gt;http://www.michiganapples.com/portals/0/MAC%20&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-11-15T14:57:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;3738&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.michiganapples.com/About/Varieties"&gt;www.michiganapples.com/About/Varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.michiganapples.com/About/Varieties"&gt;http://www.michiganapples.com/About/Varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;753A53E3C6A184B4C55A074E275F120D20D4914C&lt;/td&gt;
&lt;td&gt;ccid_3X+TRdGR*mid_753A53E3C6A184B4C55A074E275F...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Michigan Apple Varieties&lt;/td&gt;
&lt;td&gt;Michigan Apple Comm...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse2.mm.bing.net/th?id=OIP.3X-TRdGReOQ"&gt;https://tse2.mm.bing.net/th?id=OIP.3X-TRdGReOQ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;4F1E0F&lt;/td&gt;
&lt;td&gt;419131 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.flinchbaughsorchard.com/wp-content/"&gt;http://www.flinchbaughsorchard.com/wp-content/&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-11-01T04:51:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1691&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.flinchbaughsorchard.com/apple-varieties"&gt;www.flinchbaughsorchard.com/apple-varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.flinchbaughsorchard.com/apple-varie"&gt;http://www.flinchbaughsorchard.com/apple-varie&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;7480E3C7DDE4776D891D003001872D954786935C&lt;/td&gt;
&lt;td&gt;ccid_GBWUKwPQ*mid_7480E3C7DDE4776D891D00300187...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Apple Varieties&lt;/td&gt;
&lt;td&gt;Flinchbaugh's Orchard &amp;amp; Farm...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse1.mm.bing.net/th?id=OIP.GBWUKwPQyNo"&gt;https://tse1.mm.bing.net/th?id=OIP.GBWUKwPQyNo&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Step 6: Generate new filenames
&lt;/h3&gt;

&lt;p&gt;Since we get data back from many different websites, there is a chance that some filenames might be totally the same. Plus some files might not have an extension within the URI, because of routing or image generation. To fix this we add a function that guesses the MimeType and based on that looks up the extension that belongs to it. This in combination with a GUID that is generated we are sure we have a unique filename.&lt;/p&gt;

&lt;p&gt;import mimetypes&lt;/p&gt;

&lt;p&gt;import uuid&lt;/p&gt;

&lt;p&gt;def getFileName(contentUrl):&lt;br&gt;
    mt = mimetypes.guess_type(contentUrl)&lt;br&gt;
    if mt[0] != None :&lt;br&gt;
        ext = mimetypes.guess_extension(mt[0])&lt;br&gt;
        return str(uuid.uuid1()) + ext&lt;br&gt;
    else:&lt;br&gt;
        return ""&lt;/p&gt;

&lt;p&gt;By applying above funtion to each row we can add an extra column to the dataframe that contains the newly generated fileName. NOTE: you might have noticed that the function can return an emtpy string. This is only the case when it can't figure out the MimeType. Those results we filter out.&lt;/p&gt;

&lt;p&gt;df['fileName'] = df.apply(lambda x: getFileName(x.content_url), axis=1)&lt;/p&gt;

&lt;p&gt;df = df[df['fileName'] != ""]&lt;/p&gt;

&lt;p&gt;df.head()&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt; &lt;/th&gt;
&lt;th&gt;_type&lt;/th&gt;
&lt;th&gt;accent_color&lt;/th&gt;
&lt;th&gt;content_size&lt;/th&gt;
&lt;th&gt;content_url&lt;/th&gt;
&lt;th&gt;date_published&lt;/th&gt;
&lt;th&gt;encoding_format&lt;/th&gt;
&lt;th&gt;height&lt;/th&gt;
&lt;th&gt;host_page_display_url&lt;/th&gt;
&lt;th&gt;host_page_url&lt;/th&gt;
&lt;th&gt;image_id&lt;/th&gt;
&lt;th&gt;image_insights_token&lt;/th&gt;
&lt;th&gt;insights_metadata&lt;/th&gt;
&lt;th&gt;name&lt;/th&gt;
&lt;th&gt;thumbnail&lt;/th&gt;
&lt;th&gt;thumbnail_url&lt;/th&gt;
&lt;th&gt;web_search_url&lt;/th&gt;
&lt;th&gt;width&lt;/th&gt;
&lt;th&gt;fileName&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;0&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;9D2E39&lt;/td&gt;
&lt;td&gt;224643 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://upload.wikimedia.org/wikipedia/commons"&gt;https://upload.wikimedia.org/wikipedia/commons&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-09-14T23:40:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Apple"&gt;https://en.wikipedia.org/wiki/Apple&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Apple"&gt;https://en.wikipedia.org/wiki/Apple&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;6AEAF1C3894ED7D563916634549ED00175C970D3&lt;/td&gt;
&lt;td&gt;ccid_pfp3ysAm*mid_6AEAF1C3894ED7D563916634549E...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Apple -- Wikipedia&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse3.mm.bing.net/th?id=OIP.pfp3ysAmXA6"&gt;https://tse3.mm.bing.net/th?id=OIP.pfp3ysAmXA6&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;11c6ddea-3197-11ea-87d5-000d3aaa7d6e.jpe&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;4F300B&lt;/td&gt;
&lt;td&gt;161831 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://upload.wikimedia.org/wikipedia/commons"&gt;https://upload.wikimedia.org/wikipedia/commons&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-12-21T20:48:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1083&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Honeycrisp"&gt;https://en.wikipedia.org/wiki/Honeycrisp&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="https://en.wikipedia.org/wiki/Honeycrisp"&gt;https://en.wikipedia.org/wiki/Honeycrisp&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;4531DA369849BECBA2980492B1F021A9EF15BB99&lt;/td&gt;
&lt;td&gt;ccid_wPWDNJmR*mid_4531DA369849BECBA2980492B1F0...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Honeycrisp -- Wikipedia&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse1.mm.bing.net/th?id=OIP.wPWDNJmRmNP"&gt;https://tse1.mm.bing.net/th?id=OIP.wPWDNJmRmNP&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;11c6e2ea-3197-11ea-87d5-000d3aaa7d6e.jpe&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;C47807&lt;/td&gt;
&lt;td&gt;172706 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/wp-content/uploads"&gt;https://www.tasteofhome.com/wp-content/uploads&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2018-08-15T00:12:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/collection/new-typ"&gt;https://www.tasteofhome.com/collection/new-typ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.tasteofhome.com/collection/new-typ"&gt;https://www.tasteofhome.com/collection/new-typ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;EC1EBFE716FD1033C0E0983931FEC881E8A1177D&lt;/td&gt;
&lt;td&gt;ccid_7jSg14s4*mid_EC1EBFE716FD1033C0E0983931FE...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;15 New Types of Apples You Should Be Buying&lt;/td&gt;
&lt;td&gt;...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse2.mm.bing.net/th?id=OIP.7jSg14s46gp"&gt;https://tse2.mm.bing.net/th?id=OIP.7jSg14s46gp&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1200&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;7E0F24&lt;/td&gt;
&lt;td&gt;4462721 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.michiganapples.com/portals/0/MAC%20"&gt;http://www.michiganapples.com/portals/0/MAC%20&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-11-15T14:57:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;3738&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.michiganapples.com/About/Varieties"&gt;www.michiganapples.com/About/Varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.michiganapples.com/About/Varieties"&gt;http://www.michiganapples.com/About/Varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;753A53E3C6A184B4C55A074E275F120D20D4914C&lt;/td&gt;
&lt;td&gt;ccid_3X+TRdGR*mid_753A53E3C6A184B4C55A074E275F...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Michigan Apple Varieties&lt;/td&gt;
&lt;td&gt;Michigan Apple Comm...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse2.mm.bing.net/th?id=OIP.3X-TRdGReOQ"&gt;https://tse2.mm.bing.net/th?id=OIP.3X-TRdGReOQ&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;3738&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;ImageObject&lt;/td&gt;
&lt;td&gt;4F1E0F&lt;/td&gt;
&lt;td&gt;419131 B&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.flinchbaughsorchard.com/wp-content/"&gt;http://www.flinchbaughsorchard.com/wp-content/&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;2019-11-01T04:51:00.0000000Z&lt;/td&gt;
&lt;td&gt;jpeg&lt;/td&gt;
&lt;td&gt;1691&lt;/td&gt;
&lt;td&gt;&lt;a href="http://www.flinchbaughsorchard.com/apple-varieties"&gt;www.flinchbaughsorchard.com/apple-varieties&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;
&lt;a href="http://www.flinchbaughsorchard.com/apple-varie"&gt;http://www.flinchbaughsorchard.com/apple-varie&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;7480E3C7DDE4776D891D003001872D954786935C&lt;/td&gt;
&lt;td&gt;ccid_GBWUKwPQ*mid_7480E3C7DDE4776D891D00300187...&lt;/td&gt;
&lt;td&gt;{}&lt;/td&gt;
&lt;td&gt;Apple Varieties&lt;/td&gt;
&lt;td&gt;Flinchbaugh's Orchard &amp;amp; Farm...&lt;/td&gt;
&lt;td&gt;{'_type': 'ImageObject', 'width': 474, 'height...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://tse1.mm.bing.net/th?id=OIP.GBWUKwPQyNo"&gt;https://tse1.mm.bing.net/th?id=OIP.GBWUKwPQyNo&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;
&lt;a href="https://www.bing.com/images/search?view=detail"&gt;https://www.bing.com/images/search?view=detail&lt;/a&gt;...&lt;/td&gt;
&lt;td&gt;1800&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Step 7: Create folder to save the images
&lt;/h3&gt;

&lt;p&gt;This is a folder on your local or cloud environment.&lt;/p&gt;

&lt;p&gt;dirName = "image_gathering"&lt;/p&gt;

&lt;p&gt;os.mkdir(dirName)&lt;/p&gt;

&lt;p&gt;Requirement already satisfied: wget in /anaconda/envs/azureml_py36/lib/python3.6/site-packages (3.2)&lt;br&gt;
Directory  image_gathering/apples  already exists. Folder will be cleared&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 8: Download images
&lt;/h3&gt;

&lt;p&gt;First we create a function that tries to download the file to the new destination with the new filename. Bing search might be outdated, so there are possibilities that certain downloads fails (Ex. NotFound, NotAuthenticated, ...) Therefore the function returns True or False depending if download was succesful.&lt;/p&gt;

&lt;p&gt;import wget&lt;/p&gt;

&lt;p&gt;def downloadFile(dirName, contentUrl, fileName):&lt;br&gt;
    try:&lt;br&gt;
        wget.download(contentUrl, dirName + "/" + fileName)&lt;br&gt;
        return True&lt;br&gt;
    except:&lt;br&gt;
        return False&lt;/p&gt;

&lt;p&gt;By applying this function do every row, every result will be downloaded and we add an extra column that keeps track if it was successfull or not.&lt;/p&gt;

&lt;p&gt;df['fileDownloaded'] = df.apply(lambda x: downloadFile(dirName, x.content_url, x.fileName), axis=1)&lt;/p&gt;

&lt;h2&gt;
  
  
  THE END
&lt;/h2&gt;

&lt;p&gt;Its not that hard and quite fast to gather your own data based on results from Bing Search. Depending on what you want to realize you will need to run this script multiple times if you want images for different search terms. Or you can convert this script into a function that accepts an array of search strings.&lt;/p&gt;

&lt;h3&gt;
  
  
  Extra -- Move data to datastore
&lt;/h3&gt;

&lt;p&gt;In case you are using Azure Machine Learning Studio, its a good idea to move your images to a datastore. That way other data scientists can also make use of it. Below code uploads the folder of images to the same folder on your datastore.&lt;/p&gt;

&lt;p&gt;import azureml.core&lt;/p&gt;

&lt;p&gt;from azureml.core import Workspace, Datastore&lt;/p&gt;

&lt;p&gt;ws = Workspace.from_config()&lt;/p&gt;

&lt;p&gt;datastore_name = 'workspaceblobstore'&lt;br&gt;
datastore = Datastore.get(ws, datastore_name)&lt;br&gt;
datastore.upload(dirName, dirName)&lt;/p&gt;

&lt;p&gt;The full Jupyter Notebook can be found on my GitHub&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/sammydeprez/PythonExamples/blob/master/BingImageSearch/BingImageSearch.ipynb"&gt;https://github.com/sammydeprez/PythonExamples/blob/master/BingImageSearch/BingImageSearch.ipynb&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enjoy gathering new data!&lt;/p&gt;

&lt;p&gt;Originally posted on &lt;a href="https://www.datafish.eu/article/data-gathering-for-image-recognition/"&gt;https://www.datafish.eu/article/data-gathering-for-image-recognition/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>imagerecognition</category>
      <category>cognitiveservices</category>
      <category>machinelearning</category>
      <category>bing</category>
    </item>
    <item>
      <title>Alterations - A hidden gem in QnaMaker</title>
      <dc:creator>Sammy Deprez</dc:creator>
      <pubDate>Mon, 16 Dec 2019 22:22:07 +0000</pubDate>
      <link>https://dev.to/sammydeprez/alterations-a-hidden-gem-in-qnamaker-42b7</link>
      <guid>https://dev.to/sammydeprez/alterations-a-hidden-gem-in-qnamaker-42b7</guid>
      <description>&lt;p&gt;Qna Maker is a great and easy tool. It can help you to build, in a matter of minutes, a mini chatbot. A chatbot that will replace your boring FAQ page.&lt;/p&gt;

&lt;p&gt;Since the last releases, a lot of cool features have been added.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://www.qnamaker.ai/old/Documentation/ActiveLearning"&gt;Active Learning&lt;/a&gt; -- a built in function to let your users help you with suggesting the correct answer to their question.&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://docs.microsoft.com/en-us/azure/cognitive-services/QnAMaker/how-to/multiturn-conversation"&gt;Multi Turn Questions&lt;/a&gt; -- in some cases you will need to ask the user multiple questions to get to the correct answers. Ex. 'I am looking for a school.' Then you might ask the user first: 'What school are you looking for?' with some suggestions like 'Music School', 'Primary School', ... With multi turn this is a possibility without any extra coding in your bot.&lt;/li&gt;
&lt;li&gt;  Publish bot -- before this option you had to built your own chatbot with the SDK that calls the Qna Maker service. Now you can create a new chatbot within the Qna Maker Portal.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But there is one feature that was there from the beginning. But it is hidden within the Qna Maker API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Alterations - less training with synonyms
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XTYqMlYn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/gift_present-1024x405.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XTYqMlYn--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/gift_present-1024x405.png" alt="A gift equalts to a present. The words mean the same"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alterations can be compared to synonyms. Its a list of words that mean the same thing. For example the word 'gift' can be swapped by the word 'present'. Or you might even use it for abbreviations. 'GDPR' for example is a widely used term, but some people might call it AVG (Dutch term)&lt;/p&gt;

&lt;p&gt;But why would I call this a hidden gem? First of all I am not a native English speaker and I don't live in an English speaking country. So I built bots in other languages than English. Why is this an important fact? Well the Qna Maker has been pre-trained with a lot of synonyms in English and some other big languages. And you guessed well, its not done in the language I need (not at the moment at least).&lt;/p&gt;

&lt;p&gt;When you add a list of well chosen alterations, you will make your QnA Maker smarter in a short time. Since you need to add less alternative questions in your knowledge base.&lt;/p&gt;

&lt;p&gt;Enough talking. I said it's a hidden gem. Because you can do this within the Qna Maker portal. You need to use the API for it.&lt;/p&gt;

&lt;h2&gt;
  
  
  How its done!
&lt;/h2&gt;

&lt;p&gt;Lets start by open your favorite REST API Client (Ex. Postman)&lt;/p&gt;

&lt;p&gt;So the API has 2 function for the alterations.&lt;/p&gt;

&lt;p&gt;The first one is getting a list of all the alterations that have been stored. The second one is to override your current list.&lt;/p&gt;

&lt;h3&gt;
  
  
  Get current alterations
&lt;/h3&gt;

&lt;p&gt;Call the REST API&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  GET {Endpoint}/qnamaker/v4.0/alterations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Endpoint is for example &lt;a href="https://westus.api.cognitive.microsoft.com"&gt;https://westus.api.cognitive.microsoft.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With in the headers of your request the Ocp-Apim-Subscription-Key which is equal to the Key you can find in the Azure Portal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ABq42xZX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-230443-1024x531.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ABq42xZX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-230443-1024x531.png" alt="You can find the endpoint and key for your cognitive service in the Azure Portal"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Postman it would look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RuTi9LS---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-221606-1024x304.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RuTi9LS---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-221606-1024x304.png" alt="An example of how the REST API can be called in Postman"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you have never used alterations then the first time you execute this, you will receive an empty array:&lt;/p&gt;

&lt;p&gt;{"wordAlterations": []}&lt;/p&gt;

&lt;h2&gt;
  
  
  Add alterations
&lt;/h2&gt;

&lt;p&gt;Now let us add some alterations. To do this we need to call the REST API&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  PUT {Endpoint}/qnamaker/v4.0/alterations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At this point Endpoint is the same as to get the alterations. And we also need to add the Ocp-Apim-Subscription-Key to the header.&lt;/p&gt;

&lt;p&gt;On top of that we need to add some alterations too. So in the body of our request we can add something like this:&lt;/p&gt;

&lt;p&gt;{&lt;/p&gt;

&lt;p&gt;    "wordAlterations": [&lt;/p&gt;

&lt;p&gt;        {&lt;/p&gt;

&lt;p&gt;            "alterations": ["auto","wagen"]&lt;/p&gt;

&lt;p&gt;        },&lt;/p&gt;

&lt;p&gt;        {&lt;/p&gt;

&lt;p&gt;            "alterations": ["gebruiker","medewerker"]&lt;/p&gt;

&lt;p&gt;        },&lt;/p&gt;

&lt;p&gt;        {&lt;/p&gt;

&lt;p&gt;            "alterations": ["wachtwoord","paswoord"]&lt;/p&gt;

&lt;p&gt;        }&lt;/p&gt;

&lt;p&gt;    ]&lt;/p&gt;

&lt;p&gt;}&lt;/p&gt;

&lt;p&gt;So the structure exists out of the 'wordAlterations' object that is an array of different 'alterations'. An 'alterations' object again is an array of a list of words that are synonyms.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rMarHnEp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-221607-1024x514.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rMarHnEp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://datafish.eu/wp-content/uploads/2019/12/Annotation-2019-12-16-221607-1024x514.png" alt="An example of how adding alterations to the Qna Maker service can be done"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now read this carefully.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; This function replaces all your alterations that were already stored in your Qna Maker service&lt;/li&gt;
&lt;li&gt; Alterations are added for all the knowledge bases within your QnA Maker service.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Sadly enough alterations are not built in into the Qna Maker portal. But do not forget them while you are building a knowledge base. They can really help you to improve the quality of your knowledge base and will save you time in training it. So less training with synonyms.&lt;/p&gt;

&lt;p&gt;More detailed information about the alteration function can be found here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/rest/api/cognitiveservices/qnamaker/alterations"&gt;https://docs.microsoft.com/en-us/rest/api/cognitiveservices/qnamaker/alterations&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Original posted at: &lt;a href="https://datafish.eu/article/alterations-a-hidden-gem-in-qnamaker/"&gt;https://datafish.eu/article/alterations-a-hidden-gem-in-qnamaker/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>alterations</category>
      <category>qnamaker</category>
      <category>chatbot</category>
      <category>synonyms</category>
    </item>
  </channel>
</rss>
