<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: idiWork</title>
    <description>The latest articles on DEV Community by idiWork (@idiwork).</description>
    <link>https://dev.to/idiwork</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/idiwork"/>
    <language>en</language>
    <item>
      <title>Brand presence</title>
      <dc:creator>idiWork</dc:creator>
      <pubDate>Mon, 29 Mar 2021 11:45:42 +0000</pubDate>
      <link>https://dev.to/idiwork/experminet-206-brand-presence-dfi</link>
      <guid>https://dev.to/idiwork/experminet-206-brand-presence-dfi</guid>
      <description>&lt;h1&gt;
  
  
  About
&lt;/h1&gt;

&lt;p&gt;Companies and brands have always invested in having a presence; either by appearing in certain ads or through influencers on social media.&lt;/p&gt;

&lt;p&gt;However, this investment is useless if there is no traceability of where the brand has appeared, for how long, for how many people and, most importantly, what impact and return on investment we have had on our business; as well as answering other crucial questions. &lt;/p&gt;

&lt;h1&gt;
  
  
  Idea
&lt;/h1&gt;

&lt;p&gt;Providing traceability of our brand presence is a task that involves a lot of effort due to the volume of data processing that it requires. A successful analysis of this data will help to carry out market researches and correlations to know where to invest in future campaigns to maximize our business objectives. The objective of this experiment is to be able to exploit this huge amount of information that is generated from different data sources and its subsequent analysis for decision-making.&lt;/p&gt;

&lt;p&gt;Thanks to artificial intelligence and parallel data processing, this is now possible. &lt;/p&gt;

&lt;h1&gt;
  
  
  Process
&lt;/h1&gt;

&lt;p&gt;First, we must bring together all the data sources where our brand appears (TV channels, radio, social networks, Twitch, YouTube channels, etc.). Several processes in the Azure cloud will be in charge of carrying it out. Subsequently, thanks to parallel data processing and sophisticated Artificial Intelligence systems from both Cognitive Services and ad-hoc trained neural networks (all this also put into production in Azure), our brand will be able to be identified. In addition to this, the current audience level, as well as other relevant information, will be extracted at all times. Then all this information will be stored in a corporate Data Warehouse after an ETL process that will bring together all the information. Finally, an analysis will be carried out in Power BI to be able to transform all this information into knowledge that can be exploited by the analysts for decision making. &lt;/p&gt;

&lt;h1&gt;
  
  
  Utility
&lt;/h1&gt;

&lt;p&gt;Obtaining brand presence is a good practice where a large part of capital is invested. Providing traceability and being able to analyze the impact it has is a difficult task if it’s not done with the right tools.  &lt;/p&gt;

&lt;p&gt;Thanks to Artificial Intelligence and Big Data techniques, we offer to bring all this knowledge to analysts in order to answer all their questions, maximize return on investment (ROI), as well as achieve business objectives. &lt;/p&gt;

&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;Companies have always struggled to have presence in the sector. Each year, companies invest huge amounts of  money  to appear in strategic locations and to be able to achieve a greater volume of sales, customers and what–nots. But, do they conduct studies and traceability of:&lt;/p&gt;

&lt;p&gt;• Where has my brand appeared? &lt;/p&gt;

&lt;p&gt;• For how long? &lt;/p&gt;

&lt;p&gt;• How many people have seen my brand? &lt;/p&gt;

&lt;p&gt;• In which medias? &lt;/p&gt;

&lt;p&gt;• What impact has it had?&lt;/p&gt;

&lt;p&gt;There is no doubt about the repercussion that having a presence in these strategic places provides. However, if we enrich it with a posteriori studies that are capable of providing us with information for decision-making, our return on investment (ROI) is guaranteed.  &lt;/p&gt;

&lt;p&gt;This problem creates real headaches when having to deal with large data from different sources and their subsequent analysis. But, thanks to artificial intelligence and parallel data processing, this is possible. &lt;/p&gt;

&lt;h1&gt;
  
  
  Architecture
&lt;/h1&gt;

&lt;p&gt;The high-level architecture capable of satisfying the earlier mentioned needs would be the following: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Tr3f8sEg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nx46q654ubrxixa7ffgq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Tr3f8sEg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nx46q654ubrxixa7ffgq.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Roughly speaking, what we would have would be a series of origins from which we will analyze when the brand is present and how many viewers it is reaching. Among these origins, we can find: &lt;/p&gt;

&lt;p&gt;• Twitch platform&lt;/p&gt;

&lt;p&gt;• YouTube&lt;/p&gt;

&lt;p&gt;• Sports such as football, tennis, MotoGP, basketball…&lt;/p&gt;

&lt;p&gt;• TVchannels&lt;/p&gt;

&lt;p&gt;• Radio&lt;/p&gt;

&lt;p&gt;• Socialnetworks like Instagram, Facebook…&lt;/p&gt;

&lt;p&gt;These are just a few examples of the wide range of possibilities that we have, being able to add as many as we want. Let’s see it a little more in detail: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NJEqYHSf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jdon63507o8a21s70yka.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NJEqYHSf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jdon63507o8a21s70yka.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We would start from the base of having a number of services in Python that would be in charge of capturing and connecting to the different  APIs  and sources. In addition to connecting, they would make the necessary cuts to keep the data of interest. &lt;/p&gt;

&lt;p&gt;This data would  then be sent to both Azure Cognitive Services and custom AI models that will be the responsibles for identifying the brand in each of the data sources. &lt;/p&gt;

&lt;p&gt;Finally, all this information will travel through Azure Data Factory where the Business Intelligence’s own Extract, Transform and Load (ETL) process will take place and this information will be transformed, analyzed and joined to the other data in our central data unit constituted by the Data Warehouse.  &lt;/p&gt;

&lt;p&gt;This last part will be where we nurture the subsequent analyses performed with PowerBI that will be what we make available to the corresponding department for decision making. &lt;/p&gt;

&lt;h1&gt;
  
  
  Preparation
&lt;/h1&gt;

&lt;p&gt;The sources from which we are going to extract the information are totally heterogeneous, which forces us to carry out a set of techniques and scripts to capture the data that we need. In this case, we decided to use Python programming language, which offers a suite of tools capable of executing this heavy task.&lt;/p&gt;

&lt;p&gt;For different social networks we have at our disposal the corresponding API to obtain the desired information (publications, likes, visits, etc.). And, both from Twitch and the different television channels, we have direct access to the signal.&lt;/p&gt;

&lt;p&gt;Later and thanks to artificial intelligence, we will be able to detect if the brand is present, like in the next images:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--19DvAdmm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rf1b3f73j4gxc4bp0pmn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--19DvAdmm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rf1b3f73j4gxc4bp0pmn.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MI29cRhO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vcx1myctlz62ddr0gh3k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MI29cRhO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vcx1myctlz62ddr0gh3k.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Services
&lt;/h1&gt;

&lt;p&gt;Once the information has been collected, different services will be in charge of detecting the presence of the brands in each image. Apart from customized models that will improve and enrich our system, we will use Computer Vision, which is one of Azure’s Cognitive Services:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KgAaaB9n--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0eom7pj5jpuvzln80m26.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KgAaaB9n--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0eom7pj5jpuvzln80m26.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Analysis of the information
&lt;/h1&gt;

&lt;p&gt;Once we have the architecture assembled and the data has been processed by the Cognitive Services, the next step is extracting information from the pre-processed data to make it available to the analysis departments, which will use it for decision making.&lt;/p&gt;

&lt;p&gt;To tackle this last part, we have decided to make use of a great ally such as Power BI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding the information
&lt;/h2&gt;

&lt;p&gt;Next, we will explain how the Power BI’s information is structured.&lt;/p&gt;

&lt;p&gt;At the top we find the sources:&lt;/p&gt;

&lt;p&gt;• TV channels (for example)&lt;br&gt;
  o Tele 5.&lt;/p&gt;

&lt;p&gt;And within each channel we can find different categories, like:&lt;/p&gt;

&lt;p&gt;• Programs&lt;br&gt;
  o The comedy club =&amp;gt; We detected a brand on:&lt;br&gt;
      the stage&lt;br&gt;
      the comedian’s clothes&lt;br&gt;
     -&amp;gt; We identify the comedians and detect the number of spectators&lt;/p&gt;

&lt;p&gt;• Sports&lt;br&gt;
  o Football matches =&amp;gt; We detect a brand on:&lt;br&gt;
     the stadium&lt;br&gt;
     the players´ clothing&lt;br&gt;
    -&amp;gt; We identify those players and detect number of spectators&lt;/p&gt;

&lt;p&gt;• Ads (again, same process)&lt;/p&gt;

&lt;p&gt;In Social networks:&lt;/p&gt;

&lt;p&gt;• Instagram&lt;br&gt;
  o FC Barcelona account&lt;br&gt;
     Publications: We detect the brands&lt;br&gt;
    -&amp;gt; We identify the player and identify the number of visits and likes&lt;/p&gt;

&lt;p&gt;• Twitch: Same concept as the previous ones. We would be able to navigate the different streamers, identify them and detect how long a brand has been seen on the screen and for how many viewers.&lt;/p&gt;

&lt;p&gt;• And so on…&lt;/p&gt;

&lt;h1&gt;
  
  
  Analysis
&lt;/h1&gt;

&lt;p&gt;“Impact” is the effect caused by each of the elements. For example, the impact on a football broadcast would be calculated as the product between the seconds that a brand appears times the number of spectators. However, in a social media post the impact will be equal to its reached viewers.&lt;/p&gt;

&lt;p&gt;The “impact” metric is something totally customizable, and must be calculated as the experts in the business dictates, because each company behaves in a different way.&lt;/p&gt;

&lt;p&gt;Let’s start with the report. In it, the different elements will appear numbered to facilitate their reading and monitoring.&lt;/p&gt;

&lt;h2&gt;
  
  
  Global metrics – tab
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JSblmzej--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sp8frjjbydbr9dypoajs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JSblmzej--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sp8frjjbydbr9dypoajs.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The percentage of impact of each social network. We can see how Instagram and Facebook predominate, the first being the winner.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Percentage of impact by subtype. In this case, we can see how eSports and entertainment stand out from the rest.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Regarding television channels, Cuatro and La Sexta are the superiors.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;There is no doubt about the impact that Twitch is gaining. That is why we thought it would be interesting to compare it with the impact of both social networks and the TV. In this case, we can see how Twitch could practically be put at the same level than the other two, so the investment made in this platform is being truly effective.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally, in terms of general impact by source (without breaking down by subtype), Twitch would be the one performing most satisfactorily.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Social media – tab
&lt;/h2&gt;

&lt;p&gt;In this tab we are going to enter into a study between the different social networks:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jNhIGk2j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pmufb98og5jup7jrltmg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jNhIGk2j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pmufb98og5jup7jrltmg.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Number of total publications vs the target set. We set a goal of 1000 and we are 67 publications above.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Although the number of publications has been exceeded, we have not received as many “likes” as expected.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;However, our custom metric that relates both “likes” and visits to publications tell us that we have reached a lot more people than expected, so the objectives are being achieved.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In this chart we can see the impact of our publications by date. February was clearly the busiest month.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;One of the studies to be carried out is when (time) our publications have the greatest impact. Thanks to this visual we can see how from 8 to 9 and from 13 to 17 is when we have a greater impact.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here we can see the impact of each of the social networks. In addition, it also works as a filter.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If we want to know which person in the publication has the greatest impact, this chart will tell it. Here Cristiano Ronaldo and Lionel Messi meet tied at the top of the list. However, Iker Casillas seems to be the one with that least impact creates.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We also have at our disposal a couple of filters to filter by “Service”, this means, by a social network account in particular or by a particular person.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally, we can see the number of publications by type (image vs video).&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If we want to filter by a specific account (FC Barcelona, for example) this would be what we would see:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--t5gyIHEm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l55gu79iavwmq5obgkn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--t5gyIHEm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l55gu79iavwmq5obgkn6.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And here an example of FC Real Madrid:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rH036gBM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ser7w71cd3q38ibfzagn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rH036gBM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ser7w71cd3q38ibfzagn.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As the analysis shows us that the posts with Cristiano Ronaldo or Lionel Messi have most impact, the decision-makers interest would be in having these two appearing in most of the posts in order to generate even more impact. And as the publications on the FC Barcelona account work very well between 8 and 9, the account should try to publish at that specific time, or between 1 and 3 o’clock, which is a good time to publish for both accounts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Twitch – tab
&lt;/h2&gt;

&lt;p&gt;As we have seen before, Twitch is one of the leading platforms at the moment. And to many, the new television.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bp0dka-B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fhxwtdps1s8v4ngx1med.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bp0dka-B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fhxwtdps1s8v4ngx1med.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Personalized impact vs the objective set.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Seconds that appear on the screen vs the target.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Number of people we have reached vs the target.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When the post has had most impoct. In social networks, it is crucial to know the best time slot to publish, so this chart is very important.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Impact per person. In addition, it also works as a filter. We can see how Ibai is the leader, followed by AuronPlay.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The impact by date.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  “TV Channel” tab
&lt;/h2&gt;

&lt;p&gt;This tab shows the analysis of different television channels and programs:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SJt1yiWI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sdf9owrcnwurhdfg02kb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SJt1yiWI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sdf9owrcnwurhdfg02kb.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;As in the previous tabs, here we can see the customized impact vs the objective.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Total seconds vs the goal.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;People we have reached vs the objective.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Time period in which the most impact has been achieved, this being mainly between 6:00 p.m. and 3:00 a.m.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Impact by service. As we can see, the ads have greater impact than a football game. But, we have to keep in mind, that the costs for appearing in an add are different for the costs to appear in a stadium or on the players’ shirt, and that, in this case, we should also elaborate metrics such as impact vs. invested money.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The impact in different TV channels.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Impact per person who appears on the screen with the brand. This info can be used in future ads and choose one person over another to appear in the ads, in order to achieve a greater impact.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you want to read more go to &lt;a href="https://www.idiwork.com/projects/experiment-206/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Stay up to date: &lt;a href="https://www.idiwork.com"&gt;idiwork.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Follow us on &lt;a href="https://twitter.com/idiWork"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Links
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/"&gt;Azure&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Azure Cognitive Services&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://powerbi.microsoft.com/en-us/"&gt;PowerBI&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>azure</category>
      <category>cognitiveservices</category>
      <category>bigdata</category>
    </item>
    <item>
      <title>Analysis of hot spots in physical stores</title>
      <dc:creator>idiWork</dc:creator>
      <pubDate>Wed, 03 Feb 2021 09:39:39 +0000</pubDate>
      <link>https://dev.to/idiwork/analysis-of-hot-spots-in-physical-stores-2ldl</link>
      <guid>https://dev.to/idiwork/analysis-of-hot-spots-in-physical-stores-2ldl</guid>
      <description>&lt;h1&gt;
  
  
  About
&lt;/h1&gt;

&lt;p&gt;There are countless processes in a physical store that produce data that can be used and exploited to generate or increase sales processes. &lt;/p&gt;

&lt;p&gt;In addition to measuring the time spent in the store and register whether the visitor ends up buying something or not, the purpose of experiment #205 is to use facial recognition services to locate hot spots in the store, sentiment analysis to analyze the level of satisfaction of the client, and &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Cognitive Services&lt;/a&gt; to analyze bags or objects from other brands.&lt;/p&gt;

&lt;p&gt;With all this information it’s possible to calculate a predictive stock to automatically supply the store.&lt;/p&gt;

&lt;h1&gt;
  
  
  Idea
&lt;/h1&gt;

&lt;p&gt;The idea arises from the need and the lack of use of the information that can be exploited in physical stores. Nowadays data-driven decision making is crucial for the proper development of a business and thanks to AI this reality can be possible.&lt;/p&gt;

&lt;h1&gt;
  
  
  Process
&lt;/h1&gt;

&lt;p&gt;First, we will use the store’s cameras to analyze the images. Then, we will develop a software that collects all the information and sends it to the cloud to be analyzed in the &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Cognitive Services&lt;/a&gt;. Once in the cloud, all the necessary analysis, the entire ETL process and a further integration with the corporate DataWarehouse will be carried out and utilized through &lt;a href="https://powerbi.microsoft.com/en-us/"&gt;PowerBI&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Utility
&lt;/h1&gt;

&lt;p&gt;Find correlations between the average time spent in a store and the probability that a purchase happens, detect heat sources and patterns for possible rearrangements of the layout, predict stocks, analyse customer sentiment to identify possible dissatisfaction and detect brands that customers consume for segmentation.&lt;/p&gt;

&lt;p&gt;If you want to read more go to &lt;a href="https://www.idiwork.com/projects/experiment-205/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;For many people, artificial intelligence is something reserved. And for few, a future that is yet to come. But the reality is different, since we can see around us a huge number of examples where artificial intelligence is part of our daily lives. Who has not seen a series in Netflix or booked a hotel room in Booking suggested by the product recommender? Or even used Google to search the internet. &lt;/p&gt;

&lt;p&gt;On the other hand, as subject matter experts, we are used to companies and large organizations that do not take full advantage of their data, since data is, without a doubt, ‘the new oil’. Extracting data and making the most out of it is a complex task but, if done in the right way, the return on investment is more than guaranteed. &lt;/p&gt;

&lt;p&gt;Today we will focus on unraveling a system capable of tracing the patterns that people follow at a shopping center; analyzing their feelings at the arrival and departure, whether they go to the fitting room, buy something and much more. This procedure is carried out with &lt;a href="https://azure.microsoft.com/en-us/"&gt;Azure&lt;/a&gt; and the results are shown in &lt;a href="https://powerbi.microsoft.com/en-us/"&gt;Power BI&lt;/a&gt;. &lt;/p&gt;

&lt;h1&gt;
  
  
  Description of the system
&lt;/h1&gt;

&lt;p&gt;Data from the shopping center`s security cameras are rarely exploited. Thanks to advances in technology, we can study our consumers, not only when they interact through our e-commerce platforms but also in the shops. &lt;/p&gt;

&lt;p&gt;The system will answer the following questions: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How many people have entered the shop? &lt;/li&gt;
&lt;li&gt;How long have they stayed inside? &lt;/li&gt;
&lt;li&gt;Which departments has the person visited? &lt;/li&gt;
&lt;li&gt;How much time has the person spent in each department? &lt;/li&gt;
&lt;li&gt;Has the person been in the fitting room? &lt;/li&gt;
&lt;li&gt;Has the person bought anything? &lt;/li&gt;
&lt;li&gt;How long has the person been queuing to pay? &lt;/li&gt;
&lt;li&gt;What is the person’s feeling (positive/negative, maybe surprise at a price, etc.) throughout the visit to the shop? &lt;/li&gt;
&lt;li&gt;Which parts of the store are most often visited?&lt;/li&gt;
&lt;li&gt;Which bags does the person have? &lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Architecture
&lt;/h1&gt;

&lt;p&gt;The simplified architecture to meet all the needs and answer the previous questions would be this one: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--V7NORjVp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v3zfmdraug6on2y02mda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--V7NORjVp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v3zfmdraug6on2y02mda.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We would start from the base of having n number of shops. Each shop would have several security cameras that will be used to trace the patterns and routes of each person. &lt;/p&gt;

&lt;p&gt;At the shop’s closing time (although it could also be done in real time), all the data stored by the security cameras would be sent to Azure’s services for further analysis. &lt;/p&gt;

&lt;p&gt;Let’s see it in detail: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SEN27DxZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/90a75ak9og2jhsglm167.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SEN27DxZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/90a75ak9og2jhsglm167.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each shop will have a database where the different data generated throughout the day will be stored. At the end of the day, an automatic process will send the images captured by the cameras to be analyzed in Azure. In order to meet the business needs, the Face and Computer Vision tools will be used to answer all the target questions that were raised before. &lt;/p&gt;

&lt;p&gt;Finally, all the data generated by &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Azure Cognitive Services&lt;/a&gt; will be stored in an &lt;a href="https://azure.microsoft.com/en-us/services/sql-database/"&gt;Azure SQL Server&lt;/a&gt; in &lt;a href="https://azure.microsoft.com/en-us/overview/what-is-paas/"&gt;PaaS&lt;/a&gt;, for later analysis in Power BI and, thus, making it available for the analysis department. &lt;/p&gt;

&lt;h1&gt;
  
  
  Organization of each shop
&lt;/h1&gt;

&lt;p&gt;Each store will need to be divided into n number of areas so that we can draw a later analysis focused on the desired direction. &lt;/p&gt;

&lt;p&gt;For example, in a shop like this: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7xPABGFa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y457yq7s82h6herjkfe9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7xPABGFa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y457yq7s82h6herjkfe9.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The cameras would need to be strategically placed in order to visualize the shop’s: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Aisles &lt;/li&gt;
&lt;li&gt;Entry and exit of the shop &lt;/li&gt;
&lt;li&gt;Entry and exit of the fitting rooms
&lt;/li&gt;
&lt;li&gt;Cash desks &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In addition, the shop needs to be also divided into different areas, for example, in red, green and blue zones, that may correspond to men’s, women’s and children’s departments. Like this: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2WTPy3ja--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1xrluwdmq6xozhhf5euo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2WTPy3ja--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1xrluwdmq6xozhhf5euo.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Services
&lt;/h1&gt;

&lt;p&gt;As discussed in the above, we will use Microsoft Cognitive Services for image processing:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IlHFH8Gp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/emj4m411zl9qpusrw77f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IlHFH8Gp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/emj4m411zl9qpusrw77f.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WYnhS1kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2eiioi2r7gbjrcb0ugm5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WYnhS1kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2eiioi2r7gbjrcb0ugm5.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the wide spectrum it offers us, we will use the &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/face/"&gt;Facial Recognition&lt;/a&gt; and &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/"&gt;Computer Vision&lt;/a&gt; services.&lt;/p&gt;

&lt;h1&gt;
  
  
  Facial Recognition
&lt;/h1&gt;

&lt;p&gt;The Facial Recognition  service detects the faces of the people who pass through the space we are studying. In addition to identifying them and being able to differentiate them from each other, it also recognizes the facial expressions and provides us the emotion the person is feeling at that specific moment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--emQroZQh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ftii3ahi6qnvy9hx34sb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--emQroZQh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ftii3ahi6qnvy9hx34sb.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Computer Vision
&lt;/h1&gt;

&lt;p&gt;The Computer  Vision service identifies the brands of the bags that our customers carry with them:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c8EVUxM4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/65baejhpo6yf2pe15990.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c8EVUxM4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/65baejhpo6yf2pe15990.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In addition, the Computer Vision service locates the people in our area, giving us the x and y coordinates of the establishment in order to identify the patterns, hot spots and much more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tskeyO5R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8avhifasj7gdfqfgq0ly.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tskeyO5R--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8avhifasj7gdfqfgq0ly.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Assembling the system
&lt;/h1&gt;

&lt;p&gt;Now that we know what the final goal is and the parts that compose it, we’re going to try to assemble the whole system so that it’s able to function as a whole and can provide us the information we need to analyze our business so we can gain a competitive advantage:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;As discussed earlier, the first step is to have each of the stores divided into sections that we are interested in studying. In addition to this, we would conduct a study to identify how many cameras would be needed and where they should be located to be able to control the entire space.&lt;/li&gt;
&lt;li&gt;Once this is done, the next step would be to have a system in charge of collecting the photos of each of the cameras along with information like the time, camera from which it was taken, and other metadata that interests us. All the information will be stored in a database (for example, in an instance of SQL Server) that will be available in each store.&lt;/li&gt;
&lt;li&gt;At the end of each day, there must be a process that will take all the information generated on that day and send it to our services in Azure so we can identify faces, emotions, brands, location in the store, etc.&lt;/li&gt;
&lt;li&gt;Finally, all the information generated by the services will be stored in a centralized database (on an Azure SQL Server instance in PaaS) in order to perform the relevant analyses.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;With the entry of the new GDPR and to ensure that we don’t send images over the network with customers who haven’t given their explicit consent, we may slightly change the previous assembly:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The first step is the same. We will have each store divided into sections and we would conduct a study to identify how many cameras would be needed and where they should be located. &lt;/li&gt;
&lt;li&gt;We would need exactly the same system to collect the photos of each of the cameras along with the information about the time and camera from which it was taken and other metadata that interests us. All this information will be stored in the database that will be available in each store. This point would remain intact in both methods.&lt;/li&gt;
&lt;li&gt;At the end of each day, there will be a process that will take all the information generated on that day and process it on the same store system, so we can bring the models of the three Azure services mentioned earlier to process the images as an IoT system in the Edge. With this technique the faces will be stored as a unique, irreversible, and anonymous identifier (for example, as a hash).&lt;/li&gt;
&lt;li&gt;Once we have processed the images, the information will be stored in the same database where the images and raw data have been stored.&lt;/li&gt;
&lt;li&gt;At this point we could delete all raw information so that we would only have anonymous data. What’s more, we could make a third approach that would be to process the images in real time, not having to store any raw images and eliminate any possibility of having any privacy-related problems.&lt;/li&gt;
&lt;li&gt;Finally, once we have stored all the processing information in the database, we will run one last process before the analysis which is an ETL (Extract, Transform and Load) of Business Intelligence to gather all the processed data from all the stores into a centralized database in Azure (in an Azure SQL Server). &lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Analysis of the information
&lt;/h1&gt;

&lt;p&gt;Once we have assembled the architecture and processed the data using the Cognitive Services,the next step is crucial; we must extract information of that preprocessed data in order to bring it to the knowledge of analysts, who will use it for decision-making.&lt;/p&gt;

&lt;p&gt;To address this last phase, we’ve decided to make use of a great ally like &lt;a href="https://powerbi.microsoft.com/en-us/"&gt;PowerBI&lt;/a&gt;. The report is as follows:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KpnWDPM1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k319jif4vrutnltg1gff.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KpnWDPM1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k319jif4vrutnltg1gff.png" alt="image"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;We have numbered it in such a way that it’s easier to explain and locate the items in the report.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A bar chart in which we see the percentage of the brands of the bags that the customers carried. In this case almost 80% were Zara's.&lt;/li&gt;
&lt;li&gt;We can set different target values. In this case, we have established as our target that 50% of people who enter the store should buy an item. In this example 51.61% of the people ended up buying something.&lt;/li&gt;
&lt;li&gt;In addition to the sales objectives, we have also observed that there may be a correlation between the purchase and the visit to the fitting room. That’s why we established as a target that at least 40% of customers should go through the fitting room. In our case, 36.11% have done so; we’re a little below the target.&lt;/li&gt;
&lt;li&gt;We are also interested in having positive feelings within our store. One of the moments where most discontentment occurs is while waiting. So to do this, we have set a limit of 10 minutes of queue. On average, our customers spent 12.24 minutes queuing, a value somewhat above our goal.&lt;/li&gt;
&lt;li&gt;On the other hand, another aspect that also has a correlation with the final purchase is the time spent in the store. We have set 15 minutes as the minimum time a person should spend in the store. In this case, the average time spent in the store was 20.67 minutes, so this target was achieved.&lt;/li&gt;
&lt;li&gt;We also have some gender statistics. In our store 60% of the visitors are women and 40% are men.&lt;/li&gt;
&lt;li&gt;As for age, we can see how men and women are, on average, 34 and 44 years old respectively.&lt;/li&gt;
&lt;li&gt;As for the areas we defined earlier, we can see the percentage of people who passed through the different areas to take a look at the products. In the case of zone 1, 78% of the people who entered the store ended up passing through zone 1.&lt;/li&gt;
&lt;li&gt;In the case of zone 2, 60%&lt;/li&gt;
&lt;li&gt;Finally, in the case of zone 3, only 28% ended up visiting the zone. This may indicate that the area is not well designed, signalized or just doesn’t have very attractive products.&lt;/li&gt;
&lt;li&gt;In this bar chart we are interested in seeing the total number of people who entered the store each day, and the predominated feeling during the day. This can help us detect potential key days when more staff or better organization is needed. In addition to this, also observe peaks (both high and low) of visits.&lt;/li&gt;
&lt;li&gt;Finally, a filter to segment by date.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In point 11, we see that there is a day with a predominantly negative feeling. We’re talking about January 31:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ptRkjSso--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vyeia16hlv4mgj9z5v59.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ptRkjSso--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vyeia16hlv4mgj9z5v59.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When we filter it we can draw numerous conclusions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;As we have said before, the first thing is that the prevailing feeling on this day is negative.&lt;/li&gt;
&lt;li&gt;We can see how the percentage of purchases is 68.3%, well above average (51.61%)&lt;/li&gt;
&lt;li&gt;The visits to the fitting rooms have reduced to 31.3%. This can happen because we are having a special Sales day, where it comes out more profitable for customers to buy and then return than waiting in line to get to the fitting room.&lt;/li&gt;
&lt;li&gt;Finally, the average time spent queuing has increased dramatically to 42.3 minutes&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Sentiment and pattern analysis
&lt;/h1&gt;

&lt;p&gt;An example of a successful purchase, without any negative emotions or highlights:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PDmK-uxE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tommqkird641uimeumkc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PDmK-uxE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tommqkird641uimeumkc.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;In this chart we can see the path that the person does within our establishment. We can also see where the person is and how much time he/she spends in the same place. This chart also shows the emotions the visitor is having throughout its stay. We see how in one of the points it reflects surprise; probably because the price of something he/she likes is lower than his/her expectation. Also, after leaving the fitting room the visitor seems happy with the result.&lt;/li&gt;
&lt;li&gt;Slicer in charge of making the timeline move forward so that you can see chart 1 dynamically, making it possible to visualize the person’s journey throughout the store as if they did it for real.&lt;/li&gt;
&lt;li&gt;We can also see the actions this person has taken. In this case, the visitor made a purchase (the garment that surprised) and a return.&lt;/li&gt;
&lt;li&gt;The brands on the bags with which the visitor entered&lt;/li&gt;
&lt;li&gt;Gender data&lt;/li&gt;
&lt;li&gt;Age of the visitor&lt;/li&gt;
&lt;li&gt;The time spent in the store. In this case it has been higher than the minimum of 15 minutes that we set before&lt;/li&gt;
&lt;li&gt;The time spent in the queue, this being 8 minutes, less than our maximum.&lt;/li&gt;
&lt;li&gt;And finally, we have at our disposal a slicer for filters per person.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Next, let’s look at an example of a negative situation in the establishment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--on44Ii_Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ndro7f25zd8c00bcpgbo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--on44Ii_Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ndro7f25zd8c00bcpgbo.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This visitor, like the previous one, found a garment that was intriguing and, after passing through the fitting room, was convinced enough to buy it.&lt;/p&gt;

&lt;p&gt;However, we can see how at the end she didn’t take any action. If we look closely at the pivot chart, just as she was queuing her feelings changed drastically first to anger and finally to disgust.&lt;/p&gt;

&lt;p&gt;And we can clearly see how this person was in the establishment for 32 minutes, and 24 of them she passed waiting in line. A time period that is unrealistic to leave satisfied customers or create pleasant emotions in the establishment.&lt;/p&gt;

&lt;p&gt;These types of reports are very close to those of traditional Business Intelligence, where we have graphs, metrics, KPIs,etc. to analyze our business. We can add as many as our analysis and business team needs to make the right decision. We have opted for these as an example, but there are countless studies and metrics to calculate the fruit of the idiosyncrasies of each establishment.&lt;/p&gt;

&lt;h1&gt;
  
  
  Benefits
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Different KPIs for sales and stock&lt;/li&gt;
&lt;li&gt;Sales Forecast&lt;/li&gt;
&lt;li&gt;Intelligent detection of areas of improvement&lt;/li&gt;
&lt;li&gt;Suggestions of promotions&lt;/li&gt;
&lt;li&gt;Seller analysis&lt;/li&gt;
&lt;li&gt;Customer behavior analysis&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Other cases of use
&lt;/h1&gt;

&lt;p&gt;We have adapted this experiment to the retail industry as we - think it’s a case in which it could be applied perfectly in order to get great results and a great return on investment.&lt;/p&gt;

&lt;p&gt;However, there are numerous examples where this system could also be implemented:&lt;/p&gt;

&lt;h2&gt;
  
  
  Sports
&lt;/h2&gt;

&lt;p&gt;Track athletes, their relative position and movements they make when scoring. Analyze correlations and build the key to success.&lt;/p&gt;

&lt;p&gt;In addition, we could also analyze the audience that attend these events, their patterns, behavior, etc.&lt;/p&gt;

&lt;h2&gt;
  
  
  Medicine
&lt;/h2&gt;

&lt;p&gt;We know that medicine is one of the sectors where AI is very present. This system could provide us with disease control, tracking of people with the disease and much more. This could help to prevent and/or combat diseases in a much more effective way.&lt;/p&gt;

&lt;h2&gt;
  
  
  Farming
&lt;/h2&gt;

&lt;p&gt;Although it may seem like a very conservative sector, the use of new technologies is the tendency. The competitive advantages that can be achieved through its inclusion are considerable.&lt;br&gt;
This system would allow us to study and control how animals move,what they eat, whether they have drunk enough water or not, and if they act accordingly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Transportation
&lt;/h2&gt;

&lt;p&gt;In the transport sector this system also can have a lot of use. In any space where there is movement of people, vehicles or objects it will allow us to analyze all the information completely autonomously and draw conclusions.&lt;/p&gt;

&lt;h2&gt;
  
  
  More:
&lt;/h2&gt;

&lt;p&gt;Traceability of people in department stores, where there are a large number of stores&lt;br&gt;
Traceability at airports to maximize their economic performance through passenger purchases&lt;br&gt;
Covid-19 traceability&lt;br&gt;
Optimization of activities (both lucrative and leisure) on cruise ships&lt;br&gt;
Optimization and time waiting in line&lt;br&gt;
Traceability in supermarkets&lt;/p&gt;

&lt;p&gt;If you want to read more go to &lt;a href="https://www.idiwork.com/experiment-205-applied-artificial-intelligence-the-real-one/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Links
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/"&gt;Azure&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Azure Cognitive Services&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/data-factory/"&gt;Azure Data Factory&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/free/machine-learning/"&gt;Azure Machine Learning&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/free/sql-database/"&gt;Azure SQL&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/face/"&gt;Cognitive Services – Face API&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/overview/what-is-paas/"&gt;PaaS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://powerbi.microsoft.com/en-us/"&gt;PowerBI&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>machinelearning</category>
      <category>dataanalysis</category>
      <category>azure</category>
    </item>
    <item>
      <title>Facial Recognition Access Control</title>
      <dc:creator>idiWork</dc:creator>
      <pubDate>Tue, 28 Jan 2020 11:24:28 +0000</pubDate>
      <link>https://dev.to/idiwork/facial-recognition-access-control-lfl</link>
      <guid>https://dev.to/idiwork/facial-recognition-access-control-lfl</guid>
      <description>&lt;h1&gt;
  
  
  About
&lt;/h1&gt;

&lt;p&gt;The Experiment #103 researches about the possibilities of applying Cognitive Services to create an access control system only managed by an Artificial Intelligence by giving it the capabilities of speaking human languages and recognizing human faces.&lt;/p&gt;

&lt;h1&gt;
  
  
  Idea
&lt;/h1&gt;

&lt;p&gt;The project tries to simulate an automatic access control system with facial recognition. The application will tell the visitor if he/she is or is not in a white list in order to open and welcome him/her or to keep the door closed to strangers. The idea is based on an old project from &lt;a href="https://www.hackster.io/windows-iot/windows-iot-facial-recognition-door-e087ce"&gt;huckster.io&lt;/a&gt; created by &lt;a href="https://www.hackster.io/ethankusters"&gt;Ethan Kusters&lt;/a&gt; and &lt;a href="https://www.hackster.io/mazudo"&gt;Masato Sudo&lt;/a&gt; and published in 2016.&lt;/p&gt;

&lt;h1&gt;
  
  
  Utility
&lt;/h1&gt;

&lt;p&gt;For this simulation we will build a smart box. This box will contain an item hidden behind a little door. The door will be locked until the application recognizes a face included in the white list. If the Artificial Intelligence recognizes the visitor the door will automatically open, showing up the item. This way, the owners of this system only have to take care of who take part of the white list by enrolling the different visitors.&lt;/p&gt;

&lt;h1&gt;
  
  
  Process
&lt;/h1&gt;

&lt;p&gt;First, we are going to create a Windows Universal Platform application with C Sharp and .Net Core. This application will be our user interface. After that, we will connect a Webcam to interact with the &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/face/"&gt;Cognitive Service Vision Face API&lt;/a&gt;. Then, we will to ensemble all the electronic parts, a LED light, a button and a servo motor. We will install &lt;a href="https://docs.microsoft.com/en-us/windows/iot-core/"&gt;Windows 10 IoT Core&lt;/a&gt; on a &lt;a href="https://www.raspberrypi.org/products/raspberry-pi-2-model-b/"&gt;Raspberry Pi 2 model B GPIO&lt;/a&gt; device to run the application on it. Also, we will build our 3D printed smart box to contain the entire project and the electronics. Finally, we will plug in a speaker to hear the machine speaking thanks to the &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/text-to-speech/"&gt;Cognitive Service Text to Speech API&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Advantages
&lt;/h1&gt;

&lt;p&gt;The principal advantage of using Cognitive Services in this simulation is the possibility of identifying people quickly and accurately. The Artificial Intelligence can work for us 24 hours a day, 7 days a week without any interruption. This way a company will save money and resources and will be very much efficient controlling the accesses to a building for example. In addition, with the capability to speak we can give the machine a more human appearance.&lt;/p&gt;

&lt;h1&gt;
  
  
  Architecture
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--g7kFrItR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_Facial_Recognition_Architecture_Diagram_A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g7kFrItR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_Facial_Recognition_Architecture_Diagram_A.png" alt="PoC facial recognition architecture diagram"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, we created a Windows Universal Application to run the project on desktop and mobile. This app allows us to register the faces of our whitelisted users (01). These people will be the ones with access permissions.&lt;/p&gt;

&lt;p&gt;The app connects to Azure (02) and its &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Cognitive Services&lt;/a&gt; (03) to use the &lt;a href="https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/text-to-speech"&gt;Text to Speech&lt;/a&gt; (04) service and the &lt;a href="https://docs.microsoft.com/en-us/azure/cognitive-services/face/"&gt;Face&lt;/a&gt; (05) service we have previously deployed.&lt;/p&gt;

&lt;p&gt;With the &lt;a href="https://docs.microsoft.com/en-us/azure/cognitive-services/face/"&gt;Face&lt;/a&gt; cognitive service, we can store all users by adding their photos to a Person Group List (06) in the cloud.&lt;/p&gt;

&lt;p&gt;There are reverse arrows in the diagram because the process works also for consulting the Person Group list and tell us if the visitor is registered on it.&lt;/p&gt;

&lt;p&gt;If you want to read more go to &lt;a href="https://www.idiwork.com/experiment-103-architectural-diagram/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Step by Step
&lt;/h1&gt;

&lt;h2&gt;
  
  
  1. How to work with Face Cognitive Service
&lt;/h2&gt;

&lt;p&gt;First, we are going to create the &lt;a href="https://docs.microsoft.com/en-us/azure/cognitive-services/face/"&gt;Face Resource&lt;/a&gt; from our Azure portal under &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Cognitive Services&lt;/a&gt; group:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VpeSLeov--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://www.idiwork.com/wp-content/uploads/Face_Service.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VpeSLeov--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://www.idiwork.com/wp-content/uploads/Face_Service.gif" alt="Face service GIF"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-103-how-to-modify-the-project-to-work-with-face-cognitive-service-and-servo-motor/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. How to set up the IoT device hardware
&lt;/h2&gt;

&lt;p&gt;We will use a &lt;a href="https://www.raspberrypi.org/products/raspberry-pi-2-model-b/"&gt;Raspberry Pi 2 Model B&lt;/a&gt; to run our IoT project. Also, we need to attach to it some peripherals like: Nano USB Wi-Fi Adapter, USB Compact Speaker Set and Microsoft LifeCam HD-3000.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/BEEPtL6jqNA"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;If you want to see the whole process visit &lt;a href="https://www.idiwork.com/experiment-103-how-to-set-up-the-iot-device-hardware-peripherals-and-electronics/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. How to pack and launch the IoT project
&lt;/h2&gt;

&lt;p&gt;The 3D model of the project box was created in &lt;a href="https://www.sketchup.com/"&gt;SketchUp&lt;/a&gt;. Inside the six faces, there are individual spaces for each component.&lt;/p&gt;

&lt;p&gt;Then, we used a &lt;a href="https://www.prusa3d.com/prusaslicer/"&gt;Slicer Software&lt;/a&gt; to convert the STL 3D files into G-Code that our 3D Printer can understand and print.&lt;/p&gt;

&lt;p&gt;Finally, we printed the 3D parts in a &lt;a href="https://www.prusa3d.com/prusaslicer/"&gt;Prusa i3 MK2S&lt;/a&gt; printer with black, red and white filaments from &lt;a href="https://www.smartmaterials3d.com/"&gt;Smart Materials&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/2hq4-DKWP5M"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-103-how-to-build-a-3d-printed-box-to-pack-and-run-the-iot-project/"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Links
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/"&gt;Azure Cognitive Services&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/face/"&gt;Cognitive Services – Face API&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/text-to-speech/"&gt;Cognitive Services – Text to Speech&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.raspberrypi.org/products/raspberry-pi-2-model-b/"&gt;Raspberry Pi 2 Model B&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developer.microsoft.com/en-us/windows/iot"&gt;Windows IoT Core&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/windows/iot-core/connect-your-device/iotdashboard"&gt;Windows IoT Dashboard&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.thingiverse.com/thing:4020615"&gt;Thingiverse 3D Model&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.sketchup.com/"&gt;SketchUp 3D Software&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.prusa3d.com/"&gt;Prusa i3 MK2 3D Printer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.prusa3d.com/prusaslicer/"&gt;PrusaSlicer Software&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.smartmaterials3d.com/"&gt;3D Printing Filament&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Check the code in the &lt;a href="https://github.com/idiWork/Experiment_103"&gt;GitHub repo&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>iot</category>
      <category>dotnet</category>
      <category>ai</category>
    </item>
    <item>
      <title>Create a hotel customer reviews classification with Azure</title>
      <dc:creator>idiWork</dc:creator>
      <pubDate>Tue, 07 Jan 2020 07:59:26 +0000</pubDate>
      <link>https://dev.to/idiwork/create-a-hotel-customer-reviews-classification-with-azure-iep</link>
      <guid>https://dev.to/idiwork/create-a-hotel-customer-reviews-classification-with-azure-iep</guid>
      <description>&lt;h3&gt;
  
  
  About
&lt;/h3&gt;

&lt;p&gt;The Experiment #102 researches about the possibilities of applying Cognitive Services and Deep Learning to understand and manage a huge amount of information from human interactions without the human intervention. We will discover if the Artificial Intelligence is capable of collect all this raw data, classify it and manage all the different situations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Idea
&lt;/h2&gt;

&lt;p&gt;The project tries to simulate an automatic classification of comments or reviews in a business environment. Any company wants to identify critical requests as soon as possible to try to solve it before them become a big problem, regardless of the number of messages that arrive, or the language in which them were wrote, or the time that them were sent.&lt;/p&gt;

&lt;h2&gt;
  
  
  Utility
&lt;/h2&gt;

&lt;p&gt;For this simulation it is been chosen the case of an hotel. We will imagine that the customers have access to a mobile application in which they can leave their comments, complaints and suggestions. They can use images to illustrate their point of view or location and text messages to explain it. Due to content, the reviews will be classified as good, neutral or bad.&lt;/p&gt;

&lt;p&gt;In this case we want to classify each comment and identify the bad, urgent and important ones to manage them quickly and, if it were necessary, report to a different human agent depends of the location (room, bar, restaurant, swimming pool, etc), who will can solve it properly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Process
&lt;/h2&gt;

&lt;p&gt;First, we are going to create an &lt;a href="https://notebooks.azure.com" rel="noopener noreferrer"&gt;Azure Notebooks&lt;/a&gt; project where we can set up a Notebook server. Then we will create and upload some files based on Python to deploy a Text Summarization service in an &lt;a href="https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-manage-workspace" rel="noopener noreferrer"&gt;Azure Machine Learning Workspace&lt;/a&gt;. After that, we will construct and train a simple deep neural network classification model, with &lt;a href="https://keras.io/models/model/" rel="noopener noreferrer"&gt;Keras&lt;/a&gt; and Tensorflow, that will classify the hotel customers reviews.&lt;/p&gt;

&lt;p&gt;Also, we will perform the integration with the &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/" rel="noopener noreferrer"&gt;Azure Cognitive Services&lt;/a&gt; along with the Azure Machine Learning. We will use Computer Vision API to collect information from pictures or photographs and Text Analytics API to extract data from human utterances.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advantages
&lt;/h2&gt;

&lt;p&gt;The principal advantage of using Cognitive Services and Deep Learning in this case is the possibility of identifying very quickly the reviews or comments that needs to be carefully manage by a human been because of its importance or its urgency.&lt;/p&gt;

&lt;p&gt;Even if tons and tons of messages arrive, we can be sure that the most critical messages will be identified and treated immediately and the rest of them will be classified and stored correctly.&lt;/p&gt;

&lt;p&gt;This way a company will save money and resources and will be very much efficient attending customers’ requests.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FPoC_ML_Architecture_Diagram_A-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FPoC_ML_Architecture_Diagram_A-1.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We tried to simulate hotel customer inputs from different sources (01) like Social Media, this hotel reviews (02) will be send to an Artificial Intelligent for its analysis.&lt;/p&gt;

&lt;p&gt;We used &lt;a href="https://notebooks.azure.com/" rel="noopener noreferrer"&gt;Azure Notebooks&lt;/a&gt; with &lt;a href="https://jupyter.readthedocs.io/en/latest/index.html" rel="noopener noreferrer"&gt;Jupyter&lt;/a&gt; to create, train and deploy the services we need. Ones of them are Containerized Services (03) we trained with Deep Learning like the Summarize Service (05) and the Classification Service (06). Others are Cognitive Services (04) we used directly from Microsoft like Computer Vision (07) and Text Analytics (08).&lt;/p&gt;

&lt;p&gt;If you want to read more go to &lt;a href="https://www.idiwork.com/experiment-102-architectural-diagram/" rel="noopener noreferrer"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step by Step
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. How to create an Azure Notebooks project
&lt;/h3&gt;

&lt;p&gt;First, we can access this Azure Notebooks Service by visiting &lt;a href="https://notebooks.azure.com/" rel="noopener noreferrer"&gt;https://notebooks.azure.com/&lt;/a&gt; and then, we can create a new &lt;a href="https://notebooks.azure.com/home/projects" rel="noopener noreferrer"&gt;Azure Notebooks Project&lt;/a&gt; from “My Projects”.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FStepByStep_01_01_project.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FStepByStep_01_01_project.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-102-how-to-create-an-azure-notebooks-project-and-deploy-a-summarization-service/" rel="noopener noreferrer"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. How to construct a Deep Neural Network
&lt;/h3&gt;

&lt;p&gt;Keras is a Python library that allows us to construct Deep Learning models. First, we must import all needed modules and download the text analytics files from our GitHub repository. The cells of the “.ipynb” script look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;re&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;nltk&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;uuid&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pandas&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;pd&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;tensorflow&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;tf&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;keras&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;keras&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;models&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;layers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;optimizers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;regularizers&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;keras.models&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Sequential&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;keras.layers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Activation&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;keras.utils&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;to_categorical&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;urllib.request&lt;/span&gt;
&lt;span class="n"&gt;data_location&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;./data&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
&lt;span class="n"&gt;base_data_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://raw.githubusercontent.com/idiWork/Experiment_102/master/resources/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
&lt;span class="n"&gt;filesToDownload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;reviews_labels.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;reviews_texts.txt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;contractions.py&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;textanalytics.py&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;makedirs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data_location&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;exist_ok&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="nb"&gt;file&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;filesToDownload&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;data_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;base_data_url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;file&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;local_file_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data_location&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;file&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;urllib&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;urlretrieve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data_url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;local_file_path&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;nltk&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;download&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;stopwords&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
&lt;span class="n"&gt;nltk&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;download&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;punkt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;sys&lt;/span&gt;
&lt;span class="n"&gt;sys&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data_location&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;textanalytics&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ta&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-102-how-to-construct-and-train-a-deep-neural-network-using-keras-and-deploy-the-model-as-an-azure-web-service/" rel="noopener noreferrer"&gt;this post&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  3. How to deploy Azure Cognitive Services
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/" rel="noopener noreferrer"&gt;Computer Vision&lt;/a&gt; is an &lt;a href="https://azure.microsoft.com/en-us/services/cognitive-services/" rel="noopener noreferrer"&gt;Azure Cognitive Service&lt;/a&gt; that allow us to get information from an image. It analyzes the image environment, detects objects inside them and classifies all of them by assigning tags and labels. We can find this resource in our Azure Portal under the “AI + Machine Learning” group:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FExp102_StepByStep_03_01.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.idiwork.com%2Fwp-content%2Fuploads%2FExp102_StepByStep_03_01.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After deploying the service, we must look and write down at the subscription key and the endpoint to integrate the service in our project.&lt;/p&gt;

&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-102-how-to-deploy-and-integrate-azure-cognitive-services-computer-vision-and-text-analytics/" rel="noopener noreferrer"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. How to use Microsoft Flow to send alerts
&lt;/h3&gt;

&lt;p&gt;We have created a very simple C# console project to simulate the reception of hotel reviews. The program makes HTTP requests to the different services we have previously deployed to collect all the information from the image and text of the review. These services include containerized services like Summarize or Classification and Cognitive Services like Computer Vision or Text Analytics.&lt;/p&gt;

&lt;p&gt;All review data will be stored in a SQL database. This is the database definition:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;dbo&lt;/span&gt;&lt;span class="p"&gt;].[&lt;/span&gt;&lt;span class="n"&gt;review&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;IDENTITY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;text&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="k"&gt;MAX&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;64&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;sentiment&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;64&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;2048&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;keys&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;captions&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;categories&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;tags&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nvarchar&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;CONSTRAINT&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;PK_review&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt; &lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want to see the whole process just go to &lt;a href="https://www.idiwork.com/experiment-102-how-to-use-microsoft-flow-to-send-an-email-when-an-event-occurs/" rel="noopener noreferrer"&gt;this post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To keep reading about these topics, check out the following links:&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://azure.microsoft.com/en-us/services/logic-apps/" rel="noopener noreferrer"&gt;Azure Logic Apps&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>python</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Using Blockchain &amp; IoT during organ transportation</title>
      <dc:creator>idiWork</dc:creator>
      <pubDate>Tue, 24 Sep 2019 07:46:36 +0000</pubDate>
      <link>https://dev.to/idiwork/using-blockchain-iot-during-organ-transportation-4fe1</link>
      <guid>https://dev.to/idiwork/using-blockchain-iot-during-organ-transportation-4fe1</guid>
      <description>&lt;p&gt;Technology has made possible to imagine incredible things, but we cannot lose sight of the reason we create: people.&lt;/p&gt;

&lt;p&gt;We must continue creating to improve our present and also our future, and one of the most important issues we can improve on is in the medical field. In idiWork we have imagined what it would be like to combine IoT with blockchain for the transport of volatile goods, such as organs in transplant surgeries.&lt;/p&gt;

&lt;p&gt;Through IoT, we can have continuous information on the state of an organ. With Azure technology, we can transmit information about the organ in real time, and with blockchain, that it is transmitted under any circumstance and in a secure way.&lt;/p&gt;

&lt;p&gt;In addition, through the smart contract that we will create with blockchain, we will maintain an meticulous record of each action that is performed throughout this process, in addition making sure that each party involved is synchronized, taking action at the time that is required.&lt;/p&gt;

&lt;p&gt;This is the structure of the project, in which we can see how we obtain the information from the physical device, registering it properly and registering it...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p_4idsnY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_IoT_Blockchain_Architecture_Diagram_A.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p_4idsnY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_IoT_Blockchain_Architecture_Diagram_A.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;... so that each event and useful information reaches all the users involved:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mS5hbai8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_IoT_Blockchain_Architecture_Diagram_B.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mS5hbai8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://www.idiwork.com/wp-content/uploads/PoC_IoT_Blockchain_Architecture_Diagram_B.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Do you want to know how we have achieved it? In &lt;a href="https://www.idiwork.com/projects/experiment-101/"&gt;idiWork you will find the step by step process in great detail&lt;/a&gt;, from the IoT device setting to the Azure functions development, to learn how this technology works and to continue creating.&lt;/p&gt;

&lt;p&gt;Want to know more?&lt;/p&gt;

&lt;h3&gt;
  
  
  First part: IoT device setup 🔌
&lt;/h3&gt;

&lt;p&gt;In the &lt;a href="https://www.idiwork.com/step-by-step-how-to-set-up-an-iot-device-in-azure-iot-hub/"&gt;first part, we will make the IoT device setup through Azure IoT hub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;We register the device we work with, configuring its Wi-Fi connection and making possible the transmission of information with Azure services. You can even follow the process with the device in our video, an MxChip AZ3166. Check it out:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/WZLupmVKxKs"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h3&gt;
  
  
  Second part: Blockchain Worbench App 🔒
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.idiwork.com/how-to-create-a-blockchain-workbench-app/"&gt;In the second part, you will learn how to create a Blockchain Worbench App&lt;/a&gt;. Here we will detail one of the most important parts of this process: the smart contract. We can define all possible states of the organ that is transported, the minimum and maximum temperature, and the parts that operate in each state as well. Thus everything is coordinated to avoid errors, and facilitate all parties to act in the moment and in the way they should.&lt;/p&gt;

&lt;h3&gt;
  
  
  Third part: Azure Function App 💻
&lt;/h3&gt;

&lt;p&gt;In the end, &lt;a href="https://www.idiwork.com/experiment-101-how-to-create-and-azure-function-app-to-record-telemtry-readings/"&gt;the third part will show how to create a Azure Function App to register all telemetries&lt;/a&gt;. We will specify the values of the telemetries that we register, and you can explore in detail the code in C#, where we see how to authorize the actions of the process and how the information is serialized or deserialized in JSON. &lt;/p&gt;

&lt;p&gt;You can even see this experiment in action, don't miss it!&lt;/p&gt;

&lt;h3&gt;
  
  
  This is how we can imagine, build and create a great future.
&lt;/h3&gt;

&lt;p&gt;Don't hesitate to visit the repository on GitHub, where we publish all our experiments:&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i3JOwpme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/idiWork"&gt;
        idiWork
      &lt;/a&gt; / &lt;a href="https://github.com/idiWork/Experiment_101"&gt;
        Experiment_101
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Using Blockchain &amp;amp; Internet of Things During Organ Transportation
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;h1&gt;
Experiment #101 - Organ Transportation&lt;/h1&gt;
&lt;h4&gt;
Using Blockchain &amp;amp; IoT During Organ Transportation.&lt;/h4&gt;

&lt;h6&gt;
Project Date:&lt;/h6&gt;
&lt;ul&gt;
&lt;li&gt;May - August 2019&lt;/li&gt;
&lt;/ul&gt;
&lt;h6&gt;
Services:&lt;/h6&gt;
&lt;ul&gt;
&lt;li&gt;Azure IoT Central&lt;/li&gt;
&lt;li&gt;Azure IoT Hub&lt;/li&gt;
&lt;li&gt;Azure Event Hubs&lt;/li&gt;
&lt;li&gt;Azure Functions&lt;/li&gt;
&lt;li&gt;Azure Logic Apps&lt;/li&gt;
&lt;li&gt;Azure Blockchain Workbench&lt;/li&gt;
&lt;/ul&gt;
&lt;h6&gt;
Technology used:&lt;/h6&gt;
&lt;ul&gt;
&lt;li&gt;Internet of Things&lt;/li&gt;
&lt;li&gt;Blockchain&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
About&lt;/h2&gt;
&lt;p&gt;This Experiment researches the possibilities of combining Internet of Things (IoT) and Blockchain for transportation of volatile goods. IoT allows us to communicate with devices and collect ambient measurements like temperature and humidity from different sensors. The different components involved send and receive data and the Blockchain structure stores, transmits and verifies this data with integrity and in a secure way.&lt;/p&gt;
&lt;h2&gt;
Idea&lt;/h2&gt;
&lt;p&gt;The experiment attempts to simulate the real time monitoring of environmental conditions in the transportation of volatile goods from one point to another, verifying at all times that the obtained values falls within the established security parameters.&lt;/p&gt;
&lt;h2&gt;
Utility&lt;/h2&gt;
&lt;p&gt;For this experiment we chose the case…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/idiWork/Experiment_101"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;Thanks for reading, see you in the next experiment!&lt;/p&gt;

</description>
      <category>blockchain</category>
      <category>iot</category>
      <category>azure</category>
      <category>microsoft</category>
    </item>
  </channel>
</rss>
