<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Zekeriyya Demirci</title>
    <description>The latest articles on DEV Community by Zekeriyya Demirci (@zekeriyyaa_).</description>
    <link>https://dev.to/zekeriyyaa_</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/zekeriyyaa_"/>
    <language>en</language>
    <item>
      <title>Traffic Data Analysis with Apache Spark Based on Autonomous Transport Vehicle Data</title>
      <dc:creator>Zekeriyya Demirci</dc:creator>
      <pubDate>Tue, 05 Apr 2022 22:14:47 +0000</pubDate>
      <link>https://dev.to/zekeriyyaa_/traffic-data-analysis-with-apache-spark-based-on-autonomous-transport-vehicle-data-4leg</link>
      <guid>https://dev.to/zekeriyyaa_/traffic-data-analysis-with-apache-spark-based-on-autonomous-transport-vehicle-data-4leg</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gXgMtUc7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rudz7jo6dsvzubgi1jcv.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gXgMtUc7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rudz7jo6dsvzubgi1jcv.gif" alt="Image description" width="480" height="270"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="http://www.finern.com/info_httpwwwfinerncombuu.html"&gt;resource of gif&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The increasing use of autonomous systems in industry is making systems more complex. Autonomous transportation vehicles, which have many uses, are one of them. Especially in places with limited space such as smart factories, a lot of control should be considered to prevent possible accidents that may arise from traffic. Therefore, comprehensive solutions are needed to keep the system up for as long as possible.&lt;/p&gt;

&lt;p&gt;You can access all project on &lt;a href="https://github.com/zekeriyyaa/Traffic-Data-Analysis-with-Apache-Spark-Based-on-Mobile-Robot-Data"&gt;my github repo.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let’s start ..&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;First of all, I would like to clarify that this approach was developed for the “&lt;a href="https://ifarlab.ogu.edu.tr/Icerik/Detay/4"&gt;Development of Autonomous Transportation Vehicles and HMI-M2M Interfaces for Smart Factories&lt;/a&gt;” project financed by TÜBİTAK.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There may be traffic caused by autonomous transportation vehicles (ATVs) in smart factories. Therefore, in order to ensure the sustainability of the system and increase its efficiency, it is necessary to analyze the data produced by the ATV and generate useful feedback fed into the system. For this purpose, five analyzes were taken into account as given below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Travel time: The elapsed time between the ATV starting to task and finishing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Waiting time: The elapsed time while ATV is waiting during performing its task.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Average speed: The average speed of the ATV while performing its task.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Occupancy: (way lenght) / (number of vehicle * vehicle length)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Density: ( number of vehicle / way length )&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A document instance stored in MongoDB is shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oGGiLgR8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/59sxu8rj52zpe6aciguu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oGGiLgR8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/59sxu8rj52zpe6aciguu.png" alt="Image description" width="315" height="264"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All system components communicate via ROS (Robot Operating System) which is a middleware. The data generated by the ATV’s sensors is instantly sent to MongoDB. Apache Spark analyzes this data, generates results every 15 minutes and writes the results to MsSQL.&lt;br&gt;
The system architecture is given below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Eu6VuDDC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ggvz84yv4e93hnr2kmmg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Eu6VuDDC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ggvz84yv4e93hnr2kmmg.png" alt="Image description" width="875" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the developed desktop application, it is also possible to visualize the results via the GUI. The application takes all analysis results from MSSQL and then visualizes it. It is able to display all five analysis result by grouping way ID or timestamp (every 15 minutes). You can see the min, max and average results and graphical representation of the selected analysis type as given below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qJxRiexa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/78gkoyi87srf64mowd83.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qJxRiexa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/78gkoyi87srf64mowd83.png" alt="Image description" width="880" height="281"&gt;&lt;/a&gt;&lt;br&gt;
The ATV moves according to the given waypoints and completes the whole path. Analyzes are produced separately for each way. A way denotes the area between two waypoints.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RloDPpOr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e567knrayixfvllvj4b7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RloDPpOr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e567knrayixfvllvj4b7.png" alt="Image description" width="880" height="598"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>python</category>
      <category>mongodb</category>
      <category>datascience</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Augmented Reality Application for Autonomous Transport Vehicle</title>
      <dc:creator>Zekeriyya Demirci</dc:creator>
      <pubDate>Wed, 30 Mar 2022 21:02:28 +0000</pubDate>
      <link>https://dev.to/zekeriyyaa_/augmented-reality-application-for-autonomous-transport-vehicle-4ipp</link>
      <guid>https://dev.to/zekeriyyaa_/augmented-reality-application-for-autonomous-transport-vehicle-4ipp</guid>
      <description>&lt;p&gt;&lt;strong&gt;AR (Augmented Reality)&lt;/strong&gt; has a wide range of uses in many fields such as medicine, entertainment, education or industry. After the Industry 4.0 revolution, it has become one of the focal points of smart factory applications. In this article, I will talk about our approach that applies the augmented reality application to an ATV(Autonomous Transport Vehicle).&lt;/p&gt;




&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6ZjUjrhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ody8w8uazlibfhz6g2bs.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6ZjUjrhN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ody8w8uazlibfhz6g2bs.jpeg" alt="Image description" width="700" height="466"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.forbes.com/sites/theyec/2019/02/06/augmented-reality-in-business-how-ar-may-change-the-way-we-work/?sh=343f430f51e5"&gt;Augmented Reality In Business — Forbes&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let’s start ..&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;First of all, I would like to clarify that this approach was developed for the “&lt;u&gt;Development of Autonomous Transportation Vehicles and HMI-M2M Interfaces for Smart Factories&lt;/u&gt;” project financed by TÜBİTAK.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;You can access all project on &lt;a href="https://github.com/zekeriyyaa/Augmented-Reality-on-Autonomous-Guided-Vehicle"&gt;my github repo.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In practise, it’s aimed to augment the sensor data received from the ATV in order to inform the operator and prevent possible hazard. For this purpose, the ATV and laboratory shown in Figure 1 were used.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jARFcKH1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pn1y7bk7kq3nl3ie3m4d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jARFcKH1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pn1y7bk7kq3nl3ie3m4d.png" alt="Image description" width="700" height="210"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://ifarlab.ogu.edu.tr/"&gt;Figure 1 — ATV and Laboratory Environment&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Prior to realize this approach in real environment, it was performed in Gazebo simulation environment shown in Figure 2.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fqN8PXKJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rf6gmik1e9mc1ka6erfz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fqN8PXKJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rf6gmik1e9mc1ka6erfz.png" alt="Image description" width="700" height="333"&gt;&lt;/a&gt;&lt;br&gt;
Figure 2 — Gazebo simulation environment&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The data used are as follows:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Position (x,y): left-top side of the screen&lt;/li&gt;
&lt;li&gt;Velocity: left-top side of the screen&lt;/li&gt;
&lt;li&gt;Charge percentage:left-top side of the screen&lt;/li&gt;
&lt;li&gt;Laser(270 degree): red colored circles&lt;/li&gt;
&lt;li&gt;Route: orange colored lines&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;ROS (Robot Operating System) is located at the center of the architecture shown in Figure 3. It acts as middleware between system components and allows them to communicate easily.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W9nsIn5t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y019of7bntoe38al3co5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W9nsIn5t--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y019of7bntoe38al3co5.png" alt="Image description" width="700" height="302"&gt;&lt;/a&gt;&lt;br&gt;
Figure 3 — System Architecture&lt;/p&gt;

&lt;p&gt;There are two different camera perspectives that you can switch while ATV is performing its task:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;“ATV Camera” mounted on the front of the ATV.&lt;/li&gt;
&lt;li&gt;“AR Camera” of additional devices such as telephone or tablet.&lt;/li&gt;
&lt;/ol&gt;




&lt;p&gt;&lt;strong&gt;Results&lt;/strong&gt;&lt;br&gt;
You can see the screenshots of simulation environment shown in Figure 4.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9pkOZFCt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ljoajl8r3dvn80f0kyhp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9pkOZFCt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ljoajl8r3dvn80f0kyhp.png" alt="Image description" width="518" height="514"&gt;&lt;/a&gt;&lt;br&gt;
Figure 4 — Gazebo simulation environment&lt;/p&gt;

&lt;p&gt;You can also see the screenshots of real environment shown in Figure 5.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--P9aUPSHi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l68a5tnnf9f6iljwcqym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--P9aUPSHi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l68a5tnnf9f6iljwcqym.png" alt="Image description" width="700" height="195"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://ifarlab.ogu.edu.tr/"&gt;Figure 5 —Real environment&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can access the application videos using the URLs given below.&lt;br&gt;
&lt;a href="https://drive.google.com/file/d/1twgUH91NDP1vr8UKaw8O8o58-cuKvQFH/view"&gt;1. Simulation Environment AR Camera&lt;/a&gt;&lt;br&gt;
&lt;a href="https://drive.google.com/file/d/1yQWT3beJa_ufd6g9eOfbxqOkZzo6VqDu/view"&gt;2. Real Environment AR Camera&lt;/a&gt;&lt;br&gt;
&lt;a href="https://drive.google.com/file/d/1UhpUbeTwgXvo0dUhdHG4E4hkNhpImz_W/view"&gt;3. Real Environment ATV Camera&lt;/a&gt;&lt;/p&gt;

</description>
      <category>augmentedreality</category>
      <category>robotic</category>
      <category>ros</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
