<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Greg Gradwell</title>
    <description>The latest articles on DEV Community by Greg Gradwell (@greghgradwell).</description>
    <link>https://dev.to/greghgradwell</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/greghgradwell"/>
    <language>en</language>
    <item>
      <title>Part 13: 1+1=1</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Wed, 20 Jan 2021 21:58:11 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-13-1-1-1-3a5</link>
      <guid>https://dev.to/greghgradwell/part-13-1-1-1-3a5</guid>
      <description>&lt;p&gt;No need to pull out your calculator, folks, the math here isn't literal. Today we're talking about how it was possible to add a multirotor controller to our fixed-wing autopilot in just 4 days thanks to the awesomeness of Elixir. Now, if you're thinking "4 days is not nearly enough time to make a good multirotor controller", you're absolutely right. Let's call it the &lt;em&gt;Little Caesar's Hot-N-Ready Controller&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I8-J2dvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/105225148-04533400-5b13-11eb-89b5-38d1451eeae4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I8-J2dvf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/105225148-04533400-5b13-11eb-89b5-38d1451eeae4.png" width="50%"&gt;&lt;/a&gt;&lt;/p&gt; &lt;br&gt;
I mean, I absolutely love their pizza, but certain (former) friends of mine... Anyway, just to prove that the controller isn't terrible, I offer you a video of this quadcopter flying in RealFlight. It's the same mission as the one we've used for our Cessna 150, with a bonus orbit added in the middle. But if you don't have two minutes to spare, I'll save you the trouble: it flies just fine. Would I trust this controller on a real quadcopter that had real rotor blades that made real cuts when they hit your real face? Probably not. Mostly because it lacks all the niceties that a mature controller would have to protect you from your own incompetence. But it's good enough to demonstrate the concept, and that's all I care about at the moment.&lt;br&gt;
&lt;br&gt;&lt;br&gt;&lt;a href="https://youtu.be/ItGxSKOme6Y?t=3"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XGVk9wJJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/105225127-fbfaf900-5b12-11eb-806d-dd972482e834.png" width="100%"&gt;&lt;/a&gt;&lt;br&gt;
&lt;h2&gt;
  
  
  The One-Eyed Man
&lt;/h2&gt;

&lt;p&gt;Full disclosure: I am not even close to an authority on this subject. I wouldn't be the least bit surprised if I'm butchering the Elixir language with how I'm using functions. But it seems reasonable to me and it appears to be working, so for now... wheeeeeeeeeeee.&lt;br&gt;&lt;br&gt;&lt;br&gt;
From the beginning this autopilot was intended to control different types of vehicles. The first prototype was actually a ground vehicle controller, because those tend to fall out of the sky much less. This meant that vehicle-specific parameters were already broken out into separate folders and functions. The model name is specified by a single file located on a USB flash drive installed in the Raspberry Pi (because we don't want to touch our source code, right?). When the autopilot first boots it checks the name of this file, i.e., "Cessna150", and then loads the appropriate configuration. For one thing, the autopilot will know that it is an airplane. Other pieces of information include the PID controller gains, vehicle turn rate (for planning purposes), and controller limits (rate, attitude, speed, etc.). The type of vehicle is passed to every module that might need it (the Cessna 150 is a "Plane"). At this point we can take advantage of the Elixir function &lt;code&gt;apply&lt;/code&gt;, which allows us to call a function whose name and location (module) can be specified at runtime. In other words, if I have a function called &lt;code&gt;add&lt;/code&gt; located in the &lt;code&gt;Math&lt;/code&gt; module,  we could expect that:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apply(Math, :add, [2,3]) = 5
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or if we had two controllers, one for a &lt;code&gt;Plane&lt;/code&gt; and one for a &lt;code&gt;Multirotor&lt;/code&gt;, as long they each had a function called &lt;code&gt;fly&lt;/code&gt;, the following code blocks would both be valid&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apply(Plane, :fly, [1,2,3,4])
apply(Multirotor, :fly, [1,2,3,4])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;But now let's abstract the vehicle type to actually take advantage of the runtime aspect:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vehicle_module = Plane
apply(vehicle_module, :fly, [1,2,3,4])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;See how easy that was? Now everywhere that we have vehicle-specific instructions we can call the correct function according to the vehicle type. This means that our code isn't littered with conditional statements like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if (vehicle_type == Plane) do
    fly_like_a_plane(1,2,3,4);
else if (vehicle_type == Multirotor) do
    fly_like_a_multirotor(1,2,3,4);
end
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Of course, most of the autopilot code is vehicle agnostic, so it remains untouched. But in the future if we decide to support a new type of vehicle (autonomous unicycles, anyone?), we won't have to tear everything apart. Hence the funny math at the start of this. We can add support for an entirely new vehicle without a tremendous amount of effort (assuming we already understand the logic for controlling it). So it's more like &lt;code&gt;1+1=1.05&lt;/code&gt; but our editor, O'Brien, said that either one is fine.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next?
&lt;/h2&gt;

&lt;p&gt;There is one task that we have yet to complete with this autopilot: fly a fully-autonomous mission with a real vehicle. Our priority has been simulation as of late, but our 2.1m Cessna is just about ready to go, so once the weather gets to 80° I'll be able to stand outside without whining and we'll fly this thing for real. See you soon!&lt;br&gt;&lt;br&gt;-Greg&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fOsUCHiE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/105225178-0d440580-5b13-11eb-8e27-52f3d40f9704.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fOsUCHiE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/105225178-0d440580-5b13-11eb-8e27-52f3d40f9704.gif" width="60%"&gt;&lt;/a&gt;&lt;a href="https://media1.tenor.com/images/511960e849769ffcc6e536e390f9810d/tenor.gif?itemid=17644004"&gt;1&lt;/a&gt;&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>uav</category>
      <category>autopilot</category>
      <category>quadcopter</category>
    </item>
    <item>
      <title>Part 12b: A Eye in the Sky</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Tue, 12 Jan 2021 05:48:33 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-12b-a-eye-in-the-sky-2e2i</link>
      <guid>https://dev.to/greghgradwell/part-12b-a-eye-in-the-sky-2e2i</guid>
      <description>&lt;p&gt;Welcome to the thrilling continuation of our RealFlight simulator discussion. If you remember from last time (I had to go back and look too), we were able to capture video from a downward-facing view in RealFlight and transmit its data via UDP. These images could be collected by a separate process, thereby creating the presence of a third-party "camera". Now we will actually put that camera to work. To give you some context, here's a quick glimpse of what the camera sees on takeoff:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/biq0UXefXZuLEucAAC/giphy-downsized-large.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/biq0UXefXZuLEucAAC/giphy-downsized-large.gif" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now somewhere out in the grass lies a giant red circle. Our camera has been equipped with a clever algorithm to identify this red circle and tell the aircraft to orbit around it. You know what's cool about game engines? The pixels are perfect. So it really only takes like 4 lines of OpenCV code to find the circle. The hardest part is calculating the distance from the aircraft to the circle (which I don't think I got exactly right, so it's a little twitchy). But once the circle has been found, the aircraft will continue to orbit as long as the camera continues to send commands, or until the GCS operator overrides the camera's permission to control the vehicle. In the future, the autopilot could also ignore a peripheral due to other circumstances, such as the need to land before the battery runs out, but that has yet to be implemented. Below is an illustration of the orbit that is added to the aircraft path planner, as well as a sped-up version of the object acquisition and subsequent orbit. After that is a dog playing hockey, because if you watch the airplane circling that dot for too long you're going to throw up.&lt;br&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--g004yPXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/104273143-53ce9b80-5453-11eb-8a3b-d92b5d17abec.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--g004yPXJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/104273143-53ce9b80-5453-11eb-8a3b-d92b5d17abec.gif" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/lYXpHMcSeQiK3Qcdku/giphy-downsized-large.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/lYXpHMcSeQiK3Qcdku/giphy-downsized-large.gif" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;p&gt;&lt;a href="https://i.giphy.com/media/mzlPR06xgPaj4s8tK6/giphy-downsized-large.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/mzlPR06xgPaj4s8tK6/giphy-downsized-large.gif" width="70%"&gt;&lt;/a&gt;&lt;a href="https://i.imgur.com/FvrzOHA.mp4"&gt;1&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  But whai?
&lt;/h2&gt;

&lt;p&gt;The object circling kinda looks like a record, right? Well good, because at this point I should be sounding like a broken one. The significance of all this is: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The peripheral is modifying the behavior of the aircraft in real time.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Or to quote Ron Popeil:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--S8vjLZUG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/104273166-5af5a980-5453-11eb-975a-78a4eddadf2a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--S8vjLZUG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/104273166-5af5a980-5453-11eb-975a-78a4eddadf2a.png" width="50%"&gt;&lt;/a&gt;&lt;a href="https://youtu.be/GG43jyZ65R8"&gt;2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Obviously, we can't completely forget it, but it does mean that the operator need not be involved in every decision that affects the path of the vehicle. In fact, we can have peripherals that might outperform their human counterparts through the use of coding and algorithms.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--53P1cMLr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/104273182-5df09a00-5453-11eb-98fd-cd76bd652a0e.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--53P1cMLr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/104273182-5df09a00-5453-11eb-98fd-cd76bd652a0e.jpg" width="50%"&gt;&lt;/a&gt;&lt;a href="https://img.ifunny.co/images/c8747af585d04da048efcc0befa382a044c065db00b435e362c438bb20da22a6_1.jpg"&gt;3&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our red circle example may be trivial, but it demonstrates a great use for this autopilot when equipped with a smart camera. A potential search and rescue mission plan could be as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Roam search area using predetermined path (alone or in conjunction with other vehicles)&lt;/li&gt;
&lt;li&gt;If camera finds something of interest:&lt;/li&gt;
&lt;ol&gt;
  &lt;li&gt;Orbit at location&lt;/li&gt;
  &lt;li&gt;Alert operator and provide data for analysis (video feed, still images, etc.)&lt;/li&gt;
  &lt;li&gt;Operator can choose to:
    &lt;ol&gt;
        &lt;li&gt;Confirm that the subject has been found&lt;/li&gt;
        &lt;li&gt;Adjust the vehicle position for more information&lt;/li&gt;
        &lt;li&gt;Dismiss the vehicle to continue its search path
    &lt;/li&gt;
&lt;/ol&gt;


&lt;/li&gt;
&lt;/ol&gt;




&lt;/ol&gt;
&lt;br&gt;&lt;br&gt;
The advantage of this method is that someone doesn't have to be staring at a video monitor for hours at a time. For one thing, this would require 100% video stream uptime to be effective. And besides that, unless there's a car chase or a bar fight every 15 minutes, who can pay attention to a screen for that long? We can utilize machines for what they're best at: computationally intensive tasks, complex data fusion, and multi-sensor analysis. Meanwhile our brains can be used where machines are weakest: high-level decision making. The peripherals augment the autopilot, the autopilot augments the vehicle, and the vehicle augments our mission capabilities. And all this was possible by attaching a smart camera to our autopilot without any modifications to the source code. That gets me pretty excited, and I've seen a man roast four juicy sirloin steaks without even paying attention, so I know a little something about excitement.&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
In case you want to watch the results of somebody else's video game (it's like Twitch, but without action, sound, or commentary), click on the picture below and you can check out "The Hunt for Red Dot-ober". But if not, there's no hard feelings. I'll shee you shoon.&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg

&lt;p&gt;&lt;br&gt;&lt;a href="https://youtu.be/lKNvlEVCvBs"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pQNu9Tbg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/104273200-63e67b00-5453-11eb-9d28-2b245507303c.png" width="100%"&gt;&lt;/a&gt;&lt;br&gt;&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>uav</category>
      <category>autopilot</category>
      <category>ai</category>
    </item>
    <item>
      <title>Part 12a: Real to Reel</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 04 Jan 2021 18:40:30 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-12a-real-to-reel-2nc9</link>
      <guid>https://dev.to/greghgradwell/part-12a-real-to-reel-2nc9</guid>
      <description>&lt;p&gt;Welcome back, everyone! I hope you all enjoyed not leaving the house for the holidays. Isn't it crazy that people used to wear pants like every day? Wild.&lt;/p&gt;

&lt;p&gt;During the break I finally got hold of the new computer I needed to help make this next demo possible. For starters, about a month ago I switched simulators, from X-Plane 11 to RealFlight 9.5. In case you're not familiar with &lt;a href="https://www.realflight.com/"&gt;RealFlight&lt;/a&gt;, it is an RC flight simulator, designed to recreate the experience of flying model airplanes (from the comfort of your home). If you've thought about getting involved with RC but don't know where to start, this is a great tool. It does an impressive job of replicating the RC experience, with the added benefit of not having to cry over all your crashed airplanes. But why did I choose to switch simulators? After all, you know I've cried plenty of tears over destroyed aircraft, so that ship has sailed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The aircraft in RealFlight are the same scale as the vehicles I will be testing in real life, so a controller tuned in RF should carry over well to a real flight test&lt;/li&gt;
&lt;li&gt;Flying from the RC pilot's POV (standing on the ground) gives an idea of how small the aircraft will look during a mission (one of the hardest parts of flight testing is recovering a vehicle that is too far away)&lt;/li&gt;
&lt;li&gt;Multirotor vehicles are available (which the Elixir autopilot will eventually be capable of controlling)&lt;/li&gt;
&lt;li&gt;Multiplayer is supported (this will allow for swarm logic to be tested)&lt;/li&gt;
&lt;li&gt;Additional cameras can be added to the vehicle (the purpose of this will be demonstrated shortly)&lt;/li&gt;
&lt;li&gt;If you get bored at work, you can play a video game and claim it's for research&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Gameplay inside RealFlight is illustrated below. Notice the secondary view in the top left corner. This can be set to any of the camera views available in the game, but we're using the camera that has been added to our &lt;a href="https://www.horizonhobby.com/product/carbon-z-cessna-150-2.1m-bnf-basic/EFL1450.html"&gt;E-Flite 2.1m Cessna 150&lt;/a&gt; and is pointed downward. This camera is simulating a third-party device that will be communicating with the autopilot and directing its flight path (but that comes later).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yodxax94--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566539-0496d280-4e77-11eb-9908-01863f62085e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yodxax94--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566539-0496d280-4e77-11eb-9908-01863f62085e.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Pretty cool that we get to fly a simulated version of a model can we can actually own, right? Although, I suppose you could also own one of the aircraft in X-Plane, but I already spent my Christmas money on LEGOs, so I might have to wait another few years for that.
&lt;h2&gt;
  
  
  SOAP City
&lt;/h2&gt;

&lt;p&gt;As I'm SURE you recall from our posts about X-Plane, we communicated with the simulator by sending and receiving UDP messages. The good news is that RealFlight also allows us to control the vehicle with our autopilot. The bad news is that we have to use SOAP. &amp;lt;insert joke about smelly computer programmers here&amp;gt;&lt;br&gt;&lt;br&gt;
Now I have nothing against &lt;a href="https://en.wikipedia.org/wiki/SOAP"&gt;SOAP&lt;/a&gt; other than the fact that I had never heard of it before and I had no idea what the heck I was doing. However, Ardupilot has &lt;a href="https://ardupilot.org/dev/docs/sitl-with-realflight.html"&gt;proven&lt;/a&gt; to be capable of integrating with RealFlight, so I knew it was at least possible (and I could use their source code as a starting point). In short, SOAP uses XML-formatted messages, which most often are transmitted with HTTP (as is the case with RealFlight). So the three challenges were:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Determining which messages were available to send/receive&lt;/li&gt;
&lt;li&gt;Formatting the messages correctly&lt;/li&gt;
&lt;li&gt;Sending/Receiving messages using HTTP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;My biggest gripe about the whole process was the lack of formal documentation from Knife Edge (the creators of RealFlight) regarding this interface (they call it FlightAxis). I think it is a shame, because they built a really wonderful feature that tremendously expands the usefulness of their software. But undocumented features are about as useful as I was from ages 12 to 27.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IeyoxaKS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566632-2bed9f80-4e77-11eb-9142-75d428c83cda.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IeyoxaKS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566632-2bed9f80-4e77-11eb-9142-75d428c83cda.jpg" width="70%"&gt;&lt;/a&gt;&lt;a href="https://i.imgflip.com/2t4wmk.jpg"&gt;1&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the FlightAxis message structure was understood, it was a matter of making HTTP calls on the Elixir side. This was quite simple thanks to the &lt;a href="https://hexdocs.pm/httpoison/HTTPoison.html"&gt;HTTPoison&lt;/a&gt; library. Every time we would like an update from RealFlight regarding the state of the aircraft, we use the &lt;strong&gt;POST&lt;/strong&gt; method. This results in a two-way transfer of data, as the &lt;strong&gt;POST&lt;/strong&gt; request also includes the servo output from our autopilot (calculated from the previous update). RealFlight will then respond with the updated state of the vehicle, which we can format to simulate various sensor outputs like we did with X-Plane. In the end, it's pretty basic, with the majority of the code dedicated to extracting the aircraft state from the RealFlight response.&lt;/p&gt;

&lt;h2&gt;
  
  
  Downward Vlog
&lt;/h2&gt;

&lt;p&gt;Alright, so we've essentially gotten back to the same point we were at with X-Plane: controlling a simulated vehicle with our autopilot. Now it's time to enjoy the fruits of our labor (after some additional labors, of course). We would like to analyze the downward-facing camera feed from the perspective of a third-party peripheral. That is, each frame should be packaged so that it can be consumed as though it came from a real camera. Fortunately, Andrew Tridgell (the systems lead for Ardupilot) had already &lt;a href="https://www.youtube.com/watch?v=SDSxel3N1pw"&gt;demonstrated&lt;/a&gt; this capability (which is how I got the idea in the first place).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dYxY1EEw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566558-0b254a00-4e77-11eb-90d2-3c776708f9e3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dYxY1EEw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/103566558-0b254a00-4e77-11eb-90d2-3c776708f9e3.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By using the Python package &lt;a href="https://github.com/SerpentAI/D3DShot"&gt;d3dshot&lt;/a&gt;, we can grab a screenshot of our RealFlight environment (we'll take just the part showing the downward-facing camera feed), and then send this image data (encoded using OpenCV) over UDP. On another computer we can have a script running with a UDP socket open and waiting to receive these messages. This script is representing the third-party peripheral, which in real life would be capable of obtaining the video footage on its own. Nonetheless, the peripheral now has its data to analyze. The important thing to note here is that this peripheral is self-contained. It is not part of the autopilot (it's written in Python, for one thing), and thus its hardware and software can be developed without any integration concerns, provided that it conforms to the autopilot's API. So while this "peripheral" currently exists on the same computer that is running the autopilot, this is by no means a constraint, and it will soon be moved to a separate piece of hardware. &lt;br&gt;&lt;br&gt;&lt;br&gt;
So what, you may ask, it the point of this peripheral? Well, that will just have to wait until next time. But I think it's pretty cool, so hopefully you'll come back to check it out.&lt;br&gt; You're very patient, dear reader. That's what I appreciates about you.&lt;br&gt;&lt;br&gt;-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>uav</category>
      <category>simulation</category>
    </item>
    <item>
      <title>Part 11: Kibbles 'n Wits</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 21 Dec 2020 22:45:31 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-11-kibbles-n-wits-54bp</link>
      <guid>https://dev.to/greghgradwell/part-11-kibbles-n-wits-54bp</guid>
      <description>&lt;p&gt;Welcome back! If you noticed that there wasn't a write-up last week when there should have been, then I think you're the only one. So congratulations, and please head over to the online store where you can pick up a free t-shirt using the coupon code THEREISNOSTORE. Unfortunately, I am still waiting on some new hardware to arrive that will allow me to illustrate the next step in the journey. So this post is a bit of an interlude, although it is still relevant to the autopilot development. We're going to have a quick discussion on the subject of &lt;strong&gt;&lt;a href="https://en.wikipedia.org/wiki/Eating_your_own_dog_food"&gt;dogfooding&lt;/a&gt;&lt;/strong&gt;. If you're not familiar with the concept (and have an aversion to hyperlinks), it refers to the practice of using (and relying on) the product you are developing. For example, if I were making an email client (is the name G-mail taken?), as soon as I had a viable prototype I would begin to use it for all of my emailing needs. And although I don't have much experience in developing customer-facing products, I can point to plenty of instances where I've been looking at a screen while yelling "Doesn't anybody at your company actually use this stuff?!?"&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--d3fI4Y6S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102828686-2c2b6c80-439a-11eb-805c-f68fa8a6c36f.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--d3fI4Y6S--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102828686-2c2b6c80-439a-11eb-805c-f68fa8a6c36f.jpg"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Even if you don't have customers, and are just writing code so that you can sound cool in conversations (with your mom, I guess?), it can be a useful exercise to approach your software (or hardware) from another perspective. In the case of this project, I found many opportunties for improvement when I began interfacing with the autopilot as a "third-party".
&lt;h2&gt;
  
  
  Fonzie: Private Investigator
&lt;/h2&gt;

&lt;p&gt;One of the few inviolable tenets of this Elixir autopilot is that it has a very clear and simple API (application programming interface). If the capabilies of the aircraft cannot be easily expanded through the use of third-party hardware and software, then we have lost one of the primary motivators for adopting this autopilot over other options. How can we make the interface as simple as possible? The first tactic might seem like it actually adds complexity, but I believe it ultimately contributes to the overall robustness of the system. I will discuss the reasoning for this decision in another post, so for now let's just pretend it's a good idea that we all agree with:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;All third-party software must be contained on separate hardware&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The immediate benefit of this stipulation is that the autopilot source code remains untouched no matter what peripherals are connected. It also means that the peripheral software can be written in any language. In order to communicate with the autopilot, a device must simply adhere to the protocols of the API and be capable of UART read/write. This brings me to the actual topic of this post, which are the things I learned about my code when I tried to create a peripheral of my own.&lt;/p&gt;

&lt;h2&gt;
  
  
  I blox, U-blox
&lt;/h2&gt;


&lt;p&gt;If it ain't broke, don't fix it, and the UBX protocol from U-blox works just fine, so that is what I tend to use when I need to send serial messages. It is compact (8 bytes added to each payload) and there are plenty of example parsers available online. My first peripheral was going to be written in Python, which shares enough similarities to Elixir that I decided to port my Elixir parser into Python. Although porting code to another language probably doesn't count as dogfooding, I think it offers many of the same benefits. It forces you to understand what is happening with each line, and often times you will find opportunities for improvement in your original code (as was the case with this module). The nice thing about being your own first customer is that you can actually do something about the deficiencies rather than just be mad at them. Once I was able to create and parse UBX messages in Python, there was only one thing left to do: &lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7m6gO1Xw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/102828696-32214d80-439a-11eb-8cef-2d8ccc6ecccd.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7m6gO1Xw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/102828696-32214d80-439a-11eb-8cef-2d8ccc6ecccd.gif" width="50%"&gt;&lt;/a&gt;&lt;a href="https://media4.giphy.com/media/Lo0IDynmNuIv4WpYrl/giphy.gif"&gt;1&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;&lt;br&gt;
Oh, right. I hadn't actually written the module in Elixir to interface with generic peripherals. While it might seem backwards to write the external code before internal code, it allowed me to create the ideal "customer" experience first and then adapt the API to match. To be honest, this led to a bigger refactor than I was expecting. Perhaps it would have turned out the same regardless of where I started, but I do like the idea of making the API conform to the use case, and so it was beneficial to create a peripheral while the API was still in its early stages. It was this "dogfooding" experience that really improved the Elixir code, and laid the groundwork for more powerful peripherals in the future. And once my new goodies arrive, I'll have a video for you that demonstrates the true potential of this Elixir autopilot project. Until then, just remember, your January diets will be a lot more successful if you pack on some extra pounds in December. So pay it forward and eat those cookies.&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg

</description>
      <category>elixir</category>
      <category>uavs</category>
      <category>autopilot</category>
      <category>hardware</category>
    </item>
    <item>
      <title>Part 10: Plan it, Janet!</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 14 Dec 2020 18:44:47 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-10-plan-it-janet-1kb4</link>
      <guid>https://dev.to/greghgradwell/part-10-plan-it-janet-1kb4</guid>
      <description>&lt;h1&gt;
  
  
  10. Plan it, Janet!
&lt;/h1&gt;

&lt;p&gt;Is it weird that two of my favorite fictional characters are both named Riff Raff? Well, technically one them is real, but he goes so hard in the paint that he makes me wonder if maybe I'm the one who's fabricated.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OPHWVf2B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102117642-6ec9d380-3df3-11eb-8b39-639563b1dab2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OPHWVf2B--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102117642-6ec9d380-3df3-11eb-8b39-639563b1dab2.jpg" width="50%"&gt;&lt;/a&gt;&lt;a href="https://www.abouttoblow.com/wp-content/uploads/riff-raff-neon-icon.jpg"&gt;1&lt;/a&gt; &lt;/p&gt;
&lt;br&gt;
Unfortunately we're not going to be discussing the amazing freestyle that is "&lt;a href="https://youtu.be/uzKFmq444O4"&gt;Introducing the Icon&lt;/a&gt;". Instead we will talk about the far less exciting topic of knowing where you are and where you're going (lest you end up someplace else). But don't worry, this is not a self-help seminar. Unless you are an autopilot, in which case the Singularity has arrived and I would like to be the first to welcome our robot overlords.
&lt;h2&gt;
  
  
  They're All Going To Laugh At You
&lt;/h2&gt;

&lt;p&gt;If there is one thing I can count on when it comes to testing my inventions, it's that they won't work on the first several tries. I remember the days before I had truly come to terms with this: I would test a prototype without any debugging tools installed, without any error logging, sometimes without even any failsafes. And then of course it doesn't work, so I would probably just, um, try it again? Yeah, super naive, and a huge waste of time. I also routinely failed to have clear success criteria, so at best I was hoping for some qualitative feedback. But as time goes on, and your "friends" keep giving you "constructive criticim" that makes you "cry", you start to learn some things. One tactic that I have grown fond of when it comes to testing vehicle control systems is using repeatable test plans. This is by no means revolutionary, but I believe it is a strategy that can be very useful even for vehicles that are not meant to be autonomous. By that I mean, if I want to collect metrics regarding my vehicle behavior, I likely can get better (and faster) results by repeating a test &lt;em&gt;exactly&lt;/em&gt; over and over, rather than performing many different tests a few times. It's essentially the Bruce Lee approach to flight test.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tHuez0OZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/102117677-75584b00-3df3-11eb-96d1-cec18c7a8234.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tHuez0OZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/102117677-75584b00-3df3-11eb-96d1-cec18c7a8234.gif"&gt;&lt;/a&gt;&lt;a href="https://media2.giphy.com/media/mKPTMhAlmhlRu/giphy.gif?cid=ecf05e47422584188641f77c901febde712d6bd851aa7b66&amp;amp;rid=giphy.gif"&gt;2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And how can you practice the same &lt;del&gt;kick&lt;/del&gt; flight 10,000 times? Well, for starters it helps if your vehicle has an autopilot rather than a human pilot (&lt;strong&gt;EVEN&lt;/strong&gt; if your final product will be controlled by a human). And secondly, you'll want the ability to generate those repeatable tests quickly.&lt;/p&gt;

&lt;h2&gt;
  
  
  The First Flying Car
&lt;/h2&gt;

&lt;p&gt;The smartest decision I made when developing my first autopilot (it's a short list) was testing my path following algorithms on a car before they were put on an airplane. Okay, technically that was only necessary because I made the dumb decision to not use any sort of simulation, but please remember how little I knew at that point. There is one tremendous advantage that ground testing has: you actually know where the vehicle is. If you are flying an airplane, you know where you &lt;em&gt;think&lt;/em&gt; it is. So when we're trying to drive back and forth between two waypoints that happen to have a big yellow line connecting them, it's pretty easy to tell if our car is on the line or not (at least within the error expected by our GPS).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=ZBo-xgQBn4E"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yB194Zm1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120255-201e3880-3df7-11eb-9916-9b15afca5b29.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Once we graduate to the sky, it's anybody's guess. We at least have a pretty good idea on takeoff and landing, so once our high level controller is tuned to satisfaction, that's a great place to start. The nice thing about simulators is that you can spawn your aircraft in the same place every time. This means you can hard-code your start position for a given airport and runway (we'll get to adjustable start positions later). If we know our initial position and heading, we can calculate the entire takeoff and landing sequences relative to that point.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EFRvAbch--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120268-27454680-3df7-11eb-9001-1401275693a0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EFRvAbch--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120268-27454680-3df7-11eb-9001-1401275693a0.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
This mission is about as simple as it gets. Start the takeoff roll, leave the ground, climb to our mission altitude, and then come around to land. Although basic, it allows us to stress all of our controllers, particurlarly the altitude controller, as we have several different altitudes that must be achieved at specific locations. For example, if the aircraft hits the ground when it is supposed to be at the flare altitude, then we know our controller is overshooting. What's nice about this mission is that it is essentially the shortest path to a landing approach, aside from spawning our vehicle in the air (which seems complicated, although probably not impossible). Once we've got the takeoff and landing down, we can move on to some additional flight waypoints. If you'll recall from our discussion about Dubins paths, a racetrack pattern is a very simple way to test your path planning/following logic, as it only incorporates one of the four Dubins-type waypoints (&lt;em&gt;left-left&lt;/em&gt; or &lt;em&gt;right-right&lt;/em&gt;). When those look good, I like to try an hourglass pattern, which can utilize the same four waypoint locations, but will stress all four Dubins types.&lt;br&gt;&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QTv1qeAg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120272-29a7a080-3df7-11eb-80c3-480bb65cab20.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QTv1qeAg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120272-29a7a080-3df7-11eb-80c3-480bb65cab20.png" width="30%"&gt;&lt;/a&gt;     &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mApkkGfT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120280-2b716400-3df7-11eb-96f4-f4d88fc3a3e7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mApkkGfT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120280-2b716400-3df7-11eb-96f4-f4d88fc3a3e7.png" width="30%"&gt;&lt;/a&gt;     &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5QL9-BTE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120288-2d3b2780-3df7-11eb-937c-3bc0f5c33fb0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5QL9-BTE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120288-2d3b2780-3df7-11eb-937c-3bc0f5c33fb0.png" width="30%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Going in circles
&lt;/h2&gt;

&lt;p&gt;If this feels like a review, it is. But the paths are perhaps not as important as the progression we are following. Each mission is introducing a small amount of complexity, such that if vehicle does not perform as expected, we have a better chance of identifying the source of the error. We are also building a mission repertoire that will be very useful when it comes time to flight test. Before we ever need to start chasing down corner cases, we'll want to demonstrate that we can fly the same mission many times in a row without any issues. Speaking of that repertoire (whose first 'r' is &lt;b&gt;very&lt;/b&gt; sneaky), wouldn't it be nice if we could tell the vehicle to just fly in circles? For one thing, it will help us to tune our orbit-following gains. It's also a great way to press "pause" in your simulation. And finally, this will be a useful feature down the road when we've got a camera onboard the aircraft and would like to observe something on the ground. Of course, if we have a mission loaded we don't want to throw it away just to orbit for a while, so the mission parameters will be saved and ready to resume once we're done orbiting.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qBPHEATv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120298-31674500-3df7-11eb-86d2-8c6f7767982f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qBPHEATv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120298-31674500-3df7-11eb-86d2-8c6f7767982f.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Lost in Earth
&lt;/h2&gt;

&lt;p&gt;If you're like me, you get lost all the time. So the notion of having to repeatedly find the same exact hard-coded start location seems pretty daunting. I suppose you could place a big flag there, but then it would be in the way when it comes time to land and you'd have to remember to remove it. And while I'm sure you're thinking, "Who would be dumb enough to ignore a giant red flag?" I am. I am dumb enough.&lt;br&gt;Now although I might not be capable of pinpointing my exact location, I can usually tell if I'm at the right airport. So what if we keep our flight waypoints fixed, but allow our takeoff and landing waypoints to move relative to our starting position? Then all we need to do is set the airplane down on the runway and point it in roughly the right direction. This was the strategy I used with my first autopilot, and it worked really well.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pc97vEng--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120306-34623580-3df7-11eb-9316-a9f8d44da203.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pc97vEng--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120306-34623580-3df7-11eb-9316-a9f8d44da203.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Yeet on over
&lt;/h2&gt;

&lt;p&gt;But what if you're in the middle of a mission and something more important comes up?&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zwhVlYte--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120316-3926e980-3df7-11eb-9e35-aac87de2271e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zwhVlYte--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120316-3926e980-3df7-11eb-9e35-aac87de2271e.png"&gt;&lt;/a&gt;&lt;a href="https://pics.onsizzle.com/bae-come-over-me-i-cant-amat-the-zoo-bae-2623471.png"&gt;3&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Ideally we wouldn't have to wait for the mission to complete. So the final addition to our planning toolbelt is a "land now" command. This creates a new Dubins path, with the vehicle's current location and heading as the first waypoint, leading directly to a landing.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dhQnONaJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120326-3e843400-3df7-11eb-91a6-5154dfb553d5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dhQnONaJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/102120326-3e843400-3df7-11eb-91a6-5154dfb553d5.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Moving Pictures
&lt;/h2&gt;

&lt;p&gt;As awesome as it is to look at pictures of missions, I think a video might be a little more entertaining. But if I showed it to you now, you wouldn't need to come back next time. So instead, I will leave you with all the things that Meatloaf wouldn't do for love. See you soon!&lt;br&gt;&lt;br&gt;-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>uavs</category>
      <category>simulation</category>
    </item>
    <item>
      <title>Part 9: Con Air(frame)</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Thu, 10 Dec 2020 16:07:50 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-9-con-air-frame-30de</link>
      <guid>https://dev.to/greghgradwell/part-9-con-air-frame-30de</guid>
      <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rKMdV8Kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/101680480-9e549680-3a15-11eb-8ec7-dced3d1368bb.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rKMdV8Kg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/101680480-9e549680-3a15-11eb-8ec7-dced3d1368bb.gif"&gt;&lt;/a&gt;&lt;a href="https://i.imgur.com/EaXaVpN.gif"&gt;1&lt;/a&gt;&lt;/p&gt;&lt;/span&gt; &lt;br&gt;
I've been &lt;del&gt;reading&lt;/del&gt; imagining all your feedback, and I couldn't agree more. The action movie references have really dwindled lately, and it's unacceptable. The engineering should probably take a back seat. Hey, you want to feel old? &lt;em&gt;Con Air&lt;/em&gt; came out in 1997. Oof.&lt;br&gt;&lt;br&gt;&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6mkvzodJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680495-a3b1e100-3a15-11eb-9afe-49356649f995.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6mkvzodJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680495-a3b1e100-3a15-11eb-9afe-49356649f995.jpg"&gt;&lt;/a&gt;&lt;a href="https://i.kym-cdn.com/photos/images/newsfeed/001/706/064/57e.jpg"&gt;2&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Airframe Integration
&lt;/h2&gt;

&lt;p&gt;Alright, down to business. Today we'll be commiserating about how difficult it can be to take all your hardware from the bench and mount it in your vehicle. It's like when you see a gigantic couch at Costco that looks like it will fit in your studio apartment simply because it's surrounded by 140,000 square feet of warehouse. The mistake I make most often is forgetting that I will still need to access the hardware once it's installed. Yes, technically there might be room, but what happens when you need to replace the microSD card that is buried inside the fuselage and tucked up against a wall? This oversight cost me a TON of time, because I chose an ill-suited airframe for my first rounds of flight tests. Well, in truth, the &lt;a href="https://www.horizonhobby.com/product/ec-1500-twin-1.5m-pnp/EFL5775.html"&gt;Horizon Hobby EC-1500&lt;/a&gt; was the wrong choice for a number of reasons, but its lack of useable internal volume was one of them. With just the autopilot stack installed, it looks like this cockpit will have plenty of room. But once we add the flight pack and connect all the wiring, it gets really full and difficult to work with. And this was an early iteration, without the voltage/current monitoring hardware that we discussed in the previous write-up.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tJJ3GfYX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680538-b3312a00-3a15-11eb-8b62-7ec231f35dbe.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tJJ3GfYX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680538-b3312a00-3a15-11eb-8b62-7ec231f35dbe.jpg" width="50%"&gt;&lt;/a&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3At9g84u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680547-b4faed80-3a15-11eb-8a28-e63115469f96.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3At9g84u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680547-b4faed80-3a15-11eb-8a28-e63115469f96.jpg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Back to Basics
&lt;/h2&gt;

&lt;p&gt;Your first love always holds a special place in your heart, and with RC aircraft it is no exception. My favorite airplane to fly is the &lt;a href="https://www.horizonhobby.com/product/t-28-trojan-1.1m-pnp/PKZ8275.html"&gt;Parkzone T-28&lt;/a&gt;(sadly no longer available, unless you bought 5 of them a few years ago...😬). It is the vehicle I used for all of my first fixed-wing autopilot testing. It's light, flies great, and has lots of space on the inside for hardware. After my extremely unsuccessful tests with the EC-1500, I went back to simpler times (and &lt;a href="https://youtu.be/pCH26ZN7nkU"&gt;Simple Wafers wafer cookies&lt;/a&gt;). One thing I realized was that in order to test most of the autopilot functionality, I did not need GPS. By removing the ArduSimple board from the INS, I could use a smaller microcontroller and greatly reduce the overall footprint. I essentially just had an IMU at this point, but it was still outputting the same VectorNav INS message, so none of the Elixir code had to change. I also replaced the separate battery module with a &lt;a href="https://sixfab.com/product/raspberry-pi-power-management-ups-hat/"&gt;Sixfab Raspberry Pi UPS&lt;/a&gt; (uninterruptible power supply), which stacked on the Pi nicely and reduced some clutter. The resulting assembly was decently compact, and easy to install in the airplane:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8Ds-_WVw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680583-bfb58280-3a15-11eb-8847-fc6f1275959d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8Ds-_WVw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680583-bfb58280-3a15-11eb-8847-fc6f1275959d.jpg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IgdjBPZM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680599-c7752700-3a15-11eb-872e-6e51dd417272.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IgdjBPZM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680599-c7752700-3a15-11eb-872e-6e51dd417272.jpg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although the fit is tight on the sides, there is plenty of room in the front and the back, which is where all of the connections are made. With a custom adapter to secure the stack inside the fuselage, this setup was a breeze to handle. It enabled me to tune the gains for the rate and attitude controllers, which are the most crucial for stable flight. Once the attitude controller is working well, it takes away a lot of the pilot workload, and allows the pilot to test the high-level controller with the confidence that they can fall back to attitude mode if the high-level gains are unsatisfactory.&lt;br&gt;&lt;br&gt;&lt;/p&gt;

&lt;p&gt;With the attitude controller thoroughly (barely) vetted, I was able to move on to the next airframe, which would have room for a two-node setup. Instead of jumping to that hardware right away, I took the single stack from the small T-28 and flew with this first. Despite the difference in scale, I assumed that the controller gains would be close enough to start with, which turned out to be the case. Did I mention the bigger airframe was also a T-28? The Carbon-Z series of aircraft from E-flite are fantastic and the &lt;a href="https://www.horizonhobby.com/product/carbon-z-t-28-2.0m-bnf-basic-with-as3x/EFL1350.html"&gt;2.0m T-28&lt;/a&gt; is no exception. It has flaps and retractable landing gear, and looks awesome in the air.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kb1N6Pq6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680613-cd6b0800-3a15-11eb-9f48-ade80cef9879.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kb1N6Pq6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680613-cd6b0800-3a15-11eb-9f48-ade80cef9879.jpg" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
With the stack installed, we've got a ton of room.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Lneh8zCq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680980-4ff3c780-3a16-11eb-92aa-6edb5b1a6e7b.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Lneh8zCq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680980-4ff3c780-3a16-11eb-92aa-6edb5b1a6e7b.jpg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt; 

&lt;h2&gt;
  
  
  1 + 1 = 4
&lt;/h2&gt;


&lt;p&gt;The tough part about connecting an additional node is that it requires equipment beyond the node itself. In order for the nodes to communicate, they must be connected via Ethernet, and each must have a IP address. For the sake of simplicity and expandability, this is currently accomplished by means of an Ethernet switch and a small router. Fortunately they both require 5V to operate, so powering them is trivial. However, that still adds two pieces of hardware once we make the leap beyond a single node. We've also reintroduced the original INS. Now the travesty in all this is that I only took picture one of the complete setup...and it's blurry. So in case you think your brain is malfuntioning due to the rat's nest of wiring, you're probably fine. However, if a fire is not struck deep within your heart after you witness Cameron Poe enjoying the sweet feeling of the wind flowing through his majestic mane, then you might want to see a doctor.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PLcj_nSt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680681-e5db2280-3a15-11eb-8aa6-92bafefb87d3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PLcj_nSt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101680681-e5db2280-3a15-11eb-8aa6-92bafefb87d3.jpg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;&lt;br&gt;
&lt;span&gt;&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zmvk7EDZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/101680694-eb386d00-3a15-11eb-8180-74b093abd660.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zmvk7EDZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/101680694-eb386d00-3a15-11eb-8180-74b093abd660.gif" width="50%"&gt;&lt;/a&gt;&lt;a href="http://www.reddit.com/r/reactiongifs/comments/3pse4c/mrw_when_i_leave_the_apartment_complexes_gym/"&gt;3&lt;/a&gt;&lt;/p&gt;&lt;/span&gt;&lt;br&gt;&lt;br&gt;
The larger airframe flew very well, despite the additional weight. I was able to validate the two-node setup (the peripherals were split between the nodes), and began tuning the high-level controller gains. At this point it was time to retire the USB connections, and move to UART communication with the peripherals, which would add robustness to both the software and hardware aspects of the design. After all, "Eliminating handshakes makes light work". But that's a story for another time.&lt;br&gt;&lt;br&gt;-Greg

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>uavs</category>
      <category>sensors</category>
    </item>
    <item>
      <title>Part 8: The Dolt of Volt</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 07 Dec 2020 16:48:00 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-8-the-dolt-of-volt-p9d</link>
      <guid>https://dev.to/greghgradwell/part-8-the-dolt-of-volt-p9d</guid>
      <description>&lt;p&gt;Throughout my career, I've had the nasty habit of holding on to preconceived notions for far too long without questioning their validity. Often times, I will let a single bad experience with a product or technology cause me to discount it for months or years, until I am introduced to it again and realize what I've been missing. Sometimes I just make a boneheaded decision up front and then live with it (one of the many dangers of working mostly alone). And thus was the case when it came to me and voltage regulators.&lt;/p&gt;

&lt;h2&gt;
  
  
  Buck the trend
&lt;/h2&gt;

&lt;p&gt;I don't know why, but I've got a propensity for step-down regulators. Not just, like, in general. That would be weird. But if one of my devices needs a particular voltage, I'm more inclined to use a step-down (buck) converter than a step-up (boost) converter. Honestly, I think it's just because in my mind I've managed to conflate voltage with gravity, and OBVIOUSLY it's easier to go downhill. In my defense, buck converters tend to be more efficient than boost converters for the voltage and current that I'm usually dealing with. But the different isn't huge, maybe 5-10% (and I've actually found a boost converter with essentially the same performance, so this argument has even less merit now). But let's just say that I can buck at 95% efficiency and boost at 85%. Seems like buck is the obvious choice, right? If you said yes in your head just to make me feel better, I truly appreciate it. Perhaps a little more context would help. Our Raspberry Pis require 5V to operate, and we're using batteries with lithium chemistry, so they're roughly 3.7V per cell. In other words, we're either boosting from 3.7V or bucking from 7.4V: one cell or two cells. Or, basically this:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Mkl45Ci8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376044-f0a28580-3864-11eb-9aed-79f969426d49.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Mkl45Ci8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376044-f0a28580-3864-11eb-9aed-79f969426d49.png"&gt;&lt;/a&gt;&lt;a href="https://power.tenergy.com/tenergy-li-ion-7-4v-2600mah-rechargeable-battery-pack-w-pcb-2s1p-19-24wh-5a-rate/"&gt;1&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Now, the difference between one and two cells may not seems like a lot, but when it comes to aircraft, every gram counts (yeah, we're doing the metric thing, sorry for the convenience). Not only that, but if we want to increase the capacity by connecting additional cells in parallel, we will obviously need to add twice as many for the 7.4V setup. So potentially the weight penalty could be significant. And while this seems obvious in hindsight, my dumb brain was still stuck in snow day mode, and I was sledding down that hill from 7.4 to 5, having the time of my life. The whole reason for this anecdote is that in the following example I am using a 7.4V battery pack. I have since seen the error of my ways, and have switched to 3.7V, but I haven't yet put together a complete setup. So let's just say that I went to Doofus Jail, but I'm out on parole, and I see no reason why my engineering voting rights should not be restored.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f8JnVHZG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pics.onsizzle.com/youre-a-unit-of-power-harry-mawatt-25594323.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f8JnVHZG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pics.onsizzle.com/youre-a-unit-of-power-harry-mawatt-25594323.png"&gt;&lt;/a&gt;&lt;a href="https://pics.onsizzle.com/youre-a-unit-of-power-harry-mawatt-25594323.png"&gt;2&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
You know what's cool about having your battery die in the middle of a flight? Nothing. So we'd like to monitor our battery pack voltage, and for extra credit, let's get the current as well so we can calculate the energy discharged. Once again, we're using a 2-cell LiPo, so we can expect a maximum voltage of roughly 8.4V (when super-duper charged), and our Pololu &lt;a href="https://www.pololu.com/product/3782"&gt;step-down regulator&lt;/a&gt; is capable of providing 3.2A. This means that a voltage/current sensor like &lt;a href="https://www.ti.com/lit/ds/symlink/ina260.pdf"&gt;INA260&lt;/a&gt; from Texas Instruments will do just fine. And would you look at that? Adafruit sells a &lt;a href="https://www.adafruit.com/product/4226"&gt;breakout board&lt;/a&gt; ready to rock. We can communicate to this sensor over I²C (once we've ported their library to Elixir, of course), which will give us the battery voltage and current. Once we've added a &lt;a href="https://www.amazon.com/gp/product/B082PVGYX3"&gt;small protoboard&lt;/a&gt; to allow for two output connections (this particular setup was powering a Pi as well as a USB hub) and attached everything to the Tenergy &lt;a href="https://power.tenergy.com/tenergy-li-ion-7-4v-2600mah-rechargeable-battery-pack-w-pcb-2s1p-19-24wh-5a-rate/"&gt;LiPo pack&lt;/a&gt;, it look like this:&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Hc5Gz7dr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375745-9d303780-3864-11eb-9795-320552a7edd0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Hc5Gz7dr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375745-9d303780-3864-11eb-9795-320552a7edd0.jpg" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
That little two-sided connector on the left is a &lt;a href="https://www.sparkfun.com/products/14495"&gt;Qwiic Adapter&lt;/a&gt; from Sparkfun that allows for I²C connections using their Qwiic cables (the Adafruit equivalent is STEMMA). Both companies make a bunch of boards/sensors with this connector, making it really easy to swap devices or change the length of your cable. Once your design is a bit more polished you'll probably want to move to soldered wiring (or a PCB), but I've had good results with these connection types.&lt;br&gt;&lt;br&gt;&lt;br&gt;
The pack above has a 2600mAh capacity, which is plenty for a single node setup. For multiple nodes, we will also require power for the ethernet switch and router (used only assign IP addresses), so it is necessary to upgrade the voltage regulator to something a little &lt;a href="https://www.pololu.com/product/4091"&gt;bigger&lt;/a&gt;. We've also swapped the proto board for a &lt;a href="https://www.amazon.com/MBSS-Solderable-Breadboard-Proto-Board/dp/B082PV1V6S"&gt;larger&lt;/a&gt; one, with the final result as follows. As you can see, the &lt;a href="https://power.tenergy.com/at-tenergy-18650-7-4v-5200mah-rechargeable-battery-pack-w-pcb-2s2p-38-48wh-5a-rate/"&gt;battery&lt;/a&gt; is quite a bit bigger as well. It is also from Tenergy, and has a capacity of 5200mAh.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YfYn2Txe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375760-a15c5500-3864-11eb-97fc-1aa36c0dcb78.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YfYn2Txe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375760-a15c5500-3864-11eb-97fc-1aa36c0dcb78.jpg" width="80%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Flight Ranger
&lt;/h2&gt;

&lt;p&gt;So this is great for our autopilot power, but if we plan on motorin', we're going to need a battery power for our propulsion system. This will also be a LiPo pack, but the current draw will be more than the INA260 can handle. We'll be dealing in the ballpack of 24V and 60A, so something like this &lt;a href="https://www.sparkfun.com/products/16408"&gt;KR Sense&lt;/a&gt; board will be more appropriate.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jd1h-H4x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376307-46772d80-3865-11eb-8230-6aa6a36b7737.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jd1h-H4x--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376307-46772d80-3865-11eb-8230-6aa6a36b7737.jpg"&gt;&lt;/a&gt;&lt;a href="https://cdn.sparkfun.com//assets/parts/1/5/1/4/1/16408-KR_Sense_90A_Current_and_Voltage_Sensor-02.jpg"&gt;3&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
However, rather than I²C, this sensor communicates using an analog voltage output. Unfortunately, the Raspberry Pi does not have any pins capable of analog-to-digital conversion, so we must employ another sensor to handle the analog readings. Sparkfun makes a handy &lt;a href="https://www.sparkfun.com/products/15334"&gt;board&lt;/a&gt; for just the occasion, which has 4 analog inputs and uses the Qwiic connector that you've heard so much about.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PA63WZrL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376317-48d98780-3865-11eb-96b4-d71ee544cadc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PA63WZrL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101376317-48d98780-3865-11eb-96b4-d71ee544cadc.jpg"&gt;&lt;/a&gt;&lt;a href="https://cdn.sparkfun.com//assets/parts/1/3/8/5/7/15334-SparkFun_Qwiic_12_Bit_ADC_-_4_Channel__ADS1015_-01.jpg"&gt;4&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The final product turns out much more compact than you'd imagine, because we solder our XT-60 battery connectors directly to the KR Sense board. After wrapping some Kaptop tape to insulate the boards and adding some &lt;a href="https://www.amazon.com/gp/product/B07XYYSTQB"&gt;heavy duty double-sided tape&lt;/a&gt;, our propulsion voltage/current sensor looks like this:&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EiHEYrSO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375776-a6b99f80-3864-11eb-9fb7-faca79600fdc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EiHEYrSO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375776-a6b99f80-3864-11eb-9fb7-faca79600fdc.jpg" width="50%"&gt;&lt;/a&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8QSQ1tbv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375781-a9b49000-3864-11eb-94e9-16ae9daac750.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8QSQ1tbv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101375781-a9b49000-3864-11eb-94e9-16ae9daac750.jpg" width="48.8%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Great Responsibility
&lt;/h2&gt;

&lt;p&gt;Well there you have it, folks. I just talked about voltage regulation and current sensing for what felt like 20 minutes. And now I am &lt;em&gt;sensing&lt;/em&gt; that you are bored. But next time might be more interesting! We're going to look at how we actually cram all this stuff into an airplane. You'd be amazed how quickly that darn fuselage fills up. But until then, I hope you walk around with your head a little higher, knowing that your universe can now be protected from dead batteries mid-flight.&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fg92gBOb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://i.imgur.com/saF5rwZ.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fg92gBOb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://i.imgur.com/saF5rwZ.gif"&gt;&lt;/a&gt;&lt;a href="https://i.imgur.com/saF5rwZ.gif"&gt;5&lt;/a&gt;&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>uavs</category>
      <category>sensors</category>
    </item>
    <item>
      <title>Part 7: The Mimic Gimmick</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Thu, 03 Dec 2020 16:52:03 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-7-the-mimic-gimmick-f4m</link>
      <guid>https://dev.to/greghgradwell/part-7-the-mimic-gimmick-f4m</guid>
      <description>&lt;p&gt;What's that saying? &lt;em&gt;"Fake it until you make it."&lt;/em&gt; Well today's topic goes a little something like this:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Make it and fake it until you can afford to partake it.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For as long as there have been prototypes, there has been the question of &lt;strong&gt;make vs. buy&lt;/strong&gt;. If you're anything like me, you're a real sucker for the &lt;strong&gt;make&lt;/strong&gt; argument, &lt;em&gt;especially&lt;/em&gt; when you actually understand how something works. But it sure is nice to just unwrap your newly acquired piece of hardware, plug it in, and get on with your life. However, sometimes that hardware is really expensive, and you're just working on some crazy idea that will likely never generate a dollar, let alone enough revenue to justify a $5,000 inertial navigation system (INS). So if the plan is to someday use a &lt;a href="https://www.vectornav.com/products/VN-300"&gt;VN-300 Dual Antenna GNSS-Aided INS&lt;/a&gt;, but our capital is currently tied up saving for a &lt;a href="https://www.lego.com/en-us/product/nintendo-entertainment-system-71374?CMP=AFC-AffiliateUS-msYS1Nvjv4c-3624890-115554-1"&gt;LEGO NES Console&lt;/a&gt;, what can we do in the meantime?&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Make It
&lt;/h2&gt;

&lt;p&gt;There's nothing terribly complicated about how an INS works. Unless you need accurate, reliable results, at which point it gets a little difficult. If there is one thing I know from experience, it's that a mediocre INS is sufficient to enable a fixed-wing aircraft to fly autonomously. In my case, "mediocre" is being extremely generous. So this thing doesn't have to be cosmic, it just needs to work reasonably well and provide consistant output. If there's one part we really want to get right, that's the inertial measurement unit (IMU). This will be giving us the attitude of the vehicle (&lt;code&gt;roll&lt;/code&gt;/&lt;code&gt;pitch&lt;/code&gt;/&lt;code&gt;yaw&lt;/code&gt;), as well as the body axes rotation rates (I usually refer to these as &lt;code&gt;roll_rate&lt;/code&gt;, &lt;code&gt;pitch_rate&lt;/code&gt;, and &lt;code&gt;yaw_rate&lt;/code&gt;. I know this is incorrect, so if you need to scream into a pillow or something, now's your chance).&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iMhbRb_K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057755-41526f80-3541-11eb-98c9-9182b45d4b27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iMhbRb_K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057755-41526f80-3541-11eb-98c9-9182b45d4b27.png"&gt;&lt;/a&gt;&lt;a href="https://invensense.tdk.com/wp-content/uploads/2016/06/DS-000189-ICM-20948-v1.3.pdf"&gt;1&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
I won't get too into the weeds with the IMU, because it would all just be second-hand information at best. But two sensor breakout boards that I have been using recently are the Bosch &lt;a href="https://www.sparkfun.com/products/14686"&gt;BNO080&lt;/a&gt; and the InvenSense &lt;a href="https://www.sparkfun.com/products/15335"&gt;ICM-20948&lt;/a&gt;. I would recommend the ICM-20948, as it is cheaper ($17), more stable, and as you will soon read, equipped with a really awesome feature.

&lt;ul&gt;
&lt;li&gt;If you want to collect accelerometer and gyroscope data and fuse them together yourself, check out this page: &lt;a href="https://x-io.co.uk/open-source-imu-and-ahrs-algorithms"&gt;https://x-io.co.uk/open-source-imu-and-ahrs-algorithms&lt;/a&gt;. They've got source code! You'll be up and running within a day.&lt;/li&gt;
&lt;li&gt;If you want to trust engineers that work on IMUs for a living, go with the ICM-20948 and utilize their Digital Motion Processor (DMP), which FINALLY has working source code available &lt;a href="https://github.com/ericalbers/ICM20948_DMP_Arduino"&gt;https://github.com/ericalbers/ICM20948_DMP_Arduino&lt;/a&gt; (HUGE shoutout to Eric Albers. I've been trying to find code like this for 5+ years). The DMP does all the sensor fusion calculations for you, so you can just get straight to the answer. I haven't flown using it yet, but so far on the bench it looks 👍 &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Alright, we've got attitude and attitude_rate, now we need some position and velocity information. But first, let me point out that I am NOT using the magnetometer from the IMU to determine vehicle heading (the direction it's pointing). Remember back in the day when you had to wave your iPhone around in a figure-8 so that Google Maps knew which way you were facing? Yeah, that's because the earth's magnetic field is super weak. And there's all sorts of magnetic interference on a vehicle with electricity and moving metallic parts, so I swore off magnetometers years ago. Fortunately there is a great way to determine heading using GPS. Oh, right, I forgot to mention that we'll be using GPS to obtain position and velocity (although you probably knew this already). But wait, there's more! Actually it's just another GPS. But it's not really doing as much work.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SzQ_I7Ll--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057774-444d6000-3541-11eb-8f5a-b5744ec6370a.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SzQ_I7Ll--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057774-444d6000-3541-11eb-8f5a-b5744ec6370a.jpg"&gt;&lt;/a&gt;&lt;a href="https://imgflip.com/i/4opm5k"&gt;2&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The basic theory behind using a dual-GPS solution to determine heading is that if you have two GPS units, you can get an accurate estimate of their position relative to each other. So if these units are attached to your vehicle at known locations, then you can determine the heading of your vehicle. Super simple, super cool. Speaking of simple (no, not you, dear reader), the company ArduSimple makes an affordable &lt;a href="https://www.ardusimple.com/product/simplertk2b-heading-basic-starter-kit-ip67/"&gt;Dual-GPS kit&lt;/a&gt; with really straightforward integration (you're familiar with u-blox protocol, right?). And while $540 certainly isn't cheap, we're still well under our $5k baseline.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5Qgb7Sw6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057788-48797d80-3541-11eb-915e-6fba634b50d4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5Qgb7Sw6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057788-48797d80-3541-11eb-915e-6fba634b50d4.png" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alright, so we have data sources for attitude, attitude_rate, position, and velocity, but so far these are just measurements. In other words, they will be subject to noise and biases, and we don't want to have our state estimate jumping around from data point to data point. We need some sort of a filter. If you want to be cool, use an &lt;a href="https://www.seas.harvard.edu/courses/cs281/papers/unscented.pdf"&gt;Unscented Kalman Filter&lt;/a&gt; (UKF). If you actually want to implement your INS, use an &lt;a href="https://en.wikipedia.org/wiki/Extended_Kalman_filter"&gt;Extended Kalman Filter&lt;/a&gt;. Now don't get me wrong, UKFs are freaking RAD. They make a ton of sense, and are clearly the better theoretical way to estimate the state of non-linear systems. But if you can find one working example that includes source code or even a detailed algorithm, then you win the internet. Conversely, EKFs are being used in several open source autopilots (Ardupilot, PX4), and thus you can look at real source code if you have the patience. The unfortunate part about most UKFs and EKFs is that they are HUGE in terms of how many states they contain. This makes debugging really difficult when you're trying to implement your first Kalman filter and have no idea where to start. So after failing several times over the past 5 years to get a 22-state EKF running, I aimed a little lower. I was able to take advantage of a Udacity "one free month" promotion, and enrolled in their Flying Car Nanodegree for the sole purpose of completing the estimation project that included a &lt;a href="https://github.com/udacity/FCND-Estimation-CPP"&gt;7-state EKF&lt;/a&gt; (position: 3, velocity: 3, heading: 1). One of the best parts about this project was that it included a quadcopter simulator, with which you could test your EKF algorithm. I modified the EKF to use roll and pitch coming directly from my IMU (they calculate attitude in a different way), and the results looked pretty good. The final hardware stack looked like this (the Xbee radio in the bottom right is not related):&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QOuLxK5l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057801-4dd6c800-3541-11eb-881c-fdd5514eda9c.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QOuLxK5l--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/101057801-4dd6c800-3541-11eb-881c-fdd5514eda9c.JPG" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The ArduSimple sits on top of a &lt;a href="https://www.sparkfun.com/products/14812"&gt;Sparkfun RedBoard Turbo&lt;/a&gt;, which uses my favorite chip, the ATSAMD21G18 (Cheetos are second, only because I'm not exactly sure if they count as a chip). The INS code lives on the RedBoard and combines the IMU data with GPS data in the 7-state EKF, to produce the attitude, attitude_rate, position, and velocity information that we need. In case you're not familiar with the size of Arduino boards, this stack is about 3.2"x3.15", while the VectorNav VN-300 that we are emulating is 1.77"x1.73". So ours is quite a bit bigger, but I can make 8 of these for the price of one VN-300.

&lt;blockquote&gt;
&lt;p&gt;Note: The RingBuffer.h file for any Sparkfun SAMD board must be modified to increase the &lt;code&gt;SERIAL_BUFFER_SIZE&lt;/code&gt;. I changed it from 64 to 256. The messages coming from our u-blox can be 100 bytes, and will be dropped if the buffer is too small. That bug was really painful.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Fake It
&lt;/h2&gt;

&lt;p&gt;Now that we've got the INS producing acceptable results, how do we integrate it with the autopilot? Because the plan is to eventually replace our stack with a real VN-300, we'd like to have that swap be as painless as possible. The easiest way to do that is to have our INS output messages that conform to the VectorNav binary message protocol. For all you datasheet lovers, here's the goods: &lt;a href="http://rmrco.com/proj/shiprad/vn300-user-manual.pdf"&gt;http://rmrco.com/proj/shiprad/vn300-user-manual.pdf&lt;/a&gt; (VectorNav requires you to register in order to download their datasheet, which in my opinion, is really not awesome).&lt;br&gt;&lt;br&gt;
At the time I was developing this INS I was fortunate enough to have access to a real VN-200 (single GPS), so I could test my message parser with a real piece of hardware. I've never understood why most message protocol datasheets don't include examples of an actual message. It's such a crucial piece of information. So if you're trying to develop your own VectorNav binary parser and need some messages to test with, drop me a line!&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Test It
&lt;/h2&gt;

&lt;p&gt;Okay, great, so I have an INS sitting at my desk, and the &lt;code&gt;roll&lt;/code&gt;/&lt;code&gt;pitch&lt;/code&gt;/&lt;code&gt;yaw&lt;/code&gt; looks pretty reasonable, but how the heck do I test the position and velocity algorithms without having to strap my whole desk to a truck and drive around? You guessed it. X-Plane. With the UDP data output from X-Plane, we can simulate messages that the INS would expect to be coming from the two u-blox (lowercase company names drive me nuts) GPS units. So instead of receiving GPS updates from real hardware, the INS will be getting the position/velocity from our vehicle in X-Plane, and calculating its solution accordingly. I used this same method to test my IMU algorithm: rates and accelerations go in, attitude comes out. After the bugs were worked out, our INS hardware could be fully in the loop of our X-Plane simulation. This approach proved quite successful, as the INS code required only a few tweaks when it was placed on a real vehicle. For comparison, the last time I developed an INS, I strapped down the hardware to a golf cart and drove around for days chasing all the bugs. So if you think that this current attempt is a little hacky, it used to be a LOT worse 😆&lt;/p&gt;


&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Step 4: Replace It
&lt;/h2&gt;

&lt;p&gt;Okay, so I'm not actually to step 4 yet, which will be when our INS wannabe is replaced by a real deal VN-300. But until then, our first three steps produced pretty good results, as proven by several flight tests. So the next time you've got $582.25 lying around, maybe think about building your own Dual-GPS INS?&lt;br&gt;&lt;br&gt;&lt;/p&gt;

&lt;p&gt;Next time we'll talk about something a little more exciting. "Impossible!" you say, as you awake from the boredom-induced slumber that can only come from something as dull as a state estimation lecture. How about something a little more universally applicable, like step-down and step-up voltage regulators and the tools we can use to monitor our batteries. Wow, it's a good thing you've got the weekend to prepare 😮&lt;br&gt;&lt;br&gt;&lt;/p&gt;

&lt;p&gt;-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>uav</category>
      <category>autopilot</category>
      <category>estimation</category>
    </item>
    <item>
      <title>Part 6: SimPATHy</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 30 Nov 2020 16:21:16 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-6-simpathy-5e0k</link>
      <guid>https://dev.to/greghgradwell/part-6-simpathy-5e0k</guid>
      <description>&lt;h1&gt;
  
  
  6. SimPATHy
&lt;/h1&gt;

&lt;p&gt;For someone who keeps using the word "autopilot", I sure have avoided the autonomous part of the project for a long time. Well, hold onto your socks because today we'll be describing how the vehicle actually pilots itself. And the good news is, the explanation is quite simple:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PHmht45V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631032-28974f00-32e0-11eb-9d21-b738923a2a20.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PHmht45V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631032-28974f00-32e0-11eb-9d21-b738923a2a20.jpg"&gt;&lt;/a&gt;&lt;a href="https://i.kym-cdn.com/photos/images/original/000/183/103/alens.jpg"&gt;1&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
But how did the aliens do it? One theory is that they used &lt;a href="https://en.wikipedia.org/wiki/Dubins_path"&gt;Dubins paths&lt;/a&gt;. If you've never heard of a Dubins path, that's probably because you spent your college years with friends. Big mistake. In case you don't feel like clicking on that nicely added link, a Dubins path is essentially the shortest path between two points given a constraint on the curvature (turning radius) of the path. To create a Dubins path you just need to specify two or more waypoints, each of which includes the following information:

&lt;ul&gt;
&lt;li&gt;3D location (latitude, longitude, altitude)&lt;/li&gt;
&lt;li&gt;Heading (direction the vehicle should be headed upon reaching the waypoint)&lt;/li&gt;
&lt;li&gt;Speed&lt;/li&gt;
&lt;li&gt;Vehicle turning radius (or rate of turn)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each segment will include a constant-radius turn, a straight line, and another constant-radius turn, as seen in this example:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9aQjFok1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631052-2d5c0300-32e0-11eb-81f8-a73e25526200.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9aQjFok1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631052-2d5c0300-32e0-11eb-81f8-a73e25526200.png" width="30%"&gt;&lt;/a&gt;&lt;a href="https://commons.wikimedia.org/w/index.php?curid=46816719"&gt;2&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Pretty much everything I know about Dubins paths I learned from Randal Beard's book &lt;em&gt;Small Unmanned Aircraft: Theory and Practice&lt;/em&gt;. What's awesome, is that you can view the contents of the book for free on his website: &lt;a href="https://uavbook.byu.edu"&gt;https://uavbook.byu.edu&lt;/a&gt;. This material covers much more than just Dubins paths, and is a great resource if you are interested in aircraft autonomy.&lt;br&gt;&lt;br&gt;
The unfortunate part about Dubins paths is that they involve a LOT of calculations. They aren't difficult, but there are plenty of opportunities for human error to creep in. I have rewritten these equations in 4 different languages, and I made copious mistakes every single time. So if you're going to take a shot at path planning with Dubins, be sure you have a way to quickly create and test your paths. The nice thing is that Dubins paths are pretty straight forward to display, as they do not involve any complex curvature. If you have a graphical interface that can handle straight lines and circles, then you can display Dubins paths. And if you can display them, you can find out rather quickly when you have problems.
&lt;h2&gt;
  
  
  The Scenic View
&lt;/h2&gt;

&lt;p&gt;One of the most popular libraries for creating user interfaces in Elixir is called &lt;a href="https://github.com/boydm/scenic"&gt;Scenic&lt;/a&gt;. If you don't like words without moving pictures and sound, check out the ElixirConf 2018 introduction to Scenic: &lt;a href="https://youtu.be/1QNxLNMq3Uw"&gt;https://youtu.be/1QNxLNMq3Uw&lt;/a&gt;. There's no doubt that Scenic is far more powerful than what I have used it for, but at the very least it's great for creating simple shapes and text and updating them on demand. Since my ground control station (GCS) pretty much only uses rectangles, circles, lines, and text boxes, it was a great fit. The first thing I created was a display to include the current vehicle information, as well as the corresponding autopilot commands.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oJjSh2cF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631074-32b94d80-32e0-11eb-9761-e6d498bf4877.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oJjSh2cF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631074-32b94d80-32e0-11eb-9761-e6d498bf4877.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The values in the column along the left are all the estimated values coming from the vehicle. The commands are contained in the rows on the right. This gives you a good sense of the controller pipeline: &lt;code&gt;speed&lt;/code&gt;/&lt;code&gt;course&lt;/code&gt;/&lt;code&gt;altitude&lt;/code&gt; commands are converted to &lt;code&gt;thrust&lt;/code&gt;/&lt;code&gt;roll&lt;/code&gt;/&lt;code&gt;pitch&lt;/code&gt;/&lt;code&gt;yaw&lt;/code&gt; commands, which are converted to &lt;code&gt;rollrate&lt;/code&gt;/&lt;code&gt;pitchrate&lt;/code&gt;/&lt;code&gt;yawrate&lt;/code&gt; commands (thrust carries over directly). This interface was useful for tuning the PID controllers in the simulator, as I could quickly compare commanded and actual values. Once the vehicle flew well under manual control (and I had a decent handle on how to use Scenic), I could get to those Dubins paths everyone's been talking about.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4y07DN2e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631086-351ba780-32e0-11eb-8a0c-2f69c34731ee.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4y07DN2e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631086-351ba780-32e0-11eb-8a0c-2f69c34731ee.jpg"&gt;&lt;/a&gt;&lt;a href="https://imgflip.com/i/4obe5k"&gt;3&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The first step was to get a simple, repeatable path to display correctly. I used a racetrack with all right-hand turns. In case you didn't look at any of the required reading, there are four possible Dubins segments in regards to the turn direction for the starting and ending turns: &lt;em&gt;left-left&lt;/em&gt;, &lt;em&gt;left-right&lt;/em&gt;, &lt;em&gt;right-left&lt;/em&gt;, and &lt;em&gt;right-right&lt;/em&gt;. A right-handed racetrack only tests one of the cases: &lt;em&gt;right-right&lt;/em&gt;. This is approximately not 100% of the cases. But it's a start.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MxrLH4ml--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631105-3947c500-32e0-11eb-857e-6fb03248b989.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MxrLH4ml--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631105-3947c500-32e0-11eb-857e-6fb03248b989.png" width="30%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
You'll notice the arcs are either green or red, and there's some small circles at the center of said arcs. But this example is terrible for explaining what those are, so let's go to the next path I tried, which was an hourglass. The waypoint locations and headings are identical to the racetrack, but the order is different, which results in all of the four Dubins segments being present. &lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5QPsz9Yz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631114-3c42b580-32e0-11eb-95b7-4851991717bc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5QPsz9Yz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631114-3c42b580-32e0-11eb-95b7-4851991717bc.png" width="30%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now for the red/green business (if you've never seen &lt;a href="https://www.youtube.com/user/RedGreenTV"&gt;The Red Green Show&lt;/a&gt;, you need to get your life together). As you recall from roughly two minutes ago, a Dubins segment is described as a turn-straight-turn. In color form, I have displayed this as green-white-red. The center of each turn arc is represented by a similarly-colored circle. If the two consecutive segments share a turn direction, e.g., a &lt;em&gt;right-left&lt;/em&gt; followed by a &lt;em&gt;left-left&lt;/em&gt; share a left turn, their arc centers are identical.&lt;br&gt;&lt;/p&gt;

&lt;p&gt;I wish I had taken screenshots of all the messed up paths I was creating until I fixed the bugs in my code. I assure you it took a while. Once of the biggest ways I kept screwing up was related to my decision to keep all waypoint locations in geographic coordinates (latitude/longitude) as opposed to converting them to Cartesian coordinates (X/Y/Z). This was contrary to how I had done things in the past, so I'm sure the newness contributed to some of the errors. But I think the biggest issue was converting latitude/longitude to displacements in terms of pixels, as was necessary in order to plot the coordinates in Scenic. This is because one degree of latitude is not the same distance as one degree of longitude. Throw in the desire to create a constant margin surrounding the waypoints, and I managed to run in circles for a couple of days trying to put that darn √2 in the right spot.&lt;br&gt;&lt;br&gt;
Another added bit of difficulty was that typical Dubins paths algorithms assume a constant turn radius for all waypoints. This seemed like an unnecessary constraint to me, so I reworked the equations to treat the starting and ending turn radii as separate variables. After several days of failing and crying, I understood why people just picked a single radius and stuck with it. Nevertheless, it's possible, and here is the path to prove it: &lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w6TnG4u7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631125-3f3da600-32e0-11eb-829e-6fea0fd46dbe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w6TnG4u7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631125-3f3da600-32e0-11eb-829e-6fea0fd46dbe.png" width="30%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once I was able to create paths correctly, I added takeoff and landing points. The autopilot adjusts its path-following logic depending on the type of waypoint. For example, during takeoff, course corrections will be achieved using yaw instead of roll until the aircraft has left the ground. Flap settings are also dependent on the type of waypoint (takeoff flaps, landing flaps, no flaps for cruise). For the sake of stressing the path generator and the autopilot, I created missions at random when flying in the simulator. Below is one such path, which sort of looks like a slot car controller. I believe this is proof that aliens also have slot car technology. But that will be the subject of an entirely different series.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UMtV4xkb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100632065-3ac5bd00-32e1-11eb-870d-d35c75fcff03.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UMtV4xkb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100632065-3ac5bd00-32e1-11eb-870d-d35c75fcff03.png" width="49%"&gt;&lt;/a&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7aM1047d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631158-482e7780-32e0-11eb-905a-ef871b846107.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7aM1047d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631158-482e7780-32e0-11eb-905a-ef871b846107.jpg" width="49%"&gt;&lt;/a&gt;&lt;a href="https://i.ebayimg.com/images/g/4xYAAOSwCFhfAF05/s-l1600.jpg"&gt;4&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Not all who wander are testing path following algorithms
&lt;/h2&gt;

&lt;p&gt;When we take the Scenic GCS and the Scenic path planner and fly our aircraft in X-Plane, this is the view. The bottom right corner contains the terminal output, which in this case is simply showing that our cluster only contains 1 node, and therefore the cluster is complete.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fdRsJnUk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631185-4c5a9500-32e0-11eb-88f1-ebcb4b1e6e29.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fdRsJnUk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100631185-4c5a9500-32e0-11eb-88f1-ebcb4b1e6e29.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Did I mention I have been able to do all of this testing with the free X-Plane demo version? Maybe if this autopilot ever makes some money I'll go back and buy a copy (look under your chair! Just kidding). But until then, we'll keep chatting about spinning copper into gold. Next time we'll look at how to replicate a VectorNav VN-300 INS for approximately 12% of the cost (and 5% of the capability, but who's counting).&lt;br&gt;&lt;br&gt;&lt;/p&gt;

&lt;p&gt;-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>simulation</category>
      <category>uav</category>
    </item>
    <item>
      <title>Part 5b: Good HIL Hunting</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Thu, 26 Nov 2020 16:09:53 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-5b-good-hil-hunting-352f</link>
      <guid>https://dev.to/greghgradwell/part-5b-good-hil-hunting-352f</guid>
      <description>&lt;p&gt;If you remember from last time, we were parsing X-Plane data output messages and converting them to native Elixir messages that contained the same information. By doing this, the &lt;code&gt;Estimation&lt;/code&gt; module could not tell that its data was coming from a simulator instead of a real aircraft. Here is the data path, as previously shown:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tKsqamsN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943289-eab98a00-2d25-11eb-8721-182d00da88ac.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tKsqamsN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943289-eab98a00-2d25-11eb-8721-182d00da88ac.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;Estimation&lt;/code&gt; module might not know the difference, but the sensor path is being totally bypassed. At some point, once we're content with the performance of our aircraft control in the simulator, we would like to stress more of the system. The next step in the progression was to take the X-plane data and create messages that conformed to the protocols of our sensors, in this case a &lt;a href="https://www.vectornav.com/products/VN-300"&gt;VectorNav VN-300 INS&lt;/a&gt; and a &lt;a href="https://www.terabee.com/shop/lidar-tof-range-finders/teraranger-evo-60m/"&gt;TeraRanger Evo 60m ToF (time of flight)&lt;/a&gt; sensor. Both of these sensors were connected via USB, so it was simple to spoof their presense. I replaced each of the sensors with a unique USB-to-Serial adapter, with chips such as the &lt;a href="https://www.sparkfun.com/products/15096"&gt;CH340C&lt;/a&gt;, &lt;a href="https://www.sparkfun.com/products/13263"&gt;FT231X&lt;/a&gt;, &lt;a href="https://www.adafruit.com/product/3309"&gt;CP2104&lt;/a&gt;, and &lt;a href="https://www.sparkfun.com/products/9873"&gt;FT232RL&lt;/a&gt; (pictured below). If you're not familiar with USB-to-Serial interfaces, check out one of those boards.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--b6uysSeY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311141-f0041800-2f63-11eb-97f2-444614fec62b.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--b6uysSeY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311141-f0041800-2f63-11eb-97f2-444614fec62b.jpg" width="30%"&gt;&lt;/a&gt;&lt;a href="https://www.adafruit.com/product/3309"&gt;1&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By attaching a jumper between the TX and RX pins on the adapter, it's possible to send a spoofed sensor message and then "receive" it on the same interface, which is connected to the simulation computer. This message can then be parsed by the sensor module and processed as though it came from a real sensor. The data path then turns into something like this:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OOcMY_4k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311157-f98d8000-2f63-11eb-8494-eaf1933406c4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OOcMY_4k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311157-f98d8000-2f63-11eb-8494-eaf1933406c4.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
Notice, essentially the entire sensor path is being stressed. The only difference between this and the setup on a real vehicle is that our "sensor" messages are coming from the computer instead of a real sensor. But the messages are exactly the same, and thus the bytes are getting parsed in the exact same way. The entire setup is very compact.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WSzSUG9W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311191-0f02aa00-2f64-11eb-897d-33f201b93809.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WSzSUG9W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311191-0f02aa00-2f64-11eb-897d-33f201b93809.jpg"&gt;&lt;/a&gt;&lt;a href="https://imgflip.com/i/4nw1gh"&gt;2&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Don't believe me?
&lt;/h3&gt;

&lt;p&gt;Here's what it takes to integrate an RC receiver, VN-300, TeraRanger Evo 60m, and Xbee radio:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YVyt0yqP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311196-12963100-2f64-11eb-8dab-0fd6f66eb946.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YVyt0yqP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311196-12963100-2f64-11eb-8dab-0fd6f66eb946.jpeg" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The USB hub connects to the computer running X-plane, and utilizes all of the data paths that a real vehicle would require except for the &lt;code&gt;Actuation&lt;/code&gt; module. But don't worry, we thought of that. Remember, that at this stage the actuation messages (aileron, elevator, rudder, throttle, flaps) are being sent directly to X-plane via UDP. But we'd really like to be driving servos, or at least pretending that we are, because the next step in our simulation is going to be running the code on representative hardware (Raspberry Pis). &lt;br&gt;Before I talk about the &lt;code&gt;Actuation&lt;/code&gt; solution, let's look at the final HIL (hardware-in-the-loop) setup. First, how about a picture that includes a lot of stuff that I won't explain? 😀 What's important is that each of those monitors in the bottom right corner is attached to a Raspberry Pi, and the Pis are connected to the Ethernet switch on the left, and thus connected to each other.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r6AgSyYq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311211-17f37b80-2f64-11eb-9a57-75ed387b1572.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r6AgSyYq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100311211-17f37b80-2f64-11eb-9a57-75ed387b1572.JPG" width="100%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that you're thorougly overwhelmed, let's zoom in on the part that's relevant for this conversation. The configuration shown here includes 6 peripherals − some simulated, some real (the actual hardware is present) − that are connected to the two Raspberry Pi nodes (off-screen to the right):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;VN-300 (simulated)&lt;/li&gt;
&lt;li&gt;TeraRanger Evo 60m (simulated)&lt;/li&gt;
&lt;li&gt;Xbee 900 MHz radio (real)&lt;/li&gt;
&lt;li&gt;Frsky 2.4 GHz receiver (real)&lt;/li&gt;
&lt;li&gt;Pololu Servo Controller 1 (real)&lt;/li&gt;
&lt;li&gt;Pololu Servo Controller 2 (real)
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nXYHhBTE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100369804-3be1ac00-2fba-11eb-8a28-8c39a4eba8d4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nXYHhBTE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100369804-3be1ac00-2fba-11eb-8a28-8c39a4eba8d4.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;
Instead of looping the sensor data back into the same USB-to-Serial adapter as with the pure simulator setup, each adapter is connected to an identical version, with the the TX and RX wires reversed.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Sim RX -&amp;gt; Node TX&lt;br&gt;&lt;br&gt;
Sim TX -&amp;gt; Node RX&lt;br&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This means the output of the simulator becomes the input to the Raspberry Pi node, i.e., the simulator is creating spoofed sensor data that is being read by the Raspberry Pis. In turn, the Pis are creating real servo commands, generated by the ungainly MUX stack at the bottom of the photo (see post #4 for more details about that). And FINALLY we can talk about the &lt;code&gt;Actuation&lt;/code&gt; module. Attached to the final servo MUX is a Feather M0 board. Its job is to read the PWM outputs coming from the MUX and create a serial message that will be sent back to the simulator. This message will be read and converted to the same X-Plane control message we were using earlier to drive the ailerons, elevator, rudder, throttle, and flaps. WHICH MEANS, the &lt;code&gt;Actuation&lt;/code&gt; module is being fully stressed, as it is driving real servos (or so it thinks), and the output from those servos are being used by X-Plane to control the simulated vehicle.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UXfFlmi5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100370892-d4c4f700-2fbb-11eb-870a-e29f60bd1b7c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UXfFlmi5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/100370892-d4c4f700-2fbb-11eb-870a-e29f60bd1b7c.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  In SIMpler Terms
&lt;/h2&gt;

&lt;p&gt;So what's the point? Before I ever put this autopilot in an actual aircraft, the entire software and hardware stack had been stressed, and countless bugs were found that otherwise would have been uncovered during flight test. I was able to test hardware failure scenarios ("what happens when I unplug this sensor?") and watch how the system responded (it needed some tweaks). I could also test my ground control station software using telemetry coming from actual radios (remember last time when I said I would talk about the GCS? Well, that's gonna have to wait). The HIL proved to be a tremendous resource, and is something I plan to keep updating as the design progresses.&lt;br&gt;&lt;br&gt;&lt;br&gt;
At this point you probably don't believe me when I suggest what we'll talk about next time, so I won't betray your trust further. But I'm leaning towards Dubins paths. I know, how did you get so lucky?? 😀&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>simulation</category>
      <category>uav</category>
    </item>
    <item>
      <title>Part 5a: "X" Marks the Spot</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 23 Nov 2020 17:05:03 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-5a-x-marks-the-spot-42oo</link>
      <guid>https://dev.to/greghgradwell/part-5a-x-marks-the-spot-42oo</guid>
      <description>&lt;p&gt;Today we're going to talk about something near and dear to my heart: &lt;strong&gt;staying inside.&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
At several points in my career I have been involved with flight test programs of varying levels of sophistication and technical complexity. The biggest drawback for me was that the aircraft usually had to fly outdoors. I thought engineering was a civilized activity, but apparently there are some parts that require what basically amounts to camping in the jungle. Fortuntely for those of us that shudder at the thought of being too far from the fridge, there is a great tool we can leverage to delay those perilous journeys for as long as possible: &lt;em&gt;computer simulation&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Simulation is a tool in your arsenal that can be very useful in the right situation. In case you tend towards physical prototypes for the purpose of testing (as was often my strategy), allow me to offer a few thoughts on the subject:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The more often you'll be testing your system, the more you'll want a simulator. The iterations are MUCH faster.&lt;/li&gt;
&lt;li&gt;Replicating failure scenarios is much easier with a simulator.&lt;/li&gt;
&lt;li&gt;Simulators make it easier to spread testing responsilibities across your team, i.e., there's no excuse for Software to send it to Test before it's been simulated.&lt;/li&gt;
&lt;li&gt;Do you remember "&lt;a href="https://youtu.be/YmEKjqgEGpE?t=534"&gt;Pilot Wings&lt;/a&gt;" for Super NES? That game was so awesome.&lt;/li&gt;
&lt;li&gt;A simulation is only as good as the model of your system.&lt;/li&gt;
&lt;li&gt;If you feel like making your own simulator, maybe don't? And yeah, now you think this is some sort of cool challenge. But it's not, I promise. You've got way better uses of your time.&lt;/li&gt;
&lt;li&gt;At some point, no matter how good your simulator is, you will have to test your system in the real world.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The first time I attempted a fixed-wing autopilot, I used Matlab to simulate all of my path-planning algorithms. This was a huge help, and you'd think it would have dawned on me that I should test the rest of my software in a similar manner before putting it on a vehicle. Unfortunately I was still pretty new to software development, and since I didn't have any formal education on the subject, I really didn't know any better. So instead, I tested all of my low-level code on an actual vehicle. I at least had the sense to start with an &lt;a href="https://youtu.be/ZBo-xgQBn4E"&gt;RC car&lt;/a&gt;, thereby eliminating that pesky third-dimension. However I was constantly discovering bugs related to parts of the code that were being reached for the first time (oh, yeah, I also knew nothing about unit tests 😵). So the overall test program was certainly much slower than it needed to be, although in the end I did succeed in creating a system capable of &lt;a href="https://youtu.be/7-N5IQFGF_I"&gt;fully-autonomous flight&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Two rodeos is still &lt;a href="https://twitter.com/simoncholland/status/722404063678226432"&gt;a very low number of rodeos&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;So here we are. My second attempt at an autopilot from the ground up, and thus my second attempt at testing an autopilot. One thing going for me is that I no longer live near the airport and taxiway that had proven so crucial to my tests in the past. This really forced me to find a way to simulate my testing environment. Another piece of luck is that I happened to talk to my brother shortly before I was ready to start testing. He mentioned all the cool stuff he was doing with the flight simulator &lt;a href="https://www.x-plane.com/"&gt;X-Plane&lt;/a&gt; (he is a pilot in real life) and how capable it was in terms of third-party expansions. After some quick research it was clear that this was the perfect tool for testing my autopilot at home. &lt;strong&gt;AT HOME.&lt;/strong&gt; The key to X-Plane's usefulness as an external simulator is its UDP data interface. &lt;/p&gt;

&lt;h2&gt;
  
  
  UDP: Yeah You Know Me
&lt;/h2&gt;

&lt;p&gt;Do you like sending messages? Do you hate being left on "read"? Then UDP might be for you. If you don't know what the UDP protocol is, I will let you look it up, because if I tried to explain it here you would quickly realize that I don't know what it is either. But here's what you can do with it. There is a long list of &lt;a href="https://www.x-plane.com/kb/data-set-output-table/"&gt;data messages&lt;/a&gt; that X-Plane will send via UDP to an IP address and port that you specify. These include handy pieces of information like roll/pitch/yaw (attitude), or latitude/longitude/altitude (LLA). Likewise you can send messages &lt;strong&gt;&lt;em&gt;to&lt;/em&gt;&lt;/strong&gt; X-Plane, such as the vehicle control inputs (aileron, elevator, rudder, throttle, flaps). This means that I could have the autopilot running on the same network as X-Plane (doesn't have to be the same computer) and the two could seamlessly interact. What's even better is that with the addition of a few interfaces, the autopilot code had no idea that it was talking to a simulator rather than being installed in a vehicle. In other words, the code isn't littered with &lt;code&gt;if (sim==true)&lt;/code&gt; statements. So I was able to stress every code path that would be touched during a real flight.&lt;br&gt;Hopefully there is something within this last paragraph that interests you, because we'll talking about this for a while...&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1 - Let's Cheat
&lt;/h2&gt;


&lt;p&gt;X-Plane messages were integrated in steps. Eventually I will demonstrate how data messages were processed and converted into fake sensor messages, and then read by real hardware. But that is several steps away. I started with a process that resembled &lt;a href="https://www.amazon.com/Test-Driven-Development-Kent-Beck/dp/0321146530"&gt;Test-driven Development&lt;/a&gt;, in the sense that I inserted a TON of duplication, and then slowly removed it. To understand what I mean by duplication, let's look at the data path for a typical estimation sensor (IMU, INS, LiDAR, etc):&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--I-muFFJO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943278-e7be9980-2d25-11eb-80ac-22d833d00329.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--I-muFFJO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943278-e7be9980-2d25-11eb-80ac-22d833d00329.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;&lt;br&gt;
The sensor is attached to physical interface, which receives its serial data, parses it, and then sends an Elixir message to the &lt;code&gt;Estimation&lt;/code&gt; module. In order to fully simulator a sensor, I would need to get the data from X-Plane to travel this full path (and like I said, eventually we will). But before we do all that work, wouldn't it be nice to see if this whole X-Plane thing will even do what we want? Let's just go straight from X-Plane to &lt;code&gt;Estimation&lt;/code&gt; like this:&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tKsqamsN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943289-eab98a00-2d25-11eb-8721-182d00da88ac.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tKsqamsN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943289-eab98a00-2d25-11eb-8721-182d00da88ac.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;&lt;br&gt;
So much easier! In order to test the autopilot I needed the following information:

&lt;ul&gt;
&lt;li&gt;Body rates (3-axis gyroscope output)&lt;/li&gt;
&lt;li&gt;Attitude (roll/pitch/yaw)&lt;/li&gt;
&lt;li&gt;Velocity (North/East/Down)&lt;/li&gt;
&lt;li&gt;Position (Latitude/Longitude/Altitude)&lt;/li&gt;
&lt;li&gt;AGL (Above ground level, e.g., laser altimeter)&lt;/li&gt;
&lt;/ul&gt;


&lt;p&gt;This data was available in X-Plane through messages #3, 16, 17, 20, and 21.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c6P5yZpy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943183-be9e0900-2d25-11eb-92ea-a8bb12da75a5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c6P5yZpy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99943183-be9e0900-2d25-11eb-92ea-a8bb12da75a5.png"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;&lt;br&gt;
The hardest part was parsing the UDP messages. Here's a couple of resources I used to figure this out: &lt;br&gt;&lt;a href="http://jefflewis.net/XPlaneUDP_9.html"&gt;&lt;/a&gt;&lt;a href="http://jefflewis.net/XPlaneUDP_9.html"&gt;http://jefflewis.net/XPlaneUDP_9.html&lt;/a&gt;&lt;br&gt;&lt;a href="http://www.nuclearprojects.com/xplane/info"&gt;&lt;/a&gt;&lt;a href="http://www.nuclearprojects.com/xplane/info"&gt;http://www.nuclearprojects.com/xplane/info&lt;/a&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
Once I had the data parsed, I sent Elixir-style messages straight to the Estimation module, where it was processed as though it had come from a sensor module. Conversely, instead of sending actuation outputs to servos, I created UDP messages to send to X-Plane. Once I got those right, X-Plane treated my aileron/elevator/throttle/rudder servo commands as though they came from a pilot sitting inside the aircraft.&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
If this all sounds suspicioiusly simple, IT IS! I cannot stress enough how ridiculously easy (in the grand scheme of things) it was to integrate X-Plane into my autopilot pipeline. I realize that I skipped over all the hard parts, and instead offered you some hand-waving similar to my college calculus classes that went from &lt;em&gt;"x=1, solve for x"&lt;/em&gt; to &lt;strong&gt;"x=y+3. Why were you born?"&lt;/strong&gt; in the span of a single lecture. But that's because I figure most of you aren't going to actually try this at home. If you do, please drop me a line and I'd be happy to go into more detail. And if you think this was super boring, then you'll hate the next post, because it's even more of this. I will talk about how I simulated sensor data at a low level, and also demonstrate the super basic ground station GUI that I created using Scenic. But I promise, once we're out of Sim City, we'll get back to all of my failures that happened in the real world.&lt;br&gt;&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>uav</category>
      <category>xplane</category>
    </item>
    <item>
      <title>Part 4: Attested Development</title>
      <dc:creator>Greg Gradwell</dc:creator>
      <pubDate>Mon, 23 Nov 2020 17:02:38 +0000</pubDate>
      <link>https://dev.to/greghgradwell/part-4-attested-development-b2o</link>
      <guid>https://dev.to/greghgradwell/part-4-attested-development-b2o</guid>
      <description>&lt;p&gt;Do you ever look back on your life and think: &lt;/p&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wFQEyUQO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/99624137-82477180-29e2-11eb-9256-02db313af921.gif"&gt;&lt;a href="https://giphy.com/gifs/11BAxHG7paxJcI"&gt;1&lt;/a&gt;

&lt;p&gt;Great! Me too. And even though I'm constantly convinced I've made it to the other side of &lt;em&gt;senseless&lt;/em&gt;, I continue to provide examples of the contrary. Today I figured we could talk about one such instance involving our good friend, the &lt;strong&gt;&lt;em&gt;multiplexor&lt;/em&gt;&lt;/strong&gt;.&lt;br&gt;&lt;/p&gt;

&lt;p&gt;As I'm sure you remember from Part 2, a multiplexor takes input signals from two different sources, and based on some criterion passes only one of them on. This is a very useful tool, as it allows for a direct link between the RC receiver and the servos in the vehicle. If things go south with the autopilot, I can simply switch to manual control, which gives me that extra bit of comfort as I manually pilot the airplane directly into the ground due to poor decision making.&lt;br&gt;&lt;br&gt;
Things got a little more complicated when I added a second node to the autopilot. Before the servo signal reaches the final multiplexor (which I'm tired of typing, so we'll call it a MUX from now on), there must first be a decision as to which of the two node's signals will be used. If only there were a way to choose between two input signals...That's right! More multiplexors. I mean MUXs. MUXes? MUXi? So basically this autopilot is just turtles all the way down. Perhaps a diagram will help. Let's start with the single-node configuration (created using &lt;a href="https://app.diagrams.net/"&gt;https://app.diagrams.net/&lt;/a&gt;, formerly &lt;a href="http://draw.io"&gt;http://draw.io&lt;/a&gt;)&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--on9zt5uW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624422-24675980-29e3-11eb-864b-4c78ef632a87.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--on9zt5uW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624422-24675980-29e3-11eb-864b-4c78ef632a87.png" width="70%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;In case the acronym &lt;em&gt;PWM&lt;/em&gt; doesn't ring a bell, it stands for "&lt;a href="https://en.wikipedia.org/wiki/Servo_control"&gt;Pulse-width modulation&lt;/a&gt;" and is the type of signal that a servo expects to receive. The fact that I'm sending PWM signals to the MUX is important to note now, because that will change later.&lt;br&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;So the aircraft servos are connected to the MUX which is either forwarding autopilot servo commands or RC receiver servo commands. This is pretty simple, and it led to the following hardware stack: &lt;/p&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fgY5SJQ6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624490-3cd77400-29e3-11eb-961b-c0dcb11fc4b8.JPG" width="50%"&gt;

&lt;p&gt;The RC receiver is on top. Below it is an &lt;a href="https://www.adafruit.com/product/2772"&gt;Adafruit Feather M0 Basic Proto&lt;/a&gt; board, loaded with a program that takes serial messages from the autopilot and converts them to PWM signals. In the bottom right is a &lt;a href="https://www.pololu.com/product/2806"&gt;4-channel MUX&lt;/a&gt; from Pololu. You may notice there are two yellow/red/black servo connectors coming out of the receiver and exiting to the right (stage left). These were connected directly to the aircraft flaps and landing gear, as this MUX does not have enought channels to support them.&lt;br&gt;&lt;br&gt;
This setup isn't too bad. The wiring is pretty much as minimal as it could get. But when we expand this design concept to the dual-node configuration, things get ugly. Let's look at the diagram:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Fp8NgWbu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624548-524c9e00-29e3-11eb-9d59-8e3d0c523cf7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Fp8NgWbu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624548-524c9e00-29e3-11eb-9d59-8e3d0c523cf7.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;YIKES. My guess is you didn't bother to look at it closely, and honestly neither would I. But you at least noticed all the extra lines and stuff, right? So why are they necessary? Well, each of the autopilot nodes has two responsibilies:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Act as primary source for approximately one half of the servos&lt;/li&gt;
&lt;li&gt; Act as secondary source for the rest of the servos&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is the reason there are two MUXes before we even get to the final Boss. MUX 1 and MUX 2 each supply roughly half of the servo outputs, which are combined into a complete input for MUX 3. Perhaps a few scenarios will help. Let's assume that MUX 1 is responsible for channels 1-3, and MUX 2 is responsible for channels 4-6.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scenario 1: Normal Operation
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;MUX 1: uses &lt;strong&gt;primary&lt;/strong&gt; inputs, coming from Autopilot Node 1&lt;/li&gt;
&lt;li&gt;MUX 2: uses &lt;strong&gt;primary&lt;/strong&gt; inputs, coming from Autopilot Node 2&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Scenario 2: Autopilot Node 1 is dead
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;MUX 1: uses &lt;strong&gt;secondary&lt;/strong&gt; inputs, coming from Autopilot Node 2&lt;/li&gt;
&lt;li&gt;MUX 2: uses &lt;strong&gt;primary&lt;/strong&gt; inputs, coming from Autopilot Node 2&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Scenario 3: Autopilot Node 2 is dead
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;MUX 1: uses &lt;strong&gt;primary&lt;/strong&gt; inputs, coming from Autopilot Node 1&lt;/li&gt;
&lt;li&gt;MUX 2: uses &lt;strong&gt;secondary&lt;/strong&gt; inputs, coming from Autopilot Node 1&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So as long as one of the autopilot nodes is healthy and producing servo commands, all of the servos will be covered. But still, this is a pretty dense solution. Steve Holt!&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_oAuUKxG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624609-67293180-29e3-11eb-8f68-789820de860d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_oAuUKxG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624609-67293180-29e3-11eb-8f68-789820de860d.jpg" width="80%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;In this example, servo control is coming from Pololu &lt;a href="https://www.pololu.com/product/1350"&gt;Maestro Servo Controllers&lt;/a&gt;. This setup was actually created before the single-node setup mentioned earlier, so the chronology is a bit out of order here. The next revision we see will again feature the Feather M0. And if you're looking for the RC receiver, it's hiding under the monster &lt;a href="https://acroname.com/store/s56-rxmux-1"&gt;8-channel MUX&lt;/a&gt; from Acroname.&lt;br&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Where did we go wrong?
&lt;/h2&gt;

&lt;p&gt;How did the addition of just one more node turn this solution into such a hardware nightware? Allow me to highlight the assumptions that were carried from the single-node setup to the dual-node setup.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VOc085i5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/99624657-78723e00-29e3-11eb-9182-ad4106834ad4.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VOc085i5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://user-images.githubusercontent.com/2257561/99624657-78723e00-29e3-11eb-9182-ad4106834ad4.gif"&gt;&lt;/a&gt;&lt;a href="https://i.pinimg.com/originals/58/03/1b/58031bef9a023a631b17e34884a7e18c.jpg"&gt;2&lt;/a&gt;&lt;/p&gt;
&lt;br&gt;
The requirements had changed, but I did not revisit any of the decisions that went into the design of the single-mux configuration. I just took that concept and made it bigger. The two most impactful assumptions were:&lt;br&gt;
&lt;/blockquote&gt;

&lt;ol&gt;
&lt;li&gt; I need to use a dedicated servo controller to produce PWM signals&lt;/li&gt;
&lt;li&gt; I need to use an off-the-shelf MUX rather than create my own&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Assumption #2 was somewhat of a byproduct of #1. If I'm using a separate PWM controller, than my signals will be PWM, and therefore an off-the-shelf MUX makes total sense. But what if I could use a microcontroller to generate PWM signals? Could I also use that board to act as a MUX? And if my MUX is on a microcontoller, do I need to send PWM signals to it, or I can use something more versatile, like UART? Great questions. Here's the updated diagram with the answer:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--myIAgaKq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624704-8de76800-29e3-11eb-9408-b98c7213e50f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--myIAgaKq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624704-8de76800-29e3-11eb-9408-b98c7213e50f.png"&gt;&lt;/a&gt;&lt;br&gt;Dual-MUX with Arduino&lt;/p&gt;
&lt;br&gt;
As you can see, things got a lot simpler. The &lt;em&gt;Servo Controller/MUX&lt;/em&gt; is a Feather M0 board that takes two serial inputs — one from the Autopilot node (primary), and one from the other Servo Controller/MUX (secondary) — and selects the appropriate one. At this point, it converts these serial commands into PWM signals, which are sent along to Boss MUX 3. And if this much sleeker diagram isn't enough to get you excited, then check out what this means for the hardware. There are way fewer wires, and it takes up much less space.&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pawclwuo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624761-ad7e9080-29e3-11eb-8067-edc972e06436.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pawclwuo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624761-ad7e9080-29e3-11eb-8067-edc972e06436.JPG" width="40%"&gt;&lt;/a&gt;    &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UVGZTD3X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624786-bcfdd980-29e3-11eb-8c5a-86a88e56708e.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UVGZTD3X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624786-bcfdd980-29e3-11eb-8c5a-86a88e56708e.JPG" width="40%"&gt;&lt;/a&gt;&lt;br&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z2R5FtIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624805-c8510500-29e3-11eb-9657-3d393434a4b8.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z2R5FtIt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/2257561/99624805-c8510500-29e3-11eb-9657-3d393434a4b8.jpeg" width="50%"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  What's the point?
&lt;/h2&gt;

&lt;p&gt;Oh, you wanted a moral to this story? 😰 Let's start with what I do know. I lived with that bulky dual-MUX setup &lt;strong&gt;for weeks&lt;/strong&gt;. The moment I had the idea to make my own MUX, the rest of the design fell into place, and the hardware became a lot smaller (yes, I realize it's taller, but imagine the airplane like it's a moving truck — floorspace is premium). In my gut I knew that the original solution just wasn't cutting it. I think I accepted the status quo because I figured I was using commerially available parts, so that must be as good as it gets. What's the lesson? Geez, I'm not exactly sure. But I think it has something to do with always leaving a note...&lt;br&gt;&lt;/p&gt;

&lt;p&gt;Come back next time and we'll discuss how I was able to test the software and hardware in a realistic simulation environment. I would X-plane further, but I don't want to give it away.😉&lt;br&gt;&lt;br&gt;&lt;br&gt;
-Greg&lt;/p&gt;

</description>
      <category>elixir</category>
      <category>autopilot</category>
      <category>mux</category>
      <category>hardware</category>
    </item>
  </channel>
</rss>
