<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Sharon Anderson</title>
    <description>The latest articles on DEV Community by Sharon Anderson (@sharonandersondev).</description>
    <link>https://dev.to/sharonandersondev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sharonandersondev"/>
    <language>en</language>
    <item>
      <title>Music Visualiser with Three.JS</title>
      <dc:creator>Sharon Anderson</dc:creator>
      <pubDate>Wed, 01 Dec 2021 11:02:12 +0000</pubDate>
      <link>https://dev.to/sharonandersondev/music-visualiser-with-threejs-49l3</link>
      <guid>https://dev.to/sharonandersondev/music-visualiser-with-threejs-49l3</guid>
      <description>&lt;p&gt;In an attempt to learn THREE.js — the 3D rendering WebGL framework and WebAudio API, I made something that visualises the music in a very simple way. This article documents the whole process.&lt;br&gt;
Final thing first:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/n3rkF0el0AQ"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;(Just use a .mp3 / .mp4 / .wav file to see it work. If you are out, you can use this)&lt;br&gt;
A Primer on WebAudio API&lt;br&gt;
The HTML5’s  tag when combined with the WebAudio API, becomes quite powerful. It’s a dynamic tool that lets you process and adds audio effects dynamically to any kind of audio.&lt;/p&gt;

&lt;p&gt;The Web Audio API involves handling audio operations inside an audio context and has been designed to allow modular routing. Basic audio operations are performed with audio nodes, which are linked together to form an audio routing graph. Several sources — with different types of channel layouts — are supported even within a single context. This modular design provides the flexibility to create complex audio functions with dynamic effects.&lt;/p&gt;

&lt;p&gt;The audio pipeline starts by creating an audio context. It should have at least a single audio source — which can be thought of as an entry point for external files, mic input, oscillators, etc. Once we have a source in place, the signal is processed and moved ahead in the pipeline using audio nodes. After processing, the signal(s) are routed to the audio destination, which can only be a single one in the whole context.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--72l_IWBs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l58qub21ma7j23xtczpc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--72l_IWBs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l58qub21ma7j23xtczpc.png" alt="Image description" width="422" height="310"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Modular Routing&lt;/p&gt;

&lt;p&gt;The simplest illustration has a single source and a destination, without any effects or processing, inside the context. Why would anyone use this? Maybe they just want to play the sound without any changes.&lt;/p&gt;

&lt;p&gt;On the left is an example of a much more complex setup, which can also be made using this API.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/n3rkF0el0AQ"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Refer the complete article here:- &lt;a href="https://www.epicprogrammer.com/2021/11/music-visualiser-with-threejs-web-audio.html"&gt;https://www.epicprogrammer.com/2021/11/music-visualiser-with-threejs-web-audio.html&lt;/a&gt;&lt;br&gt;
Content Inspired by &lt;a href="//youtube.com/c/epicprogrammer?sub_confirmation=1"&gt;Epic Programmer&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>tutorial</category>
      <category>programming</category>
    </item>
    <item>
      <title>Infinite Lights</title>
      <dc:creator>Sharon Anderson</dc:creator>
      <pubDate>Sun, 07 Nov 2021 11:17:39 +0000</pubDate>
      <link>https://dev.to/sharonandersondev/infinite-lights-44l4</link>
      <guid>https://dev.to/sharonandersondev/infinite-lights-44l4</guid>
      <description>&lt;p&gt;Through this article, we’ll use Three.js and learn how to:&lt;br&gt;
instantiate geometries to create thousands (up to millions) of lights.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Preparing the road and camera&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To begin we’ll create a new "Road" class to encapsulate all the logic for our plane. It’s going to be a basic "PlaneBufferGeometry" with its height being the road’s length.&lt;/p&gt;

&lt;p&gt;We want this plane to be flat on the ground and go further away. But Three.js creates a vertical plane at the center of the scene. We’ll rotate it on the x-axis to make it flat on the ground (y-axis).&lt;/p&gt;

&lt;p&gt;We’ll also move it by half its length on the z-axis to position the start of the plane at the center of the scene.&lt;/p&gt;

&lt;p&gt;We’re moving it on the z-axis because position translation happens after the rotation. While we set the plane’s length on the y-axis, after the rotation, the length is on the z-axis.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Instantiating the lights&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Although some lights are longer or thicker than others, they all share the same geometry. Instead of creating a bunch of meshes for each light, and causing lots of draw calls, we can take advantage of instantiation.&lt;/p&gt;

&lt;p&gt;Instantiation is the equivalent of telling WebGL “Hey buddy, render this SAME geometry X amount of times”. This process allows you to reduce the amount of draw calls to 1.&lt;/p&gt;

&lt;p&gt;Although it’s the same result, rendering X objects, the process is very different. Let’s compare it with buying 50 chocolates at a store:&lt;/p&gt;

&lt;p&gt;A draw call is the equivalent of going to the store, buying only one chocolate and then coming back. Then we repeat the process for all 50 chocolates. Paying for the chocolate (rendering) at the store is pretty fast, but going to the store and coming back (draw calls) takes a little bit of time. The more draw calls, the more trips to the store, the more time.&lt;/p&gt;

&lt;p&gt;With instantiation, we’re going to the store and buying all 50 chocolates and coming back. You still have to go and come back from the store (draw call) one time. But you saved up those 49 extra trips.&lt;/p&gt;

&lt;p&gt;A fun experiment to test this even further: Try to delete 50 different files from your computer, then try to delete just one file of equivalent size to all 50 combined. You’ll notice that even though it’s the same combined file size, the 50 files take more time to be deleted than the single file of equivalent size 😉&lt;/p&gt;

&lt;p&gt;Coming back to the code: to instantiate we’ll copy our tubeGeometry over to an InstancedBufferGeometry. Then we’ll tell it how many instances we’ll need. In our case, it’s going to be a number multiplied by 2 because we want two lights per “car”.&lt;/p&gt;

&lt;p&gt;Next we’ll have to use that instanced geometry to create our mesh.&lt;/p&gt;

&lt;p&gt;Although it looks the same, Three.js now rendered 100 tubes in the same position. To move them to their respective positions we’ll use an InstancedBufferAttribute.&lt;/p&gt;

&lt;p&gt;While a regular BufferAttribute describes the base shape, for example, it’s position, uvs, and normals, an InstanceBufferAttribute describes each instance of the base shape. In our case, each instance is going to have a different aOffset and a different radius/length aMetrics.&lt;/p&gt;

&lt;p&gt;When it’s time each instance passes through the vertex shader. WebGL is going to give us the values corresponding to each instance. Then we can position them using those values.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Positioning the lights&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We want to have two roads of lights coming from different directions. Let’s create the second TailLights and move each to their respective position. To center them both, we’ll move them by half the middle island’s width and half the road’s width.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/hw_e5n1DqPo"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;This article was inspired by Epic Programmer here is the original article :- &lt;a href="https://www.epicprogrammer.com/2021/11/infinite-lights-with-threejs.html"&gt;https://www.epicprogrammer.com/2021/11/infinite-lights-with-threejs.html&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>programming</category>
    </item>
  </channel>
</rss>
