<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Vu Hung Nguyen (Hưng)</title>
    <description>The latest articles on DEV Community by Vu Hung Nguyen (Hưng) (@vuhung16au).</description>
    <link>https://dev.to/vuhung16au</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/vuhung16au"/>
    <language>en</language>
    <item>
      <title>The Mathematics and Engineering Behind 3D Heart Animations Synchronized with Music</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Sun, 16 Nov 2025 06:34:50 +0000</pubDate>
      <link>https://dev.to/vuhung16au/the-mathematics-and-engineering-behind-3d-heart-animations-synchronized-with-music-3dd1</link>
      <guid>https://dev.to/vuhung16au/the-mathematics-and-engineering-behind-3d-heart-animations-synchronized-with-music-3dd1</guid>
      <description>&lt;h1&gt;
  
  
  The Mathematics and Engineering Behind 3D Heart Animations Synchronized with Music
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;What happens when you combine parametric equations, audio signal processing, and real-time visualization? You get mesmerizing 3D heart animations that dance to music. This project started as a simple mathematical visualization and evolved into a comprehensive system for creating music-synchronized animations using Python, NumPy, Matplotlib, and librosa.&lt;/p&gt;

&lt;p&gt;In this deep-dive, I'll walk you through the technical journey of building this system, from the mathematical foundations of the 3D heart shape to the audio engineering that makes hearts pulse in perfect sync with music.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Demo Videos:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=twa5GX2UVf4" rel="noopener noreferrer"&gt;H9 Effect: Cuba to New Orleans&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=Jv5ItACRHC4" rel="noopener noreferrer"&gt;I2 Effect: Five Hearts&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=Canpma8k4UI" rel="noopener noreferrer"&gt;I3 Effect: Birthday Celebration&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Key Ideas
&lt;/h2&gt;

&lt;p&gt;At its core, this project solves three main challenges:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Mathematical Modeling&lt;/strong&gt;: Creating a 3D heart shape using parametric equations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio Analysis&lt;/strong&gt;: Extracting musical features (beats, tempo, loudness, bass, onsets) from audio files&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Real-Time Synchronization&lt;/strong&gt;: Mapping audio features to visual transformations in real-time&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The magic happens when these three systems work together, each frame of the animation queries the audio features at that exact moment in time, creating a synchronized experience where the heart "feels" the music.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Mathematics: Parametric Heart Equations
&lt;/h2&gt;

&lt;p&gt;The 3D heart shape is defined by parametric equations using two parameters: &lt;code&gt;u&lt;/code&gt; ∈ [0, π] and &lt;code&gt;v&lt;/code&gt; ∈ [0, 2π].&lt;/p&gt;

&lt;h3&gt;
  
  
  The Core Equations
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;x = sin(u) · (15·sin(v) - 4·sin(3v))
y = 8·cos(u)
z = sin(u) · (15·cos(v) - 5·cos(2v) - 2·cos(3v) - cos(v))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These equations create a heart shape by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using &lt;code&gt;sin(u)&lt;/code&gt; as a scaling factor that varies from 0 to 1 and back to 0&lt;/li&gt;
&lt;li&gt;Combining multiple harmonics of &lt;code&gt;sin(v)&lt;/code&gt; and &lt;code&gt;cos(v)&lt;/code&gt; to create the heart's characteristic curves&lt;/li&gt;
&lt;li&gt;The coefficients (15, -4, -5, -2, -1) control the shape's proportions&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Implementation in Python
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Generate parameter grids
&lt;/span&gt;&lt;span class="n"&gt;u&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;linspace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;u_points&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;v&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;linspace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;v_points&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;u_grid&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;v_grid&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;meshgrid&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;u&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Flatten for scatter plot
&lt;/span&gt;&lt;span class="n"&gt;u_flat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;u_grid&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;flatten&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="n"&gt;v_flat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;v_grid&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;flatten&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# Apply parametric equations
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;u_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;15&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;u_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;u_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="mi"&gt;15&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt;
    &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v_flat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3D Rotation
&lt;/h3&gt;

&lt;p&gt;To rotate the heart around the Y-axis, we apply a rotation matrix:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;alpha_deg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;frame&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;tempo_factor&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;total_frames&lt;/span&gt;
&lt;span class="n"&gt;alpha_rad&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;deg2rad&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha_deg&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;x_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha_rad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha_rad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;y_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;  &lt;span class="c1"&gt;# Y-axis rotation doesn't affect Y coordinate
&lt;/span&gt;&lt;span class="n"&gt;z_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha_rad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha_rad&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;tempo_factor&lt;/code&gt; adapts rotation speed based on the music's tempo, making the heart rotate faster during upbeat sections.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Sound Engineering: Audio Feature Extraction with Librosa
&lt;/h2&gt;

&lt;p&gt;Librosa is a Python library for music and audio analysis. It provides the tools we need to extract musical features that drive our visualizations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Audio Analysis Pipeline
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;analyze_audio.py&lt;/code&gt; script performs a comprehensive analysis:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Beat Detection&lt;/strong&gt;: Identifies the rhythmic pulse of the music&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tempo Tracking&lt;/strong&gt;: Calculates BPM and tracks tempo changes over time&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Onset Detection&lt;/strong&gt;: Finds moments when new sounds begin&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RMS Energy (Loudness)&lt;/strong&gt;: Measures overall volume over time&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bass Analysis&lt;/strong&gt;: Extracts low-frequency energy&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Beat Detection
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;tempo&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;beat_frames&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;beat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;beat_track&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;beat_times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;frames_to_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;beat_frames&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This gives us a list of timestamps where beats occur, which we use to make the heart pulse.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dynamic Tempo Tracking
&lt;/h3&gt;

&lt;p&gt;For longer pieces with tempo changes, we track tempo over time:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;tempo_times&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;tempo_values&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;beat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tempo&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="n"&gt;aggregate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;median&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;hop_length&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;512&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a time series of tempo values, allowing the heart's rotation speed to adapt to tempo changes throughout the song.&lt;/p&gt;

&lt;h3&gt;
  
  
  RMS Energy (Loudness)
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;rms&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;feature&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;rms&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;rms_times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;frames_to_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rms&lt;/span&gt;&lt;span class="p"&gt;)),&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# Normalize to 0-1 range
&lt;/span&gt;&lt;span class="n"&gt;rms_normalized&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rms&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;rms&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rms&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;rms&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;RMS energy measures the "loudness" at each moment. We use this to control camera zoom - louder sections bring the camera closer for a more intimate view.&lt;/p&gt;

&lt;h3&gt;
  
  
  Bass Extraction
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Extract bass frequencies (typically 20-250 Hz)
&lt;/span&gt;&lt;span class="n"&gt;S&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stft&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;bass_freqs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;where&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;freqs&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;freqs&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="mi"&gt;250&lt;/span&gt;&lt;span class="p"&gt;))[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;bass_energy&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;abs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;S&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;bass_freqs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;:]),&lt;/span&gt; &lt;span class="n"&gt;axis&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Bass frequencies create the "thump" in music. We use bass strength to control the heart's brightness - more bass means a brighter, more glowing heart.&lt;/p&gt;

&lt;h3&gt;
  
  
  Onset Detection
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;onset_frames&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;onset&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;onset_detect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;units&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;frames&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;onset_times&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;librosa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;frames_to_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;onset_frames&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;sr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Onsets mark the beginning of new musical events. Strong onsets trigger additional heartbeat pulses, creating visual emphasis on musical accents.&lt;/p&gt;

&lt;h3&gt;
  
  
  Output: JSON Feature File
&lt;/h3&gt;

&lt;p&gt;All extracted features are saved to a JSON file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"beat_times"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.9&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"tempo_times"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"tempo_values"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;120.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;121.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;120.8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"rms_times"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.023&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.046&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"rms_values"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"bass_times"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.023&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.046&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"bass_values"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"onset_times"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mf"&gt;0.15&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;1.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This JSON file becomes the "score" that the animation follows.&lt;/p&gt;




&lt;h2&gt;
  
  
  Synchronizing Sound and Visuals
&lt;/h2&gt;

&lt;p&gt;The synchronization happens in real-time during animation rendering. Each frame queries the audio features at that exact moment.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Update Function
&lt;/h3&gt;

&lt;p&gt;Every effect implements an &lt;code&gt;update(frame)&lt;/code&gt; function that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Calculates the current time: &lt;code&gt;current_second = frame / fps&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Queries audio features at that time&lt;/li&gt;
&lt;li&gt;Applies transformations based on those features&lt;/li&gt;
&lt;li&gt;Updates the 3D scatter plot&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Audio Feature Queries
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_beat_intensity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;current_time&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;beat_times&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;window&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Returns 0-1 intensity based on proximity to nearest beat.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;distances&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;abs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;beat_times&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;current_time&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;nearest_distance&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;min&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;distances&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;nearest_distance&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;window&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;intensity&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nearest_distance&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;window&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;intensity&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="mf"&gt;0.0&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This function checks if we're near a beat. If we're within 0.1 seconds of a beat, it returns an intensity value (1.0 = exactly on beat, 0.0 = far from beat).&lt;/p&gt;

&lt;h3&gt;
  
  
  Applying Audio Features to Visuals
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Heartbeat pulse on beats
&lt;/span&gt;&lt;span class="n"&gt;heartbeat_scale&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;beat_intensity&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;heartbeat_scale&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mf"&gt;0.2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;beat_intensity&lt;/span&gt;  &lt;span class="c1"&gt;# Pulse 20% larger
&lt;/span&gt;
&lt;span class="c1"&gt;# Apply scaling
&lt;/span&gt;&lt;span class="n"&gt;x_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x_base&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;heartbeat_scale&lt;/span&gt;
&lt;span class="n"&gt;y_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;y_base&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;heartbeat_scale&lt;/span&gt;
&lt;span class="n"&gt;z_rotated&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;z_base&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;heartbeat_scale&lt;/span&gt;

&lt;span class="c1"&gt;# Brightness responds to bass
&lt;/span&gt;&lt;span class="n"&gt;alpha&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mf"&gt;0.4&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;bass&lt;/span&gt;  &lt;span class="c1"&gt;# More bass = brighter
&lt;/span&gt;
&lt;span class="c1"&gt;# Zoom responds to loudness
&lt;/span&gt;&lt;span class="n"&gt;zoom_factor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;base_zoom&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;loudness&lt;/span&gt;  &lt;span class="c1"&gt;# Louder = closer
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each audio feature maps to a visual property:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Beats&lt;/strong&gt; → Heartbeat pulse (scale)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tempo&lt;/strong&gt; → Rotation speed&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Loudness&lt;/strong&gt; → Camera zoom&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bass&lt;/strong&gt; → Brightness/alpha&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Onsets&lt;/strong&gt; → Additional pulses&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  FFmpeg: Combining Video and Audio
&lt;/h2&gt;

&lt;p&gt;Matplotlib's &lt;code&gt;FFMpegWriter&lt;/code&gt; creates video-only files. We use FFmpeg to combine the video with the original audio track.&lt;/p&gt;

&lt;h3&gt;
  
  
  Video Generation
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;matplotlib.animation&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;FFMpegWriter&lt;/span&gt;

&lt;span class="n"&gt;writer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;FFMpegWriter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fps&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;bitrate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;5000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;anim&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;save&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;outputs/video.mp4&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;writer&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;writer&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a silent video file with the animation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Audio Combination
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; outputs/video.mp4 &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-i&lt;/span&gt; inputs/audio.mp3 &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-c&lt;/span&gt;:v copy &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-b&lt;/span&gt;:a 192k &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-shortest&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
       outputs/final_video+audio.mp4
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key FFmpeg options:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;-c:v copy&lt;/code&gt;: Copy video stream (no re-encoding, faster)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-c:a aac&lt;/code&gt;: Encode audio as AAC&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-b:a 192k&lt;/code&gt;: Audio bitrate (192 kbps is good quality)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-shortest&lt;/code&gt;: End when shortest stream ends (syncs video and audio length)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-y&lt;/code&gt;: Overwrite output file without asking&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Automated Workflow
&lt;/h3&gt;

&lt;p&gt;PowerShell build scripts automate the entire process:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Step 1: Analyze audio&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;python&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;analyze_audio.py&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;inputs/song.mp3&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c"&gt;# Step 2: Generate video&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;python&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;heart_animation.py&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;--effect&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;I2&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nt"&gt;--audio-features&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;song_features.json&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nt"&gt;--resolution&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;large&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nt"&gt;--output&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;outputs/video.mp4&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c"&gt;# Step 3: Combine with audio&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;ffmpeg&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-i&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;outputs/video.mp4&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-i&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;inputs/song.mp3&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nt"&gt;-c&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;copy&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-c&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="nx"&gt;a&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;aac&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-b&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="nx"&gt;a&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;192k&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nt"&gt;-shortest&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;\&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="n"&gt;outputs/final.mp4&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Python and Matplotlib: The Visualization Engine
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Matplotlib's 3D Scatter Plot
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;mpl_toolkits.mplot3d&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Axes3D&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;matplotlib.pyplot&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;plt&lt;/span&gt;

&lt;span class="n"&gt;fig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;plt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;figure&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;figsize&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;width&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;height&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;dpi&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;dpi&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ax&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;fig&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_subplot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;111&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;projection&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3d&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Create scatter plot
&lt;/span&gt;&lt;span class="n"&gt;scatter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cmap&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;magma&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;alpha&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Set view angle
&lt;/span&gt;&lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;view_init&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;elev&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;azim&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;45&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Set axis limits (controls zoom)
&lt;/span&gt;&lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_xlim&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_ylim&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_zlim&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;zoom_factor&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Animation with FuncAnimation
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;matplotlib.animation&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;FuncAnimation&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;frame&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Calculate current time
&lt;/span&gt;    &lt;span class="n"&gt;current_second&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;frame&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;fps&lt;/span&gt;

    &lt;span class="c1"&gt;# Query audio features
&lt;/span&gt;    &lt;span class="n"&gt;beat_intensity&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;get_beat_intensity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;current_second&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;beat_times&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;loudness&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;get_loudness_at_time&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;current_second&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;rms_times&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;rms_values&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Apply transformations
&lt;/span&gt;    &lt;span class="c1"&gt;# ... rotation, scaling, camera movement ...
&lt;/span&gt;
    &lt;span class="c1"&gt;# Update scatter plot
&lt;/span&gt;    &lt;span class="n"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_offsets3d&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x_new&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_new&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z_new&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_alpha&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alpha&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Update camera
&lt;/span&gt;    &lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;view_init&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;elev&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;elevation&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;azim&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;azimuth&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_xlim&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;zoom&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;zoom&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;

&lt;span class="n"&gt;anim&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;FuncAnimation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fig&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;update&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;frames&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;total_frames&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
                    &lt;span class="n"&gt;interval&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="n"&gt;fps&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;blit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key points:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;blit=False&lt;/code&gt;: Required for 3D plots (blitting doesn't work with 3D)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;interval=1000/fps&lt;/code&gt;: Controls playback speed (33ms = 30 fps)&lt;/li&gt;
&lt;li&gt;Each frame calls &lt;code&gt;update()&lt;/code&gt; with the frame number&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Performance Optimization
&lt;/h3&gt;

&lt;p&gt;For multi-heart effects (I2, I3), we use lower point density:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;density_multipliers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;lower&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;0.35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;   &lt;span class="c1"&gt;# ~5,000 points per heart
&lt;/span&gt;    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;low&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;      &lt;span class="c1"&gt;# ~10,000 points
&lt;/span&gt;    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;medium&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;0.75&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;# ~22,500 points
&lt;/span&gt;    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;high&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;      &lt;span class="c1"&gt;# ~40,000 points
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With 5 hearts at 40,000 points each = 200,000 points to render per frame. Using &lt;code&gt;lower&lt;/code&gt; density reduces this to 25,000 points, making rendering feasible.&lt;/p&gt;




&lt;h2&gt;
  
  
  Putting It All Together: The Complete Pipeline
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Audio Analysis&lt;/strong&gt; (librosa)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Input: Audio file (MP3, WAV, etc.)&lt;/li&gt;
&lt;li&gt;Output: JSON file with beat times, tempo, loudness, bass, onsets&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Heart Generation&lt;/strong&gt; (NumPy)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Input: Density setting, formula coefficients&lt;/li&gt;
&lt;li&gt;Output: 3D point cloud (x, y, z coordinates)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Animation Rendering&lt;/strong&gt; (Matplotlib)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Input: Heart points, audio features JSON, effect configuration&lt;/li&gt;
&lt;li&gt;Process: For each frame, query audio features and apply transformations&lt;/li&gt;
&lt;li&gt;Output: Video file (MP4, no audio)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Audio Combination&lt;/strong&gt; (FFmpeg)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Input: Video file + original audio file&lt;/li&gt;
&lt;li&gt;Output: Final video with synchronized audio&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Effect H9: Cuba to New Orleans - A Musical Journey
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Duration:&lt;/strong&gt; ~698 seconds (11.6 minutes)&lt;/p&gt;

&lt;p&gt;H9 is an epic journey through a single piece of music, featuring strategic "through-heart" passages where the camera passes through the heart's core at key musical moments.&lt;/p&gt;

&lt;h3&gt;
  
  
  Technical Highlights
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Dynamic Tempo Adaptation&lt;/strong&gt;: Rotation speed adapts to tempo changes throughout the 11-minute piece&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Through-Heart Passages&lt;/strong&gt;: Camera zooms through the heart at specific timestamps, synchronized with musical transitions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phase-Based Narrative&lt;/strong&gt;: 14 distinct phases, each responding to different sections of the music&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Feature Sync&lt;/strong&gt;: Combines beats, tempo, loudness, bass, and onsets for comprehensive audio response&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Implementation Details
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Phase 3: First through-heart passage (100-120s)
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;current_second&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mf"&gt;120.0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;phase_t&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;current_second&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mf"&gt;100.0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;20.0&lt;/span&gt;
    &lt;span class="c1"&gt;# Zoom through heart: 50 → 5 → 50
&lt;/span&gt;    &lt;span class="n"&gt;zoom_factor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;45&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;phase_t&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# Camera passes through
&lt;/span&gt;    &lt;span class="n"&gt;elevation&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;phase_t&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The through-heart effect is achieved by zooming the camera from a distance (zoom=50) to very close (zoom=5) and back, creating the illusion of passing through the heart.&lt;/p&gt;




&lt;h2&gt;
  
  
  Effect I2: Five Hearts - Comprehensive Audio Feature Synchronization
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Duration:&lt;/strong&gt; Matches audio file length&lt;/p&gt;

&lt;p&gt;I2 demonstrates multi-heart visualization where each heart responds to a different audio feature, creating a visual symphony.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Five Hearts
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Heart 1 (Beats)&lt;/strong&gt;: Pulses on detected beats, magma colormap&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Heart 2 (Tempo)&lt;/strong&gt;: Rotation speed adapts to tempo, YlOrRd colormap&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Heart 3 (Loudness)&lt;/strong&gt;: Scales with RMS energy, Blues colormap&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Heart 4 (Bass)&lt;/strong&gt;: Brightness responds to bass, Greens colormap&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Heart 5 (Onsets)&lt;/strong&gt;: Pulses on musical events, Purples colormap&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Technical Implementation
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Each heart gets its own scatter plot
&lt;/span&gt;&lt;span class="n"&gt;scatter1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;colors1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cmap&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;magma&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;alpha&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;scatter2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ax&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;colors2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cmap&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;YlOrRd&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;alpha&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.6&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# ... etc for hearts 3, 4, 5
&lt;/span&gt;
&lt;span class="c1"&gt;# In update function, process each heart independently
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x_orig&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_orig&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z_orig&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hearts&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Assign feature based on heart index
&lt;/span&gt;    &lt;span class="n"&gt;feature_type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;feature_type&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;  &lt;span class="c1"&gt;# Beats
&lt;/span&gt;        &lt;span class="n"&gt;scale&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mf"&gt;0.2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;beat_intensity&lt;/span&gt;
    &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;feature_type&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;  &lt;span class="c1"&gt;# Tempo
&lt;/span&gt;        &lt;span class="n"&gt;tempo_factor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current_tempo&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;75.0&lt;/span&gt;
        &lt;span class="n"&gt;rotation_speed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tempo_factor&lt;/span&gt;
    &lt;span class="c1"&gt;# ... etc
&lt;/span&gt;
    &lt;span class="c1"&gt;# Update this heart's scatter plot
&lt;/span&gt;    &lt;span class="n"&gt;scatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_offsets3d&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x_new&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_new&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z_new&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Camera Strategy
&lt;/h3&gt;

&lt;p&gt;The camera switches between modes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multi-heart frame&lt;/strong&gt;: Wide view showing all 5 hearts&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Individual focus&lt;/strong&gt;: Close-up on one heart at a time&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Orbital motion&lt;/strong&gt;: Camera orbits around the center&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Effect I3: Birthday Celebration - 11 Hearts to 16 Hearts
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Duration:&lt;/strong&gt; Matches audio file length (typically ~60 seconds for "Happy Birthday")&lt;/p&gt;

&lt;p&gt;I3 is a special celebration effect that transitions from 11 hearts to 16 hearts, with number displays (11, 16, 2025) appearing at strategic moments.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Two-Phase Design
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Phase 1 (0-50%)&lt;/strong&gt;: 11 hearts in circular formation&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;1 center heart + 5 inner circle + 5 outer circle&lt;/li&gt;
&lt;li&gt;Each heart syncs to different audio features&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Phase 2 (50-100%)&lt;/strong&gt;: 16 hearts in 4x4 grid&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Smooth transition adds 5 more hearts&lt;/li&gt;
&lt;li&gt;More complex interactions and patterns&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Number Display
&lt;/h3&gt;

&lt;p&gt;Numbers are displayed as text overlays using Matplotlib's text annotation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Show "11" during 11-heart phase
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;current_second&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;phase5_end&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;current_second&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;total_duration&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mf"&gt;0.11&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;fig&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.85&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;11&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
             &lt;span class="n"&gt;fontsize&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;72&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ha&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;center&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;va&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;center&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
             &lt;span class="n"&gt;color&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;white&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;alpha&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;weight&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bold&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Heart Positioning
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_get_heart_positions_11&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;11 hearts: 1 center + 5 inner + 5 outer&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;positions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;  &lt;span class="c1"&gt;# Center
&lt;/span&gt;
    &lt;span class="c1"&gt;# Inner circle
&lt;/span&gt;    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;angle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;angle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;angle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;positions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

    &lt;span class="c1"&gt;# Outer circle
&lt;/span&gt;    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;angle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pi&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
        &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cos&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;angle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;angle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;positions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;positions&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Lessons Learned: The "Vibe Coding" Approach
&lt;/h2&gt;

&lt;p&gt;This project evolved through what I call "vibe coding", an iterative, creative development process driven by prompts and experimentation rather than strict upfront planning.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Prompt-Driven Development
&lt;/h3&gt;

&lt;p&gt;Each effect started as a detailed prompt in &lt;code&gt;Prompt.md&lt;/code&gt; describing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Visual narrative (phases, camera movements, transitions)&lt;/li&gt;
&lt;li&gt;Audio synchronization strategy&lt;/li&gt;
&lt;li&gt;Technical requirements&lt;/li&gt;
&lt;li&gt;Implementation notes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These prompts served as both specification and inspiration, allowing the code to evolve organically.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Learnings
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Start Simple, Iterate Complex&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Effect A (simple rotation) → Effect I3 (16 hearts with number display)&lt;/li&gt;
&lt;li&gt;Each effect built on previous learnings&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Audio Features Are Rich&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Initially used only beats&lt;/li&gt;
&lt;li&gt;Discovered tempo, loudness, bass, and onsets each add unique visual dimensions&lt;/li&gt;
&lt;li&gt;Combining multiple features creates more nuanced responses&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Performance Matters&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;40,000 points per heart × 16 hearts = 640,000 points per frame&lt;/li&gt;
&lt;li&gt;Lower density (5,000 points) still looks great and renders 128× faster&lt;/li&gt;
&lt;li&gt;Always profile before optimizing&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Modular Design Wins&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Base effect class with &lt;code&gt;update(frame)&lt;/code&gt; method&lt;/li&gt;
&lt;li&gt;Audio sync functions reusable across effects&lt;/li&gt;
&lt;li&gt;Easy to create new effects by extending the base class&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Automation Is Essential&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build scripts handle: audio analysis → video generation → audio combination&lt;/li&gt;
&lt;li&gt;Saves hours of manual work&lt;/li&gt;
&lt;li&gt;Makes experimentation faster&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Documentation as You Go&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Prompt.md&lt;/code&gt; captures design decisions&lt;/li&gt;
&lt;li&gt;Code comments explain "why" not just "what"&lt;/li&gt;
&lt;li&gt;Makes revisiting code months later much easier&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  The Creative Process
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Listen to the music&lt;/strong&gt; - Understand its structure, energy, and emotional arc&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Design the narrative&lt;/strong&gt; - Plan phases that match the music's journey&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Map audio to visuals&lt;/strong&gt; - Decide which features drive which visual properties&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Implement and iterate&lt;/strong&gt; - Code, render, watch, adjust, repeat&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Refine timing&lt;/strong&gt; - Fine-tune phase boundaries and transitions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Optimize performance&lt;/strong&gt; - Balance quality and render time&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process is more art than science, it requires intuition, experimentation, and patience.&lt;/p&gt;




&lt;h2&gt;
  
  
  Demo Videos
&lt;/h2&gt;

&lt;p&gt;See these effects in action:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.youtube.com/watch?v=twa5GX2UVf4" rel="noopener noreferrer"&gt;H9: Cuba to New Orleans&lt;/a&gt;&lt;/strong&gt; - An 11-minute musical journey with through-heart passages&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.youtube.com/watch?v=Jv5ItACRHC4" rel="noopener noreferrer"&gt;I2: Five Hearts&lt;/a&gt;&lt;/strong&gt; - Five hearts, each dancing to different audio features&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://www.youtube.com/watch?v=Canpma8k4UI" rel="noopener noreferrer"&gt;I3: Birthday Celebration&lt;/a&gt;&lt;/strong&gt; - 11 hearts transform to 16 hearts with number displays&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This project demonstrates how mathematics, signal processing, and visualization can combine to create something beautiful. The technical stack, NumPy for math, librosa for audio, Matplotlib for visualization, FFmpeg for video processing, is powerful yet accessible.&lt;/p&gt;

&lt;p&gt;The real magic isn't in any single technology, but in how they work together:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Mathematics&lt;/strong&gt; provides the shape&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio analysis&lt;/strong&gt; provides the rhythm&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Synchronization&lt;/strong&gt; provides the connection&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Visualization&lt;/strong&gt; provides the beauty&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Whether you're interested in parametric equations, audio signal processing, or creative coding, there's something here to explore. The code is open source, the techniques are documented, and the possibilities are endless.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Next Steps:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Explore the codebase: &lt;a href="https://github.com/vuhung16au/math-olympiad-ml/tree/main/MathsHeartShaped3D" rel="noopener noreferrer"&gt;GitHub Repository&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Try creating your own effect&lt;/li&gt;
&lt;li&gt;Analyze your favorite music&lt;/li&gt;
&lt;li&gt;Make something beautiful&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Created with Python, mathematics, and a passion for making music visible.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>maths</category>
      <category>animation</category>
      <category>music</category>
      <category>python</category>
    </item>
    <item>
      <title>Create Synthetic Data - A Comprehensive Guideline</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Sun, 09 Nov 2025 06:24:05 +0000</pubDate>
      <link>https://dev.to/vuhung16au/create-synthetic-data-a-comprehensive-guideline-49bf</link>
      <guid>https://dev.to/vuhung16au/create-synthetic-data-a-comprehensive-guideline-49bf</guid>
      <description>&lt;p&gt;Overview&lt;br&gt;
This document will guide you how to create synthetic data using Python, Cursor, to solve the problem of "being hungry for data" when real data is not available.&lt;/p&gt;

&lt;p&gt;What is Synthetic Data?&lt;br&gt;
Synthetic data is artificially generated data that mimics the statistical properties of real-world data. It is often used in scenarios where real data is scarce, sensitive, or expensive to obtain. Synthetic data can be used for testing, training machine learning models, and validating algorithms without compromising privacy or security.&lt;/p&gt;

&lt;p&gt;Why Use Synthetic Data?&lt;br&gt;
Privacy: Synthetic data can be generated without using any real personal information, making it a safer alternative for testing and development.&lt;br&gt;
Cost-Effective: Generating synthetic data can be more cost-effective than collecting and maintaining real datasets.&lt;br&gt;
Flexibility: Synthetic data can be tailored to specific requirements, allowing for the creation of diverse datasets that cover various scenarios.&lt;br&gt;
Scalability: Synthetic data can be generated in large volumes, making it suitable for big data analytics.&lt;br&gt;
Methodology&lt;br&gt;
Python faker library is used to generate synthetic data.&lt;br&gt;
Probabilistic approaches are employed to ensure the synthetic data closely resembles real-world data distributions.&lt;br&gt;
Machine learning techniques can be applied to refine the synthetic data generation process.&lt;br&gt;
Neural networks can be utilized to create more complex and realistic synthetic datasets, such as GAN (Generative Adversarial Networks)&lt;br&gt;
Notes:&lt;/p&gt;

&lt;p&gt;Translate a dataset from one language to another language can be a good choice&lt;br&gt;
Expand or extend an existing dataset by generating synthetic samples based on the original data distribution.&lt;br&gt;
For example, IoT sensor datasets can be extended by:&lt;/p&gt;

&lt;p&gt;Generating more time series data points following observed patterns&lt;br&gt;
Adding noise and anomalies for robustness testing&lt;br&gt;
Simulating different environmental conditions&lt;br&gt;
Creating multi-sensor correlation scenarios&lt;br&gt;
Example Synthetic Data Generation Project&lt;br&gt;
Define the features and structure of the synthetic datasets to be generated, do it manually, save it to FEATURES.md.&lt;/p&gt;

&lt;p&gt;Example Prompt&lt;br&gt;
Help me create a Python script that generates synthetic data for stock prices.&lt;/p&gt;

&lt;p&gt;Refer to FEATURES.md for the fields to include: date, open, high, low, close, adjusted close, and volume.&lt;/p&gt;

&lt;p&gt;Folder Structure&lt;br&gt;
The folder structure is as follows:&lt;/p&gt;

&lt;p&gt;synthetic-data/&lt;br&gt;
├── datasets/&lt;br&gt;
│   ├── small/&lt;br&gt;
│   ├── medium/&lt;br&gt;
│   ├── large/&lt;br&gt;
│   ├── README.md&lt;br&gt;
│   ├── FEATURES.md&lt;br&gt;
│   └── .gitignore&lt;br&gt;
├── scripts/&lt;br&gt;
│   ├── generate_datasets.py&lt;br&gt;
│   ├── compress_datasets.py&lt;br&gt;
│   ├── README.md&lt;br&gt;
│   └── .gitignore&lt;br&gt;
├── requirements.txt&lt;br&gt;
└── README.md&lt;br&gt;
└── Makefile&lt;br&gt;
└── .gitignore&lt;br&gt;
data: where synthetic data will be stored&lt;br&gt;
raw: raw synthetic data files&lt;br&gt;
processed: processed synthetic data files&lt;br&gt;
scripts: contains the Python script to generate synthetic data&lt;br&gt;
README.md: How to install and run the script using uv with virtual enviroment named .venv&lt;br&gt;
FEATURES.md: Document the features of the synthetic data generation process&lt;br&gt;
requirements.txt: List of Python dependencies&lt;br&gt;
README.md Structure&lt;br&gt;
Overview &amp;amp; Purpose&lt;br&gt;
Prerequisites (Python version (3.9+), libraries)&lt;br&gt;
Installation Instructions (using uv with .venv)&lt;br&gt;
Quick Start Guide&lt;br&gt;
Basic usage examples&lt;br&gt;
Common workflows&lt;br&gt;
Configuration Options&lt;br&gt;
Conclusion&lt;br&gt;
Objectives of the Synthetic Datasets&lt;br&gt;
Provide ready-to-use datasets to demonstrate ML workflows&lt;br&gt;
Cover supervised, unsupervised, and semi-supervised learning (and suggest more options if any)&lt;br&gt;
Support task types:&lt;br&gt;
classification,&lt;br&gt;
regression,&lt;br&gt;
clustering&lt;br&gt;
time-series forecasting: In this project, we focus on generating stock prices data&lt;br&gt;
anomaly detection&lt;br&gt;
recommendation systems&lt;br&gt;
graph analysis&lt;br&gt;
sentiment analysis&lt;br&gt;
Note: Implement all these tasks if possible, they will be needed for comprehensive ML demonstrations.&lt;/p&gt;

&lt;p&gt;Support a range of dataset sizes and feature counts&lt;br&gt;
--size Options (Number of Samples)&lt;br&gt;
small: 1,000 – 10,000&lt;br&gt;
medium: 10,000 – 100,000&lt;br&gt;
large: 100,000 – 1,000,000&lt;br&gt;
extra large: 1,000,000 – 10,000,000&lt;br&gt;
Feature targets&lt;br&gt;
Features: 5 – 50: Number of features/columns in the dataset to generate. Do not set if FEATURES.md is present&lt;br&gt;
Classes: 2 – 10 (classification)&lt;br&gt;
Clusters: 2 – 10 (clustering)&lt;br&gt;
Dataset format&lt;br&gt;
CSV: Comma-separated values for easy import into various tools&lt;br&gt;
Optionally, support for compressed formats like .csv.gz for large datasets&lt;br&gt;
Encoding: UTF-8 to ensure compatibility&lt;br&gt;
Example Run Command&lt;br&gt;
python scripts/generate_datasets.py&lt;br&gt;
--task classification&lt;br&gt;
--size small&lt;br&gt;
--num-samples 5000&lt;br&gt;
--num-classes 3&lt;br&gt;
--random-state 2025&lt;/p&gt;

&lt;p&gt;scripts/generate_datasets.py Parameters&lt;br&gt;
-t, --task: Type of machine learning task (classification, regression, clustering, etc.)&lt;br&gt;
-s, --size: Size of the dataset to generate (small, medium, large, extra large)&lt;br&gt;
-n, --num-samples: Number of samples/rows in the dataset&lt;br&gt;
-f, --num-features: Number of features/columns in the dataset (if not using FEATURES.md)&lt;br&gt;
-c, --num-classes: Number of classes (for classification tasks)&lt;br&gt;
-k, --num-clusters: Number of clusters (for clustering tasks)&lt;br&gt;
--random-state: Seed for random number generation to ensure reproducibility&lt;br&gt;
--output-format: Format of the output dataset (CSV, CSV.GZ). Default is CSV&lt;br&gt;
--output-dir: Directory to save the generated datasets. Default is datasets/&lt;br&gt;
Note: When using --size, the script automatically determines the number of samples within the specified range. Use --num-samples to override with an exact number.&lt;/p&gt;

&lt;p&gt;Makefile targets&lt;br&gt;
make help           # List available targets&lt;br&gt;
make create-all     # Generate representative datasets across tasks and sizes&lt;br&gt;
make compress-all   # Compress all CSV datasets (creates .csv.gz)&lt;br&gt;
make clean          # Delete all CSV files in datasets/&lt;br&gt;
make clean-gzip     # Delete all .csv.gz files in datasets/&lt;br&gt;
make test &lt;br&gt;
make sample         # Generate small sample datasets for testing&lt;br&gt;
make visualize      # Create visualizations of dataset distributions &lt;br&gt;
Required libraries&lt;br&gt;
faker: For generating fake data such as names, addresses, emails, etc.&lt;br&gt;
pandas: For data manipulation and analysis&lt;br&gt;
scikit-learn: For generating datasets with specific characteristics&lt;br&gt;
tensorflow or pytorch: For advanced synthetic data generation using neural networks&lt;br&gt;
You can decide what libraries to use based on your specific needs and the complexity of the synthetic data you want to generate.&lt;/p&gt;

&lt;p&gt;FEATURES.md Example&lt;br&gt;
Features of Synthetic Data Generation of stock prices&lt;/p&gt;

&lt;p&gt;Synthetic Data Features&lt;br&gt;
The synthetic data features include:&lt;/p&gt;

&lt;p&gt;Feature Name    Description Data Type   Example Values&lt;br&gt;
Date    Date of the stock price record  Date    2015-03-31&lt;br&gt;
Open    Opening price of the stock  Float   0.555&lt;br&gt;
High    Highest price of the stock  Float   0.595&lt;br&gt;
Low Lowest price of the stock   Float   0.53&lt;br&gt;
Close   Closing price of the stock  Float   0.565&lt;br&gt;
Adj Close   Adjusted closing price of the stock Float   0.565&lt;br&gt;
Volume  Trading volume of the stock Integer 4816294&lt;br&gt;
Sample data&lt;br&gt;
Date,Open,High,Low,Close,Adj Close,Volume&lt;br&gt;
2015-03-31,0.555,0.595,0.53,0.565,0.565,4816294&lt;br&gt;
2015-04-01,0.575,0.58,0.555,0.565,0.565,4376660&lt;br&gt;
2015-04-02,0.56,0.565,0.535,0.555,0.555,2779640&lt;br&gt;
Final Notes&lt;br&gt;
R is a strong alternative to Python for synthetic data generation&lt;br&gt;
Combine with LLMs to create more context-aware synthetic data is a promising direction, tailor your prompts&lt;br&gt;
faker can be replaced by using LLM API calls to generate more realistic and diverse synthetic data samples.&lt;/p&gt;

</description>
      <category>datasets</category>
      <category>synthetic</category>
      <category>ai</category>
      <category>programming</category>
    </item>
    <item>
      <title>Write a Research Proposal in 2 Hours, Not a Day! ⏱️</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Thu, 06 Nov 2025 22:28:08 +0000</pubDate>
      <link>https://dev.to/vuhung16au/write-a-research-proposal-in-2-hours-not-a-day-gm6</link>
      <guid>https://dev.to/vuhung16au/write-a-research-proposal-in-2-hours-not-a-day-gm6</guid>
      <description>&lt;p&gt;With help of AI, you can draft a solid research proposal in just a couple of hours instead of days. Here's how:&lt;/p&gt;

&lt;p&gt;Let's say, we need to write a research proposal on NLP (Natural Language Processing), classifications of social media posts.&lt;/p&gt;

&lt;p&gt;Step 1: Come up with some research ideas&lt;br&gt;
Open Perplexity Comet, browse aclanthology.org and search for the keywords "social media"&lt;br&gt;
Add more keywords if needed like GNN, multimodal, sentiment analysis, misinformation, disinformation, BERT, transformers, etc.&lt;/p&gt;

&lt;p&gt;Notes:&lt;/p&gt;

&lt;p&gt;Browse more sources like ACM NLP conference papers, Google Scholar, ScienceDirect, etc if needed.&lt;/p&gt;

&lt;p&gt;Open all the papers on new tabs in Comet. Now I have 40 papers to read through, not manually, but with AI assistance!&lt;/p&gt;

&lt;p&gt;Talk to "Comet" with a prompt: Research all open tabs and give me 10 research questions about classifications of social media posts. Cite the sources.&lt;/p&gt;

&lt;p&gt;Step 2: Narrow down to 3 research ideas&lt;br&gt;
Read through all 10 research ideas from Comet and pick 3 that I like the most.&lt;br&gt;
Manually refine the 3 research ideas to make them more specific.&lt;br&gt;
Talk the Comet again with a prompt:&lt;/p&gt;

&lt;p&gt;Suggest me a research proposal title and abstract based on the following 3 research questions:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;[Refined Research Question 1]&lt;/li&gt;
&lt;li&gt;[Refined Research Question 2]&lt;/li&gt;
&lt;li&gt;[Refined Research Question 3]&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Step 3: Prepare a working directory&lt;br&gt;
The folder structure looks like this:&lt;/p&gt;

&lt;p&gt;research-proposal/&lt;br&gt;
    ├── main.tex&lt;br&gt;
    ├── Makefile&lt;br&gt;
    ├── styles/.        # User-defined LaTeX styles, overleaf templates, etc&lt;br&gt;
    ├── references.bib  # Bibtex file for references with &lt;code&gt;abstract&lt;/code&gt; field included &lt;br&gt;
    ├── refereneces/    # Downloaded papers go here, not mandatory&lt;br&gt;
    └── samples/        # Sample research proposals in LaTeX format for reference&lt;br&gt;
Step 4 Prepare references.bib&lt;br&gt;
Use Comet to extract Bibtex entries from all the papers I opened in Step 1.&lt;br&gt;
The next magic starts here:&lt;/p&gt;

&lt;p&gt;Write the research proposal&lt;br&gt;
Open Cursor and use the following prompt to write the research proposal in one go:&lt;/p&gt;

&lt;p&gt;Help me write a research proposal.&lt;/p&gt;

&lt;p&gt;Save your response to main.tex&lt;/p&gt;

&lt;p&gt;Create a Makefile to compile the LaTeX document with targets:&lt;/p&gt;

&lt;p&gt;all: compile the document&lt;br&gt;
pdf: compile to PDF&lt;br&gt;
clean: remove auxiliary files&lt;br&gt;
bib: compile the bibliography&lt;br&gt;
view: open the compiled PDF&lt;br&gt;
Title: {}&lt;/p&gt;

&lt;p&gt;Research questions:&lt;/p&gt;

&lt;p&gt;[Refined Research Question 1]&lt;br&gt;
[Refined Research Question 2]&lt;br&gt;
[Refined Research Question 3]&lt;br&gt;
The structure o the research proposal should include the following sections:&lt;/p&gt;

&lt;p&gt;Background&lt;br&gt;
Current Research Landscape and Related Work&lt;br&gt;
Research Motivation&lt;br&gt;
Research Questions: Include the 3 research questions above&lt;br&gt;
Methodology&lt;br&gt;
Requirements&lt;br&gt;
Timeline&lt;br&gt;
References: Use the Bibtex entries from references.bib&lt;br&gt;
Requirements:&lt;/p&gt;

&lt;p&gt;Only cite the papers from the references.bib file, don't make up any citations.&lt;br&gt;
Compiled PDF file should be less than 6 pages.&lt;br&gt;
Tone should be formal and academic.&lt;br&gt;
Methodology section should explain how I plan to answer the research questions.&lt;br&gt;
Methodology section should include a diagram illustrating the research framework using LaTeX packages like tikz&lt;br&gt;
Requirements section should list the hardware, software, datasets, and other resources needed for the research.&lt;br&gt;
Timeline section should include a Gantt chart showing the timeline of the research activities over 36 months, with milestones and deliverables using LaTeX packages like pgfgantt&lt;br&gt;
make sure make pdf command works without errors.&lt;br&gt;
Additional requirements:&lt;/p&gt;

&lt;p&gt;Refer to samples/research-proposal1.tex and samples/*.tex for formatting and structure guidance.&lt;br&gt;
Step 5: Humanize and finalize&lt;br&gt;
Humanize the text. I use stealthwriter or naturalwrite&lt;br&gt;
Read through the generated main.tex file and make necessary edits to improve clarity, coherence, and flow.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>research</category>
      <category>automation</category>
      <category>proposal</category>
    </item>
    <item>
      <title>Using Coding Agents for Smarter Research Citation Management</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Thu, 06 Nov 2025 07:14:03 +0000</pubDate>
      <link>https://dev.to/vuhung16au/using-coding-agents-for-smarter-research-citation-management-4na2</link>
      <guid>https://dev.to/vuhung16au/using-coding-agents-for-smarter-research-citation-management-4na2</guid>
      <description>&lt;p&gt;As researchers and developers, we often rely on tools like Zotero or Mendeley to manage our citations. They’re great for collecting and organizing references, but I’ve found them limited when it comes to &lt;strong&gt;searching&lt;/strong&gt; and &lt;strong&gt;extracting insights&lt;/strong&gt; from saved citations.  &lt;/p&gt;

&lt;p&gt;Over time, I’ve developed a more flexible workflow that combines traditional reference management with coding agents (like Cursor or Copilot) to make literature organization smarter and more automated.&lt;/p&gt;

&lt;h3&gt;
  
  
  My Workflow
&lt;/h3&gt;

&lt;p&gt;Here’s how I set it up:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;I create a single &lt;code&gt;.bib&lt;/code&gt; file to store all references.&lt;br&gt;&lt;br&gt;
Each entry includes the paper’s download link, abstract, keywords, and notes—usually a short summary or reason for citation.  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;All downloaded papers are stored in a separate folder, ideally organized by topic.  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For quick exploration, I use coding agents (such as Cursor or GitHub Copilot) to search and extract information directly from the &lt;code&gt;.bib&lt;/code&gt; file.&lt;br&gt;&lt;br&gt;
This works especially well if the &lt;code&gt;.bib&lt;/code&gt; file contains abstracts, keywords, and short notes—even without full-text access.  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For deeper retrieval, I extend the approach using &lt;strong&gt;Retrieval-Augmented Generation (RAG)&lt;/strong&gt;.&lt;br&gt;&lt;br&gt;
This allows the coding agent to access both the &lt;code&gt;.bib&lt;/code&gt; file and the full-text papers I’ve downloaded. Essentially, the agent "reads" from my personal library instead of querying the open web.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When prompting, I always instruct the agent to rely &lt;strong&gt;only&lt;/strong&gt; on the saved citations (and downloaded papers, if available).&lt;br&gt;&lt;br&gt;
The response must include source references. This minimizes AI hallucination and ensures outputs are grounded in verified material.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Why This Works
&lt;/h3&gt;

&lt;p&gt;This approach gives me:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Full control over citation context and metadata
&lt;/li&gt;
&lt;li&gt;Local, reproducible research environment
&lt;/li&gt;
&lt;li&gt;Powerful semantic search using AI tools
&lt;/li&gt;
&lt;li&gt;Less risk of fabricated or irrelevant information
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If I only need to summarize main ideas across papers, storing abstracts in the &lt;code&gt;.bib&lt;/code&gt; file is enough.&lt;br&gt;&lt;br&gt;
When I need to refer to conclusions or methodology in detail, I make sure the full paper is downloaded and indexed.&lt;/p&gt;




&lt;p&gt;This workflow blends classic academic management with modern AI-driven retrieval—perfect for researchers who want to integrate intelligent automation without losing academic rigor.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>citation</category>
      <category>research</category>
    </item>
    <item>
      <title>Advanced Prompt Techniques for the Next Level of Vibe Coding</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Wed, 08 Oct 2025 13:39:56 +0000</pubDate>
      <link>https://dev.to/vuhung16au/advanced-prompt-techniques-for-the-next-level-of-vibe-coding-2dbb</link>
      <guid>https://dev.to/vuhung16au/advanced-prompt-techniques-for-the-next-level-of-vibe-coding-2dbb</guid>
      <description>&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;In modern software engineering, &lt;em&gt;prompt engineering&lt;/em&gt; for AI coding assistants (Copilot, Cursor, etc.) is rapidly maturing. Through three of my open-source projects—&lt;strong&gt;CodeGlow&lt;/strong&gt;, &lt;strong&gt;Marlinga&lt;/strong&gt;, and &lt;strong&gt;PyTorch Mastery&lt;/strong&gt;—I've systematically experimented with advanced prompting strategies. Below, I distill actionable insights, struggles, and specific tactics from real-world GitHub repos, issues, and PRs that can level up your vibe coding workflow.&lt;/p&gt;




&lt;h3&gt;
  
  
  1. Project Contexts &amp;amp; Summaries
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;CodeGlow&lt;/strong&gt;: A cross-language code beautifier focused on rapid formatting for Python, Java, and JavaScript, optimized for copying into rich-text editors like MS Word.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Marlinga&lt;/strong&gt;: A fictional Martian colony translator, mixing English and conlang vocabulary, requiring creative prompts to bridge code, story, and UX logic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;PyTorch Mastery&lt;/strong&gt;: A learning journey in deep learning, emphasizing best practices in code modularity, policy enforcement via Copilot, and educational template generation.[1][2]&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  2. Writing Effective Prompts for Copilot &amp;amp; Cursor
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Key Techniques:
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Modular Instruction Policies&lt;/strong&gt; ([[CodeGlow PRs &amp;amp; Issues]]):[3][4][5][6]&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Place &lt;code&gt;.github/copilot-instructions.md&lt;/code&gt; in the root or organize “instruction modules” by topic (e.g., Next.js, RTF export, copy-to-clipboard).&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Tip&lt;/em&gt;: Explicitly state, “Use OOP for all major components,” to bias Copilot towards maintainable class-based patterns.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Prompting for Helper Functions &amp;amp; Templates&lt;/strong&gt; ([PyTorch Mastery PR #61]):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Use helper functions for all non-trivial logic, with type hints and short docstrings.”&lt;/li&gt;
&lt;li&gt;“Prefer &lt;code&gt;@dataclass&lt;/code&gt; configs over ad hoc dicts for parameter management.”&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Standardized Alias and Library Prompts&lt;/strong&gt; ([PyTorch Mastery PR #51]):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Always &lt;code&gt;import torch.nn.functional as F&lt;/code&gt; and reference as such for all activation and loss.”&lt;/li&gt;
&lt;li&gt;Encourage Copilot/Cursor to echo community conventions; e.g., “Apply &lt;code&gt;sns.set_style('darkgrid')&lt;/code&gt; for model visualizations.”&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;User Story-Driven Prompts&lt;/strong&gt; ([CodeGlow User Story Issue]):[7]&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Summarize requirements as user stories, then prompt for modular code matching each story.”&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Instructional Prompts for UI/UX&lt;/strong&gt; ([CodeGlow/Marlinga Issues/PRs]):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Enforce explicit dark/light toggle as a user-selectable setting.”&lt;/li&gt;
&lt;li&gt;“Add footer linking Github and LinkedIn—Copilot should not hardcode URLs.”&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h3&gt;
  
  
  3. Innovative Approaches &amp;amp; Lessons Learned
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Modular Prompt Policies Lead to Modular Code&lt;/strong&gt;:

&lt;ul&gt;
&lt;li&gt;Projects with granular Copilot instructions (split by topic) saw clearer and more reusable code.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Prompting for Validation Infrastructure&lt;/strong&gt; ([PyTorch Mastery]):

&lt;ul&gt;
&lt;li&gt;Writing, “Create &lt;code&gt;validate_code_style.py&lt;/code&gt; to audit for OOP/helper function ratios,” let Copilot generate automation for code review.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Policy Compliance Tracking&lt;/strong&gt;:

&lt;ul&gt;
&lt;li&gt;Prompt-assisted development improved helper function compliance from 30% (target) to 57% (actual) [PR #61].&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h3&gt;
  
  
  4. Prompt Effectiveness: Reflections &amp;amp; Struggles
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Prompt Drift Is Real&lt;/strong&gt;:
Without sustained policy files, “verbal” instructions buried in issues/PRs quickly get lost—leading to Copilot code regressions (e.g., GravityScript).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Explicit Examples Drive Accuracy&lt;/strong&gt;:
Adding positive and negative code examples in Copilot instruction files improved reliability.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Manual “Prompt Refactoring” Yields Results&lt;/strong&gt;:
Rewriting instructions to clarify roles (class vs. function, what to refactor, what to leave dynamic) directly impacted code intelligibility and review overhead.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  5. Advanced Tips for Senior Vibe Coders
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Centralize Prompt Policies&lt;/strong&gt;:
Store in &lt;code&gt;.github/copilot-instructions/&lt;/code&gt; and make modular; reference them in onboarding docs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use JSON or XML for Data Definitions&lt;/strong&gt;:
Standardizing on these formats for all prompt-related data empowers Copilot/Cursor to better infer schemas and serialization rules.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Explicit Pipeline Prompts&lt;/strong&gt;:
Rather than vague instructions, dictate, e.g., “After formatting, output in both JSON and XML; update docs automatically.”&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feedback-Driven Prompt Iteration&lt;/strong&gt;:
Routinely collect Copilot feedback via PR comments/issues and bake lessons into future instruction updates.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enforce Documentation via Prompts&lt;/strong&gt;:
“Every function requires a docstring and type hints—reject PRs without both.”&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Scaling “vibe coding” to production and team contexts means you control not only the code—but the &lt;em&gt;instructional context&lt;/em&gt; Copilot and Cursor see. Modularity, clarity, and explicitness in prompts aren’t just for the AI—they’re for your team, your future self, and robust engineering velocity.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;What advanced prompt tactics have leveled up your Copilot experience? Share your own in the comments or fork any of my [featured repos] for more!&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;References &amp;amp; Deep Dives&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;[CodeGlow PR: Copilot instruction policy modularization][3]&lt;/li&gt;
&lt;li&gt;[PyTorch Mastery: OOP and helper function prompt policy][1]&lt;/li&gt;
&lt;li&gt;[hf-transformer-trove: Visualization set-up Copilot policy][2]&lt;/li&gt;
&lt;li&gt;Issues and PRs on Copilot policies, user stories, &amp;amp; UX-driven prompting&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;a href="https://github.com/vuhung16au/pytorch-mastery" rel="noopener noreferrer"&gt;1&lt;/a&gt;, &lt;br&gt;
&lt;a href="https://github.com/vuhung16au/hf-transformer-trove" rel="noopener noreferrer"&gt;2&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/16" rel="noopener noreferrer"&gt;3&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/13" rel="noopener noreferrer"&gt;4&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/8" rel="noopener noreferrer"&gt;5&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/3" rel="noopener noreferrer"&gt;6&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/4" rel="noopener noreferrer"&gt;7&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.to/new"&gt;8&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow" rel="noopener noreferrer"&gt;9&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/Marlinga" rel="noopener noreferrer"&gt;10&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/pytorch-mastery" rel="noopener noreferrer"&gt;11&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/26" rel="noopener noreferrer"&gt;12&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/24" rel="noopener noreferrer"&gt;13&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/22" rel="noopener noreferrer"&gt;14&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/20" rel="noopener noreferrer"&gt;15&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/11" rel="noopener noreferrer"&gt;16&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/10" rel="noopener noreferrer"&gt;17&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/9" rel="noopener noreferrer"&gt;18&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/7" rel="noopener noreferrer"&gt;19&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/6" rel="noopener noreferrer"&gt;20&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/5" rel="noopener noreferrer"&gt;21&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/issues/2" rel="noopener noreferrer"&gt;22&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/27" rel="noopener noreferrer"&gt;23&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/25" rel="noopener noreferrer"&gt;24&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/23" rel="noopener noreferrer"&gt;25&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/21" rel="noopener noreferrer"&gt;26&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/19" rel="noopener noreferrer"&gt;27&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/18" rel="noopener noreferrer"&gt;28&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/17" rel="noopener noreferrer"&gt;29&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/pull/12" rel="noopener noreferrer"&gt;30&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/commits/main/README.md" rel="noopener noreferrer"&gt;31&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/commits?author=Copilot" rel="noopener noreferrer"&gt;32&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/commits?author=vuhung16au" rel="noopener noreferrer"&gt;33&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/CodeGlow/commit/1315c79f09f17fef08fa77e3cd7a246300b2d9af" rel="noopener noreferrer"&gt;34&lt;/a&gt;&lt;br&gt;
&lt;a href="https://github.com/vuhung16au/GravityScript/tree/main/.github" rel="noopener noreferrer"&gt;35&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>opensource</category>
      <category>productivity</category>
    </item>
    <item>
      <title>FFmpeg Tutorial: Creating Professional Videos with Advanced Filtering and GPU Acceleration</title>
      <dc:creator>Vu Hung Nguyen (Hưng)</dc:creator>
      <pubDate>Wed, 01 Oct 2025 12:21:37 +0000</pubDate>
      <link>https://dev.to/vuhung16au/ffmpeg-tutorial-creating-professional-videos-with-advanced-filtering-and-gpu-acceleration-3h6</link>
      <guid>https://dev.to/vuhung16au/ffmpeg-tutorial-creating-professional-videos-with-advanced-filtering-and-gpu-acceleration-3h6</guid>
      <description>&lt;h1&gt;
  
  
  &lt;strong&gt;FFmpeg Tutorial: Creating Professional Videos with Advanced Filtering and GPU Acceleration&lt;/strong&gt;
&lt;/h1&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;FFmpeg Tutorial: Creating Professional Videos with Advanced Filtering and GPU Acceleration&lt;/strong&gt;
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;Master the art of video processing with FFmpeg: from basic operations to complex video looping, GPU acceleration, and automated video creation pipelines.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Table of Contents&lt;/strong&gt;
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;Getting Started with FFmpeg&lt;/li&gt;
&lt;li&gt;Basic Video Operations&lt;/li&gt;
&lt;li&gt;Advanced Video Filtering&lt;/li&gt;
&lt;li&gt;GPU Acceleration on Mac Silicon&lt;/li&gt;
&lt;li&gt;Creating Looped Videos&lt;/li&gt;
&lt;li&gt;Video Processing Pipeline&lt;/li&gt;
&lt;li&gt;Real-World Examples&lt;/li&gt;
&lt;li&gt;Best Practices&lt;/li&gt;
&lt;li&gt;Troubleshooting&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Introduction&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;FFmpeg is the Swiss Army knife of video processing - a powerful, open-source multimedia framework that can handle virtually any video or audio format. Whether you're creating YouTube content, processing surveillance footage, or building automated video pipelines, FFmpeg provides the tools you need to manipulate media files with precision and efficiency.&lt;/p&gt;

&lt;p&gt;In this comprehensive tutorial, we'll explore advanced FFmpeg techniques used in real-world video production pipelines, including GPU acceleration, complex filtering, and automated video creation workflows.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Getting Started with FFmpeg&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Installation on macOS&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;For Mac users, especially those with Apple Silicon (M1/M2/M3), FFmpeg can be installed via Homebrew:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install FFmpeg with hardware acceleration support&lt;/span&gt;
brew &lt;span class="nb"&gt;install &lt;/span&gt;ffmpeg

&lt;span class="c"&gt;# Verify installation and check available encoders&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-encoders&lt;/span&gt; | &lt;span class="nb"&gt;grep &lt;/span&gt;videotoolbox

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Basic Command Structure&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FFmpeg commands follow this general pattern:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ffmpeg &lt;span class="o"&gt;[&lt;/span&gt;global options] &lt;span class="nt"&gt;-i&lt;/span&gt; input_file &lt;span class="o"&gt;[&lt;/span&gt;input options] &lt;span class="o"&gt;[&lt;/span&gt;output options] output_file

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Basic Video Operations&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;1. Video Format Conversion&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Convert MP4 to AVI&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 output.avi

&lt;span class="c"&gt;# Convert with specific codec&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v libx264 &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac output.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;2. Video Resizing&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Resize to specific dimensions&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="nv"&gt;scale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1920:1080 output.mp4

&lt;span class="c"&gt;# Resize maintaining aspect ratio&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="nv"&gt;scale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1920:-1 output.mp4

&lt;span class="c"&gt;# Resize with specific resolution presets&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="nv"&gt;scale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1280:720 output_hd.mp4  &lt;span class="c"&gt;# HD&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="nv"&gt;scale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1920:1080 output_fhd.mp4 &lt;span class="c"&gt;# Full HD&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="nv"&gt;scale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;640:480 output_sd.mp4    &lt;span class="c"&gt;# SD&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;3. Removing Audio from Video&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Remove audio track&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v copy &lt;span class="nt"&gt;-an&lt;/span&gt; output_no_audio.mp4

&lt;span class="c"&gt;# Extract audio only&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-vn&lt;/span&gt; &lt;span class="nt"&gt;-c&lt;/span&gt;:a copy output_audio.aac

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;4. Trimming Videos&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Trim to first 4 seconds&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-t&lt;/span&gt; 4 output_trimmed.mp4

&lt;span class="c"&gt;# Trim from 10 seconds to 20 seconds&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-ss&lt;/span&gt; 10 &lt;span class="nt"&gt;-t&lt;/span&gt; 10 output_trimmed.mp4

&lt;span class="c"&gt;# Trim from specific time to end&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-ss&lt;/span&gt; 00:01:30 output_from_1min30sec.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Advanced Video Filtering&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;1. Complex Filter Graphs&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;FFmpeg's &lt;code&gt;filter_complex&lt;/code&gt; allows you to chain multiple operations:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Scale and reverse video&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"[0:v]scale=1920:1080,reverse[v]"&lt;/span&gt; &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v]"&lt;/span&gt; output.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;2. Video Splitting and Concatenation&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Split video into two streams, reverse one, then concatenate&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
  [0:v]split=2[v1][v2];
  [v2]reverse[v2_rev];
  [v1][v2_rev]concat=n=2:v=1:a=0[v_cycle]
"&lt;/span&gt; &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v_cycle]"&lt;/span&gt; output_forward_reverse.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;3. Logo Overlay&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Add logo to bottom-left corner&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; video.mp4 &lt;span class="nt"&gt;-i&lt;/span&gt; logo.png &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
  [0:v][1:v]overlay=10:H-h-10
"&lt;/span&gt; output_with_logo.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;GPU Acceleration on Mac Silicon&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;One of the most significant advantages of modern Macs is the built-in hardware acceleration. Using GPU acceleration can provide &lt;strong&gt;up to 8.5x faster encoding&lt;/strong&gt; compared to CPU-only processing.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Enabling Hardware Acceleration&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Use Apple's hardware encoder for H.264&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac output.mp4

&lt;span class="c"&gt;# Use hardware encoder for HEVC/H.265&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v hevc_videotoolbox &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac output.mp4

&lt;span class="c"&gt;# With quality settings&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-q&lt;/span&gt;:v 50 &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac output.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Performance Comparison&lt;/strong&gt;
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;strong&gt;Method&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Encoding Speed&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;CPU Usage&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Quality&lt;/strong&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Software (libx264)&lt;/td&gt;
&lt;td&gt;1x&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;td&gt;Excellent&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hardware (videotoolbox)&lt;/td&gt;
&lt;td&gt;8.5x&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;Good&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Creating Looped Videos&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Creating seamless looped videos is essential for background content, music videos, and ambient videos. Here are several approaches:&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;1. Simple Video Looping&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Loop video 5 times&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-stream_loop&lt;/span&gt; 5 &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt; copy output_looped.mp4

&lt;span class="c"&gt;# Infinite loop (until audio ends)&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-stream_loop&lt;/span&gt; &lt;span class="nt"&gt;-1&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-i&lt;/span&gt; audio.mp3 &lt;span class="nt"&gt;-c&lt;/span&gt; copy &lt;span class="nt"&gt;-shortest&lt;/span&gt; output.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;2. Forward-Reverse Looping&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This creates a seamless loop by playing the video forward, then in reverse:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Method 1: Single command with complex filter&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
  [0:v]split=2[v1][v2];
  [v2]reverse[v2_rev];
  [v1][v2_rev]concat=n=2:v=1:a=0[v_cycle];
  [v_cycle]loop=loop=-1:size=1:start=0[v_looped]
"&lt;/span&gt; &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v_looped]"&lt;/span&gt; output_seamless_loop.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;3. Two-Step Approach (More Reliable)&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Step 1: Create forward-reverse cycle&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
  [0:v]split=2[v1][v2];
  [v2]reverse[v2_rev];
  [v1][v2_rev]concat=n=2:v=1:a=0[v_cycle]
"&lt;/span&gt; &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v_cycle]"&lt;/span&gt; &lt;span class="nt"&gt;-t&lt;/span&gt; 16 video_cycle.mp4

&lt;span class="c"&gt;# Step 2: Loop the cycle to match audio duration&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-stream_loop&lt;/span&gt; &lt;span class="nt"&gt;-1&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; video_cycle.mp4 &lt;span class="nt"&gt;-i&lt;/span&gt; audio.mp3 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="nt"&gt;-shortest&lt;/span&gt; output.mp4

&lt;span class="c"&gt;# Clean up&lt;/span&gt;
&lt;span class="nb"&gt;rm &lt;/span&gt;video_cycle.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Video Processing Pipeline&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Automated Video Creation Workflow&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Here's a complete pipeline for creating professional videos:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
&lt;span class="c"&gt;# Complete video processing pipeline&lt;/span&gt;

&lt;span class="c"&gt;# Step 1: Prepare video (remove sound, resize, add logo)&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="s2"&gt;"scale=1920:1080,overlay=10:H-h-10:enable='between(t,0,inf)'"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; logo.png &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-an&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  video_prepared.mp4

&lt;span class="c"&gt;# Step 2: Create seamless loop&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; video_prepared.mp4 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
    [0:v]split=2[v1][v2];
    [v2]reverse[v2_rev];
    [v1][v2_rev]concat=n=2:v=1:a=0[v_cycle]
  "&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v_cycle]"&lt;/span&gt; &lt;span class="nt"&gt;-t&lt;/span&gt; 16 video_cycle.mp4

&lt;span class="c"&gt;# Step 3: Combine with audio&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;-stream_loop&lt;/span&gt; &lt;span class="nt"&gt;-1&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; video_cycle.mp4 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; concatenated_audio.mp3 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="nt"&gt;-shortest&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  final_output.mp4

&lt;span class="c"&gt;# Cleanup&lt;/span&gt;
&lt;span class="nb"&gt;rm &lt;/span&gt;video_cycle.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Makefile Integration&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight make"&gt;&lt;code&gt;&lt;span class="c"&gt;# Makefile for automated video processing
&lt;/span&gt;&lt;span class="nv"&gt;OUTPUT_NAME&lt;/span&gt; &lt;span class="o"&gt;?=&lt;/span&gt; my-video
&lt;span class="nv"&gt;VIDEO_RESOLUTION&lt;/span&gt; &lt;span class="o"&gt;?=&lt;/span&gt; HD

&lt;span class="nl"&gt;all&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;$(OUTPUT_NAME).mp4&lt;/span&gt;

&lt;span class="nl"&gt;$(OUTPUT_NAME).mp4&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;video_prepared.mp4 audio_files/&lt;/span&gt;
    &lt;span class="p"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Creating final video..."&lt;/span&gt;
    &lt;span class="p"&gt;@&lt;/span&gt;./combine_video_audio.sh &lt;span class="nt"&gt;--output-name&lt;/span&gt; &lt;span class="p"&gt;$(&lt;/span&gt;OUTPUT_NAME&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nl"&gt;video_prepared.mp4&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;input.mp4&lt;/span&gt;
    &lt;span class="p"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Preparing video..."&lt;/span&gt;
    &lt;span class="p"&gt;@&lt;/span&gt;./prepare_video.sh &lt;span class="nt"&gt;--resolution&lt;/span&gt; &lt;span class="p"&gt;$(&lt;/span&gt;VIDEO_RESOLUTION&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nl"&gt;clean&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt;
    &lt;span class="p"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;-f&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt;.mp4 &lt;span class="k"&gt;*&lt;/span&gt;.mp3
    &lt;span class="p"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Cleanup completed"&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Real-World Examples&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Example 1: Waterfall Background Video&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create a 2.5-hour waterfall video with music&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s2"&gt;"Ban_Gioc_Waterfall.mp4"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s2"&gt;"concatenated_audio.mp3"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-filter_complex&lt;/span&gt; &lt;span class="s2"&gt;"
    [0:v]scale=1920:1080,setpts=PTS-STARTPTS[video_scaled];
    [video_scaled]split=2[v1][v2];
    [v2]reverse[v2_rev];
    [v1][v2_rev]concat=n=2:v=1:a=0[v_cycle];
    [v_cycle]loop=loop=-1:size=1:start=0[v_looped]
  "&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-map&lt;/span&gt; &lt;span class="s2"&gt;"[v_looped]"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-map&lt;/span&gt; 1:a &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-shortest&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-r&lt;/span&gt; 24 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="s2"&gt;"waterfall_music_video.mp4"&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Example 2: Batch Video Processing&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;
&lt;span class="c"&gt;# Process multiple videos with GPU acceleration&lt;/span&gt;

&lt;span class="k"&gt;for &lt;/span&gt;video &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt;.mp4&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;do
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Processing &lt;/span&gt;&lt;span class="nv"&gt;$video&lt;/span&gt;&lt;span class="s2"&gt;..."&lt;/span&gt;
    ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$video&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="nt"&gt;-vf&lt;/span&gt; &lt;span class="s2"&gt;"scale=1920:1080,crop=1920:1080:0:0"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="nt"&gt;-t&lt;/span&gt; 4 &lt;span class="se"&gt;\&lt;/span&gt;
      &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;video&lt;/span&gt;&lt;span class="p"&gt;%.*&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;_processed.mp4"&lt;/span&gt;
&lt;span class="k"&gt;done&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Example 3: YouTube-Optimized Video&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create YouTube-optimized video with chapters&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; video.mp4 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-i&lt;/span&gt; audio.mp3 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;:a aac &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-b&lt;/span&gt;:a 128k &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-movflags&lt;/span&gt; +faststart &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-metadata&lt;/span&gt; &lt;span class="nv"&gt;title&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"My Video Title"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-metadata&lt;/span&gt; &lt;span class="nv"&gt;description&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"Video description"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  youtube_ready.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Best Practices&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;1. Performance Optimization&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Always use GPU acceleration&lt;/strong&gt; when available (&lt;code&gt;h264_videotoolbox&lt;/code&gt; on Mac)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use appropriate quality settings&lt;/strong&gt; (&lt;code&gt;q:v 50&lt;/code&gt; for good quality/size balance)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Process in chunks&lt;/strong&gt; for very large files&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use &lt;code&gt;shortest&lt;/code&gt;&lt;/strong&gt; when combining video and audio of different lengths&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;2. Quality Settings&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# High quality (larger file)&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-q&lt;/span&gt;:v 20 output.mp4

&lt;span class="c"&gt;# Balanced quality/size&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-q&lt;/span&gt;:v 50 output.mp4

&lt;span class="c"&gt;# Smaller file size&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-c&lt;/span&gt;:v h264_videotoolbox &lt;span class="nt"&gt;-q&lt;/span&gt;:v 60 output.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;3. Error Handling&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Always use -y to overwrite output files&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 output.mp4

&lt;span class="c"&gt;# Check for errors in scripts&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="nv"&gt;$?&lt;/span&gt; &lt;span class="nt"&gt;-ne&lt;/span&gt; 0 &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"FFmpeg failed!"&lt;/span&gt;
    &lt;span class="nb"&gt;exit &lt;/span&gt;1
&lt;span class="k"&gt;fi&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;4. Memory Management&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# For large files, process in segments&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-ss&lt;/span&gt; 00:00:00 &lt;span class="nt"&gt;-t&lt;/span&gt; 00:05:00 &lt;span class="nt"&gt;-c&lt;/span&gt; copy segment1.mp4
ffmpeg &lt;span class="nt"&gt;-i&lt;/span&gt; input.mp4 &lt;span class="nt"&gt;-ss&lt;/span&gt; 00:05:00 &lt;span class="nt"&gt;-t&lt;/span&gt; 00:05:00 &lt;span class="nt"&gt;-c&lt;/span&gt; copy segment2.mp4

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Troubleshooting&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Common Issues and Solutions&lt;/strong&gt;
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;"No such file or directory"&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Check file paths and permissions&lt;/li&gt;
&lt;li&gt;Ensure input files exist&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;"Codec not found"&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Install FFmpeg with full codec support&lt;/li&gt;
&lt;li&gt;Use &lt;code&gt;brew install ffmpeg&lt;/code&gt; on macOS&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;High CPU usage&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Enable GPU acceleration with &lt;code&gt;h264_videotoolbox&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Use &lt;code&gt;preset fast&lt;/code&gt; for faster encoding&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio/video sync issues&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;code&gt;shortest&lt;/code&gt; flag when combining streams&lt;/li&gt;
&lt;li&gt;Check frame rates with &lt;code&gt;r&lt;/code&gt; parameter&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Out of memory errors&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Process files in smaller chunks&lt;/li&gt;
&lt;li&gt;Use &lt;code&gt;threads&lt;/code&gt; parameter to limit CPU usage&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Debugging Commands&lt;/strong&gt;
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Check video information&lt;/span&gt;
ffprobe &lt;span class="nt"&gt;-v&lt;/span&gt; quiet &lt;span class="nt"&gt;-print_format&lt;/span&gt; json &lt;span class="nt"&gt;-show_format&lt;/span&gt; &lt;span class="nt"&gt;-show_streams&lt;/span&gt; input.mp4

&lt;span class="c"&gt;# Test encoding without output&lt;/span&gt;
ffmpeg &lt;span class="nt"&gt;-f&lt;/span&gt; lavfi &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="nv"&gt;testsrc&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;duration&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;1:size&lt;span class="o"&gt;=&lt;/span&gt;1920x1080:rate&lt;span class="o"&gt;=&lt;/span&gt;30 &lt;span class="nt"&gt;-f&lt;/span&gt; null -

&lt;span class="c"&gt;# Monitor system resources&lt;/span&gt;
top &lt;span class="nt"&gt;-pid&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;pgrep ffmpeg&lt;span class="si"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  &lt;strong&gt;Conclusion&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;FFmpeg is an incredibly powerful tool for video processing, and with the right techniques, you can create professional-quality videos efficiently. The key is to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Leverage hardware acceleration&lt;/strong&gt; whenever possible&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use complex filters&lt;/strong&gt; for advanced effects&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automate repetitive tasks&lt;/strong&gt; with scripts and Makefiles&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Optimize for your target platform&lt;/strong&gt; (YouTube, social media, etc.)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Whether you're creating ambient videos, processing surveillance footage, or building automated video pipelines, FFmpeg provides the flexibility and power you need to achieve professional results.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Additional Resources&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://ffmpeg.org/documentation.html" rel="noopener noreferrer"&gt;FFmpeg Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://ffmpeg.org/ffmpeg-filters.html" rel="noopener noreferrer"&gt;FFmpeg Filters&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developer.apple.com/documentation/videotoolbox" rel="noopener noreferrer"&gt;Apple VideoToolbox Documentation&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;This tutorial is based on real-world video processing pipelines used in production environments. The examples and techniques shown here have been tested and optimized for performance and quality.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ffmpeg</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
