<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kunal Chakraborty</title>
    <description>The latest articles on DEV Community by Kunal Chakraborty (@kunal_chakraborty_6a1ab69).</description>
    <link>https://dev.to/kunal_chakraborty_6a1ab69</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kunal_chakraborty_6a1ab69"/>
    <language>en</language>
    <item>
      <title>From Idea to macOS-Style Portfolio: React, GSAP Animations &amp; Higher-Order Components</title>
      <dc:creator>Kunal Chakraborty</dc:creator>
      <pubDate>Tue, 17 Mar 2026 03:35:57 +0000</pubDate>
      <link>https://dev.to/kunal_chakraborty_6a1ab69/from-idea-to-macos-style-portfolio-react-gsap-animations-higher-order-components-3p29</link>
      <guid>https://dev.to/kunal_chakraborty_6a1ab69/from-idea-to-macos-style-portfolio-react-gsap-animations-higher-order-components-3p29</guid>
      <description>&lt;p&gt;For this project, I set out to build a macOS-inspired portfolio using React, Tailwind CSS, and GSAP. At first, the goal was simple: create something visually impressive. But as the project evolved, it became much more than that. It turned into a deep exploration of animation systems, reusable architecture, and professional development workflows.&lt;/p&gt;

&lt;p&gt;The final result isn’t just a portfolio—it behaves like an application, complete with draggable windows, smooth transitions, and a responsive Dock that mimics macOS interactions.&lt;/p&gt;

&lt;p&gt;⚡ The Stack That Made It Possible&lt;/p&gt;

&lt;p&gt;React acted as the backbone of the application. Instead of treating the UI as static pages, I approached it like an operating system. Each “app” is essentially a window with its own lifecycle—open, close, minimize, and focus. Managing which window stays on top required careful state handling, especially when dealing with z-index and user interactions.&lt;/p&gt;

&lt;p&gt;For styling, Tailwind CSS made it surprisingly easy to recreate macOS’s signature glassmorphism. Using utilities like backdrop blur and semi-transparent backgrounds, I was able to achieve the frosted-glass effect without writing heavy custom CSS. This allowed me to focus more on behavior rather than styling complexity.&lt;/p&gt;

&lt;p&gt;GSAP is where everything came to life. While CSS animations can handle basic transitions, they fall short when precision and performance are critical. GSAP provided the control needed to build fluid, responsive animations—especially for interactions like window transitions and Dock behavior.&lt;/p&gt;

&lt;p&gt;🖱️ Recreating the macOS Dock Experience&lt;/p&gt;

&lt;p&gt;One of the most satisfying parts of this project was implementing the Dock magnification effect. It’s a subtle detail, but it adds a lot to the overall experience.&lt;/p&gt;

&lt;p&gt;The logic is based on proximity. As the cursor moves closer to an icon, the icon scales up smoothly. This required calculating the distance between the cursor and each icon, then mapping that distance to a scale value. GSAP made it easy to interpolate these values and animate them efficiently.&lt;/p&gt;

&lt;p&gt;What seems like a small UI detail actually required careful tuning to feel natural. Too fast, and it feels jittery. Too slow, and it feels unresponsive. Getting this right made a huge difference in the final polish.&lt;/p&gt;

&lt;p&gt;🏗️ Scaling the Window System with a Higher-Order Component&lt;/p&gt;

&lt;p&gt;As the project grew, managing each window individually started to feel repetitive. Every window needed the same core behavior—animations, drag functionality, focus handling, and layering.&lt;/p&gt;

&lt;p&gt;Rather than duplicating this logic across multiple components, I introduced a Higher-Order Component (HOC) called WindowWrapper.&lt;/p&gt;

&lt;p&gt;This abstraction allowed me to treat every app window as a plug-and-play component while centralizing all the complex behavior in one place.&lt;/p&gt;

&lt;p&gt;💡 The Idea Behind WindowWrapper&lt;/p&gt;

&lt;p&gt;The goal was simple:&lt;/p&gt;

&lt;p&gt;Wrap any component and instantly give it “macOS window behavior.”&lt;/p&gt;

&lt;p&gt;This includes:&lt;/p&gt;

&lt;p&gt;Open/close animations&lt;/p&gt;

&lt;p&gt;Dragging with GSAP Draggable&lt;/p&gt;

&lt;p&gt;Focus management (z-index handling)&lt;/p&gt;

&lt;p&gt;Conditional rendering based on state&lt;/p&gt;

&lt;p&gt;🧠 The Implementation&lt;/p&gt;

&lt;p&gt;Here’s a simplified version of how it works:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="s2"&gt;`const WindowWrapper = (Component, windowKey) =&amp;gt; {
    const Wrapped = (props) =&amp;gt; {
        const { focusWindow, windows } = useWindowStore();
        const { isOpen, zIndex } = windows[windowKey];
        const ref = useRef(null);

        // Animation when opening
        useGSAP(() =&amp;gt; {
            const el = ref.current;
            if (!el || !isOpen) return;

            el.style.display = "block";

            gsap.fromTo(
                el,
                { scale: 0.8, opacity: 0, y: 40 },
                { scale: 1, opacity: 1, y: 0, duration: 0.1, ease: "power4.out" }
            );
        }, [isOpen]);

        // Draggable behavior
        useGSAP(() =&amp;gt; {
            const el = ref.current;
            if (!el) return;

            const header = el.querySelector('#window-header');

            const [instance] = Draggable.create(el, {
                trigger: header || el,
                onPress: () =&amp;gt; focusWindow(windowKey)
            });

            return () =&amp;gt; instance.kill();
        }, []);

        // Show/hide window
        useLayoutEffect(() =&amp;gt; {
            const el = ref.current;
            if (!el) return;
            el.style.display = isOpen ? "block" : "none";
        }, [isOpen]);

        return (
            &amp;lt;section
                id={windowKey}
                ref={ref}
                style={{ zIndex }}
                className="absolute"
            &amp;gt;
                &amp;lt;Component {...props} /&amp;gt;
            &amp;lt;/section&amp;gt;
        );
    };

    return Wrapped;
};`&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;⚙️ Why This Approach Works&lt;/p&gt;

&lt;p&gt;This pattern solved multiple problems at once.&lt;/p&gt;

&lt;p&gt;Instead of tightly coupling UI logic with individual components, I separated behavior from presentation. Each app component now focuses only on its UI, while the wrapper handles system-level responsibilities.&lt;/p&gt;

&lt;p&gt;It also made the project incredibly scalable. Adding a new “app” is as simple as wrapping it:&lt;/p&gt;

&lt;p&gt;const NotesAppWindow = WindowWrapper(NotesApp, "notes");&lt;/p&gt;

&lt;p&gt;No extra logic. No duplication. Just plug and play.&lt;/p&gt;

&lt;p&gt;⚙️ Thinking Beyond Code: Professional Workflow&lt;/p&gt;

&lt;p&gt;At some point, I realized that writing good UI code wasn’t enough. Managing the project properly was just as important.&lt;/p&gt;

&lt;p&gt;I moved away from pushing changes directly to the main branch and started using a feature-branch workflow. Every new feature—whether it was a clock in the menu bar or a new draggable window—was developed in its own branch and merged through a pull request. This gave me a chance to review my own code, clean things up, and avoid breaking existing features.&lt;/p&gt;

&lt;p&gt;Versioning also became important as the project evolved. I followed semantic versioning to keep track of changes in a structured way. Major updates marked significant milestones, minor versions introduced new features, and patches handled bug fixes. This added clarity and made the project feel more like a real product.&lt;/p&gt;

&lt;p&gt;To streamline deployment, I set up a CI/CD pipeline using GitHub Actions. Now, whenever a pull request is merged into the main branch, the project automatically builds and deploys to Netlify. If something breaks, the deployment fails, preventing unstable code from going live. This automation removed a lot of manual effort and made the workflow much more reliable."&lt;br&gt;
Live_link:&lt;a href="https://kunal-dev-portfolio-six.vercel.app" rel="noopener noreferrer"&gt;https://kunal-dev-portfolio-six.vercel.app&lt;/a&gt;&lt;br&gt;
github_link:&lt;a href="https://github.com/KunalChakraborty445" rel="noopener noreferrer"&gt;https://github.com/KunalChakraborty445&lt;/a&gt;&lt;/p&gt;

</description>
      <category>frontend</category>
      <category>react</category>
      <category>showdev</category>
      <category>ui</category>
    </item>
    <item>
      <title># Building an Emotion-Based Music Recommendation System: When AI Meets Your Mood</title>
      <dc:creator>Kunal Chakraborty</dc:creator>
      <pubDate>Tue, 27 Jan 2026 11:43:48 +0000</pubDate>
      <link>https://dev.to/kunal_chakraborty_6a1ab69/-building-an-emotion-based-music-recommendation-system-when-ai-meets-your-mood-1ce3</link>
      <guid>https://dev.to/kunal_chakraborty_6a1ab69/-building-an-emotion-based-music-recommendation-system-when-ai-meets-your-mood-1ce3</guid>
      <description>&lt;p&gt;Creating a music player is straightforward; building an AI that reads your face and curates the perfect playlist is a whole different challenge. For my latest project, I ventured into the fascinating intersection of computer vision and music technology, creating a system that analyzes your facial expressions and recommends Spotify tracks to match your mood.&lt;/p&gt;

&lt;p&gt;While the sleek interface and instant recommendations are what users experience, the real magic happens behind the scenes—training emotion detection models, integrating with Spotify's API, and building a Flask backend that ties it all together. Here's how I brought emotions and algorithms into harmony.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Tech Stack: AI Meets Music
&lt;/h1&gt;

&lt;p&gt;To build a system that truly understands emotions and delivers perfect music matches, I combined cutting-edge ML libraries with production-ready APIs:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Machine Learning &amp;amp; Computer Vision:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;OpenCV&lt;/strong&gt;: The powerhouse for real-time face detection and video stream processing. Captures frames from the webcam and preprocesses them for the emotion detection model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TensorFlow/Keras&lt;/strong&gt;: Built and trained a Convolutional Neural Network (CNN) on the FER-2013 dataset to classify seven emotions: Happy, Sad, Angry, Surprised, Fearful, Disgusted, and Neutral.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DeepFace&lt;/strong&gt;: Initially experimented with this pre-trained model for quick prototyping before fine-tuning my own model for better accuracy.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Backend &amp;amp; API:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Flask&lt;/strong&gt;: A lightweight Python framework perfect for serving the ML model and handling API requests. Built RESTful endpoints for emotion detection and music recommendations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spotify Web API&lt;/strong&gt;: The gateway to millions of songs. Used OAuth 2.0 authentication and accessed audio features (valence, energy, tempo) to match songs with detected emotions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spotipy&lt;/strong&gt;: A Python wrapper that made interacting with Spotify's API incredibly simple—from searching tracks to creating custom playlists.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Frontend:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;HTML/CSS/JavaScript&lt;/strong&gt;: A clean, responsive interface with real-time webcam preview and dynamic playlist rendering.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Chart.js&lt;/strong&gt;: Visualized emotion probabilities and song audio features in beautiful, interactive charts.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Core Features: From Face to Playlist
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Real-Time Emotion Detection
&lt;/h3&gt;

&lt;p&gt;The heart of the system is the emotion detection pipeline. Here's how it works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Capture&lt;/strong&gt;: OpenCV accesses the webcam and captures frames at 30 FPS&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Detect&lt;/strong&gt;: Haar Cascade classifier detects faces in each frame&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Preprocess&lt;/strong&gt;: Crop the face, convert to grayscale, resize to 48x48 pixels&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Predict&lt;/strong&gt;: Feed the preprocessed image into the CNN model&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Classify&lt;/strong&gt;: Model outputs probabilities for all seven emotions&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;The Challenge:&lt;/strong&gt; Real-time processing without lag. The solution? Efficient frame skipping and model optimization:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Process every 3rd frame for better performance
&lt;/span&gt;&lt;span class="n"&gt;frame_count&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;frame_count&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;emotion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;detect_emotion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;frame&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;current_emotion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;emotion&lt;/span&gt;
&lt;span class="n"&gt;frame_count&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Smart Music Mapping
&lt;/h3&gt;

&lt;p&gt;Detecting emotions is only half the battle. The real magic is translating emotions into music that &lt;em&gt;feels right&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Algorithm:&lt;/strong&gt; I created an emotion-to-audio-feature mapping based on Spotify's audio analysis:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;emotion_mapping&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;happy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;valence&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;energy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tempo&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;180&lt;/span&gt;&lt;span class="p"&gt;)},&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sad&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;valence&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.4&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;energy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.4&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tempo&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)},&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;angry&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;valence&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;energy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tempo&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;120&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)},&lt;/span&gt;
    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;calm&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;valence&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;energy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;tempo&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;90&lt;/span&gt;&lt;span class="p"&gt;)},&lt;/span&gt;
    &lt;span class="c1"&gt;# ... more mappings
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;The Result:&lt;/strong&gt; When you're sad, you get melancholic acoustic tracks with low energy. When you're happy, upbeat pop songs with high valence flood your screen.&lt;/p&gt;

&lt;h3&gt;
  
  
  Personalized Recommendations
&lt;/h3&gt;

&lt;p&gt;The system doesn't just pick random songs—it learns from your Spotify listening history:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Get user's top tracks for personalization
&lt;/span&gt;&lt;span class="n"&gt;top_tracks&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;current_user_top_tracks&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;limit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;time_range&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;short_term&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;top_genres&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;extract_genres&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;top_tracks&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Combine emotion-based features with user preferences
&lt;/span&gt;&lt;span class="n"&gt;recommendations&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;recommendations&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;seed_genres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;top_genres&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;target_valence&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;emotion_valence&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;target_energy&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;emotion_energy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;limit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Training the Emotion Detection Model:
&lt;/h2&gt;

&lt;p&gt;Building an accurate emotion classifier was the most challenging—and rewarding—part of this project.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Dataset Journey
&lt;/h3&gt;

&lt;p&gt;I started with the &lt;strong&gt;FER-2013 dataset&lt;/strong&gt; (35,000+ facial images labeled with emotions). But raw data is messy:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Inconsistent lighting conditions&lt;/li&gt;
&lt;li&gt;Mislabeled images&lt;/li&gt;
&lt;li&gt;Class imbalance (way more "happy" faces than "disgusted")&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The Solution:&lt;/strong&gt; Data augmentation and class weighting:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;datagen&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ImageDataGenerator&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;rotation_range&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;width_shift_range&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;height_shift_range&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;horizontal_flip&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;zoom_range&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.2&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Balance classes with weighted loss
&lt;/span&gt;&lt;span class="n"&gt;class_weights&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;compute_class_weight&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;balanced&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
                                     &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;unique&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                                     &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The CNN Architecture
&lt;/h3&gt;

&lt;p&gt;After experimenting with different architectures, I settled on a custom CNN:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Sequential&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
    &lt;span class="nc"&gt;Conv2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;relu&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;input_shape&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;48&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;48&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)),&lt;/span&gt;
    &lt;span class="nc"&gt;BatchNormalization&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="nc"&gt;MaxPooling2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.25&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;

    &lt;span class="nc"&gt;Conv2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;64&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;relu&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;BatchNormalization&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="nc"&gt;MaxPooling2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.25&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;

    &lt;span class="nc"&gt;Conv2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;128&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;relu&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;BatchNormalization&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="nc"&gt;MaxPooling2D&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.4&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;

    &lt;span class="nc"&gt;Flatten&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;256&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;relu&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;softmax&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# 7 emotions
&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;The Results:&lt;/strong&gt; After 50 epochs, achieved 67% validation accuracy—not perfect, but solid for real-world use.&lt;/p&gt;

&lt;h2&gt;
  
  
  Professional Workflow: From Notebook to Production
&lt;/h2&gt;

&lt;p&gt;Moving from a Jupyter notebook to a production Flask app taught me invaluable lessons about ML engineering.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. &lt;strong&gt;Model Optimization &amp;amp; Serialization&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Trained models are large (40+ MB). I learned to optimize and save them properly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Save the model in HDF5 format
&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;save&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;emotion_model.h5&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# In production, load once at startup
&lt;/span&gt;&lt;span class="n"&gt;emotion_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;load_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;emotion_model.h5&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;The Lesson:&lt;/strong&gt; Loading the model on every request killed performance. Load it once when the Flask app starts.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. &lt;strong&gt;API Rate Limiting &amp;amp; Caching&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Spotify's API has rate limits (1 request per second for some endpoints). I implemented caching to avoid hitting limits:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;functools&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;lru_cache&lt;/span&gt;

&lt;span class="nd"&gt;@lru_cache&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;maxsize&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;128&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_recommendations&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;emotion&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user_id&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Cache recommendations for 5 minutes
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;fetch_spotify_recommendations&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;emotion&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. &lt;strong&gt;Error Handling for Real Users&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In notebooks, errors crash the cell. In production, they crash the user's experience. I built comprehensive error handling:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;emotion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;detect_emotion&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;frame&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Emotion detection failed: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;emotion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;neutral&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;  &lt;span class="c1"&gt;# Fallback to neutral
&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;tracks&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;get_spotify_tracks&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;emotion&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="n"&gt;SpotifyException&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;jsonify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Spotify service unavailable&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}),&lt;/span&gt; &lt;span class="mi"&gt;503&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. &lt;strong&gt;Environment Variables &amp;amp; Secrets&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Never expose API keys. I used environment variables for all sensitive data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;dotenv&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;load_dotenv&lt;/span&gt;

&lt;span class="nf"&gt;load_dotenv&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="n"&gt;SPOTIFY_CLIENT_ID&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getenv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;SPOTIFY_CLIENT_ID&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;SPOTIFY_CLIENT_SECRET&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getenv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;SPOTIFY_CLIENT_SECRET&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Deployment Challenges &amp;amp; Solutions&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenge 1: &lt;strong&gt;Large Model Files&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Deploying a 40MB model to Heroku exceeded slug size limits.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt; Switched to AWS EC2 with more storage, and optimized the model using TensorFlow Lite for a 60% size reduction.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenge 2: &lt;strong&gt;Webcam Access in Browser&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Modern browsers require HTTPS for webcam access.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt; Set up SSL certificates using Let's Encrypt and configured Flask to serve over HTTPS.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenge 3: &lt;strong&gt;Cold Start Latency&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;First request took 10+ seconds as the model loaded.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt; Implemented a "warmup" endpoint that loads the model on server startup and keeps it in memory.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lessons Learned
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What Worked:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Transfer Learning&lt;/strong&gt;: Using pre-trained face detection models (Haar Cascades) saved weeks of training time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spotify's Audio Features&lt;/strong&gt;: Valence and energy metrics are surprisingly accurate for emotion-music matching.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flask's Simplicity&lt;/strong&gt;: For ML projects, Flask's lightweight nature beats Django's complexity.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What I'd Do Differently:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Start with Pre-trained Models&lt;/strong&gt;: I spent weeks training from scratch. Fine-tuning a pre-trained model (like VGGFace) would have been faster.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add User Feedback Loop&lt;/strong&gt;: Currently, the system doesn't learn if recommendations are accurate. A "thumbs up/down" feature would improve over time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Better Lighting Handling&lt;/strong&gt;: The model struggles in low light. Adding brightness normalization would help.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Future Enhancements: Taking It to the Next Level
&lt;/h2&gt;

&lt;p&gt;The current version delivers solid emotion-based recommendations, but there's so much more potential. Here's what's on the roadmap:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. &lt;strong&gt;Multi-Person Emotion Detection&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Right now, it detects one face. Imagine a party mode where it analyzes everyone's emotions and creates a playlist that fits the room's vibe:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Detect multiple faces simultaneously&lt;/li&gt;
&lt;li&gt;Average emotion scores across all faces&lt;/li&gt;
&lt;li&gt;Weight recommendations toward the dominant emotion&lt;/li&gt;
&lt;li&gt;Create collaborative playlists that satisfy the group&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. &lt;strong&gt;Voice Tone Analysis&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Facial expressions are only part of the story. Voice carries emotional cues too:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Integrate speech emotion recognition using librosa and PyAudio&lt;/li&gt;
&lt;li&gt;Analyze pitch, tone, and speaking rate&lt;/li&gt;
&lt;li&gt;Combine facial and vocal emotions for more accurate detection&lt;/li&gt;
&lt;li&gt;Especially useful for users on phone calls or podcasts&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. &lt;strong&gt;Emotion History &amp;amp; Trends&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Track emotional patterns over time:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build a "mood journal" showing emotion trends across days/weeks&lt;/li&gt;
&lt;li&gt;Identify triggers (time of day, day of week) for different emotions&lt;/li&gt;
&lt;li&gt;Recommend music proactively: "You're usually stressed on Monday mornings. Here's a calming playlist."&lt;/li&gt;
&lt;li&gt;Visualize emotional journeys with interactive graphs&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  4. &lt;strong&gt;Integration with Smart Home&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Turn the system into an ambient mood manager:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect with Philips Hue to adjust lighting based on emotion&lt;/li&gt;
&lt;li&gt;Integrate with smart speakers for hands-free music control&lt;/li&gt;
&lt;li&gt;Create "mood scenes": Sad = dim lights + melancholic music&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. &lt;strong&gt;Mobile App&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Currently web-based, but a mobile app would be more practical:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;React Native or Flutter for cross-platform development&lt;/li&gt;
&lt;li&gt;On-device ML inference using TensorFlow Lite (faster, more private)&lt;/li&gt;
&lt;li&gt;Background emotion tracking with periodic check-ins&lt;/li&gt;
&lt;li&gt;Push notifications: "Feeling stressed? Try this playlist."&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Final Reflections:
&lt;/h2&gt;

&lt;p&gt;Building an emotion-based music recommendation system taught me that AI isn't just about algorithms—it's about understanding human experiences and translating them into meaningful interactions. The project challenged me to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bridge the gap between ML research and production engineering&lt;/li&gt;
&lt;li&gt;Design systems that feel intuitive, not intrusive&lt;/li&gt;
&lt;li&gt;Balance accuracy with real-time performance&lt;/li&gt;
&lt;li&gt;Think about privacy (webcam data never leaves the device)&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;"Technology should adapt to humans, not the other way around. If your music player doesn't understand your mood, is it really smart?"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This project transformed me from someone who "knows Python" to someone who can &lt;strong&gt;build end-to-end AI applications&lt;/strong&gt; that solve real problems. And with voice analysis, multi-person detection, and smart home integration on the horizon, this is just the beginning of the journey.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Tech Stack Summary:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;ML/CV&lt;/strong&gt;: TensorFlow, Keras, OpenCV, DeepFace&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backend&lt;/strong&gt;: Flask, Python, Spotipy&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;APIs&lt;/strong&gt;: Spotify Web API (OAuth 2.0)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Frontend&lt;/strong&gt;: HTML, CSS, JavaScript, Chart.js&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment&lt;/strong&gt;: AWS EC2, Let's Encrypt SSL&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Future&lt;/strong&gt;: Speech Recognition, TensorFlow Lite, IoT Integration&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;LINK: IT'S Under Development&lt;/p&gt;




&lt;p&gt;&lt;em&gt;What emotions should music apps detect? How would you improve this system? Share your thoughts in the comments below!&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Building a Full-Stack Job Portal: From MVP to Production-Ready Platform</title>
      <dc:creator>Kunal Chakraborty</dc:creator>
      <pubDate>Tue, 27 Jan 2026 11:14:43 +0000</pubDate>
      <link>https://dev.to/kunal_chakraborty_6a1ab69/building-a-full-stack-job-portal-from-mvp-to-production-ready-platform-24f4</link>
      <guid>https://dev.to/kunal_chakraborty_6a1ab69/building-a-full-stack-job-portal-from-mvp-to-production-ready-platform-24f4</guid>
      <description>&lt;p&gt;Creating a simple job board is one thing; building a production-ready job portal with real-time features, state management, and error tracking is another beast entirely. For my latest project, I dove headfirst into the world of full-stack development, building a comprehensive job portal using React, Node.js, Express, and a suite of modern tools.&lt;br&gt;
While the sleek UI and smooth interactions are what users see, the real challenge was architecting a scalable backend, managing complex state, and ensuring the application could handle real-world scenarios. Here's how I built it from the ground up.&lt;br&gt;
🛠️ The Tech Stack: Modern Full-Stack Architecture&lt;br&gt;
To build a robust, scalable job portal, I carefully selected technologies that work seamlessly together:&lt;br&gt;
Frontend:&lt;/p&gt;

&lt;p&gt;React: The foundation for building a dynamic, component-based UI. Handled everything from job listings to application forms with reusable components.&lt;br&gt;
Zustand: A lightweight state management solution that made managing global state (user authentication, job filters, application status) incredibly simple compared to Redux.&lt;br&gt;
Axios: Streamlined all HTTP requests with interceptors for authentication tokens and centralized error handling.&lt;/p&gt;

&lt;p&gt;Backend:&lt;/p&gt;

&lt;p&gt;Node.js + Express: Built a RESTful API that handles user authentication, job postings, applications, and more. Express middleware made route protection and validation a breeze.&lt;br&gt;
MongoDB: A flexible NoSQL database perfect for storing diverse job postings, user profiles, and application data with varying structures.&lt;/p&gt;

&lt;p&gt;Production Tools:&lt;/p&gt;

&lt;p&gt;Sentry: Real-time error tracking that alerts me the moment something breaks in production. No more "it works on my machine" excuses.&lt;br&gt;
JWT Authentication: Secure token-based authentication that keeps user sessions safe and stateless.&lt;/p&gt;

&lt;p&gt;The Features That Matter:&lt;/p&gt;

&lt;p&gt;Smart Job Filtering &amp;amp; Search&lt;br&gt;
Users can filter jobs by location, salary range, experience level, and job type. The challenge? Making it fast and responsive.&lt;br&gt;
The Solution: Implemented debounced search with Axios interceptors to cancel previous requests, ensuring the backend isn't overwhelmed with every keystroke:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;javascript// Debounced search with cleanup
useEffect(() =&amp;gt; {
  const timer = setTimeout(() =&amp;gt; {
    fetchJobs(searchQuery);
  }, 500);
  return () =&amp;gt; clearTimeout(timer);
}, [searchQuery]);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Real-Time Application Status&lt;br&gt;
Job seekers need to know where they stand. I built a real-time status tracking system that shows whether applications are "Pending," "Under Review," or "Rejected."&lt;br&gt;
The Zustand Magic: State updates instantly across all components without prop drilling:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const useApplicationStore = create((set) =&amp;gt; ({
  applications: [],
  updateStatus: (id, status) =&amp;gt; 
    set((state) =&amp;gt; ({
      applications: state.applications.map(app =&amp;gt;
        app.id === id? { ...app, status } : app
      )
    }))
}));
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Secure User Authentication&lt;br&gt;
Built a complete authentication flow with registration, login, and protected routes. JWT tokens are stored securely, and Axios interceptors automatically attach them to every request.&lt;br&gt;
🏗️ Professional Workflow: Engineering Excellence&lt;br&gt;
As the project scaled, I quickly learned that great code isn't enough—you need systems and processes.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;API-First Development
Before writing a single line of frontend code, I designed the entire API structure. I documented every endpoint, request/response format, and error code using Postman collections.
The Lesson: This approach prevented countless "I need one more field" moments and made frontend development smooth and predictable.&lt;/li&gt;
&lt;li&gt;Error Handling That Actually Helps
Initially, my error messages were generic: "Something went wrong." With Sentry integrated, I now get:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The exact line of code that failed&lt;br&gt;
The user's browser and OS&lt;br&gt;
The API request that triggered the error&lt;br&gt;
A full stack trace&lt;/p&gt;

&lt;p&gt;The Result: Bug fixes went from "hours of debugging" to "pinpoint and patch in minutes."&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Environment Variables &amp;amp; Security
Never hardcode API keys or database URLs. I set up proper environment variable management:
javascript// .env files for different environments
REACT_APP_API_URL=&lt;a href="http://localhost:5000" rel="noopener noreferrer"&gt;http://localhost:5000&lt;/a&gt;  // Development
MONGODB_URI=mongodb://localhost:27017    // Local DB
SENTRY_DSN=your_sentry_dsn_here          // Error tracking&lt;/li&gt;
&lt;li&gt;Git Workflow: Feature Branches &amp;amp; PRs
I abandoned the chaos of committing directly to main. Every feature now gets its own branch:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;feature/job-filters → Added advanced filtering&lt;br&gt;
feature/user-dashboard → Built the applicant dashboard&lt;br&gt;
bugfix/axios-interceptor → Fixed token refresh logic&lt;/p&gt;

&lt;p&gt;Each PR includes a description, screenshots (for UI changes), and a checklist of what was tested.&lt;/p&gt;

&lt;p&gt;Deployment &amp;amp; CI/CD:&lt;br&gt;
Manual deployment is error-prone and time-consuming. I automated everything using GitHub Actions:&lt;br&gt;
The Workflow:&lt;/p&gt;

&lt;p&gt;Push code to a feature branch&lt;br&gt;
Open a PR → Automated tests run&lt;br&gt;
Merge to main → GitHub Actions builds the frontend and backend&lt;br&gt;
Automatic deployment to production (Vercel for frontend, Railway for backend)&lt;/p&gt;

&lt;p&gt;If any test fails, deployment is blocked. The live site stays stable.&lt;/p&gt;

&lt;p&gt;Lessons Learned&amp;gt;&amp;gt;&lt;/p&gt;

&lt;p&gt;What Worked:&lt;/p&gt;

&lt;p&gt;Zustand over Redux: For this project's complexity, Zustand was perfect. Less boilerplate, easier to understand.&lt;br&gt;
Axios Interceptors: Centralized error handling and token management saved countless hours.&lt;br&gt;
Sentry: Catching errors before users report them is a game-changer.&lt;/p&gt;

&lt;p&gt;What I'd Do Differently:&lt;/p&gt;

&lt;p&gt;TypeScript from Day One: Adding it midway through was painful. Type safety would have prevented many bugs.&lt;br&gt;
Better Database Indexing: As job listings grew, search queries slowed down. Learned the importance of MongoDB indexes the hard way.&lt;/p&gt;

&lt;p&gt;Future Plans: Taking It Further&lt;br&gt;
The current version is solid, but there's always room to grow. Here's what's next on the roadmap:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Interview Scheduling System
Recruiters and candidates waste time coordinating interviews via email. I'm building an integrated calendar system where:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Recruiters set available time slots&lt;br&gt;
Candidates select their preferred times&lt;br&gt;
Automatic email confirmations (using Nodemailer)&lt;br&gt;
Google Calendar integration for both parties&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AI-Powered Job Assistant
Imagine a chatbot that helps job seekers:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Resume Analysis: Upload your resume, get suggestions on how to improve it for specific jobs&lt;br&gt;
Job Recommendations: "Based on your skills, here are 5 jobs you're qualified for."&lt;br&gt;
Interview Prep: Practice common interview questions with AI feedback&lt;/p&gt;

&lt;p&gt;The Tech: Planning to integrate OpenAI's API for natural language processing and personalized recommendations.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Advanced Analytics Dashboard
For recruiters, data is everything. I'm building a dashboard showing:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Application conversion rates&lt;br&gt;
Most effective job posting strategies&lt;br&gt;
Candidate pipeline visualization&lt;/p&gt;

&lt;p&gt;🎬 Final Reflections&lt;br&gt;
Building a full-stack job portal taught me more than any tutorial ever could. It's not just about knowing React or Node.js—it's about understanding:&lt;/p&gt;

&lt;p&gt;How to architect scalable systems&lt;br&gt;
Why error tracking matters in production&lt;br&gt;
The importance of professional workflows (PRs, CI/CD, versioning)&lt;/p&gt;

&lt;p&gt;"Anyone can write code that works today. Engineers write code that works tomorrow, next month, and next year."&lt;/p&gt;

&lt;p&gt;This project transformed me from someone who "can code" to someone who can ship production-grade applications. And with the AI assistant and interview scheduling features on the horizon, this is just the beginning.&lt;/p&gt;

&lt;p&gt;Tech Stack Summary:&lt;/p&gt;

&lt;p&gt;Frontend: React, Zustand, Axios, Tailwind CSS&lt;br&gt;
Backend: Node.js, Express, MongoDB, JWT&lt;br&gt;
DevOps: Sentry, GitHub Actions, Vercel, Railway&lt;br&gt;
Future: OpenAI API, Calendar APIs, Advanced Analytics&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/KunalChakraborty445/job-portal-full-stack" rel="noopener noreferrer"&gt;https://github.com/KunalChakraborty445/job-portal-full-stack&lt;/a&gt;&lt;br&gt;
Live Demo: &lt;a href="https://job-portal-full-stack-client-wine.vercel.app" rel="noopener noreferrer"&gt;https://job-portal-full-stack-client-wine.vercel.app&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
