<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: sarthak bhardwaj</title>
    <description>The latest articles on DEV Community by sarthak bhardwaj (@sarthak_bhardwaj_05aba55d).</description>
    <link>https://dev.to/sarthak_bhardwaj_05aba55d</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sarthak_bhardwaj_05aba55d"/>
    <language>en</language>
    <item>
      <title>Turning Music Into Art — Building a Synesthesia Simulator with Gemini</title>
      <dc:creator>sarthak bhardwaj</dc:creator>
      <pubDate>Sun, 14 Sep 2025 18:54:05 +0000</pubDate>
      <link>https://dev.to/sarthak_bhardwaj_05aba55d/turning-music-into-art-building-a-synesthesia-simulator-with-gemini-3459</link>
      <guid>https://dev.to/sarthak_bhardwaj_05aba55d/turning-music-into-art-building-a-synesthesia-simulator-with-gemini-3459</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/google-ai-studio-2025-09-03"&gt;Google AI Studio Multimodal Challenge&lt;/a&gt;&lt;/em&gt;  &lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built the &lt;strong&gt;Synesthesia Simulator&lt;/strong&gt;, an AI-powered applet designed to translate sound and imagery into a unified, cross-sensory artistic experience. It creatively simulates the neurological trait of synesthesia, allowing users to &lt;em&gt;see music as color&lt;/em&gt; and &lt;em&gt;hear pictures as melodies&lt;/em&gt;.  &lt;/p&gt;

&lt;p&gt;The applet provides a creative and exploratory space for users to discover novel connections between their senses. You can upload an &lt;strong&gt;audio file&lt;/strong&gt;, an &lt;strong&gt;image file&lt;/strong&gt;, or both, and the AI generates:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;A Descriptive Scene&lt;/strong&gt; – A vivid, artistic narrative describing the blended sensory experience.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Creative Prompts&lt;/strong&gt; – Inspiring ideas for writing, art, or reflection based on the output.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A Generated Vision&lt;/strong&gt; – A unique AI-generated image visually representing the fusion of sound and/or visuals.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Creative Chat&lt;/strong&gt; – An interactive chat session with a creative AI assistant, primed with the context of your generated experience, to explore ideas further.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;My goal was to create a tool that not only showcases advanced AI but also serves as a source of inspiration — particularly for creative and neurodiverse individuals who may naturally think in cross-sensory ways. It's not a medical tool, but a &lt;strong&gt;canvas for imagination&lt;/strong&gt;.  &lt;/p&gt;




&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Live Applet Link:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
➡️ &lt;a href="https://ai.studio/apps/drive/1Zc-tOnjDSxIu59zuIDHLcKzF-EHAjA2n" rel="noopener noreferrer"&gt;Launch the Synesthesia Simulator Here&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshots &amp;amp; Walkthrough:&lt;/strong&gt;  &lt;/p&gt;

&lt;p&gt;Here’s the main interface where you can upload an audio file and an image:&lt;br&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fexzof70lp0jfpu3m8bft.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fexzof70lp0jfpu3m8bft.png" alt="Synesthesia Simulator Upload" width="800" height="477"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;After processing, the applet presents the AI's synesthetic interpretation alongside a newly generated piece of art. The app includes a built-in audio visualizer that reacts to your music, with customizable color schemes:&lt;br&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7c0vip9y94ovvn0opek.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7c0vip9y94ovvn0opek.png" alt="Synesthesia Simulator Output &amp;amp; Visualizer" width="800" height="414"&gt;&lt;/a&gt;      &lt;/p&gt;

&lt;p&gt;Other features and showing the experience with a context-aware creative AI assistant:&lt;br&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feceeys5kxysd2qn1fox6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feceeys5kxysd2qn1fox6.png" alt="Synesthesia Chat" width="800" height="411"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkgsj1m8j8dmq7p3murmq.png" alt="History" width="800" height="414"&gt;
&lt;/h2&gt;

&lt;h2&gt;
  
  
  How I Used Google AI Studio
&lt;/h2&gt;

&lt;p&gt;Google AI Studio and the Gemini API power this entire experience. I combined multiple models in a seamless pipeline to handle complex multimodal tasks:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Gemini 2.5 Flash (Multimodal Understanding):&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Core of the simulator.
&lt;/li&gt;
&lt;li&gt;Handles system prompt + user prompt + audio file bytes + image file bytes all in one request.
&lt;/li&gt;
&lt;li&gt;Outputs a structured JSON (descriptiveScene, creativePrompts, imageGenerationInstruction) for reliable integration into the UI.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Imagen 4.0 (Image Generation):&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Translates the imageGenerationInstruction from Gemini into tangible artwork.
&lt;/li&gt;
&lt;li&gt;Creates visuals that embody the cross-sensory interpretation.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Gemini 2.5 Flash (Conversational AI):&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Powers the &lt;strong&gt;Creative Chat&lt;/strong&gt;.
&lt;/li&gt;
&lt;li&gt;A new chat session is initialized with descriptiveScene + creativePrompts as context.
&lt;/li&gt;
&lt;li&gt;Turns the assistant into a creative partner, offering deeper exploration of the user’s generated experience.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Multimodal Features
&lt;/h2&gt;

&lt;p&gt;The &lt;strong&gt;multimodal capabilities of Gemini&lt;/strong&gt; are what make this applet possible:  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Cross-Modal Understanding:&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Goes beyond analyzing audio and images separately.
&lt;/li&gt;
&lt;li&gt;Interprets emotional tone of melodies, maps rhythms to textures, and links color palettes to musical patterns.
&lt;/li&gt;
&lt;li&gt;Produces the descriptive scene that defines the synesthetic simulation.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Sense-Blending for Generation:&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Uses cross-modal insights to drive Imagen prompts.
&lt;/li&gt;
&lt;li&gt;Example: &lt;em&gt;“Abstract glowing waves of violet and silver flowing in rhythm with deep piano chords.”&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;Generates true synthesis of sound + visual inputs.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Contextual Conversation:&lt;/strong&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creative Chat expands the experience.
&lt;/li&gt;
&lt;li&gt;Users can ask: &lt;em&gt;“What does the color red sound like in this song?”&lt;/em&gt; or &lt;em&gt;“Tell me a story based on the third creative prompt.”&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;The assistant responds with context-aware, imaginative answers.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;p&gt;✨ Thank you for checking out my project!  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Submission by:&lt;/strong&gt; &lt;a href="https://dev.to/sarthak_bhardwaj_05aba55d"&gt;@sarthak_bhardwaj_05aba55d&lt;/a&gt;  &lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>googleaichallenge</category>
      <category>ai</category>
      <category>gemini</category>
    </item>
    <item>
      <title>Prove Your Health Status, Not Your Identity: Building ZK-VCR on Midnight</title>
      <dc:creator>sarthak bhardwaj</dc:creator>
      <pubDate>Mon, 08 Sep 2025 03:45:01 +0000</pubDate>
      <link>https://dev.to/sarthak_bhardwaj_05aba55d/prove-your-health-status-not-your-identity-building-zk-vcr-on-midnight-1l01</link>
      <guid>https://dev.to/sarthak_bhardwaj_05aba55d/prove-your-health-status-not-your-identity-building-zk-vcr-on-midnight-1l01</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/midnight-2025-08-20"&gt;Midnight Network "Privacy First" Challenge&lt;/a&gt; - Protect That Data prompt&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built &lt;strong&gt;ZK-VCR (Verifiable Credential Oracle)&lt;/strong&gt;, a decentralized application that pioneers a new standard for privacy in on-chain transactions. It allows users to prove they meet specific health criteria (like having a low cardiovascular risk score) to a smart contract without ever revealing their underlying personal health information.&lt;/p&gt;

&lt;p&gt;The project solves the "Leaky Bucket" problem of modern data privacy, where users are forced to hand over sensitive data to multiple services, risking exposure with every new interaction. ZK-VCR replaces this with an "Airlock" model, built on the philosophy of &lt;strong&gt;Privacy for the User, Transparency for the Algorithm, and Governance for the Source.&lt;/strong&gt; A user's data never leaves their device; instead, a Zero-Knowledge proof is generated locally and sent to the chain for verification.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;The complete source code and detailed documentation are available on GitHub:&lt;br&gt;
&lt;strong&gt;&lt;a href="https://github.com/SarthakB11/zk-vcr" rel="noopener noreferrer"&gt;Source Code &amp;amp; Complete Documentation&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here is a live recording of the ZK-VCR Command-Line Interface (CLI) in action, demonstrating the complete end-to-end flow from a clinic generating a credential to a user privately verifying it on-chain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://github.com/user-attachments/assets/1b93a224-a133-44aa-903a-e9bb0911e9a8" rel="noopener noreferrer"&gt;Watch the ZK-VCR Demo&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How I Used Midnight's Technology
&lt;/h2&gt;

&lt;p&gt;This project is built from the ground up using Midnight's core technology stack:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Compact Language:&lt;/strong&gt; The entire on-chain logic, including the smart contract and all ZK circuits, is written in Compact. I used it to define the contract's state (like the &lt;code&gt;owner&lt;/code&gt; and the &lt;code&gt;trustedIssuers&lt;/code&gt; map) and to implement the complex, privacy-preserving logic inside the &lt;code&gt;submitHealthProof&lt;/code&gt; circuit. Key features like &lt;code&gt;persistentHash&lt;/code&gt; were used to create a ZK-friendly signature scheme.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;MidnightJS SDK:&lt;/strong&gt; The entire off-chain portion of the DApp—the user CLI, administrator panel, and issuer tool—is built in TypeScript and uses the &lt;code&gt;MidnightJS&lt;/code&gt; SDK. This library was essential for all blockchain interactions, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Deploying and finding contracts on the testnet.&lt;/li&gt;
&lt;li&gt;  Managing wallets and private state.&lt;/li&gt;
&lt;li&gt;  Constructing and submitting transactions to call the contract's circuits.&lt;/li&gt;
&lt;li&gt;  Querying the public on-chain state to display it to the user.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;&lt;strong&gt;Private Witnesses:&lt;/strong&gt; The core of the DApp's privacy model relies on Compact's witness system. The user's sensitive &lt;code&gt;VerifiableCredential&lt;/code&gt; and the administrator's &lt;code&gt;ownerSecretKey&lt;/code&gt; are passed as private witnesses, meaning they are used in the ZK proof computation but never revealed on-chain.&lt;/p&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Protection as a Core Feature
&lt;/h2&gt;

&lt;p&gt;Data protection isn't just a feature of ZK-VCR; it is the central design principle.&lt;/p&gt;

&lt;p&gt;The "Airlock" model ensures that the user's personal health information (PHI) &lt;strong&gt;never leaves their local machine&lt;/strong&gt;. When a user wants to prove their eligibility for a service, the &lt;code&gt;submitHealthProof&lt;/code&gt; ZK circuit is executed locally. The only artifact that is ever sent to the public blockchain is the anonymous ZK proof itself.&lt;/p&gt;

&lt;p&gt;This proof mathematically demonstrates three things without revealing the underlying data:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; The health data came from a trusted source (verified via a cryptographic signature).&lt;/li&gt;
&lt;li&gt; The health data meets the publicly defined criteria (the "AI" model was run on it).&lt;/li&gt;
&lt;li&gt; The proof is fresh and not being replayed (verified via a challenge-nonce).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The smart contract learns only a single binary fact: "An anonymous user has successfully proven they are low-risk." It learns nothing about their cholesterol, their blood pressure, or whether they smoke. This provides powerful, mathematically guaranteed privacy that is fundamentally superior to policy-based promises.&lt;/p&gt;

&lt;h2&gt;
  
  
  Set Up Instructions / Tutorial
&lt;/h2&gt;

&lt;p&gt;The following is a complete, step-by-step guide to setting up the development environment and running the full ZK-VCR demo.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://github.com/SarthakB11/zk-vcr/blob/master/TUTORIAL.md" rel="noopener noreferrer"&gt;View the Full Tutorial on GitHub&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Submission by: &lt;a href="https://dev.to/sarthak_bhardwaj_05aba55d"&gt;@sarthak_bhardwaj_05aba55d&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>midnightchallenge</category>
      <category>web3</category>
      <category>blockchain</category>
    </item>
  </channel>
</rss>
