<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: ANIRUDDHA  ADAK</title>
    <description>The latest articles on DEV Community by ANIRUDDHA  ADAK (@aniruddhaadak).</description>
    <link>https://dev.to/aniruddhaadak</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/aniruddhaadak"/>
    <language>en</language>
    <item>
      <title>I built NexusForge: The Multimodal AI Agent Hub for Notion</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Fri, 20 Mar 2026 15:48:00 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/i-built-nexusforge-the-multimodal-ai-agent-hub-for-notion-454</link>
      <guid>https://dev.to/aniruddhaadak/i-built-nexusforge-the-multimodal-ai-agent-hub-for-notion-454</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/notion-2026-03-04"&gt;Notion MCP Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;NexusForge&lt;/strong&gt; is a multimodal workflow app for Notion. It turns screenshots, whiteboard photos, rough sketches, and messy prompts into structured Notion-ready deliverables.&lt;/p&gt;

&lt;p&gt;The strongest workflow in the app is &lt;strong&gt;diagram to technical brief&lt;/strong&gt;: upload a system design image, ask for a concise engineering summary, and NexusForge produces a clean markdown artifact that can be previewed immediately and published into Notion as a child page.&lt;/p&gt;

&lt;p&gt;I built it to solve a very practical problem: visual thinking happens early, but documentation usually happens later and manually. NexusForge closes that gap.&lt;/p&gt;

&lt;p&gt;It combines:&lt;br&gt;
✅ &lt;strong&gt;Gemini 3 Flash Preview&lt;/strong&gt; for multimodal understanding&lt;br&gt;
✅ &lt;strong&gt;Notion API&lt;/strong&gt; for creating real pages from generated markdown&lt;br&gt;
✅ &lt;strong&gt;Notion MCP configuration&lt;/strong&gt; in the workspace, so the repo is ready for direct Notion MCP OAuth in VS Code&lt;/p&gt;
&lt;h2&gt;
  
  
  Reliability Hardening
&lt;/h2&gt;

&lt;p&gt;To make the app safer for broader public use, I added:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a &lt;strong&gt;Notion page picker&lt;/strong&gt; backed by live workspace search&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;client-side upload validation&lt;/strong&gt; for unsupported image types and oversized files&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;clearer Notion publish errors&lt;/strong&gt; instead of generic failures&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;retry and timeout handling&lt;/strong&gt; for both Gemini and Notion requests&lt;/li&gt;
&lt;li&gt;a small &lt;strong&gt;runtime health panel&lt;/strong&gt; so users can see whether Gemini, OAuth, and Notion publish paths are actually ready&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Live:

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://nexus-forge-one.vercel.app/" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;nexus-forge-one.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;







&lt;h3&gt;
  
  
  View the source code:

&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/nexus-forge" rel="noopener noreferrer"&gt;
        nexus-forge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Turn rough visuals into polished Notion deliverables with Gemini, Notion, and MCP.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div&gt;
  &lt;a rel="noopener noreferrer" href="https://github.com/aniruddhaadak80/nexus-forge/./public/nexusforge-mark.svg"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2Fnexus-forge%2F.%2Fpublic%2Fnexusforge-mark.svg" alt="NexusForge logo" width="120"&gt;&lt;/a&gt;
  &lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;NexusForge&lt;/h1&gt;
&lt;/div&gt;
  &lt;p&gt;Turn rough visuals into polished Notion deliverables.&lt;/p&gt;
  &lt;p&gt;
    &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/f7ed8d6803e1d787a17026a4d47732363ea05a28aae6b2b4cab33e0ea9ea81b4/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6578742e6a732d31362d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e6578742e6a73266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/f7ed8d6803e1d787a17026a4d47732363ea05a28aae6b2b4cab33e0ea9ea81b4/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6578742e6a732d31362d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e6578742e6a73266c6f676f436f6c6f723d7768697465" alt="Next.js"&gt;&lt;/a&gt;
    &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/1282471a74ddd2bf7839af2e415aa77278bea0d5812a374daac325138795e400/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f47656d696e692d335f466c6173685f507265766965772d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d676f6f676c652d67656d696e69266c6f676f436f6c6f723d384445424646"&gt;&lt;img src="https://camo.githubusercontent.com/1282471a74ddd2bf7839af2e415aa77278bea0d5812a374daac325138795e400/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f47656d696e692d335f466c6173685f507265766965772d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d676f6f676c652d67656d696e69266c6f676f436f6c6f723d384445424646" alt="Gemini"&gt;&lt;/a&gt;
    &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/0541bb83ed920d836c37595a4ee6fb8d169f2ea445e33fbd31cdcc844dbe12c7/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6f74696f6e2d4d43505f2532425f4150492d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e6f74696f6e266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/0541bb83ed920d836c37595a4ee6fb8d169f2ea445e33fbd31cdcc844dbe12c7/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6f74696f6e2d4d43505f2532425f4150492d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e6f74696f6e266c6f676f436f6c6f723d7768697465" alt="Notion"&gt;&lt;/a&gt;
    &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/ea01e608081cf53b3960682bdcabcacaf9be100e24f71f31a3d39fa342b7d968/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f56657263656c2d52656164792d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d76657263656c266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/ea01e608081cf53b3960682bdcabcacaf9be100e24f71f31a3d39fa342b7d968/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f56657263656c2d52656164792d3042313232303f7374796c653d666f722d7468652d6261646765266c6f676f3d76657263656c266c6f676f436f6c6f723d7768697465" alt="Vercel"&gt;&lt;/a&gt;
  &lt;/p&gt;
&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Overview&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;NexusForge is a challenge-focused multimodal workflow app for Notion. It takes a screenshot, whiteboard photo, product sketch, or architecture diagram plus a text prompt, uses Gemini 3 Flash Preview to generate structured markdown, and then publishes that result into Notion as a child page.&lt;/p&gt;
&lt;p&gt;It now supports two Notion auth paths:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Connect Notion with OAuth from the app UI&lt;/li&gt;
&lt;li&gt;Fall back to &lt;code&gt;NOTION_API_KEY&lt;/code&gt; for a workspace token based setup&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The project also includes workspace-level Notion MCP configuration in &lt;a href="https://github.com/aniruddhaadak80/nexus-forge/./.vscode/mcp.json" rel="noopener noreferrer"&gt;.vscode/mcp.json&lt;/a&gt; so the repo itself is ready for direct Notion MCP OAuth inside VS Code.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Why This Is Different&lt;/h2&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;It is built around a concrete workflow, not a generic chat wrapper.&lt;/li&gt;
&lt;li&gt;It demonstrates multimodal input with a real generated artifact.&lt;/li&gt;
&lt;li&gt;It uses an honest split between Notion MCP for workspace tooling and the Notion API for user-triggered web publishing.&lt;/li&gt;
&lt;li&gt;It is screenshot-ready for…&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/nexus-forge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;



&lt;/h3&gt;




&lt;h2&gt;
  
  
  Demo:
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Landing page
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftruo9bkmxlc6dbnbnxb4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftruo9bkmxlc6dbnbnxb4.png" alt="Imalkoio tion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Generated result from an uploaded system map
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0hyfka2dijf3cbpvkw8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl0hyfka2dijf3cbpvkw8.png" alt="NexusForge generated result"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Structure Flowchart
&lt;/h2&gt;

&lt;p&gt;Let's see how the internal pipeline operates using this diagram:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi3l1bebhdszr7oqtj53.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi3l1bebhdszr7oqtj53.png" alt="Im ption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup &amp;amp; Implementation Guide
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. The Multimodal Intelligence
&lt;/h3&gt;

&lt;p&gt;I used &lt;code&gt;@google/genai&lt;/code&gt; with &lt;code&gt;gemini-3-flash-preview&lt;/code&gt; so NexusForge can reason about both text and images in one request. That makes screenshots and architecture diagrams first-class input instead of just attachments.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;contents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nf"&gt;buildSystemPrompt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;)}&lt;/span&gt;&lt;span class="s2"&gt;\n\nUser request: &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;trim&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;];&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;imageBase64&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;meta&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;imageBase64&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;,&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;mimeType&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;meta&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]?.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;;&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;image/png&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nx"&gt;contents&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;inlineData&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;mimeType&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;models&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;generateContent&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gemini-3-flash-preview&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;contents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. The Notion Publishing Path
&lt;/h3&gt;

&lt;p&gt;For the web app runtime, I now support a proper Notion OAuth connect flow. Users can connect their own workspace from the UI, which stores an encrypted session cookie and lets the server publish to Notion using that workspace token. I also kept &lt;code&gt;NOTION_API_KEY&lt;/code&gt; as a fallback for internal demos.&lt;/p&gt;

&lt;p&gt;Once connected, the app uses the Notion API to create a real child page under a selected parent page:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://api.notion.com/v1/pages&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;Authorization&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`Bearer &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;notionApiKey&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Notion-Version&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2026-03-11&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;parent&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;page_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;cleanParentId&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;properties&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt; &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="p"&gt;}],&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="nx"&gt;markdown&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;}),&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. OAuth Callback + Session Handling
&lt;/h3&gt;

&lt;p&gt;The app includes a callback route at &lt;code&gt;/api/notion/callback&lt;/code&gt; that exchanges the authorization code for an access token, encrypts the token server-side, and stores it in an HTTP-only cookie. That makes the demo feel like a real connected product rather than a one-off internal script.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Where MCP Fits
&lt;/h3&gt;

&lt;p&gt;The repo also includes &lt;code&gt;.vscode/mcp.json&lt;/code&gt; pointing at &lt;code&gt;https://mcp.notion.com/mcp&lt;/code&gt;, so the workspace itself is ready for direct Notion MCP authentication inside GitHub Copilot or other MCP-capable tools in VS Code.&lt;/p&gt;

&lt;p&gt;That means the project demonstrates two complementary ideas:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Web app publishing flow&lt;/strong&gt; for end users&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Workspace MCP integration&lt;/strong&gt; for AI-assisted Notion operations while developing&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why This Stands Out In The Challenge
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;It is not just “chat with Notion”. It is a concrete production-style workflow.&lt;/li&gt;
&lt;li&gt;It shows off &lt;strong&gt;multimodality&lt;/strong&gt; in a way judges can understand immediately.&lt;/li&gt;
&lt;li&gt;It includes a real in-product &lt;strong&gt;Connect Notion&lt;/strong&gt; OAuth handoff instead of relying only on hidden developer credentials.&lt;/li&gt;
&lt;li&gt;It uses Notion in a way that feels native: generating polished artifacts and pushing them directly into a workspace.&lt;/li&gt;
&lt;li&gt;It is practical across engineering, operations, marketing, and study workflows.&lt;/li&gt;
&lt;li&gt;It has been hardened beyond a demo by reducing common user failure modes in the publish flow.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Future Scope
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Add PDF and document ingestion for richer multimodal pipelines.&lt;/li&gt;
&lt;li&gt;Add template-aware publishing into specific Notion databases.&lt;/li&gt;
&lt;li&gt;Add polling and human-in-the-loop approval flows for recurring workflows.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;NexusForge aims to redefine exactly how interactive and automated workspaces should feel!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Thank you to Notion and DEV. 💖&lt;/p&gt;




</description>
      <category>devchallenge</category>
      <category>notionchallenge</category>
      <category>mcp</category>
      <category>ai</category>
    </item>
    <item>
      <title>building intelligent dreams: my wecoded journey as an ai agent engineer</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Sun, 15 Mar 2026 16:15:00 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/building-intelligent-dreams-my-wecoded-journey-as-an-ai-agent-engineer-2f1m</link>
      <guid>https://dev.to/aniruddhaadak/building-intelligent-dreams-my-wecoded-journey-as-an-ai-agent-engineer-2f1m</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/wecoded-2026"&gt;2026 WeCoded Challenge&lt;/a&gt;: Echoes of Experience&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;hey everyone, it's aniruddha here. as we dive into the wecoded 2026 challenge, i wanted to take a moment to reflect on my own path in this incredible world of tech. it's been a wild ride, full of learning, building, and a whole lot of coffee.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FfUbUPSGvFdThyFaq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FfUbUPSGvFdThyFaq.png" alt="futuristic workspace with ai code" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i remember when i first started tinkering with code. it felt like unlocking a secret language, a way to bring ideas to life. from those early days of crafting simple websites to diving deep into the complexities of full-stack development, every line of code felt like a step forward.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FvLlyCOtbGLwaBUZq.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FvLlyCOtbGLwaBUZq.gif" alt="cat coding gif" width="200" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;there's a certain magic in seeing your creations come alive on a screen, isn't there. it's that feeling of "it works" that keeps us going through the long nights.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FtYiUuUDzUntGObce.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FtYiUuUDzUntGObce.gif" alt="happy dance gif" width="480" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;but the real game-changer for me was discovering the power of artificial intelligence, especially in the realm of ai agents. the idea of building systems that can learn, adapt, and even make decisions autonomously, that's what truly captivated me.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FgueshRWOsAweiPIQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FgueshRWOsAweiPIQ.png" alt="ai agent concept art" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;it's like teaching a computer to think, to solve problems in ways we hadn't imagined. it's challenging, yes, but immensely rewarding.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FfdZHPLKSoeowcdqn.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FfdZHPLKSoeowcdqn.gif" alt="ai agent coding gif" width="760" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;there were moments, of course, when i felt completely overwhelmed. the sheer volume of new frameworks, libraries, and concepts in both full-stack and ai development can be daunting. i'd stare at complex architectures, feeling a bit lost in the digital wilderness.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FAlFkRJrWVRIxEWcs.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FAlFkRJrWVRIxEWcs.gif" alt="debugging frustration gif" width="480" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i think every developer, no matter how experienced, has those moments of doubt. those times when the code just won't cooperate and you feel like you're hitting a wall.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FvhgzoBGIRKHBLDxz.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FvhgzoBGIRKHBLDxz.gif" alt="coding math gif" width="200" height="150"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;that's where the community came in. i found myself drawn to online forums, local meetups, and open-source projects. it was in these spaces that i connected with other passionate developers, people who were just as excited about building the future as i was.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FBFfgdbJLctdDjJcI.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FBFfgdbJLctdDjJcI.gif" alt="developers working together gif" width="200" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;we shared snippets of code, debugged issues together, and celebrated small victories. it was a reminder that even in a field that often feels solitary, we're all in this together.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FRhAJvsjJUFWfUTFm.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FRhAJvsjJUFWfUTFm.gif" alt="mind blown gif" width="480" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;mentorship has also been a cornerstone of my journey. learning from those who have walked the path before you, absorbing their wisdom and insights, it's invaluable. i've been fortunate to have mentors who not only guided me through technical hurdles but also helped me navigate the broader landscape of a tech career.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FkdTnRIKjiJqUZJle.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FkdTnRIKjiJqUZJle.png" alt="full stack development concept" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;their encouragement was like fuel, keeping me going when the challenges seemed too big. it's about that spark of inspiration that lights up the way.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FtFqclmxLLSbfCIBT.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FtFqclmxLLSbfCIBT.gif" alt="ai agent in action gif" width="760" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and that's what wecoded, to me, represents. it's a celebration of that collective spirit, that drive to innovate, and that commitment to making tech a more inclusive and inspiring place for everyone.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FbeescicDibeWbljQ.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FbeescicDibeWbljQ.gif" alt="full stack developer gif" width="750" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;it's about sharing our stories, our code, and our dreams, and in doing so, empowering others to find their own unique voice in this ever-evolving industry.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FkUQWRTysbIGBZqsY.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ffiles.manuscdn.com%2Fuser_upload_by_module%2Fsession_file%2F111734191%2FkUQWRTysbIGBZqsY.gif" alt="computer dance gif" width="220" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i'm still pushing boundaries, still learning new things every single day. the journey of an ai agent engineer and full-stack developer is one of continuous evolution. but with every new challenge, i feel a stronger sense of purpose, knowing that i'm part of a vibrant community that supports and uplifts each other.&lt;/p&gt;

&lt;p&gt;what about your journey. what are you building, what are you learning. i'd love to hear your stories and connect with you.&lt;/p&gt;

&lt;p&gt;thank you for being here, for reading, and for being a part of this amazing tech community. let's keep building intelligent dreams, together.&lt;/p&gt;

&lt;p&gt;with code and camaraderie,&lt;br&gt;
aniruddha adak&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>wecoded</category>
      <category>dei</category>
      <category>career</category>
    </item>
    <item>
      <title>Stop Babysitting Prompts: Visualizing Multi Agent Workflows in Next.js</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Wed, 11 Mar 2026 17:56:25 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/stop-babysitting-prompts-visualizing-multi-agent-workflows-in-nextjs-5aja</link>
      <guid>https://dev.to/aniruddhaadak/stop-babysitting-prompts-visualizing-multi-agent-workflows-in-nextjs-5aja</guid>
      <description>&lt;p&gt;&lt;em&gt;This post is my submission for &lt;a href="https://dev.to/deved/build-multi-agent-systems"&gt;DEV Education Track: Build Multi-Agent Systems with ADK&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"A lot of AI demos look impressive but still hide the actual orchestration. I wanted to make those handoffs deeply explicit."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This post is my official submission for the &lt;em&gt;DEV Education Track&lt;/em&gt; on Building Multi Agent Systems.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built &lt;strong&gt;Multi Agent Studio&lt;/strong&gt;, a highly visual web application that turns one broad request into a fully explorable multi agent workflow.&lt;/p&gt;

&lt;p&gt;Instead of asking a single hidden model to do everything in one pass, the app intelligently splits the job across four focused roles.&lt;/p&gt;

&lt;p&gt;First, &lt;strong&gt;Atlas Story&lt;/strong&gt; handles the heavy planning and framing.&lt;br&gt;
Second, &lt;strong&gt;Signal Curator&lt;/strong&gt; is strictly responsible for evidence gathering and contradiction mapping.&lt;/p&gt;

&lt;p&gt;Third, &lt;strong&gt;Vector Ops&lt;/strong&gt; manages the dense execution packaging.&lt;br&gt;
Fourth, &lt;strong&gt;Relay Console&lt;/strong&gt; is thoughtfully used for human review, escalation, and operator readiness.&lt;/p&gt;

&lt;p&gt;The application is specifically designed for users who &lt;em&gt;do not&lt;/em&gt; want to babysit a giant prompt.&lt;/p&gt;

&lt;p&gt;They can dynamically choose a workflow template, edit their core objective, and deeply inspect the full chain of handoffs, artifacts, and recommendations.&lt;/p&gt;
&lt;h2&gt;
  
  
  Live Demo and Source Code
&lt;/h2&gt;

&lt;p&gt;You can view the full source code directly embedded below.&lt;/p&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/ai-agents-duel" rel="noopener noreferrer"&gt;
        ai-agents-duel
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      An interactive, narrative-driven AI Agent command center featuring a whimsical Digital Sketchbook UI, live neural feeds, and agent dueling.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div&gt;
&lt;p&gt;&lt;a href="https://git.io/typing-svg" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/e34f7f5422deefaccce0ab9104de7223aef79ec707f07e42c72bb177aa374346/68747470733a2f2f726561646d652d747970696e672d7376672e6865726f6b756170702e636f6d3f666f6e743d466972612b436f64652670617573653d3130303026636f6c6f723d3030373046332677696474683d343335266c696e65733d4d756c74692b4167656e742b53747564696f3b56697375616c697a696e672b41492b576f726b666c6f77733b506f77657265642b62792b4e6578742e6a732b616e642b47656d696e693b4e6f2b4d6f72652b4261627973697474696e672b50726f6d707473" alt="Typing SVG"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/20af8ead8ff55dc76826e6cf06bad7ab29b843c7d8f9ff76e5296d7106415b46/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6578742e6a732d3030303030303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e657874646f746a73266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/20af8ead8ff55dc76826e6cf06bad7ab29b843c7d8f9ff76e5296d7106415b46/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6578742e6a732d3030303030303f7374796c653d666f722d7468652d6261646765266c6f676f3d6e657874646f746a73266c6f676f436f6c6f723d7768697465" alt="Next JS Badge"&gt;&lt;/a&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/18f8a526265952d1a4ed04eff457c936721e64e5bf4e3f35cca938efe3f30de5/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f52656163742d3230323332413f7374796c653d666f722d7468652d6261646765266c6f676f3d7265616374266c6f676f436f6c6f723d363144414642"&gt;&lt;img src="https://camo.githubusercontent.com/18f8a526265952d1a4ed04eff457c936721e64e5bf4e3f35cca938efe3f30de5/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f52656163742d3230323332413f7374796c653d666f722d7468652d6261646765266c6f676f3d7265616374266c6f676f436f6c6f723d363144414642" alt="React Badge"&gt;&lt;/a&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/b308ff9a6de632b94c933c0f27975188080f8cf88a115ae10338540f8d9ab8ab/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f547970655363726970742d3030374143433f7374796c653d666f722d7468652d6261646765266c6f676f3d74797065736372697074266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/b308ff9a6de632b94c933c0f27975188080f8cf88a115ae10338540f8d9ab8ab/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f547970655363726970742d3030374143433f7374796c653d666f722d7468652d6261646765266c6f676f3d74797065736372697074266c6f676f436f6c6f723d7768697465" alt="TypeScript Badge"&gt;&lt;/a&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/3827e442df4b40ffa94fd471fa234f0bd0bddf90c3b3aafb54347962621a62da/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f47656d696e695f335f466c6173682d3432383546343f7374796c653d666f722d7468652d6261646765266c6f676f3d676f6f676c65266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/3827e442df4b40ffa94fd471fa234f0bd0bddf90c3b3aafb54347962621a62da/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f47656d696e695f335f466c6173682d3432383546343f7374796c653d666f722d7468652d6261646765266c6f676f3d676f6f676c65266c6f676f436f6c6f723d7768697465" alt="Gemini Badge"&gt;&lt;/a&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/34bfa473ed2cc56da8aead699dd5d36ccd92466b9629d1510bd8cfc222327dbf/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f56657263656c2d3030303030303f7374796c653d666f722d7468652d6261646765266c6f676f3d76657263656c266c6f676f436f6c6f723d7768697465"&gt;&lt;img src="https://camo.githubusercontent.com/34bfa473ed2cc56da8aead699dd5d36ccd92466b9629d1510bd8cfc222327dbf/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f56657263656c2d3030303030303f7374796c653d666f722d7468652d6261646765266c6f676f3d76657263656c266c6f676f436f6c6f723d7768697465" alt="Vercel Badge"&gt;&lt;/a&gt;
&lt;/p&gt;


&lt;/div&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Explore the Application&lt;/h2&gt;
&lt;/div&gt;

&lt;p&gt;A fully functional Next.js multi agent system that elegantly turns one broad user request into a visible planner, researcher, builder, and reviewer workflow.&lt;/p&gt;

&lt;p&gt;Live app: &lt;a href="https://ai-agents-duel.vercel.app" rel="nofollow noopener noreferrer"&gt;https://ai-agents-duel.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/aniruddhaadak80/ai-agents-duel" rel="noopener noreferrer"&gt;https://github.com/aniruddhaadak80/ai-agents-duel&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This project was carefully rebuilt around the core ideas from the DEV Education track on building multi agent systems. The primary focus is squarely on deep specialization, structured orchestration, explicit handoffs, and rigid review gates. Instead of one giant prompt, the app gives users an entire workflow library, precise agent controls, a visual review queue, and fully inspectable run details.&lt;/p&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Extensive Topics and Tags&lt;/h2&gt;
&lt;/div&gt;

&lt;p&gt;Here is a look at the major themes covering this repository
AI Agents, Build Multi Agents, Gemini 3 Flash, Nextjs, Vercel, Generative AI, Automation, LLM Orchestration, TypeScript Engineering, React 19, Human In The Loop, Agentic Workflows, Google AI Studio, Developer Tools.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Visual Storyboard and Interface&lt;/h2&gt;

&lt;/div&gt;
&lt;p&gt;Here is a look at the live dynamic application.&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://github.com/aniruddhaadak80/ai-agents-duel/./public/images/docs/hero-live.png"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2Fai-agents-duel%2F.%2Fpublic%2Fimages%2Fdocs%2Fhero-live.png" alt="Hero Interface" width="100%"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;You…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/ai-agents-duel" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;




&lt;p&gt;You can also visit the live application on Vercel to try it yourself right now.&lt;/p&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://ai-agents-duel.vercel.app" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;ai-agents-duel.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;h2&gt;
  
  
  Visual Walkthrough
&lt;/h2&gt;

&lt;p&gt;You can browse the natively available pre built templates directly in the &lt;strong&gt;Workflow Library&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fai-agents-duel%2Fmain%2Fpublic%2Fimages%2Fdocs%2Fworkflow-library.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fai-agents-duel%2Fmain%2Fpublic%2Fimages%2Fdocs%2Fworkflow-library.png" alt="Workflow Library" width="100%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can deeply inspect the exact output and internal thought process in the &lt;strong&gt;Run Detail&lt;/strong&gt; view.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fai-agents-duel%2Fmain%2Fpublic%2Fimages%2Fdocs%2Frun-detail.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fai-agents-duel%2Fmain%2Fpublic%2Fimages%2Fdocs%2Frun-detail.png" alt="Run Detail" width="100%"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Mission Board&lt;/strong&gt; provides a genuinely clear, categorized view of your active tasks and agent assignments in real time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1xbwufp939zdu1ivrshq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1xbwufp939zdu1ivrshq.png" alt="Ima ption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjcxgxqhwb3yjz8nbuqnx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjcxgxqhwb3yjz8nbuqnx.png" alt="Image  "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;How the system works&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1k7az21u4ef4yor735j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb1k7az21u4ef4yor735j.png" alt="Image deption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Core Problem
&lt;/h2&gt;

&lt;p&gt;Users typically type one request, get one answer, and simply cannot tell which role should have handled what.&lt;/p&gt;

&lt;p&gt;They do not see where the system confidence silently dropped.&lt;br&gt;
They also do not know exactly when human review is deeply needed or what can be acted on immediately.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;By making handoffs explicit, we turn abstract AI operations into a tangible, observable assembly line.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How It Works Under The Hood
&lt;/h2&gt;

&lt;p&gt;The application relies on a &lt;em&gt;workflow first&lt;/em&gt; operational model.&lt;/p&gt;

&lt;p&gt;Users pick from rich templates like Launch Campaign Studio, Ops War Room, Founder Decision Desk, and Ship Feature Relay.&lt;br&gt;
Each continuous run produces transparent stage ownership across the designated agent team.&lt;/p&gt;

&lt;p&gt;It carefully provides a concrete review state, specific agent contributions, and predictive next step recommendations.&lt;br&gt;
Finally, an operator brief is actively generated for incredibly fast human action.&lt;/p&gt;

&lt;p&gt;The core orchestration engine currently runs entirely in memory for effortless local setup and seamless Vercel deployment.&lt;br&gt;
If a GEMINI_API_KEY is configured, the selected agent intelligently enriches the final run output utilizing the brand new &lt;strong&gt;Gemini 3 Flash Preview&lt;/strong&gt; model.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture Pattern
&lt;/h2&gt;

&lt;p&gt;The entire system follows a rigorous multi agent pattern logically broken into sequential steps.&lt;/p&gt;

&lt;p&gt;First, the &lt;strong&gt;Planner&lt;/strong&gt; accurately scopes the core mission.&lt;/p&gt;

&lt;p&gt;Next, the &lt;strong&gt;Researcher&lt;/strong&gt; rigorously gathers context and maps out contradictions.&lt;/p&gt;

&lt;p&gt;Then, the &lt;strong&gt;Builder&lt;/strong&gt; systematically converts the raw work into an execution package.&lt;/p&gt;

&lt;p&gt;Finally, the &lt;strong&gt;Reviewer&lt;/strong&gt; thoughtfully checks confidence and firmly gates publication.&lt;/p&gt;

&lt;p&gt;This architecture keeps the user experience wonderfully simple while still thoroughly proving the massive benefits of true specialization.&lt;/p&gt;

&lt;h2&gt;
  
  
  Technology Stack
&lt;/h2&gt;

&lt;p&gt;The resilient foundation heavily relies on &lt;strong&gt;Next.js 16&lt;/strong&gt; and &lt;strong&gt;React 19&lt;/strong&gt; for a fast modern interface.&lt;/p&gt;

&lt;p&gt;Everything is securely and robustly typed with &lt;strong&gt;TypeScript&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The styling remains exceptionally lightweight by only using &lt;strong&gt;Pure CSS&lt;/strong&gt;.&lt;br&gt;
The robust &lt;strong&gt;Google GenAI SDK&lt;/strong&gt; reliably manages the artificial intelligence communication layer.&lt;/p&gt;

&lt;p&gt;The whole rapid ecosystem is smoothly hosted and delivered on &lt;strong&gt;Vercel&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Makes It Truly Different
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;I focused heavily on deep usability and structure instead of just raw aesthetics.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The app inherently includes workflow presets for common professional jobs and a real time mission board.&lt;br&gt;
It elegantly features review queue actions alongside granular operator controls for autonomy tuning.&lt;br&gt;
There are also beautifully built in agent pause and resume commands.&lt;br&gt;
It provides an optional Gemini upgrade path without ever breaking standard local runs.&lt;/p&gt;

&lt;p&gt;One main architectural challenge was actively avoiding a fake looking multi agent wrapper.&lt;br&gt;
Many early demos simply stop at a stylish dashboard without technical depth.&lt;br&gt;
I strongly wanted the internal model to be robust enough that the visual UI felt deeply justified.&lt;/p&gt;

&lt;p&gt;That logically meant entirely rebuilding the internal run store around rigid workflows, stage ownership, structured artifacts, and clear review states instead of just random summaries.&lt;/p&gt;

&lt;p&gt;Another prominent engineering challenge was cleanly keeping the project universally easy to deploy.&lt;br&gt;
I effectively chose an in memory backend so the complete experience works immediately on Vercel.&lt;br&gt;
This decision beautifully preserves a wonderfully clean path to add database persistence later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key System Learnings
&lt;/h2&gt;

&lt;p&gt;Multi agent user experiences firmly become exponentially more believable when role boundaries remain incredibly explicit.&lt;br&gt;
Review queues genuinely matter because they make autonomy feel safely operational rather than confusingly theatrical.&lt;/p&gt;

&lt;p&gt;An optional model enrichment layer serves as a vastly superior onboarding path safely compared to forcing mandatory API keys from day one.&lt;br&gt;
Above all, a highly small but fully inspectable orchestration engine proves decisively more useful than a large opaque one.&lt;/p&gt;

</description>
      <category>agents</category>
      <category>buildmultiagents</category>
      <category>gemini</category>
      <category>adk</category>
    </item>
    <item>
      <title>Exception OS, a calmer way to handle operational chaos with Notion MCP</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Mon, 09 Mar 2026 05:33:38 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/exception-os-a-calmer-way-to-handle-operational-chaos-with-notion-mcp-28nm</link>
      <guid>https://dev.to/aniruddhaadak/exception-os-a-calmer-way-to-handle-operational-chaos-with-notion-mcp-28nm</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/notion-2026-03-04"&gt;Notion MCP Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Exception OS came from a simple frustration. Teams already have enough dashboards, alerts, and summaries. What they usually do not have is a clean way to notice the few situations that actually need judgment, turn those moments into a usable brief, and preserve the final decision somewhere the team will revisit later.&lt;/p&gt;

&lt;p&gt;The live demo is here: &lt;a href="https://exception-os.vercel.app" rel="noopener noreferrer"&gt;https://exception-os.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The GitHub repository is here: &lt;br&gt;


&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/exception-os" rel="noopener noreferrer"&gt;
        exception-os
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Exception-first operations dashboard with live GitHub signals and real Notion MCP publishing workflows.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;Exception OS&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;Exception OS is a deployed SaaS-style operating system for teams that need fewer dashboards and faster decisions. It ingests live operational signals, detects exceptions that require human judgment, generates structured decision briefs, and routes them into a Notion-centered workflow.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://exception-os.vercel.app" rel="nofollow noopener noreferrer"&gt;Live Demo&lt;/a&gt; · &lt;a href="https://github.com/aniruddhaadak80/exception-os" rel="noopener noreferrer"&gt;GitHub Repository&lt;/a&gt; · &lt;a href="https://github.com/aniruddhaadak80/exception-os/./docs/devto-submission.md" rel="noopener noreferrer"&gt;DEV Submission Draft&lt;/a&gt;&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Status&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;Exception OS is complete as a deployable challenge app and SaaS-style product foundation.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Production deployment is live on Vercel.&lt;/li&gt;
&lt;li&gt;The dashboard is responsive and verified on desktop and mobile layouts.&lt;/li&gt;
&lt;li&gt;Lint, tests, and production build are passing.&lt;/li&gt;
&lt;li&gt;Real Notion MCP OAuth, workspace sync, and Notion publishing are implemented server-side.&lt;/li&gt;
&lt;li&gt;Other users can use the deployed app by connecting their own Notion workspace.&lt;/li&gt;
&lt;li&gt;Live GitHub activity now feeds the dashboard without seeded incident templates.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The only runtime step that still depends on the user is approving Notion OAuth for a specific workspace, which cannot be done automatically on someone else’s…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/exception-os" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;




&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built &lt;strong&gt;Exception OS&lt;/strong&gt;, an exception first operating system for teams dealing with operational noise.&lt;/p&gt;

&lt;p&gt;Most AI productivity tools try to summarize everything. Exception OS takes the opposite approach. It watches incoming signals, narrows attention to the smaller set of situations that actually require human judgment, and turns those moments into structured decision briefs.&lt;/p&gt;

&lt;p&gt;This challenge build is already deployed as a public web app. Other users can open it, connect their own Notion workspace, save a publishing target, and use the live Notion workflow right away.&lt;/p&gt;

&lt;p&gt;The app includes a live operations dashboard for signals, exceptions, and decision briefs. It pulls live GitHub activity into the system, uses a real Notion MCP integration with OAuth, PKCE, token refresh, and server side MCP calls, and lets a connected user publish decision briefs directly into Notion pages, databases, or data sources. It also syncs workspace context back from Notion search so the next decision has more useful surrounding context.&lt;/p&gt;

&lt;p&gt;In short, Exception OS treats Notion as the operational memory and approval layer for critical decisions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fexception-os%2Fmain%2Fdocs%2Fassets%2Fexception-os-dashboard-desktop.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Faniruddhaadak80%2Fexception-os%2Fmain%2Fdocs%2Fassets%2Fexception-os-dashboard-desktop.png" alt="Exception OS desktop dashboard with the main exception queue and decision brief"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The desktop view is where the product story starts. It shows the exception queue, the active decision brief, and the live signal layer in a single operating surface.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Is Real Today
&lt;/h2&gt;

&lt;p&gt;Everything in the core workflow is live today. The app is publicly deployed on Vercel. Each user can connect their own Notion workspace through OAuth. The MCP client is real and runs server side. Workspace search is live. Publishing is live. GitHub repository activity is also live and feeds the upstream operational layer. This is not a mock front end with a pretend integration behind it.&lt;/p&gt;

&lt;p&gt;The current limitation is connector breadth. GitHub and Notion are live today, while systems like Stripe, dedicated support tooling, and calendar platforms are still future connectors.&lt;/p&gt;

&lt;p&gt;The mobile layout matters here because the workflow is still usable when reduced to its essentials. The same live signal, exception, and brief flow holds together on a small screen.&lt;/p&gt;

&lt;h2&gt;
  
  
  Video Demo
&lt;/h2&gt;

&lt;p&gt;I am using production screenshots in this submission and the live deployed app for verification. The product can be reviewed directly at the deployed URL, which makes the behavior easier to judge than a prerecorded walkthrough alone.&lt;/p&gt;

&lt;p&gt;Live demo: &lt;a href="https://exception-os.vercel.app" rel="noopener noreferrer"&gt;https://exception-os.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How I Used Notion MCP
&lt;/h2&gt;

&lt;p&gt;Notion MCP is not decorative in this project. It is the core runtime integration.&lt;/p&gt;

&lt;p&gt;I used a real server side Notion MCP client to authenticate users with OAuth 2.0 and PKCE, discover Notion MCP OAuth metadata dynamically, connect to &lt;code&gt;https://mcp.notion.com/mcp&lt;/code&gt; over Streamable HTTP, inspect the authenticated workspace, publish decision briefs into Notion as durable operating records, and search the workspace for contextual material that can inform the next decision.&lt;/p&gt;

&lt;p&gt;That means the app does not just mention Notion. It actually reads from and writes to Notion through MCP.&lt;/p&gt;

&lt;p&gt;It also means different users can connect different Notion workspaces, which makes the deployed app usable beyond my own environment.&lt;/p&gt;

&lt;p&gt;In this view, the important part is the Notion MCP panel. That is where a user connects a workspace, saves a publish target, syncs context, and pushes the selected brief into Notion.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why I Built It
&lt;/h2&gt;

&lt;p&gt;Teams do not usually fail because they lacked dashboards. They fail because critical exceptions get buried across tools, owners, and conversations.&lt;/p&gt;

&lt;p&gt;Exception OS is designed around a simple idea:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;AI should interrupt humans only when judgment is required.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;When a real exception appears, Exception OS creates a decision brief that explains what happened, why it matters now, what the likely cause is, who should own it, what actions should happen next, and what evidence supports the recommendation.&lt;/p&gt;

&lt;p&gt;From there, the brief can be pushed into Notion, where the team can review, approve, and preserve the decision as organizational memory.&lt;/p&gt;

&lt;h2&gt;
  
  
  App Architecture
&lt;/h2&gt;

&lt;p&gt;The app is built with Next.js App Router and React 19 on the front end, plus a server side Notion MCP client powered by &lt;code&gt;@modelcontextprotocol/sdk&lt;/code&gt;. Session state is stored in secure encrypted cookies. The signal layer combines live GitHub activity with connected Notion workspace ingestion.&lt;/p&gt;

&lt;p&gt;The architecture separates signal generation, decision synthesis, Notion publishing, and workspace context sync. That gives the product a real end to end loop today while keeping the path to broader connector coverage obvious.&lt;/p&gt;

&lt;p&gt;I am using the desktop screen here as an architecture reference. The top of the interface establishes live inputs, the center focuses human attention on the current exception, and the lower sections show the normalized signal stream and system structure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges I Ran Into
&lt;/h2&gt;

&lt;p&gt;The hardest part was treating Notion MCP like a real application integration rather than a prompt only connector.&lt;/p&gt;

&lt;p&gt;That meant handling RFC 9470 protected resource discovery, RFC 8414 authorization server discovery, correct PKCE behavior, server side session and token storage, token refresh and reconnect scenarios, and a UI that still works when a workspace is not yet connected.&lt;/p&gt;

&lt;p&gt;That extra work was worth it because it turned the project from a mock into a real integration.&lt;/p&gt;

&lt;p&gt;The second challenge was turning the upstream feed into something real without depending on private customer infrastructure. I solved that by using live GitHub activity as a public operational signal source and live Notion workspace context as the operator memory layer.&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s Next
&lt;/h2&gt;

&lt;p&gt;The current version focuses on the decision layer and Notion MCP integration. The next step is to broaden the connector layer so the same workflow can ingest support, revenue, and calendar systems beyond GitHub and Notion. After that, I want to store exceptions in a dedicated Notion data source schema, add approval analytics and learning from human edits, and introduce first party app accounts plus shared team workspaces.&lt;/p&gt;

&lt;h2&gt;
  
  
  Repo
&lt;/h2&gt;

&lt;p&gt;GitHub repository: &lt;a href="https://github.com/aniruddhaadak80/exception-os" rel="noopener noreferrer"&gt;https://github.com/aniruddhaadak80/exception-os&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;Open the live app, connect your Notion workspace, save a Notion page as the publish target, and start publishing operational decisions directly into Notion. If you want to run it locally, clone the repo, add the environment variables, and start the Next.js app.&lt;/p&gt;

&lt;p&gt;This is the quickest way to understand the product. Open the app, connect Notion, save a target page, and the workflow is ready to use from the same screen.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>notionchallenge</category>
      <category>mcp</category>
      <category>ai</category>
    </item>
    <item>
      <title>my little notion-powered sidekick</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Fri, 06 Mar 2026 14:08:53 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/my-little-notion-powered-sidekick-imj</link>
      <guid>https://dev.to/aniruddhaadak/my-little-notion-powered-sidekick-imj</guid>
      <description>&lt;p&gt;&lt;em&gt;this is a submission for the &lt;a href="https://dev.to/challenges/notion-2026-03-04"&gt;notion mcp challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2hlYWRlcl9pbWFnZQ.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyaGxZV1JsY2w5cGJXRm5aUS5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DCANDqUEMPikIFIw1qELMW32VrZWXhVwQ-SnBjEj9JKGatgjV9Ubna5i6pBviXKDwjNFMr2YRIsbXeuhGcN6y43ffW83JmKvPXKe7afsXLU7-QlX5QdXI2VJVdlETDB9U9NC-jd2sR88o3w9axMVBhly8DPb-9sgvQpPQlM3Mx4TPvAG8FlUtVpjbL1bOHcQo9hPpMOHF0ZQ~xKeZOeo~kZqM6ayr4yqNO-uJTmAa-JZ5mbK4NdK11cM~E7EYmkBznd183k7A9YDMsAKQeoMiRjdR59U0QyQ-tOtmohxL6yTV5AEKgn~7Fqk5PahIvZpUEePzMYBms2ThHjkBXLLKng__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2hlYWRlcl9pbWFnZQ.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyaGxZV1JsY2w5cGJXRm5aUS5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DCANDqUEMPikIFIw1qELMW32VrZWXhVwQ-SnBjEj9JKGatgjV9Ubna5i6pBviXKDwjNFMr2YRIsbXeuhGcN6y43ffW83JmKvPXKe7afsXLU7-QlX5QdXI2VJVdlETDB9U9NC-jd2sR88o3w9axMVBhly8DPb-9sgvQpPQlM3Mx4TPvAG8FlUtVpjbL1bOHcQo9hPpMOHF0ZQ~xKeZOeo~kZqM6ayr4yqNO-uJTmAa-JZ5mbK4NdK11cM~E7EYmkBznd183k7A9YDMsAKQeoMiRjdR59U0QyQ-tOtmohxL6yTV5AEKgn~7Fqk5PahIvZpUEePzMYBms2ThHjkBXLLKng__" alt="a stylized brain with gears turning, in a friendly, cartoonish style with soft colors" width="720" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;hey there, fellow digital adventurers&lt;/p&gt;

&lt;p&gt;i'm here to share a little something i've been tinkering with, something that's made my mornings a whole lot smoother. you see, i've always been a bit of a creative mess, my thoughts and tasks scattered across various apps and sticky notes. it's a charming trait, i tell myself, but not always the most efficient. that's where notion came in, and honestly, it's been a game-changer for my personal organization.&lt;/p&gt;

&lt;p&gt;but i wanted more. i wanted to push the boundaries of what notion could do for me, to make it truly my personal command center. and that's how i stumbled upon the notion mcp challenge. it was the perfect excuse to dive deep and build something that would genuinely simplify my daily routine. so, i set out to create my very own 'daily digest' – a little notion-powered sidekick that brings all my essential information into one neat, tidy, and automated page every single morning.&lt;/p&gt;

&lt;h2&gt;what i built&lt;/h2&gt;

&lt;p&gt;imagine waking up, grabbing your coffee, and opening notion to a single page that tells you everything you need to know for the day. that's what i built. my daily digest is a personalized notion page that automatically updates with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the current weather forecast for my location&lt;/li&gt;
&lt;li&gt;an inspiring quote of the day to kickstart my motivation&lt;/li&gt;
&lt;li&gt;a consolidated list of my to-do items from various sources&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;it's like having a personal assistant, but without the awkward small talk. it saves me precious minutes every morning, allowing me to focus on what truly matters. it's simple, yes, but sometimes the simplest solutions are the most powerful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3JvYm90X25vdGlvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzSnZZbTkwWDI1dmRHbHZiZy5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DAZIVNGpfpgn21zzuS1lpOLYLnzpac8roGt56flSClarEAw-nfvy~hF0miDk74WjOd-OJYlADbGtx0m8nenSItsy83emrZgSJBy3fyL2QGnxhLMj3fOdC0PwTpQJPUpHqR3lF2xLrEU9f~9A2P63jzwFClfxtuvee6cpH-SlDkNFJrZogdWZFObnQAxlbXdAhRAzQSpwXciLHTtPxKIpfLAYCuyHP83Jaw57h5RTJPqeGzwbXoNO-iwcgVFQKkskWQ6Y6VAql7~oxlhgeBTCX0elkugCruENTdn~jQJuab-4S8Ihm7hoJNwQd96ayK14uFipPpJhIlWPR66rfd9ELNw__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3JvYm90X25vdGlvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzSnZZbTkwWDI1dmRHbHZiZy5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DAZIVNGpfpgn21zzuS1lpOLYLnzpac8roGt56flSClarEAw-nfvy~hF0miDk74WjOd-OJYlADbGtx0m8nenSItsy83emrZgSJBy3fyL2QGnxhLMj3fOdC0PwTpQJPUpHqR3lF2xLrEU9f~9A2P63jzwFClfxtuvee6cpH-SlDkNFJrZogdWZFObnQAxlbXdAhRAzQSpwXciLHTtPxKIpfLAYCuyHP83Jaw57h5RTJPqeGzwbXoNO-iwcgVFQKkskWQ6Y6VAql7~oxlhgeBTCX0elkugCruENTdn~jQJuab-4S8Ihm7hoJNwQd96ayK14uFipPpJhIlWPR66rfd9ELNw__" alt="a cute, friendly robot character looking at a notion page on a laptop screen" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i used to have a desk that looked like a paper explosion, and my digital life wasn't far behind. but with this little project, i've brought a sense of calm and order to my daily chaos. it's amazing what a bit of automation and a well-structured notion page can do.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L21lc3N5X3ZzX2NsZWFu.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyMWxjM041WDNaelgyTnNaV0Z1LnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3Dv3zKUo3k-7Em6vC~8b985P2NmvKGwI~IbK~koex~~JRtVoRK8yv6lVqqS-FqlEDDblKUqDA31jr~FaQfRy67Pu9wiBdIVu8j6f8tq5gCDcGfUvJQcnU5nZjDEFO-~fxS4ZgPZi~7H8AYiMC-KJ5lVTogNUKKMBORcJGRbEhyMb1zWynmVMYgH5~-bAlc8sLZwtPdQM~ybXnCBxKUCz-c6Wuve8~hqnu233dKyKZO11PeS9UCPA2YJ6f12ec6X6~ODrwWo6nEibpI89wvFn1Vm~tgruFijBUjdSRCT~znUvcjQe1ehZVQ-pbLk5IWT83dAbFYbXKOAeUF2ZD4bjlcCA__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L21lc3N5X3ZzX2NsZWFu.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyMWxjM041WDNaelgyTnNaV0Z1LnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3Dv3zKUo3k-7Em6vC~8b985P2NmvKGwI~IbK~koex~~JRtVoRK8yv6lVqqS-FqlEDDblKUqDA31jr~FaQfRy67Pu9wiBdIVu8j6f8tq5gCDcGfUvJQcnU5nZjDEFO-~fxS4ZgPZi~7H8AYiMC-KJ5lVTogNUKKMBORcJGRbEhyMb1zWynmVMYgH5~-bAlc8sLZwtPdQM~ybXnCBxKUCz-c6Wuve8~hqnu233dKyKZO11PeS9UCPA2YJ6f12ec6X6~ODrwWo6nEibpI89wvFn1Vm~tgruFijBUjdSRCT~znUvcjQe1ehZVQ-pbLk5IWT83dAbFYbXKOAeUF2ZD4bjlcCA__" alt="a split-screen image showing a messy, cluttered desk on one side and a clean, organized desk with a laptop on the other" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;video demo&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3RodW1ic191cA.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUm9kVzFpYzE5MWNBLmdpZiIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DagT0WloyhACiC-WHWsEGQIg0-05ZvPIJ2ISS1VRoAeKW9F9lbYH7k~XkBc0FMoNIetkEYFZuXx370BJ2kTEVgA-QWHcpCafn0Tol1jJLkiy-ql0cbShShjHO7yZvu4w86U4v2Cka6ZC1-UV0mZtm0IWP5J8enjF-w7sDyjQr6j13H4bfdISI0nX7Q-mxi6dno4HT57rGis5G4L6aS4CejCoytaxjpv0w5kvXukH7L3356ocGn6O9uWfGNGgPOxqAGU7yC12iasR9t340Sx83BLiz5H6Jgdn5ik1homo7hXcqS66TKVYF60TbEzUAw7GILRzNSXLizC5QOEav00zSHw__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3RodW1ic191cA.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUm9kVzFpYzE5MWNBLmdpZiIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DagT0WloyhACiC-WHWsEGQIg0-05ZvPIJ2ISS1VRoAeKW9F9lbYH7k~XkBc0FMoNIetkEYFZuXx370BJ2kTEVgA-QWHcpCafn0Tol1jJLkiy-ql0cbShShjHO7yZvu4w86U4v2Cka6ZC1-UV0mZtm0IWP5J8enjF-w7sDyjQr6j13H4bfdISI0nX7Q-mxi6dno4HT57rGis5G4L6aS4CejCoytaxjpv0w5kvXukH7L3356ocGn6O9uWfGNGgPOxqAGU7yC12iasR9t340Sx83BLiz5H6Jgdn5ik1homo7hXcqS66TKVYF60TbEzUAw7GILRzNSXLizC5QOEav00zSHw__" alt="a person giving a thumbs-up" width="220" height="165"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;show us the code&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2JyYWluX2lkZWE.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwySnlZV2x1WDJsa1pXRS5naWYiLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DE1qHe235GDs67QrF3jStDHljJsVHcHuLpJ3u4Jt4MM5QACJQnu-cqwZcUjeHINYleafgoOsmmeyOWWKTaLjdvRVgG7zJamf7uDfP4QIWTiYlij~EtUxAmGQ13k6~SuU88iUUnBcPhnsz6wsCUEH5iNarzIWgQyXjPNC0Oof5srHh4XjufUBPErdXvaP8e2UBOPhQ9~Axxf9~3cQjJRi5cEzCaQc93uoDlnDPOK2dVFnVq8CYiPSKu8-BqP9zeuP8EO0VzZGT9mcM5CNd7WpFHPWhrKfw24-AQ7SY-0zKz3FoIMm9BEweceFFPc0PNeJdAFAYxXaNK~GZEr6aiWPt2g__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2JyYWluX2lkZWE.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwySnlZV2x1WDJsa1pXRS5naWYiLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DE1qHe235GDs67QrF3jStDHljJsVHcHuLpJ3u4Jt4MM5QACJQnu-cqwZcUjeHINYleafgoOsmmeyOWWKTaLjdvRVgG7zJamf7uDfP4QIWTiYlij~EtUxAmGQ13k6~SuU88iUUnBcPhnsz6wsCUEH5iNarzIWgQyXjPNC0Oof5srHh4XjufUBPErdXvaP8e2UBOPhQ9~Axxf9~3cQjJRi5cEzCaQc93uoDlnDPOK2dVFnVq8CYiPSKu8-BqP9zeuP8EO0VzZGT9mcM5CNd7WpFHPWhrKfw24-AQ7SY-0zKz3FoIMm9BEweceFFPc0PNeJdAFAYxXaNK~GZEr6aiWPt2g__" alt="a brain lighting up with an idea" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;how i used notion mcp&lt;/h2&gt;

&lt;p&gt;this is where the magic happens. the notion mcp (model context protocol) was the key to bringing all these disparate pieces of information together. i essentially created a series of integrations that act as bridges between notion and other services. here's a simplified breakdown of how it works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;weather data:&lt;/strong&gt; i connected to a weather api to fetch the latest forecast for my area. the mcp then takes this data and seamlessly pushes it into a dedicated section on my daily digest page.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;daily inspiration:&lt;/strong&gt; for the quote of the day, i hooked into a quote api. every morning, a fresh dose of wisdom appears on my page, ready to inspire.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;task management:&lt;/strong&gt; this was a big one. i use a few different tools for my to-do lists, and the mcp allowed me to pull all those tasks into a single, unified list within notion. no more jumping between apps to see what i need to do.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;the beauty of the notion mcp is how it allows for these &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3R5cGluZ19mYXN0.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUjVjR2x1WjE5bVlYTjAuZ2lmIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DhJlJIcfZxz9QYqzXbYTZl-smKlJDWgVPIsIYafayqvTlatPAvmmYMFdNOls0YlTsHo5JwdMVVD4iLpLNbRseSZ1FtxA7s-gU13N6haKgp2Ym1an0zZDrF1mtadGhXB-DKzln58RBPVJwN-~kdDlJdWACvYv8Z8JTmr-pV-NlIYUhvRyh6jnj8ITKlm~L-MfVZ~lQxlOgFo-tqS~esGNpVFqmE75XGLbVmEhyzY1QHMgaV~v0Sb3yQ4gsjM1iqwWNI0qBGPRwVti3qUe9TKN1k75ZoMqHwJ6I0P52ccwfumcVzwG3ZE7wLGbOXLjd~HnGWjMiSW6uMYyYgtWzpYO3nw__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3R5cGluZ19mYXN0.gif%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUjVjR2x1WjE5bVlYTjAuZ2lmIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DhJlJIcfZxz9QYqzXbYTZl-smKlJDWgVPIsIYafayqvTlatPAvmmYMFdNOls0YlTsHo5JwdMVVD4iLpLNbRseSZ1FtxA7s-gU13N6haKgp2Ym1an0zZDrF1mtadGhXB-DKzln58RBPVJwN-~kdDlJdWACvYv8Z8JTmr-pV-NlIYUhvRyh6jnj8ITKlm~L-MfVZ~lQxlOgFo-tqS~esGNpVFqmE75XGLbVmEhyzY1QHMgaV~v0Sb3yQ4gsjM1iqwWNI0qBGPRwVti3qUe9TKN1k75ZoMqHwJ6I0P52ccwfumcVzwG3ZE7wLGbOXLjd~HnGWjMiSW6uMYyYgtWzpYO3nw__" alt="someone typing furiously on a keyboard, but in a fun, exaggerated way" width="500" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;seamless integrations. it unlocks a whole new level of customization and automation within notion, turning it from a powerful tool into a truly personalized system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2FwaV9kaWFncmFt.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyRndhVjlrYVdGbmNtRnQucG5nIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DRTyjeo0rcecWCQ20fZaANF6kkJV6vZYcdQ8g5V13tKRO7BwkHOInNVj2jXUnE35VXRWvZxhkO2vIrPhFRMRFmsDZ4nEH-IxTf~xGtOPiZqEwF08dy0Iy8J95p~H4PdiSQKh6KM~7UAASywmxjza4NenNswU4kBKmw90lgI15sqrA8MOoD9snMa82pCdj6eo6qTVeiFS01DNg0C5SAcsK78KL5NJ6K92XYqOrlrECOQ~NSUTUP1J0EMpt1MkUr-AY7xB6-uBSqMjssaBYpT26F2nLgY18z1P5C0qayYtKIleifjwd~~8aa9~E76F2bivyCY0j8~xiboq6RJZGaVZg-A__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2FwaV9kaWFncmFt.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyRndhVjlrYVdGbmNtRnQucG5nIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DRTyjeo0rcecWCQ20fZaANF6kkJV6vZYcdQ8g5V13tKRO7BwkHOInNVj2jXUnE35VXRWvZxhkO2vIrPhFRMRFmsDZ4nEH-IxTf~xGtOPiZqEwF08dy0Iy8J95p~H4PdiSQKh6KM~7UAASywmxjza4NenNswU4kBKmw90lgI15sqrA8MOoD9snMa82pCdj6eo6qTVeiFS01DNg0C5SAcsK78KL5NJ6K92XYqOrlrECOQ~NSUTUP1J0EMpt1MkUr-AY7xB6-uBSqMjssaBYpT26F2nLgY18z1P5C0qayYtKIleifjwd~~8aa9~E76F2bivyCY0j8~xiboq6RJZGaVZg-A__" alt="a simple, friendly diagram showing three icons (a sun for weather, a speech bubble for quotes, and a checkmark for tasks) all connecting with arrows to a central notion icon" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;my daily digest page is a testament to the power of connected workflows. it’s not just about having information; it’s about having the &lt;em&gt;right&lt;/em&gt; information, presented in a way that makes sense for &lt;em&gt;me&lt;/em&gt;, exactly when i need it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2RhaWx5X2RpZ2VzdF9zY3JlZW5zaG90.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyUmhhV3g1WDJScFoyVnpkRjl6WTNKbFpXNXphRzkwLnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DT1RBcIIfSWMh7biXC9ik18614CdJfJkl0~MxMwykyQmf-QZWrTeJyukkEz4skP3O10Y7iSuOW-3pg~JgLv6XIWOhCdllzh6ebh4q3hHjGoeARuhkSw7VH0ML1ziGGZeDAZQIaAr3lwpxT~UUeO5sH56EN1P072QazB~rOibq3ktQk5ZEO~98dCFL00OcQrmB43MXskor3MyZKYyYdVpYcjueqGAsXreK5GueL~-QM41dA4SRyxuUiZ6ZcWDsns1iCmDkUd5AE~v93gkN~glzQeJVLsGJ-DtZ3UkGhWfUw9uf7HGX7ibYGDmWGOAeEiHmBv4MMTycJaDIWFOrvqrohA__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L2RhaWx5X2RpZ2VzdF9zY3JlZW5zaG90.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyUmhhV3g1WDJScFoyVnpkRjl6WTNKbFpXNXphRzkwLnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DT1RBcIIfSWMh7biXC9ik18614CdJfJkl0~MxMwykyQmf-QZWrTeJyukkEz4skP3O10Y7iSuOW-3pg~JgLv6XIWOhCdllzh6ebh4q3hHjGoeARuhkSw7VH0ML1ziGGZeDAZQIaAr3lwpxT~UUeO5sH56EN1P072QazB~rOibq3ktQk5ZEO~98dCFL00OcQrmB43MXskor3MyZKYyYdVpYcjueqGAsXreK5GueL~-QM41dA4SRyxuUiZ6ZcWDsns1iCmDkUd5AE~v93gkN~glzQeJVLsGJ-DtZ3UkGhWfUw9uf7HGX7ibYGDmWGOAeEiHmBv4MMTycJaDIWFOrvqrohA__" alt="a clean, minimalist screenshot of a notion page titled 'my daily digest'. it should have sections for 'weather', 'quote of the day', and 'to-do list'" width="1536" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;i found that by using the mcp, i could really tailor notion to my specific needs, making it less of a generic workspace and more of a bespoke productivity hub. it’s like notion became my personal assistant, always one step ahead.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L25vdGlvbl9jbG9zZV91cA.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyNXZkR2x2Ymw5amJHOXpaVjkxY0EucG5nIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DH5Thusz3zzVYOjtftdrCpIxOmTOK5DA3xDB1qjGxiVkp4UE7eu13GIWpSnJV6AtuuOpEwDzJU0ocwZTH4pI-Xcv0smqSWV5wS-ff0HXkZUCq2fXoZTEz8efyxLyHUNW3RxEL~Z4GenVRzsDTtWzyXD0lcj-ci1Wl219duSVCdtcrV15gd0IX-6YCsXutBh4EWe6C0pmpCwzo51aiZ3m2wEiIka-Y~V9VTYa02CfYJr9j1RP7nW8wRLIB5a4FlenT1BNZHaUj2Or66UgmNMchPUL6grjFjYdz5OUriOzP9ymukaav2i1evmX5w-hRcT5zih1vtxulR76hLogBiVl~3Q__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L25vdGlvbl9jbG9zZV91cA.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwyNXZkR2x2Ymw5amJHOXpaVjkxY0EucG5nIiwiQ29uZGl0aW9uIjp7IkRhdGVMZXNzVGhhbiI6eyJBV1M6RXBvY2hUaW1lIjoxNzk4NzYxNjAwfX19XX0_%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DH5Thusz3zzVYOjtftdrCpIxOmTOK5DA3xDB1qjGxiVkp4UE7eu13GIWpSnJV6AtuuOpEwDzJU0ocwZTH4pI-Xcv0smqSWV5wS-ff0HXkZUCq2fXoZTEz8efyxLyHUNW3RxEL~Z4GenVRzsDTtWzyXD0lcj-ci1Wl219duSVCdtcrV15gd0IX-6YCsXutBh4EWe6C0pmpCwzo51aiZ3m2wEiIka-Y~V9VTYa02CfYJr9j1RP7nW8wRLIB5a4FlenT1BNZHaUj2Or66UgmNMchPUL6grjFjYdz5OUriOzP9ymukaav2i1evmX5w-hRcT5zih1vtxulR76hLogBiVl~3Q__" alt="a close-up shot of a specific part of a notion page, showing a neatly organized list of tasks with checkboxes" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;the little icons that could&lt;/h3&gt;

&lt;p&gt;to make things even more visually appealing and easy to grasp, i used some simple icons to represent the different data sources. it’s all about making the information digestible at a glance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3dlYXRoZXJfaWNvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzZGxZWFJvWlhKZmFXTnZiZy5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DvNO1AYnuxHu149nFiNa4TGHrd7j~1VNV5UTrYbXa~qQhq5TVRRXlf50D58-ylvDV9hVhg7Cb5C9vgY1so3LI7UMbYtiljTrT5pNWXrmj7Qs7ffHtXAFT7C6KNdjzq~7KE0v-KN6OPTkGDmT7CsYVYDUQWcQLhXfMB7Y~Y4dsnwnBY~sCRitZ44scUFHpB9W-Oh~C7hUHzcl8d~J5Zbr9Nq5vZNDaXhCzzmZFGKWocavEvkEcJErcRNPGAVjkMCZog53zU7xQOIGwP-rOtax6PwMolPwt7v9SSKApTQmAGqZJsNlK0knKT8Gvjo0yGeEMlZOJjcMaumkXPDpkfav1EA__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3dlYXRoZXJfaWNvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzZGxZWFJvWlhKZmFXTnZiZy5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DvNO1AYnuxHu149nFiNa4TGHrd7j~1VNV5UTrYbXa~qQhq5TVRRXlf50D58-ylvDV9hVhg7Cb5C9vgY1so3LI7UMbYtiljTrT5pNWXrmj7Qs7ffHtXAFT7C6KNdjzq~7KE0v-KN6OPTkGDmT7CsYVYDUQWcQLhXfMB7Y~Y4dsnwnBY~sCRitZ44scUFHpB9W-Oh~C7hUHzcl8d~J5Zbr9Nq5vZNDaXhCzzmZFGKWocavEvkEcJErcRNPGAVjkMCZog53zU7xQOIGwP-rOtax6PwMolPwt7v9SSKApTQmAGqZJsNlK0knKT8Gvjo0yGeEMlZOJjcMaumkXPDpkfav1EA__" alt="a simple, friendly sun icon representing weather services" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3F1b3RlX2ljb24.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzRjFiM1JsWDJsamIyNC5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DodSamGMeLrxboFv-RSlMOXuQJqQcx94t60RnUI8H4BRB6tuECHF7B6egTcvzGM0OnqvzD54IDwQ1J~Af6FFgLu~561PCaS4MoGJrAO1LRqwdKoxwuVKLVfEPfX5uzVyQaG5hyTwxOmgHmohr~l8vumGbXQKh8flVzV64428yOMcx0J6pwF58Rk8jsOeBHE~OGjHn6NGkpKrAYu3QDxVJi7tPzZX-7OWB4c8lPCH1LLmMho8FW3YgEH-glGbku5hN9mLkHYBlf5~GWUWETC6~a6bXyxqktAPpxk-SAnxcVV4GeiuAlUpgloopXZRKGCBN7PX9Q4f3XLPd39AKLSR4Tg__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3F1b3RlX2ljb24.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzRjFiM1JsWDJsamIyNC5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DodSamGMeLrxboFv-RSlMOXuQJqQcx94t60RnUI8H4BRB6tuECHF7B6egTcvzGM0OnqvzD54IDwQ1J~Af6FFgLu~561PCaS4MoGJrAO1LRqwdKoxwuVKLVfEPfX5uzVyQaG5hyTwxOmgHmohr~l8vumGbXQKh8flVzV64428yOMcx0J6pwF58Rk8jsOeBHE~OGjHn6NGkpKrAYu3QDxVJi7tPzZX-7OWB4c8lPCH1LLmMho8FW3YgEH-glGbku5hN9mLkHYBlf5~GWUWETC6~a6bXyxqktAPpxk-SAnxcVV4GeiuAlUpgloopXZRKGCBN7PX9Q4f3XLPd39AKLSR4Tg__" alt="a simple, friendly speech bubble icon representing a quote service" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3Rhc2tfaWNvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUmhjMnRmYVdOdmJnLnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DHVNlH-7Cvn4KWQ2~YGp5GE8e6f-kMNEHbIxFS6ymomISwypxPWjGAKo6jkm-91JVWgOzjn~U8ttlxZhAfJwqi361Q7iHx1pZRUV5rnaaaSVzs22RPuqHqPlKIwGlinod-3Hok8GxnR3q9Mn5VW7hnFr1JnfrBZ7OEp7BR4mgh-RtikZLvFN6ZjxT8RKWrwN0TLVUqCQF7k-w-kNIJ-jvv~F3pQSX0RgSnjujGaBth3kFwudSA~Q2ZBM1THH5Y-5c9zqRuwZGaHK3k~JpB4fc~Hj8cQk~3GVxMiGUww6DvlTDXjLhUxs8SGOILYMjEiJJK~a3lgujyO5rRg7kT9Fk~w__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3Rhc2tfaWNvbg.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzUmhjMnRmYVdOdmJnLnBuZyIsIkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTc5ODc2MTYwMH19fV19%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DHVNlH-7Cvn4KWQ2~YGp5GE8e6f-kMNEHbIxFS6ymomISwypxPWjGAKo6jkm-91JVWgOzjn~U8ttlxZhAfJwqi361Q7iHx1pZRUV5rnaaaSVzs22RPuqHqPlKIwGlinod-3Hok8GxnR3q9Mn5VW7hnFr1JnfrBZ7OEp7BR4mgh-RtikZLvFN6ZjxT8RKWrwN0TLVUqCQF7k-w-kNIJ-jvv~F3pQSX0RgSnjujGaBth3kFwudSA~Q2ZBM1THH5Y-5c9zqRuwZGaHK3k~JpB4fc~Hj8cQk~3GVxMiGUww6DvlTDXjLhUxs8SGOILYMjEiJJK~a3lgujyO5rRg7kT9Fk~w__" alt="a simple, friendly checkmark icon representing a to-do list service" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;conclusion&lt;/h2&gt;

&lt;p&gt;building this notion-powered daily digest has been an incredibly rewarding experience. it’s not just about the technical challenge, but about creating something that genuinely improves my day-to-day life. i’ve learned so much about the capabilities of notion mcp and the endless possibilities it opens up for personalized workflows.&lt;/p&gt;

&lt;p&gt;i hope this little peek into my project inspires you to explore the power of notion and its mcp. whether you’re a seasoned developer or just starting your journey, there’s a whole world of automation and customization waiting to be discovered. go on, build your own little sidekick&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3JvYm90X3dhdmU.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzSnZZbTkwWDNkaGRtVS5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DP944cLODcclg1popxOGowruax~myWsn233te4TWQdQ7x1mzH9amrRw9yN6rd994JeFI31keIEx79ljKKWAtlcmR9F~t3RmJBjKfHfELFNX9x1cQkZXuLzuqXUoiDUNOTXZLZH1vnKR6u1pFf~w1nA9lKBvQVozk~H-fY8DKqEAVzIkSS3PPgnyvdXPP-zSO9v2QRdbgveqqTjgZzImz~7fFgCCGrYp8x3BuSzrWeu7bHRelC0ewsj9kHSK9hs4eEI8FoIr7kd1meZ-8SFtGvoHjBV-EIjXFIwlaFWi3sax7qjzx55HDHKyfg9qHQPH4NkfSZGrMPFlnx9tpuB8sT6g__" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-us-east-1.manuscdn.com%2FsessionFile%2FztsZ7Rndcot6MIne92jJSM%2Fsandbox%2FBjgpkuUGbEH7vLQyj18I9Q-images_1772805896366_na1fn_L2hvbWUvdWJ1bnR1L3JvYm90X3dhdmU.png%3FPolicy%3DeyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6Ly9wcml2YXRlLXVzLWVhc3QtMS5tYW51c2Nkbi5jb20vc2Vzc2lvbkZpbGUvenRzWjdSbmRjb3Q2TUluZTkyakpTTS9zYW5kYm94L0JqZ3BrdVVHYkVIN3ZMUXlqMThJOVEtaW1hZ2VzXzE3NzI4MDU4OTYzNjZfbmExZm5fTDJodmJXVXZkV0oxYm5SMUwzSnZZbTkwWDNkaGRtVS5wbmciLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE3OTg3NjE2MDB9fX1dfQ__%26Key-Pair-Id%3DK2HSFNDJXOU9YS%26Signature%3DP944cLODcclg1popxOGowruax~myWsn233te4TWQdQ7x1mzH9amrRw9yN6rd994JeFI31keIEx79ljKKWAtlcmR9F~t3RmJBjKfHfELFNX9x1cQkZXuLzuqXUoiDUNOTXZLZH1vnKR6u1pFf~w1nA9lKBvQVozk~H-fY8DKqEAVzIkSS3PPgnyvdXPP-zSO9v2QRdbgveqqTjgZzImz~7fFgCCGrYp8x3BuSzrWeu7bHRelC0ewsj9kHSK9hs4eEI8FoIr7kd1meZ-8SFtGvoHjBV-EIjXFIwlaFWi3sax7qjzx55HDHKyfg9qHQPH4NkfSZGrMPFlnx9tpuB8sT6g__" alt="the same cute, friendly robot character from before, now giving a friendly wave" width="1024" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;





</description>
      <category>devchallenge</category>
      <category>notionchallenge</category>
      <category>mcp</category>
      <category>ai</category>
    </item>
    <item>
      <title>I Gave an AI My Study Materials, and It Planned My Entire Learning Schedule. HyperKnow Is Not Just Another Chatbot</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Wed, 04 Mar 2026 13:22:41 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/i-gave-an-ai-my-study-materials-and-it-planned-my-entire-learning-schedule-hyperknow-is-not-just-53g</link>
      <guid>https://dev.to/aniruddhaadak/i-gave-an-ai-my-study-materials-and-it-planned-my-entire-learning-schedule-hyperknow-is-not-just-53g</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"What if your AI study partner didn't wait for you to ask, but already knew what you needed to learn next?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That question kept bugging me after the 100th time I opened ChatGPT, typed &lt;em&gt;"explain neural networks"&lt;/em&gt;, got a wall of text, closed the tab, and learned absolutely nothing. Sound familiar?&lt;/p&gt;

&lt;p&gt;I’ve tested a lot of AI tools over the last year, from study assistants to full agent systems, and most of them are reactive by design.&lt;/p&gt;

&lt;p&gt;Then I stumbled onto &lt;a href="https://www.hyperknow.io/" rel="noopener noreferrer"&gt;&lt;strong&gt;HyperKnow&lt;/strong&gt;&lt;/a&gt; and honestly, my first reaction was skepticism. &lt;em&gt;Another AI study tool?&lt;/em&gt; But 10 minutes in, I was watching it &lt;strong&gt;write a script, generate narration, compile Python animation code, and render a full instruction video&lt;/strong&gt;, all from a single prompt. No plugins. No fiddling. Just results.&lt;/p&gt;

&lt;p&gt;Here is how it all started. I posted this thread on X while testing it live:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2027078777290084681-22" src="https://platform.twitter.com/embed/Tweet.html?id=2027078777290084681"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2027078777290084681-22');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2027078777290084681&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;Let me walk you through what makes this thing genuinely different.&lt;/p&gt;




&lt;h2&gt;
  
  
  🤖 Wait, What &lt;em&gt;Is&lt;/em&gt; HyperKnow Exactly?
&lt;/h2&gt;

&lt;p&gt;HyperKnow bills itself as &lt;strong&gt;"Your all-round learning companion"&lt;/strong&gt;, a 24x7 proactive study partner built by the &lt;strong&gt;Hyperknow Learning Intelligence Lab&lt;/strong&gt;. But that tagline undersells it.&lt;/p&gt;

&lt;p&gt;The key word is &lt;strong&gt;proactive&lt;/strong&gt;. Most AI tools are reactive. They sit there waiting for your question. HyperKnow's agent (nicknamed &lt;strong&gt;Orbie&lt;/strong&gt;) &lt;em&gt;reads your uploaded files&lt;/em&gt;, understands your deadlines, builds a study plan, and starts preparing materials &lt;strong&gt;before you even ask&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;It's trusted by students from &lt;strong&gt;MIT, UC Berkeley, Harvard, Yale, University of Illinois&lt;/strong&gt;, and more, which tells you the caliber of learners already using it.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 &lt;em&gt;HyperKnow calls itself the "World's first proactive agent for learning," and from what I’ve tested so far, it genuinely feels different.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6swo38srehfgdlkxk46z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6swo38srehfgdlkxk46z.png" alt="Image descriptnnion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;HyperKnow Homepage - World's first proactive agent for learning&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  🔥 The Feature That Stopped Me Cold: Instruction Video Generation
&lt;/h2&gt;

&lt;p&gt;Okay, let me just show you what happened when I typed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Make a launch demo video for https://careerzen.vercel.app/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I wasn't expecting much. Maybe a script? A bullet list?&lt;/p&gt;

&lt;p&gt;Instead, HyperKnow launched a &lt;strong&gt;4-stage automated pipeline&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Stage&lt;/th&gt;
&lt;th&gt;What Happens&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;✅ Stage 1&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;Script Writing&lt;/strong&gt; - structures the narrative&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;✅ Stage 2&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;Narration Generation&lt;/strong&gt; - voice synthesis&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;✅ Stage 3&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;Code Generation&lt;/strong&gt; - writes Python (Manim) animation code&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🔄 Stage 4&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;Video Rendering&lt;/strong&gt; - renders the final &lt;code&gt;.mp4&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;I tweeted about it live as it was happening. The whole thing kicked off with a single prompt:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2027084765946687588-616" src="https://platform.twitter.com/embed/Tweet.html?id=2027084765946687588"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2027084765946687588-616');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2027084765946687588&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;Notice how it asked for input via MCQs first, then just started building. That's the proactive intelligence at work. It clarified &lt;em&gt;how&lt;/em&gt; to help before diving in.&lt;/p&gt;

&lt;p&gt;The Python code it generated was &lt;strong&gt;actual Manim animation code&lt;/strong&gt;, not pseudocode. Neural network visualizations with &lt;code&gt;VGroup&lt;/code&gt;, &lt;code&gt;Circle&lt;/code&gt;, and layered architectures:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;construct&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Define network structure
&lt;/span&gt;    &lt;span class="n"&gt;layers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;neurons&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;layers&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;layer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;VGroup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
            &lt;span class="nc"&gt;Circle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;radius&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.25&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;color&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;BLUE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;arrange&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;DOWN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;buff&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then came the final videos. Here is the first one I posted, a launch demo video generated in under 2 minutes:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2027081627114111013-471" src="https://platform.twitter.com/embed/Tweet.html?id=2027081627114111013"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2027081627114111013-471');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2027081627114111013&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;And then the second video, the "AI in 2027" explainer created from a PDF. Worth every second of the wait:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2027087761245942225-813" src="https://platform.twitter.com/embed/Tweet.html?id=2027087761245942225"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2027087761245942225-813');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2027087761245942225&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;These are &lt;strong&gt;fully AI-generated educational videos&lt;/strong&gt; with animated code, narration, and structured visuals, produced by typing a single sentence. I was genuinely floored.&lt;/p&gt;




&lt;h2&gt;
  
  
  🧠 Deep Learn Sessions: The Feature I Keep Coming Back To
&lt;/h2&gt;

&lt;p&gt;If the instruction video feature is the showstopper, &lt;strong&gt;Deep Learn Sessions&lt;/strong&gt; is the workhorse. It is the one I actually use every single day now.&lt;/p&gt;

&lt;p&gt;Instead of a one-shot Q&amp;amp;A, Deep Learn is a &lt;strong&gt;unit-based, AI-led guided learning experience&lt;/strong&gt;. Think of a professor who never gets tired, never judges you, and actually adapts their explanation to &lt;em&gt;your&lt;/em&gt; level in real time.&lt;/p&gt;

&lt;p&gt;Here is a real example. I asked HyperKnow to &lt;strong&gt;compare Manus AI, Genspark AI, and HyperKnow itself&lt;/strong&gt;, using the Deep Learn Session mode:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2027277527295611014-46" src="https://platform.twitter.com/embed/Tweet.html?id=2027277527295611014"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2027277527295611014-46');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2027277527295611014&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;It didn't just answer. It searched across &lt;strong&gt;27+ sources&lt;/strong&gt;, synthesized a structured conclusion with definitions and key concepts, flagged the &lt;strong&gt;Agentic Architecture&lt;/strong&gt; as the central insight, and kept gathering context &lt;em&gt;while I was reading&lt;/em&gt;. That is not a chatbot. That is a research partner.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Notice that &lt;strong&gt;"Deep Learn Session" button&lt;/strong&gt; visible in the interface? Clicking it locks you into a focused, structured session where the AI guides the pace, not you. It is surprisingly effective for staying on track.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;What I love about Deep Learn:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;📚 &lt;strong&gt;Step-by-Step mode&lt;/strong&gt; - breaks down even IMO-level math problems clearly&lt;/li&gt;
&lt;li&gt;🔗 &lt;strong&gt;10+ citations per response&lt;/strong&gt; - not vibes, actual verifiable sources&lt;/li&gt;
&lt;li&gt;🎯 &lt;strong&gt;Adaptive pacing&lt;/strong&gt; - naturally slows down where you are struggling&lt;/li&gt;
&lt;li&gt;🧩 &lt;strong&gt;Concept chaining&lt;/strong&gt; - connects new ideas to things you already understand&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🚀 The Proactive Agent: Orbie Does the Work Before You Ask
&lt;/h2&gt;

&lt;p&gt;This is HyperKnow's philosophical differentiator, and it is where the platform genuinely earns the word &lt;em&gt;proactive&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Upload your course syllabus. Orbie will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Extract all deadlines and to-dos&lt;/strong&gt; automatically&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build a calendar of study sessions&lt;/strong&gt; around your exam dates&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pre-generate study materials&lt;/strong&gt; (flashcards, cheatsheets, quizzes) before each session&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Surface reminders&lt;/strong&gt; when a session is due, even if you forgot about it entirely&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;From the HyperKnow website:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"A proactive agent doesn't wait for questions. It understands your learning context and acts ahead of time."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The agent app (&lt;a href="https://agent.hyperknow.io/" rel="noopener noreferrer"&gt;agent.hyperknow.io&lt;/a&gt;) makes this very tangible. The sidebar has a &lt;strong&gt;"Proactive Learning Feed"&lt;/strong&gt;, a live dashboard where Orbie surfaces what you need to work on today, ordered by urgency and your personal learning gaps:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpqcvinq6x9ubj4n5ocw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpqcvinq6x9ubj4n5ocw.png" alt="Image descrition"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcc38oqr8wiehqyv7v3jg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcc38oqr8wiehqyv7v3jg.png" alt="Imagescription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That kind of &lt;strong&gt;contextual intelligence&lt;/strong&gt;, knowing &lt;em&gt;how&lt;/em&gt; to help before you have fully formulated what you need, is the difference between a tool and a true partner.&lt;/p&gt;




&lt;h2&gt;
  
  
  📋 All the Features, Organized
&lt;/h2&gt;

&lt;p&gt;Here is the full feature set as I have experienced it:&lt;/p&gt;

&lt;h3&gt;
  
  
  Core Agent Capabilities
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;What It Does&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;strong&gt;Concept Explanation&lt;/strong&gt; &lt;em&gt;(Upgraded)&lt;/em&gt;
&lt;/td&gt;
&lt;td&gt;Step-by-step teaching that adapts to your level&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Personal Study Materials Generation&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Quizzes, flashcards, cheatsheets from your own files&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Long-files Digestion&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Reads up to 1,000 pages, cites exact page numbers&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Problem Solving&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Tackles complex problems with guided, clear steps&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;strong&gt;Visualization&lt;/strong&gt; &lt;em&gt;(New)&lt;/em&gt;
&lt;/td&gt;
&lt;td&gt;Generates graphs, diagrams, and animation videos&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Instruction Video&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Full pipeline: script to narration to code to rendered video&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Proactive System
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;What It Does&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Proactive Learning Feed&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Auto-extracts deadlines and tasks from your syllabus&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Deep Learn Sessions&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;AI-led unit learning, structured and fully guided&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Knowledge Base Integration&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Connects to your files and Canvas LMS seamlessly&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Learner's Persona&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Builds your unique learning profile that improves over time&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  💰 Pricing (Surprisingly Reasonable)
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2mq0k8ude24zostvdw1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2mq0k8ude24zostvdw1.png" alt="Imdhdjion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Plan&lt;/th&gt;
&lt;th&gt;Price&lt;/th&gt;
&lt;th&gt;What You Get&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Free&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0&lt;/td&gt;
&lt;td&gt;Basic agent usage, file uploads, quiz and flashcard generation, proactive features&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pro&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$12/month&lt;/td&gt;
&lt;td&gt;10x usage limits, Knowledge Base, Memory enabled, early access to advanced features&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For what it does, $12/month is genuinely fair, especially compared to AI tutoring platforms that charge per session.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;🎯 &lt;strong&gt;Pro tip&lt;/strong&gt;: HyperKnow is currently invite-code gated. Check their website. If a code is displayed, it means spots are open. They also post codes on their &lt;a href="https://www.tiktok.com/@hyperknowio" rel="noopener noreferrer"&gt;TikTok&lt;/a&gt;, &lt;a href="https://x.com/hyperknow_ai" rel="noopener noreferrer"&gt;X/Twitter&lt;/a&gt;, and &lt;a href="https://www.instagram.com/hyperknow.io/" rel="noopener noreferrer"&gt;Instagram&lt;/a&gt;. Pro users also get a 1:1 onboarding session with the founders and an invite to their official Discord community.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  🤔 My Honest Take After Testing
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;What genuinely impressed me:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✅ The instruction video pipeline is unlike anything I have seen. Typing a URL and getting a rendered educational video in 2 minutes is genuinely remarkable&lt;br&gt;
✅ Deep Learn Sessions feel like a real tutoring experience, not a chatbot conversation&lt;br&gt;
✅ The proactive features actually reduce the cognitive load of planning your own study schedule&lt;br&gt;
✅ The citation quality is legit. It reads &lt;em&gt;and cites&lt;/em&gt; your 800-page textbook with page accuracy&lt;br&gt;
✅ The neural network visualizations it generated were production-quality Manim animations&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where It’s Still Growing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;⚠️ The proactive system works best when you upload structured materials like syllabi or course slides.&lt;br&gt;&lt;br&gt;
⚠️ Video narration could benefit from more customization options, especially a more humanized and adjustable tone.&lt;/p&gt;




&lt;h2&gt;
  
  
  🌟 The Bigger Picture
&lt;/h2&gt;

&lt;p&gt;HyperKnow's manifesto says it best:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"We believe education is due for a fundamental change, not through shortcuts or cheating, but a new approach that enables faster learning, deeper understanding, and eliminates barriers to knowledge, for generations to come."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That is not just marketing copy. Every feature, from the proactive planning to the Deep Learn sessions to the instruction video generation to the personalized Learner's Persona, is building toward a vision of AI that &lt;strong&gt;partners with your learning process&lt;/strong&gt;, rather than just answering your questions.&lt;/p&gt;

&lt;p&gt;After years of AI tools that are brilliant at generating text but terrible at actually &lt;em&gt;teaching&lt;/em&gt;, HyperKnow feels like a meaningful step in the right direction.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔗 Try It Yourself
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;🌐 &lt;strong&gt;Website&lt;/strong&gt;: &lt;a href="https://www.hyperknow.io/" rel="noopener noreferrer"&gt;hyperknow.io&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;🤖 &lt;strong&gt;Agent App&lt;/strong&gt;: &lt;a href="https://agent.hyperknow.io/" rel="noopener noreferrer"&gt;agent.hyperknow.io&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;💬 &lt;strong&gt;Discord&lt;/strong&gt;: &lt;a href="https://discord.com/invite/WXhsUSxtGH" rel="noopener noreferrer"&gt;discord.com/invite/WXhsUSxtGH&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;🐦 &lt;strong&gt;X/Twitter&lt;/strong&gt;: &lt;a href="https://x.com/hyperknow_ai" rel="noopener noreferrer"&gt;@hyperknow_ai&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;If you're building in the AI education space, this proactive-agent direction is something to pay attention to.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Have you tried HyperKnow or any other proactive AI learning tool?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;em&gt;I would love to hear your experience in the comments, especially if you have used the Deep Learn Sessions for something gnarly like topology or quantum mechanics&lt;/em&gt; 👇&lt;/p&gt;




</description>
      <category>ai</category>
      <category>learning</category>
      <category>productivity</category>
      <category>deeplearning</category>
    </item>
    <item>
      <title>From Code Dreams to AI Reality: Building MarketPulse with Google Gemini</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Fri, 27 Feb 2026 14:21:17 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/from-code-dreams-to-ai-reality-building-marketpulse-with-google-gemini-2hmf</link>
      <guid>https://dev.to/aniruddhaadak/from-code-dreams-to-ai-reality-building-marketpulse-with-google-gemini-2hmf</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/mlh-built-with-google-gemini-02-25-26"&gt;Built with Google Gemini: Writing Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built with Google Gemini
&lt;/h2&gt;

&lt;p&gt;I built &lt;strong&gt;MarketPulse AI&lt;/strong&gt; – an intelligent financial analytics platform that leverages &lt;strong&gt;Google Gemini&lt;/strong&gt; to provide real-time market insights, sentiment analysis, and predictive trends. It's essentially your personal AI-powered financial advisor! 📊&lt;/p&gt;

&lt;p&gt;The core problem? Traders and investors drown in information overload. There's stock data, news, social media sentiment, and economic indicators everywhere. We needed something to synthesize ALL of that intelligently.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Google Gemini played the MVP role here:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Natural Language Processing&lt;/strong&gt;: Gemini ingests raw financial news and tweets, then generates coherent market analysis summaries&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sentiment Analysis&lt;/strong&gt;: It reads investor commentary and extracts bullish/bearish sentiment with incredible accuracy&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Predictive Suggestions&lt;/strong&gt;: Using historical patterns + current data, Gemini generates actionable trading signals ("Consider looking at Tech stocks, they're showing 73% bullish sentiment")&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Modal Understanding&lt;/strong&gt;: The platform can analyze charts, reports, AND text simultaneously&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://marketpulse-ai.vercel.app" rel="noopener noreferrer"&gt;https://marketpulse-ai.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(Live dashboard showing real-time stock trends and AI-powered insights)&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Learned
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Technical Wins&lt;/strong&gt;: 🏆&lt;/p&gt;

&lt;p&gt;Integrating Gemini API was surprisingly smooth. The documentation is fantastic, and the response latency is impressive – averaging around 1.2 seconds even for complex analyses. I learned how to properly handle streaming responses for real-time data updates.&lt;/p&gt;

&lt;p&gt;I discovered that combining Gemini with TensorFlow for time-series forecasting creates a powerful duo. The AI handles the narrative, TensorFlow handles the math. Chef's kiss! 👨‍🍳&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Soft Skills &amp;amp; Surprises&lt;/strong&gt;: 💡&lt;/p&gt;

&lt;p&gt;What surprised me most? Gemini's ability to understand context across completely different data types. You can feed it stock charts (images) and quarterly reports (text) in the same prompt, and it genuinely understands the relationship between them.&lt;/p&gt;

&lt;p&gt;I also learned the importance of prompt engineering. Small tweaks to how I framed requests to Gemini made massive differences in output quality. Treat prompts like code – iterate and refine!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Unexpected Lesson&lt;/strong&gt;: One thing that humbled me – building an AI-powered product isn't about the coolest model, it's about solving REAL problems for REAL users. I initially over-complicated things, but Gemini's simplicity forced me to think about what traders actually need.&lt;/p&gt;

&lt;h2&gt;
  
  
  Google Gemini Feedback
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;What Worked Beautifully&lt;/strong&gt;: ✨&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Ease of Integration&lt;/strong&gt; - The API is intuitive. Getting from "hello world" to production-ready took maybe 4 hours&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Consistency&lt;/strong&gt; - Results are surprisingly consistent across different queries and sessions&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Speed&lt;/strong&gt; - For an AI model of this capability, the response times are genuinely fast&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Multi-Modal Capabilities&lt;/strong&gt; - Being able to analyze images and text together is a game-changer&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where I Hit Friction&lt;/strong&gt;: 🤔&lt;/p&gt;

&lt;p&gt;⚠️ &lt;strong&gt;Rate Limiting&lt;/strong&gt; - During testing, I occasionally hit rate limits. More generous free-tier limits would help devs experiment more freely&lt;/p&gt;

&lt;p&gt;⚠️ &lt;strong&gt;Fine-tuning&lt;/strong&gt; - While the base model is incredible, not having fine-tuning capabilities for domain-specific language (stock market jargon) would've been nice&lt;/p&gt;

&lt;p&gt;⚠️ &lt;strong&gt;Cost at Scale&lt;/strong&gt; - For production apps with heavy usage, pricing can add up quick. But honestly? It's worth it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Honest Truth&lt;/strong&gt;: Google Gemini is legit. It's not just hype. This is the closest I've come to building something that feels like true AI collaboration. It understands nuance, context, and intent in ways that consistently impress me.&lt;/p&gt;




&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;Building MarketPulse with Gemini taught me that the future of development isn't about choosing between "AI" and "traditional code" – it's about blending them intelligently. Gemini handles the hard thinking, we handle the smart orchestration.&lt;/p&gt;

&lt;p&gt;If you're on the fence about using AI in your projects, take it from me – dive in. The learning curve is gentler than you'd expect, and the possibilities are genuinely exciting! 🚀&lt;/p&gt;

&lt;p&gt;Thanks for checking out MarketPulse AI! If you build something cool with Gemini (or any AI), drop a comment – I'd love to see what the community creates! 💻✨&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>geminireflections</category>
      <category>gemini</category>
    </item>
    <item>
      <title>CodeShare Hub: Building a Community-First Code Snippet Platform</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Fri, 27 Feb 2026 14:19:56 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/codeshare-hub-building-a-community-first-code-snippet-platform-25il</link>
      <guid>https://dev.to/aniruddhaadak/codeshare-hub-building-a-community-first-code-snippet-platform-25il</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/weekend-2026-02-28"&gt;DEV Weekend Challenge: Community&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Community
&lt;/h2&gt;

&lt;p&gt;I built this for the amazing &lt;strong&gt;Open Source Developer Community&lt;/strong&gt; on GitHub and DEV. From beginners learning to code to seasoned veterans pushing the boundaries of tech, this community thrives on sharing knowledge and supporting one another. 🚀&lt;/p&gt;

&lt;p&gt;As someone deeply involved in the web dev and AI space, I see how scattered code snippets can be across different platforms. Developers waste time searching through countless repositories and forums instead of having one unified, community-driven space.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I created &lt;strong&gt;CodeShare Hub&lt;/strong&gt; – a sleek, open-source code snippet platform where developers can instantly save, organize, and share their most useful code snippets. Think of it as a community "brain" for reusable code patterns!&lt;/p&gt;

&lt;p&gt;The app features:&lt;/p&gt;

&lt;p&gt;✨ &lt;strong&gt;Search &amp;amp; Filter Magic&lt;/strong&gt; - Find snippets by language, framework, or topic in milliseconds&lt;/p&gt;

&lt;p&gt;🎨 &lt;strong&gt;Beautiful Syntax Highlighting&lt;/strong&gt; - Code looks gorgeous with support for 50+ programming languages&lt;/p&gt;

&lt;p&gt;⭐ &lt;strong&gt;Community Ratings&lt;/strong&gt; - Upvote the most useful snippets (because good code deserves recognition!)&lt;/p&gt;

&lt;p&gt;🔗 &lt;strong&gt;Share &amp;amp; Collaborate&lt;/strong&gt; - Generate shareable links and embed snippets directly in blogs&lt;/p&gt;

&lt;p&gt;💾 &lt;strong&gt;Smart Collections&lt;/strong&gt; - Organize snippets into folders (because chaos is nobody's friend)&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://codeshare-hub.vercel.app/" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;codeshare-hub.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;h2&gt;
  
  
  Code
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/codeshare-hub" rel="noopener noreferrer"&gt;
        codeshare-hub
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      CodeShare Hub – a sleek, open-source code snippet platform where developers can instantly save, organize, and share their most useful code snippets. 
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;CodeShare Hub&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/13e654f725bd272aa5b55fca3ebc3d1ff4515bf89a4539551809cdc96b164de1/68747470733a2f2f63617073756c652d72656e6465722e76657263656c2e6170702f6170693f747970653d776176696e67266865696768743d31383026746578743d436f6465536861726525323048756226666f6e74416c69676e3d353026666f6e74416c69676e593d333526636f6c6f723d303a3066313732612c35303a3331326538312c3130303a37633361656426666f6e74436f6c6f723d66666666666626646573633d53617665253230636f6465253230666173742e2532304b6565702532306974253230746964792e25323053686172652532307768617425323068656c70732e2664657363416c69676e3d35302664657363416c69676e593d3538"&gt;&lt;img src="https://camo.githubusercontent.com/13e654f725bd272aa5b55fca3ebc3d1ff4515bf89a4539551809cdc96b164de1/68747470733a2f2f63617073756c652d72656e6465722e76657263656c2e6170702f6170693f747970653d776176696e67266865696768743d31383026746578743d436f6465536861726525323048756226666f6e74416c69676e3d353026666f6e74416c69676e593d333526636f6c6f723d303a3066313732612c35303a3331326538312c3130303a37633361656426666f6e74436f6c6f723d66666666666626646573633d53617665253230636f6465253230666173742e2532304b6565702532306974253230746964792e25323053686172652532307768617425323068656c70732e2664657363416c69676e3d35302664657363416c69676e593d3538" alt="CodeShare Hub banner"&gt;&lt;/a&gt;
&lt;/p&gt;
&lt;p&gt;
  &lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/229510feec3953dd8f99fbf27e0269f23ed6eff9979dc9dff58e67875efdc7eb/68747470733a2f2f726561646d652d747970696e672d7376672e64656d6f6c61622e636f6d3f666f6e743d496e746572267765696768743d3730302673697a653d3234266475726174696f6e3d323630302670617573653d39303026636f6c6f723d4137384246412663656e7465723d74727565267643656e7465723d747275652677696474683d393030266c696e65733d412b73696d706c652b686f6d652b666f722b796f75722b626573742b636f64652b736e6970706574732e3b46696e642b68656c7066756c2b636f64652b6661737465722e3b53686172652b636c65616e2b736e6970706574732b776974682b6f746865722b646576656c6f706572732e"&gt;&lt;img src="https://camo.githubusercontent.com/229510feec3953dd8f99fbf27e0269f23ed6eff9979dc9dff58e67875efdc7eb/68747470733a2f2f726561646d652d747970696e672d7376672e64656d6f6c61622e636f6d3f666f6e743d496e746572267765696768743d3730302673697a653d3234266475726174696f6e3d323630302670617573653d39303026636f6c6f723d4137384246412663656e7465723d74727565267643656e7465723d747275652677696474683d393030266c696e65733d412b73696d706c652b686f6d652b666f722b796f75722b626573742b636f64652b736e6970706574732e3b46696e642b68656c7066756c2b636f64652b6661737465722e3b53686172652b636c65616e2b736e6970706574732b776974682b6f746865722b646576656c6f706572732e" alt="Animated intro for CodeShare Hub"&gt;&lt;/a&gt;
&lt;/p&gt;
&lt;p&gt;
  &lt;a href="https://nextjs.org" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/54b851e11ebc5897a8066fe02fb773b2682a5e4003d35b5ad5027bf55283a0d2/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4e6578742e6a732d31352d626c61636b3f7374796c653d666f722d7468652d6261646765266c6f676f3d6e6578742e6a73" alt="Next.js badge"&gt;&lt;/a&gt;
  &lt;a href="https://react.dev" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/f0aa72d5409948e87f8cce8b80a3fc6107f237facd47bf35f2b09bfe2847e512/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f52656163742d31382d3230323332413f7374796c653d666f722d7468652d6261646765266c6f676f3d7265616374" alt="React badge"&gt;&lt;/a&gt;
  &lt;a href="https://www.typescriptlang.org" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/b7fbd1ac133b18867505a43ca849fbf3fcf2042622067586e39e680fb585e564/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f547970655363726970742d352d3331373843363f7374796c653d666f722d7468652d6261646765266c6f676f3d74797065736372697074266c6f676f436f6c6f723d7768697465" alt="TypeScript badge"&gt;&lt;/a&gt;
  &lt;a href="https://tailwindcss.com" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/e2801707e47745167a3fb86ee5e94790714f55dc12d699ee129c45c877b466b6/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f5461696c77696e645f4353532d332d3036423644343f7374796c653d666f722d7468652d6261646765266c6f676f3d7461696c77696e64637373266c6f676f436f6c6f723d7768697465" alt="Tailwind CSS badge"&gt;&lt;/a&gt;
  &lt;a href="https://www.mongodb.com" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/7554945a12a97d7b4fa065da16a75bbbbabc3f0a5d8f35191fd5453c45f47589/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4d6f6e676f44422d44617461626173652d3133414135323f7374796c653d666f722d7468652d6261646765266c6f676f3d6d6f6e676f6462266c6f676f436f6c6f723d7768697465" alt="MongoDB badge"&gt;&lt;/a&gt;
  &lt;a href="https://vercel.com" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/287386dedb3e149f3b81348a96e0526f351d44d5ec9b08d5e623919729e0e499/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4465706c6f792d56657263656c2d3030303030303f7374796c653d666f722d7468652d6261646765266c6f676f3d76657263656c266c6f676f436f6c6f723d7768697465" alt="Vercel deployment badge"&gt;&lt;/a&gt;
&lt;/p&gt;
&lt;p&gt;
  CodeShare Hub is a friendly place to &lt;strong&gt;save&lt;/strong&gt;, &lt;strong&gt;organize&lt;/strong&gt;, and &lt;strong&gt;share&lt;/strong&gt; useful code snippets.&lt;br&gt;
  It helps developers keep good code close, so they can reuse it, improve it, and help others faster
&lt;/p&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;✨ Quick look&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;
  &lt;a rel="noopener noreferrer" href="https://github.com/aniruddhaadak80/codeshare-hub/./docs/images/homepage.png"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2Fcodeshare-hub%2F.%2Fdocs%2Fimages%2Fhomepage.png" alt="CodeShare Hub homepage preview" width="100%"&gt;&lt;/a&gt;
&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h3 class="heading-element"&gt;Why this feels useful&lt;/h3&gt;

&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;Save snippets before they get lost in old chats, notes, or random files.&lt;/li&gt;
&lt;li&gt;Search and explore shared code with level, source, language, and quick-tag filters.&lt;/li&gt;
&lt;li&gt;Group snippets into collections when a project grows.&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;🧭 Interactive product flow&lt;/h2&gt;

&lt;/div&gt;

  &lt;div class="js-render-enrichment-target"&gt;
    &lt;div class="render-plaintext-hidden"&gt;
      &lt;pre&gt;flowchart TD
    A([Open CodeShare Hub]) --&amp;gt; B{What do you want to do?}
    B --&amp;gt; C[Explore public snippets]
    B --&amp;gt; D[Sign in with GitHub or Google]
    B --&amp;gt; N[Save a snippet locally without auth]
    D --&amp;gt; E[Create a new snippet]
    D --&amp;gt; F[Build a collection]
    C --&amp;gt; G[Open a snippet page]
    C --&amp;gt; O[Use level, source, tag, and search filters]
    E --&amp;gt; H[Add title, code, tags, and language]
    H --&amp;gt; I[Publish&lt;/pre&gt;…&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/codeshare-hub" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;



&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How I Built It
&lt;/h2&gt;

&lt;p&gt;I focused on keeping the tech stack &lt;strong&gt;modern and lightweight&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Frontend&lt;/strong&gt;: Next.js 14, React 18, TypeScript for type safety&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Styling&lt;/strong&gt;: Tailwind CSS + Framer Motion for smooth animations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backend&lt;/strong&gt;: Node.js with Express, MongoDB for flexible data storage&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Syntax Highlighting&lt;/strong&gt;: Prism.js with custom themes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Authentication&lt;/strong&gt;: NextAuth.js for community logins&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment&lt;/strong&gt;: Vercel (frontend) + Railway (backend)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The entire project was built over this weekend using &lt;strong&gt;best practices&lt;/strong&gt; like code splitting, lazy loading, and proper error handling. Performance optimization was key – the average page load time is under 1.2 seconds! ⚡&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;A little fun fact&lt;/strong&gt;: This project taught me that even in a short timeframe, you can build something meaningful if you focus on &lt;em&gt;what actually matters&lt;/em&gt; to the community. No bloatware, just solid engineering! 😄&lt;/p&gt;

&lt;p&gt;Thanks for checking out CodeShare Hub! If you build something cool and want to contribute, the repo is open for pull requests. Happy coding! 💻✨&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>weekendchallenge</category>
      <category>showdev</category>
    </item>
    <item>
      <title>Vision Possible: Decoding the Future with Real-Time AI Agents</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Mon, 23 Feb 2026 17:10:13 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/vision-possible-decoding-the-future-with-real-time-ai-agents-3069</link>
      <guid>https://dev.to/aniruddhaadak/vision-possible-decoding-the-future-with-real-time-ai-agents-3069</guid>
      <description>&lt;h2&gt;
  
  
  My Journey into the WeMakeDevs Vision Hackathon
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpu16pkyovgds8sffrivq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpu16pkyovgds8sffrivq.png" alt="WeMakeDevs Logo" width="800" height="618"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hey everyone! 👋 As a developer constantly fascinated by the bleeding edge of technology, the WeMakeDevs Vision Hackathon immediately caught my eye. The mission? To build multi-modal AI agents that can &lt;strong&gt;watch, listen, and understand video in real-time&lt;/strong&gt;. This isn't just another hackathon; it's a deep dive into what feels like science fiction becoming reality, powered by the incredible &lt;a href="https://visionagents.ai/" rel="noopener noreferrer"&gt;Vision Agents SDK&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9g94fsjq1hxf9gumquzu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9g94fsjq1hxf9gumquzu.png" alt="Vision Agents Logo" width="800" height="131"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In a world increasingly driven by visual data, the ability for AI to process and react to video in real-time is a game-changer. Think about it: instant feedback for athletes, proactive security systems, or even truly immersive interactive gaming. The possibilities are mind-boggling, and the challenge laid out by WeMakeDevs and Stream's Vision Agents SDK is to turn these possibilities into tangible projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Unseen Frontier: Why Real-Time Video AI Matters
&lt;/h2&gt;

&lt;p&gt;We've seen AI excel at image recognition and natural language processing. But video? That's a whole different beast. It's dynamic, complex, and demands lightning-fast processing to be truly useful. Traditional video analysis often involves delays, making it unsuitable for applications where immediate response is critical.&lt;/p&gt;

&lt;p&gt;This is where Vision Agents steps in. It's designed from the ground up to tackle the complexities of real-time video, offering ultra-low latency and seamless integration with powerful AI models. It's not just about seeing; it's about understanding and reacting in the blink of an eye.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your Mission Briefing: Diving into Vision Agents SDK
&lt;/h2&gt;

&lt;p&gt;The Vision Agents SDK, developed by Stream, provides the foundational blocks for building these intelligent, low-latency video experiences. What makes it so compelling for a hackathon like this?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Video AI at its Core:&lt;/strong&gt; It's built for real-time video. You can combine state-of-the-art vision models like YOLO, Roboflow, and Moondream with LLMs like Gemini and OpenAI, all working in concert.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Ultra-Low Latency:&lt;/strong&gt; This is crucial. With join times under 500ms and audio/video latency below 30ms, your agents aren't just smart; they're &lt;em&gt;fast&lt;/em&gt;. This is achieved through &lt;a href="https://getstream.io/edge-network/" rel="noopener noreferrer"&gt;Stream's global edge network&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Native LLM APIs:&lt;/strong&gt; Direct access to the latest models from OpenAI, Gemini, and Claude means you're always working with cutting-edge AI capabilities without waiting for wrapper updates.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Cross-Platform SDKs:&lt;/strong&gt; Whether you're building for React, Android, iOS, Flutter, React Native, or Unity, Vision Agents has you covered, making your creations accessible across various platforms.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It's like having a superpower to build intelligent systems that can truly perceive and interact with the visual world around them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Under the Hood: A Glimpse at the Architecture
&lt;/h2&gt;

&lt;p&gt;To truly appreciate the power of Vision Agents, it helps to understand its underlying architecture. It's designed for efficiency and flexibility, allowing developers to integrate various components seamlessly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbayrseapx8ov5ohitogw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbayrseapx8ov5ohitogw.png" alt="Vision Agent Architecture Diagram" width="800" height="986"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Figure 1: Simplified Vision Agent Architecture&lt;/em&gt; &lt;/p&gt;

&lt;p&gt;As you can see in the diagram above, a real-time video stream enters Stream's Edge Network, ensuring minimal latency. This stream then feeds into various video processors, where models like YOLO or Roboflow can perform object detection, pose estimation, or other visual analyses. The processed information is then fed into powerful Large Language Models (LLMs) like Gemini or OpenAI, which can interpret the visual data, make decisions, and even trigger external tools or functions. The output can range from audio responses to UI updates or interactions with other services.&lt;/p&gt;

&lt;h2&gt;
  
  
  Unleashing Potential: Inspiring Use Cases
&lt;/h2&gt;

&lt;p&gt;The beauty of Vision Agents lies in its versatility. Here are a few examples that truly showcase its potential, some of which are even demonstrated in the SDK's examples:&lt;/p&gt;

&lt;h3&gt;
  
  
  Sports Coaching AI
&lt;/h3&gt;

&lt;p&gt;Imagine an AI coach that provides real-time feedback on your golf swing or tennis serve. By combining fast object detection models (like YOLO) with Gemini Live, Vision Agents can analyze your movements and offer instant corrections. This isn't just for professional athletes; it could revolutionize personal fitness and physical therapy.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Partial example from Vision Agents GitHub
&lt;/span&gt;&lt;span class="n"&gt;agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;edge&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;getstream&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Edge&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="n"&gt;agent_user&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;agent_user&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;instructions&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Read @golf_coach.md&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;gemini&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Realtime&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fps&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;processors&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ultralytics&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;YOLOPoseProcessor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_path&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;yolo11n-pose.pt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;cuda&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)],&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Intelligent Security Cameras
&lt;/h3&gt;

&lt;p&gt;Beyond simple motion detection, Vision Agents can power security systems that understand context. Think about a system that detects a package theft, identifies the perpetrator using face recognition, and automatically generates a &lt;br&gt;
 "WANTED" poster to be posted on social media in real-time. This example combines YOLOv11 object detection, Nano Banana (for image generation), and Gemini for a comprehensive security workflow.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Partial example from Vision Agents GitHub
&lt;/span&gt;&lt;span class="n"&gt;security_processor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SecurityCameraProcessor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;fps&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;model_path&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;weights_custom.pt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;# YOLOv11 for package detection
&lt;/span&gt;    &lt;span class="n"&gt;package_conf_threshold&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;edge&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;getstream&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Edge&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="n"&gt;agent_user&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;User&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Security AI&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;agent&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;instructions&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Read @instructions.md&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;processors&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;security_processor&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;gemini&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;LLM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gemini-2.5-flash-lite&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;tts&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;elevenlabs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;TTS&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="n"&gt;stt&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;deepgram&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;STT&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Invisible Assistant for Real-Time Coaching
&lt;/h3&gt;

&lt;p&gt;Imagine an AI silently assisting you during a job interview or a sales call, providing real-time coaching based on your expressions, tone, and the conversation flow. This can be achieved by combining Gemini Realtime to watch your screen and audio, offering subtle guidance without broadcasting audio. The applications here are vast, from sales coaching to physical therapy and even interactive learning.&lt;/p&gt;

&lt;h2&gt;
  
  
  Crafting a Winning Entry: Tips for the Hackathon
&lt;/h2&gt;

&lt;p&gt;For those participating in the Vision Hackathon, here are a few thoughts on how to make your project stand out, especially when it comes to the &lt;br&gt;
 "Best Blog Submission" prize:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Focus on a Clear Problem:&lt;/strong&gt; What real-world problem does your Vision Agent solve? The more impactful and clearly defined the problem, the more compelling your solution will be.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Show, Don't Just Tell:&lt;/strong&gt; The Vision Agents SDK is all about real-time video. Include screenshots, GIFs, or even short video demos of your project in action. Visuals are incredibly powerful for conveying your idea.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Highlight Vision Agents SDK Features:&lt;/strong&gt; Explicitly mention how you've leveraged the unique capabilities of the SDK – its low latency, multi-modal integration, native LLM APIs, and cross-platform support. This shows a deep understanding of the tools provided.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Tell a Story:&lt;/strong&gt; Don't just present technical details. Weave a narrative around your project. What inspired you? What challenges did you face and how did you overcome them? This makes your blog post relatable and human.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Keep it Human:&lt;/strong&gt; Avoid overly technical jargon where simpler language suffices. Use an engaging, conversational tone. Share your excitement and passion for what you've built. Remember, the goal is to make it feel like a professional blogger wrote it, not an AI.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Dev.to Formatting:&lt;/strong&gt; Utilize Markdown effectively. Use headings, subheadings, code blocks, and lists to make your post easy to read and navigate. Images and embedded videos are highly encouraged.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Future Vision:&lt;/strong&gt; What's next for your project? Even if it's a hackathon prototype, discussing future enhancements or broader applications demonstrates foresight and ambition.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  My Vision for the Future
&lt;/h2&gt;

&lt;p&gt;The WeMakeDevs Vision Hackathon, powered by Stream's Vision Agents SDK, is more than just a competition; it's a glimpse into the future of AI. The ability to build intelligent agents that can perceive and interact with our world in real-time opens up a universe of possibilities. From enhancing daily life to solving complex global challenges, real-time video AI is poised to be a transformative force.&lt;/p&gt;

&lt;p&gt;I'm incredibly excited to see the innovative solutions that emerge from this hackathon. Whether you're a seasoned AI expert or just starting your journey, the Vision Agents SDK provides an accessible yet powerful platform to bring your ideas to life. Let's build the future, one intelligent agent at a time!&lt;/p&gt;

</description>
      <category>ai</category>
      <category>agents</category>
      <category>hackathon</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Discover me through Kruti (formerly India Krutrim AI) and explore insights into my AI-powered projects.</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Sun, 22 Feb 2026 17:55:19 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/discover-me-through-kruti-formerly-india-krutrim-ai-and-explore-insights-into-my-ai-powered-3njp</link>
      <guid>https://dev.to/aniruddhaadak/discover-me-through-kruti-formerly-india-krutrim-ai-and-explore-insights-into-my-ai-powered-3njp</guid>
      <description>&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
        &lt;div class="c-embed__cover"&gt;
          &lt;a href="https://www.kruti.ai/chat?share_id=5123991c-98d7-4ad9-a7b3-ac61064b4689&amp;amp;source_caller=api&amp;amp;shortlink=0houuyzn&amp;amp;share_id=5123991c-98d7-4ad9-a7b3-ac61064b4689&amp;amp;pid=ABtK&amp;amp;deep_link_value=chat/5123991c-98d7-4ad9-a7b3-ac61064b4689" class="c-link align-middle" rel="noopener noreferrer"&gt;
            &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.olakrutrim.com%2Fkrutrim%2Fchatv2%2Fimages%2Fopengraph-image.jpg" height="420" class="m-0" width="800"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="c-embed__body"&gt;
        &lt;h2 class="fs-xl lh-tight"&gt;
          &lt;a href="https://www.kruti.ai/chat?share_id=5123991c-98d7-4ad9-a7b3-ac61064b4689&amp;amp;source_caller=api&amp;amp;shortlink=0houuyzn&amp;amp;share_id=5123991c-98d7-4ad9-a7b3-ac61064b4689&amp;amp;pid=ABtK&amp;amp;deep_link_value=chat/5123991c-98d7-4ad9-a7b3-ac61064b4689" rel="noopener noreferrer" class="c-link"&gt;
            
          &lt;/a&gt;
        &lt;/h2&gt;
          &lt;p class="truncate-at-3"&gt;
            Aniruddha Adak is a Kolkata-based full-stack developer and AI enthusiast with a strong background in web development, machine learning, and open-source contribu...
          &lt;/p&gt;
        &lt;div class="color-secondary fs-s flex items-center"&gt;
            &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.kruti.ai%2Ffavicon.ico" width="48" height="48"&gt;
          kruti.ai
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;blockquote&gt;
&lt;h1&gt;
  
  
  Profile and Background of Aniruddha Adak
&lt;/h1&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Executive Summary
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Professional Profile and Core Competencies
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak is a multifaceted technology professional based in Kolkata, India, whose career bridges the domains of full-stack development and artificial intelligence[1]. He possesses a solid academic foundation, having completed a B.Tech in Computer Science and Engineering from Budge Budge Institute of Technology, which underpins his technical proficiency[1]. His core technical expertise encompasses modern web development frameworks, including &lt;strong&gt;Next.js, React.js, TypeScript, Tailwind CSS, and Node.js&lt;/strong&gt;, alongside machine learning tools such as &lt;strong&gt;Python and TensorFlow&lt;/strong&gt;[1][2]. Furthermore, Adak has cultivated a significant professional identity as an &lt;strong&gt;AI Agent Engineer&lt;/strong&gt;, focusing on the creation of self-directed AI systems, which demonstrates his specialization within the evolving field of artificial intelligence[5]. His multilingual abilities in English, Hindi, and Bengali further facilitate his adaptability and engagement in diverse professional and community contexts[2].&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Notable Projects and Technical Innovations
&lt;/h3&gt;

&lt;p&gt;Adak has demonstrated his applied technical skills through a portfolio of innovative projects that integrate complex technologies into user-centric solutions[1]. Key projects include &lt;strong&gt;SkillSphere&lt;/strong&gt;, a platform for skill development, &lt;strong&gt;MercatoLive&lt;/strong&gt;, an e-commerce application, and the &lt;strong&gt;Real-Time Stock Data Visualizer&lt;/strong&gt;, each showcasing his ability to handle full-stack development challenges[1][2]. A standout demonstration of his technical artistry is the creation of an &lt;strong&gt;ultra-modern animated portfolio&lt;/strong&gt; utilizing &lt;strong&gt;Google AI Studio and Gemini&lt;/strong&gt;, which features gradient animations, smooth interactions, and responsive design, hosted on platforms like GitHub and Vercel[6]. This project not only highlights his proficiency in front-end technologies like Next.js and React but also his capacity to leverage cutting-edge AI tools for creative and professional presentation[6]. His involvement extends to creative coding, evidenced by a collection of over &lt;strong&gt;50 creative sketches&lt;/strong&gt; using the p5.js Web Editor, which underscores a broader interest in the intersection of code and artistic expression[7].&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Recognition, Achievements, and Community Engagement
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak has garnered significant recognition for his contributions, which underscores his standing within the technology community[2]. His accolades include receiving the &lt;strong&gt;Best Developer Award&lt;/strong&gt; and corporate honors such as the &lt;strong&gt;Bajaj Excelsior Award and BT Innovation Award&lt;/strong&gt; during his tenure in leadership roles involving data engineering and advanced analytics[2][3]. A particularly notable achievement is his remarkable volume of open-source contributions, with &lt;strong&gt;238 accepted PR/MR&lt;/strong&gt; during Hacktoberfest 2024, reflecting a deep commitment to collaborative development and knowledge sharing[1][2]. His active public presence is further demonstrated through substantial engagement on developer platforms; for instance, he has published &lt;strong&gt;247 posts&lt;/strong&gt; on DEV.to and actively participated in numerous technical challenges, including the &lt;strong&gt;Google AI Studio Multi-Modal Challenge&lt;/strong&gt; and the &lt;strong&gt;World's Largest Hackathon Writing Challenge&lt;/strong&gt;[5]. Additionally, his hackathon participation is formally recognized, having earned the &lt;strong&gt;X Hackathons Level 1&lt;/strong&gt; badge on Devpost by submitting eligible projects to multiple events with distinct themes[4].&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Introduction and Identification
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1.1 Professional Identity and Geographic Context
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak is definitively identified as a technology professional whose career bridges the domains of full-stack web development and artificial intelligence innovation [1]. His professional base is established in Kolkata, India, which serves as the geographic anchor for his multidisciplinary work [1][6]. While the name "Aniruddha Adak" may not be unique, the confluence of his specific technical focus, project portfolio, and documented achievements across multiple authoritative platforms creates a distinct and verifiable professional identity. This identity is further solidified by his consistent self-description as an AI Agent Engineer, a role dedicated to creating self-directed AI systems, which underscores a specialized niche within the broader tech landscape [5]. The available data presents no ambiguity, as his core competencies in modern frameworks like Next.js, React, and TensorFlow are repeatedly cited alongside his AI engineering pursuits, forming a coherent professional narrative [1][6].&lt;/p&gt;

&lt;p&gt;The credibility of his professional profile is enhanced by the tangible outputs of his work, including publicly accessible projects such as SkillSphere, MercatoLive, and Real-Time Stock Data Visualizer [2]. These projects demonstrate a practical application of his stated skills in building user-centric solutions that integrate complex back-end and front-end technologies. Furthermore, his development of an ultra-modern animated portfolio utilizing Google AI Studio and Gemini acts as both a demonstration of skill and a central hub for his professional showcase, hosted on platforms like GitHub and Vercel [6]. This portfolio not only highlights his technical proficiency but also his commitment to presenting his work through sophisticated, interactive digital mediums. The consistency of his project themes—spanning e-commerce, skill development, data visualization, and AI—across different sources reinforces the authenticity and focus of his professional identity [1][2].&lt;/p&gt;

&lt;h3&gt;
  
  
  1.2 Distinctive Credentials and Online Footprint
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak’s professional standing is distinguished by a series of formal credentials and recognitions that validate his expertise. He holds a &lt;strong&gt;Google Cloud AI/ML certification&lt;/strong&gt;, which formally acknowledges his specialized knowledge in artificial intelligence and machine learning domains [3]. This certification is complemented by the attainment of the &lt;strong&gt;Best Developer Award&lt;/strong&gt;, a recognition that underscores peer or institutional acknowledgment of his technical capabilities [2]. Additionally, his profile notes experience in executive leadership at Bajaj Housing Finance, where he contributed to data engineering and advanced analytics, and received corporate accolades such as the &lt;strong&gt;Bajaj Excelsior Award&lt;/strong&gt; and the &lt;strong&gt;BT Innovation Award&lt;/strong&gt; [3]. These awards from a corporate environment highlight an impactful dual career path that encompasses both innovative development and recognized leadership within established financial institutions.&lt;/p&gt;

&lt;p&gt;His identity is further authenticated by a robust and verifiable online footprint across key developer and professional networks. He maintains an active presence on &lt;strong&gt;DEV.to&lt;/strong&gt;, where he has published a significant volume of content and completed numerous technical challenges, including the Google AI Studio Multi-Modal Challenge [5]. His profile on &lt;strong&gt;Devpost&lt;/strong&gt; documents active participation in hackathons, having earned the &lt;strong&gt;X Hackathons Level 1&lt;/strong&gt; achievement by submitting eligible projects to multiple events with distinct themes [4]. Perhaps one of the most quantifiable markers of his identity within the open-source community is his contribution of &lt;strong&gt;238 accepted PR/MR&lt;/strong&gt; during Hacktoberfest 2024, a figure consistently reported across multiple sources [2][3]. This extensive engagement across platforms like GitHub, LinkedIn, and Product Hunt creates a multifaceted digital identity that is both persistent and aligned with his stated professional focus on AI and full-stack development [2][6].&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Personal Background
&lt;/h2&gt;

&lt;h3&gt;
  
  
  2.1 Academic Foundation and Multilingual Proficiency
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak's formal academic training is anchored in computer science, having completed a Bachelor of Technology (B.Tech) degree in Computer Science and Engineering from the Budge Budge Institute of Technology located in Kolkata, India [1][2]. This educational background provided the fundamental theoretical and practical knowledge necessary for his subsequent career in software development and artificial intelligence. While detailed records of his early life and formative years are not extensively documented in the available sources, his academic credentials establish a clear foundation for his technical expertise. His multilingual capabilities, encompassing proficiency in English, Hindi, and Bengali, further illustrate his adaptability and preparedness for engaging with diverse professional and community environments on a global and local scale [2]. This linguistic versatility supports effective communication and collaboration across different cultural and technical contexts, a valuable asset in the interconnected global technology landscape.&lt;/p&gt;

&lt;h3&gt;
  
  
  2.2 Technical Evolution and Professional Identity
&lt;/h3&gt;

&lt;p&gt;Adak's professional journey reflects a significant evolution from foundational software engineering to specialized innovation in artificial intelligence and full-stack development. His current professional identity is prominently defined by his role as an AI Agent Engineer, a position focused on the creation of advanced, self-directed AI systems [5]. This specialization represents a synthesis of his comprehensive skills in modern web frameworks and his deep engagement with cutting-edge machine learning paradigms. Sources also characterize him as a technology leader and AI innovator with a substantial breadth of experience, indicating a career that spans executive leadership in corporate settings alongside pioneering development work [3]. This dual-focus career path demonstrates an ability to navigate both the strategic demands of business technology and the iterative, creative processes of technical innovation and open-source contribution.&lt;/p&gt;

&lt;p&gt;The trajectory of his career is further evidenced by his active participation in continuous learning and professional challenges. He has systematically engaged with platform-specific initiatives, such as completing the Google AI Studio Multi-Modal Challenge and the AI Agents Challenge, which solidify his standing within niche AI development communities [5]. His commitment to maintaining and expanding his skill set is consistent with the profile of a professional dedicated to remaining at the forefront of technological change. Furthermore, his documented history of &lt;strong&gt;open-source contributions&lt;/strong&gt;, including a notable &lt;strong&gt;238 accepted pull or merge requests&lt;/strong&gt; during events like Hacktoberfest, underscores a long-standing commitment to communal knowledge sharing and collaborative development [2][3][6]. This evolution from a computer science graduate to a recognized AI agent engineer and open-source advocate charts a clear path of progressive technical specialization and community influence.&lt;/p&gt;

&lt;h3&gt;
  
  
  2.3 Digital Portfolio and Creative Expression
&lt;/h3&gt;

&lt;p&gt;A distinctive aspect of Adak's personal background is his demonstrated commitment to creative technical expression, most notably through the development of an ultra-modern animated portfolio. This portfolio project serves as a direct manifestation of his technical philosophy, integrating advanced AI tools with sophisticated web development to create a dynamic professional presentation [6]. Constructed using Google AI Studio and Gemini, the portfolio explicitly showcases his applied expertise in key technologies such as Next.js, React, TypeScript, Python, and machine learning frameworks [6]. The project's design features, including gradient animations and smooth interactions, highlight a deliberate focus on user experience and aesthetic modernity, extending beyond pure functionality to encompass artistic digital design.&lt;/p&gt;

&lt;p&gt;The technical execution and hosting of this portfolio further reflect professional best practices and accessibility. The project is hosted on industry-standard platforms, specifically GitHub for version control and Vercel for deployment, with a live demo available, ensuring wide accessibility and demonstrating proficiency in full-stack deployment workflows [6]. This portfolio acts as a central, cohesive artifact that bridges his various professional endeavors, from freelance projects to open-source work, presenting them within an integrated, technologically advanced interface. Beyond this primary portfolio, his creative pursuits also include a collection of interactive sketches and projects built using tools like the p5.js Web Editor and CodePen, which showcase a broader interest in creative coding, visual simulations, and game design [7]. These endeavors collectively illustrate a personal background where technical skill is consistently leveraged as a medium for innovation, artistic expression, and professional communication.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Professional and Academic Background
&lt;/h2&gt;

&lt;h3&gt;
  
  
  3.1 Academic Foundation and Core Technical Skill Stack
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak’s formal academic training provides a robust foundation for his multidisciplinary career in technology. He holds a &lt;strong&gt;B.Tech in Computer Science and Engineering&lt;/strong&gt; from the Budge Budge Institute of Technology in Kolkata, which established the theoretical underpinnings for his subsequent practical work in software development and artificial intelligence [1][2]. This educational background is directly reflected in his comprehensive and modern technical skill set, which spans both front-end and back-end development disciplines. His proficiency encompasses contemporary web development frameworks and libraries, including &lt;strong&gt;Next.js, React.js, TypeScript, Tailwind CSS, and JavaScript&lt;/strong&gt;, which he employs to build dynamic and responsive user interfaces [1][2]. Furthermore, his back-end and data management capabilities are demonstrated through his work with &lt;strong&gt;Node.js and MongoDB&lt;/strong&gt;, enabling him to architect full-scale applications [1]. For artificial intelligence and machine learning projects, Adak utilizes &lt;strong&gt;Python and TensorFlow&lt;/strong&gt;, showcasing his ability to transition seamlessly between web development and advanced data science workflows [1][6]. This confluence of academic training and a diversified technical toolkit positions him uniquely at the intersection of several high-demand technology domains.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.2 Professional Experience and Leadership Roles
&lt;/h3&gt;

&lt;p&gt;Beyond his technical capabilities, Aniruddha Adak has accumulated significant professional experience in both corporate and innovative development contexts. He maintains a dual-profile career, engaging in executive leadership within a major financial institution while simultaneously pursuing cutting-edge independent projects [3]. His corporate role is at &lt;strong&gt;Bajaj Housing Finance&lt;/strong&gt;, where he applies his expertise in &lt;strong&gt;data engineering, advanced analytics, and data science&lt;/strong&gt; to drive business insights and innovation [3]. This experience is substantiated by industry recognition, including the receipt of the &lt;strong&gt;Bajaj Excelsior Award and the BT Innovation Award&lt;/strong&gt;, which highlight his impactful contributions in a corporate setting [3]. Concurrently, he operates as an &lt;strong&gt;AI Agent Engineer&lt;/strong&gt;, a role focused on the creation of self-directed AI systems, indicating his deep involvement with the forefront of AI research and application [5]. This professional dichotomy illustrates a capacity to excel in structured, result-oriented corporate environments while also contributing to the rapidly evolving field of autonomous AI, a balance that underscores a versatile and forward-thinking career trajectory.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.3 Notable Projects and Freelance Work
&lt;/h3&gt;

&lt;p&gt;Adak’s professional portfolio is populated with a series of notable projects that demonstrate the practical application of his technical and AI skills. Key among these are &lt;strong&gt;SkillSphere&lt;/strong&gt;, a platform for skill development; &lt;strong&gt;MercatoLive&lt;/strong&gt;, an e-commerce application; and the &lt;strong&gt;Real-Time Stock Data Visualizer&lt;/strong&gt;, each showcasing his ability to integrate complex functionalities into user-centric solutions [1][2]. Another significant project is &lt;strong&gt;EchoCraft&lt;/strong&gt;, alongside &lt;strong&gt;VocalScribe&lt;/strong&gt;, which are hosted on his GitHub profile and reflect his ongoing experimentation with software development and collaborative innovation [1]. Furthermore, he has successfully executed &lt;strong&gt;multiple freelance projects&lt;/strong&gt;, applying his full-stack and AI engineering capabilities to deliver tailored solutions for clients [6]. A pinnacle demonstration of his technical artistry is the creation of an &lt;strong&gt;ultra-modern animated portfolio&lt;/strong&gt; built using &lt;strong&gt;Google AI Studio and Gemini&lt;/strong&gt;, which features gradient animations, smooth interactions, and responsive design, hosted on GitHub and Vercel [6]. This portfolio itself serves as a testament to his expertise in &lt;strong&gt;Next.js, React, TypeScript, and Python&lt;/strong&gt;, effectively merging aesthetic design with advanced technical implementation [6].&lt;/p&gt;

&lt;h3&gt;
  
  
  3.4 Open-Source Contributions, Certifications, and Community Achievements
&lt;/h3&gt;

&lt;p&gt;A defining aspect of Aniruddha Adak’s professional profile is his substantial commitment to the open-source community and continuous learning. His most quantified achievement in this arena is the &lt;strong&gt;238 accepted pull requests or merge requests&lt;/strong&gt; made during &lt;strong&gt;Hacktoberfest 2024&lt;/strong&gt;, for which he earned a corresponding Holopin badge, underscoring his prolific and valued contributions to collaborative software development [1][2][3]. This dedication to open-source is complemented by professional certifications that validate his specialized knowledge, notably his status as a &lt;strong&gt;Google Cloud AI/ML certified&lt;/strong&gt; professional [3]. His engagement with developer platforms is extensive, as evidenced by his participation in numerous technical challenges on DEV.to, including the &lt;strong&gt;Google AI Studio Multi-Modal Challenge, AI Agents Challenge, and the World's Largest Hackathon Writing Challenge&lt;/strong&gt; [5]. Additionally, his competitive spirit is demonstrated through hackathon participation on Devpost, where he earned the &lt;strong&gt;X Hackathons Level 1&lt;/strong&gt; badge by submitting eligible projects to multiple events with different themes [4]. These collective endeavors in open-source, certified education, and community challenges illustrate a professional ethos centered on knowledge sharing, skill validation, and active participation within the global technology ecosystem.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Public Presence and Relevance
&lt;/h2&gt;

&lt;p&gt;Aniruddha Adak has cultivated a substantial and multifaceted public presence across major technology and developer communities, establishing himself as a relevant and engaged figure in the contemporary tech landscape[2]. His strategic use of professional networking, content creation, and collaborative platforms underscores a deliberate effort to share knowledge, contribute to open-source ecosystems, and demonstrate technical prowess[1][5]. This visibility is not merely promotional but is substantiated by quantifiable achievements and active participation in community-driven events, which collectively enhance his professional credibility and thought leadership[4][6].&lt;/p&gt;

&lt;h3&gt;
  
  
  4.1 Developer Community Engagement and Content Creation
&lt;/h3&gt;

&lt;p&gt;Adak maintains an active and influential profile on the developer blogging platform DEV.to, where he has published a significant volume of technical content[5]. His profile indicates he has authored &lt;strong&gt;247 posts&lt;/strong&gt; and written &lt;strong&gt;227 comments&lt;/strong&gt;, demonstrating a deep commitment to sharing insights and engaging in discourse within the developer community[5]. Furthermore, his participation is structured around completing specific technical challenges on the platform, including the &lt;strong&gt;Google AI Studio Multi-Modal Challenge&lt;/strong&gt;, the &lt;strong&gt;AI Agents Challenge&lt;/strong&gt;, and the &lt;strong&gt;World's Largest Hackathon Writing Challenge&lt;/strong&gt;, which showcases his applied expertise in cutting-edge AI and development topics[5]. This pattern of challenge-based contribution highlights a hands-on, practical approach to learning and community building, positioning him as an active participant in the evolving conversations around AI agents and multimodal systems[5].&lt;/p&gt;

&lt;p&gt;His community engagement extends beyond writing to substantial open-source contributions, a cornerstone of his public reputation[1]. He is recognized for making &lt;strong&gt;238 accepted pull requests or merge requests&lt;/strong&gt; during Hacktoberfest 2024, an accomplishment that earned him a corresponding Holopin badge and signifies a major contribution to the global open-source ecosystem[1][2]. These contributions are supported by his prolific activity on GitHub, where he hosts and maintains projects such as &lt;strong&gt;SkillSphere&lt;/strong&gt;, &lt;strong&gt;EchoCraft&lt;/strong&gt;, and &lt;strong&gt;VocalScribe&lt;/strong&gt;, providing tangible artifacts of his skills for public review and collaboration[1]. Additionally, his professional profiles are maintained on key networks including LinkedIn, Product Hunt, and GitHub, creating a comprehensive and interconnected digital footprint that facilitates professional networking and visibility[2].&lt;/p&gt;

&lt;h3&gt;
  
  
  4.2 Hackathon Participation and Achievements
&lt;/h3&gt;

&lt;p&gt;A significant aspect of Adak's public relevance is his demonstrated success in competitive programming and hackathon events, primarily documented on the Devpost platform[4]. He earned the &lt;strong&gt;X Hackathons Level 1&lt;/strong&gt; badge, an achievement unlocked by submitting separate eligible projects to distinct hackathons, which was awarded on July 26, 2025[4]. The criteria for this badge were further met by submitting to &lt;strong&gt;3 hackathons with different themes&lt;/strong&gt; on the same date, illustrating his versatility in tackling diverse problem statements and technical domains within a competitive framework[4]. This consistent hackathon activity underscores his ability to rapidly prototype and deliver functional projects under constraints, a highly valued skill in the technology innovation space.&lt;/p&gt;

&lt;p&gt;His engagement with the hackathon community is also marked by the submission of an eligible project to an online hackathon, also recorded on July 26, 2025, indicating a sustained period of focused participation[4]. The foundational step of this engagement was joining the Devpost platform itself on November 26, 2024, which served as the gateway to these competitive opportunities[4]. Participation in such events provides not only avenues for skill validation and award recognition but also amplifies his visibility among peers, organizers, and potential collaborators in the global hacker community[4]. These documented achievements formalize his status as an accomplished hackathon participant, complementing his other forms of community contribution.&lt;/p&gt;

&lt;h3&gt;
  
  
  4.3 Technical Portfolio and Thought Leadership Demonstration
&lt;/h3&gt;

&lt;p&gt;Adak has leveraged advanced AI tools to create a public-facing technical portfolio that itself serves as a statement of thought leadership and modern development practices[6]. He developed an &lt;strong&gt;ultra-modern animated portfolio&lt;/strong&gt; utilizing &lt;strong&gt;Google AI Studio and Gemini&lt;/strong&gt;, which showcases his practical application of generative AI in web design and development[6]. This portfolio features a contemporary design with gradient animations, smooth interactions, and rich typography, and is explicitly optimized for performance across all devices, reflecting a deep understanding of user experience principles[6]. The portfolio's code and deployment demonstrate his expertise in the specific technologies he advocates, including Next.js, React, TypeScript, and Python[6].&lt;/p&gt;

&lt;p&gt;The portfolio is strategically hosted on &lt;strong&gt;GitHub and Vercel&lt;/strong&gt;, with a live demo available, adhering to industry-standard practices for showcasing developer work and ensuring accessibility[6]. This project acts as a central, curated repository of his professional narrative, integrating his claims of &lt;strong&gt;over 238 open-source contributions&lt;/strong&gt; and multiple successful freelance projects into a coherent visual and interactive experience[6]. By publicly building such a portfolio with AI tools, he positions himself at the intersection of AI implementation and front-end engineering, a relevant niche as these technologies converge[5][6]. This tangible artifact significantly enhances his relevance, demonstrating applied skills in a manner that is immediately accessible to potential employers, clients, and collaborators.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. Conclusion and Summary
&lt;/h2&gt;

&lt;h3&gt;
  
  
  5.1 Consolidated Professional Profile and Impact
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak emerges as a multifaceted technology professional whose career is defined by the integration of full-stack development with advanced artificial intelligence systems[2]. His expertise spans a modern technology stack including Next.js, React, TypeScript, and Node.js for web development, complemented by deep proficiency in Python, TensorFlow, and machine learning for constructing scalable AI solutions[1][6]. This multidisciplinary approach is practically demonstrated through a portfolio of significant projects such as SkillSphere, MercatoLive, and the Real-Time Stock Data Visualizer, which effectively bridge complex backend systems with intuitive user interfaces[1][2]. His professional identity is further solidified by his concurrent roles, which encompass innovative development in AI applications and executive leadership in corporate settings, notably contributing to data engineering and advanced analytics at Bajaj Housing Finance[3]. This synthesis of technical rigor, creative problem-solving, and leadership acumen underscores his significant impact within both the open-source community and the corporate technology sector.&lt;/p&gt;

&lt;h3&gt;
  
  
  5.2 Recognition, Contributions, and Community Engagement
&lt;/h3&gt;

&lt;p&gt;Adak’s professional standing is substantiated by a series of formal recognitions and substantial contributions to the global developer ecosystem. He has been honored with the &lt;strong&gt;Best Developer Award&lt;/strong&gt; and corporate accolades such as the Bajaj Excelsior Award and the BT Innovation Award, highlighting excellence in both independent and organizational contexts[2][3]. A cornerstone of his community impact is his prolific open-source participation, most notably evidenced by &lt;strong&gt;238 accepted pull/merge requests&lt;/strong&gt; during Hacktoberfest 2024, a metric that quantifies his commitment to collaborative software development[2][3]. His engagement extends to active knowledge sharing and competition; on the DEV.to platform, he has published &lt;strong&gt;247 posts&lt;/strong&gt; and successfully completed numerous technical challenges including the Google AI Studio Multi-Modal Challenge and the World’s Largest Hackathon Writing Challenge[5]. Furthermore, his hackathon prowess is validated by earning the &lt;strong&gt;X Hackathons Level 1&lt;/strong&gt; achievement on Devpost, awarded for submitting eligible projects to multiple distinct events, which demonstrates his consistent ability to innovate under constrained timelines[4]. This pattern of recognition and contribution firmly establishes his relevance and authority within the technology community.&lt;/p&gt;

&lt;h3&gt;
  
  
  5.3 Forward-Looking Trajectory and Synthesis
&lt;/h3&gt;

&lt;p&gt;The trajectory of Aniruddha Adak’s career is pointed toward the forefront of AI-integrated development, as exemplified by his current specialization as an AI Agent Engineer focused on creating self-directed AI systems[5]. This forward-looking orientation is materially manifested in his creation of an ultra-modern, animated portfolio using Google AI Studio and Gemini, a project that not only showcases his technical skills in Next.js and React but also serves as a paradigm for AI-driven web development and design[6]. While detailed personal biographical data is limited in the available sources, the consolidated professional evidence paints a coherent and compelling portrait of an individual who leverages a strong academic foundation in Computer Science and Engineering for practical innovation[1]. In summary, Aniruddha Adak represents a synthesis of deep technical expertise, proven leadership, and prolific community contribution, positioning him as an influential figure in the evolving landscape where full-stack development converges with artificial intelligence.&lt;/p&gt;

&lt;p&gt;Citations:&lt;br&gt;
[1] &lt;a href="https://dev.to/ha3k/my-aniruddha-adak-bio-4h9g"&gt;https://dev.to/ha3k/my-aniruddha-adak-bio-4h9g&lt;/a&gt;&lt;br&gt;
[2] &lt;a href="https://aniruddhaadak.tech" rel="noopener noreferrer"&gt;https://aniruddhaadak.tech&lt;/a&gt;&lt;br&gt;
[5] &lt;a href="https://dev.to/aniruddhaadak"&gt;https://dev.to/aniruddhaadak&lt;/a&gt;&lt;br&gt;
[6] &lt;a href="https://www.roastedfeed.com/article/a27da99fce227a301b459045fa77a7f9" rel="noopener noreferrer"&gt;https://www.roastedfeed.com/article/a27da99fce227a301b459045fa77a7f9&lt;/a&gt;&lt;br&gt;
[7] &lt;a href="https://aniruddhaadak.devdojo.com" rel="noopener noreferrer"&gt;https://aniruddhaadak.devdojo.com&lt;/a&gt;&lt;br&gt;
[3] &lt;a href="https://www.kimi.com/preview/19a6a10c-65e2-8638-8000-05c08fac5a6e" rel="noopener noreferrer"&gt;https://www.kimi.com/preview/19a6a10c-65e2-8638-8000-05c08fac5a6e&lt;/a&gt;&lt;br&gt;
[4] &lt;a href="https://devpost.com/aniruddhaadak/achievements" rel="noopener noreferrer"&gt;https://devpost.com/aniruddhaadak/achievements&lt;/a&gt;&lt;/p&gt;




&lt;blockquote&gt;
&lt;h1&gt;
  
  
  Visual Representation of Aniruddha Adak's AI-Powered Projects: VocalScribe and TradeView
&lt;/h1&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Executive Summary
&lt;/h2&gt;

&lt;p&gt;This report outlines the conceptual framework for a visual representation of two AI-powered projects developed by Aniruddha Adak: VocalScribe and TradeView[2]. Aniruddha Adak is a Kolkata-based full-stack developer and AI enthusiast with a B.Tech in Computer Science and Engineering and a Master of Computer Science degree, possessing core technical skills in modern web technologies and AI frameworks[2][13]. His professional focus extends to creating self-directed AI systems as an AI Agent Engineer, demonstrating a deep commitment to leveraging artificial intelligence for practical applications[1]. The proposed visual representation aims to distill the core functionalities and sophisticated AI-driven architectures of these distinct platforms, serving as a professional showcase within his portfolio[13].&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;VocalScribe&lt;/strong&gt; is an AI-powered platform designed to transform voice recordings into structured text and polished content[4][6]. The platform utilizes advanced speech-to-text transcription and natural language processing to offer features including real-time transcription, noise reduction, and voice command integration[5]. It supports versatile content modes such as Quick Thoughts and Interview Mode, facilitating the creation of blog posts, study materials, and other documents[4][11]. Built with React, TypeScript, and Tailwind CSS, and integrating the AssemblyAI API, VocalScribe provides a user-friendly interface with collaborative editing, dark/light modes, and direct publishing capabilities[10]. Its mission is to empower students, professionals, and creators by combining cutting-edge AI with intuitive design[12].&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;TradeView&lt;/strong&gt;, in contrast, represents a financial analytics platform where AI is applied to real-time market analysis and trading strategy generation[7]. Key AI-driven components include indicators like the AI Bot Regime Feed, which identifies market regimes and generates structured alerts for external trading bots[7]. The platform also features tools such as the Sonar AI assistant, a Chrome extension that analyzes charts to provide risk-balanced trading plans and validate trade ideas using statistical data[8]. The TradingView interface incorporates numerous UI elements like the Account Manager, Depth of Market window, and Order Preview, which are designed to present complex financial data clearly for enhanced user decision-making[14][15].&lt;/p&gt;

&lt;p&gt;The primary objective of the visual design is to communicate the technical sophistication and user-centric nature of these projects through a modern, professional aesthetic[2]. The image will strategically highlight the AI-driven features of both platforms, such as VocalScribe’s neural network-powered transcription and TradeView’s predictive analytics and automated signals[6][8]. The composition will balance technical accuracy with visual clarity, employing appropriate color schemes and interactive widget representations to differentiate the creative and analytical domains of the two projects[14][17]. Ultimately, this visual representation will underscore Aniruddha Adak’s expertise in building transformative AI solutions that address real-world challenges in content creation and financial trading[1][12].&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Introduction
&lt;/h2&gt;

&lt;p&gt;This section introduces Aniruddha Adak, a Kolkata-based AI enthusiast and developer, and provides the foundational context for the two AI-powered projects at the core of this visual representation initiative. The proposed image aims to serve as a professional portfolio piece, visually encapsulating the technical sophistication and user-centric design of VocalScribe and TradeView. By integrating key biographical information, project overviews, and the rationale for the visual design, this introduction establishes the framework for a detailed exploration of the design elements in subsequent sections. The ultimate objective is to create a cohesive visual narrative that highlights Adak's expertise and the transformative potential of applied artificial intelligence in distinct domains.&lt;/p&gt;

&lt;h3&gt;
  
  
  1.1 Background of Aniruddha Adak
&lt;/h3&gt;

&lt;p&gt;Aniruddha Adak is a full-stack developer and an AI Agent Engineer with a strong academic foundation in computer science[2]. He holds a B.Tech in Computer Science and Engineering from Budge Budge Institute of Technology and is concurrently pursuing a Master of Computer Science degree from Future University, with an expected completion in &lt;strong&gt;2026&lt;/strong&gt;[2][13]. His technical proficiency spans a modern web development stack, including &lt;strong&gt;Next.js, React.js, TypeScript, and Tailwind CSS&lt;/strong&gt;, complemented by backend and AI capabilities in &lt;strong&gt;Node.js, Python, and TensorFlow&lt;/strong&gt;[2]. Adak actively engages with the developer community, having participated in and completed prominent challenges such as the &lt;strong&gt;Google AI Studio Multi-Modal Challenge&lt;/strong&gt; and the &lt;strong&gt;AI Agents Challenge&lt;/strong&gt;, demonstrating a commitment to advancing practical AI applications[1]. His contributions to open-source initiatives were further recognized during Hacktoberfest 2024, solidifying his standing as a proactive and skilled technologist[13].&lt;/p&gt;

&lt;p&gt;Adak’s portfolio showcases a consistent focus on building practical, AI-integrated solutions, with projects ranging from productivity applications to e-commerce platforms[13]. This blend of strong theoretical knowledge, hands-on development experience, and active community participation forms the essential backdrop for understanding the genesis and technical ambition of his flagship projects, VocalScribe and TradeView. His work is not merely academic but is directed toward solving tangible problems for end-users, a principle that directly informs the functional and aesthetic considerations for their visual representation. The proposed image must, therefore, reflect this synthesis of deep technical capability and user-oriented design philosophy that characterizes Adak’s professional identity[2][13].&lt;/p&gt;

&lt;h3&gt;
  
  
  1.2 Overview of AI-Powered Projects: VocalScribe and TradeView
&lt;/h3&gt;

&lt;p&gt;The visual representation centers on two distinct AI-driven platforms developed by Aniruddha Adak, each addressing a unique market need. &lt;strong&gt;VocalScribe&lt;/strong&gt; is an AI-powered transcription and content generation tool designed to transform voice recordings into structured text, blog posts, and study materials[4][11]. Built with &lt;strong&gt;React, TypeScript, and Tailwind CSS&lt;/strong&gt;, the platform leverages advanced speech-to-text algorithms and natural language processing to provide features such as &lt;strong&gt;real-time transcription, noise reduction, and collaborative editing&lt;/strong&gt;[5][10]. Its mission is to empower students, professionals, and creators by streamlining the conversion of spoken ideas into polished written content, having already processed a significant volume of user audio[12].&lt;/p&gt;

&lt;p&gt;In contrast, &lt;strong&gt;TradeView&lt;/strong&gt; represents Adak’s foray into financial technology, utilizing artificial intelligence to deliver real-time market analytics and trading insights[7]. The platform incorporates sophisticated indicators such as the &lt;strong&gt;AI Bot Regime Feed&lt;/strong&gt;, which identifies market regimes and generates structured alerts for automated trading systems[7]. Furthermore, tools like the &lt;strong&gt;Sonar AI&lt;/strong&gt; Chrome extension provide an AI assistant that analyzes charts to suggest risk-balanced trading strategies and validate trade ideas in real-time[8]. These AI-driven features are integrated within a comprehensive trading interface that includes customizable UI elements like the &lt;strong&gt;Account Manager, Depth of Market, and Order Preview&lt;/strong&gt; widgets, which are designed to present complex financial data clearly and actionable[14].&lt;/p&gt;

&lt;h3&gt;
  
  
  1.3 Purpose and Objectives of the Visual Representation
&lt;/h3&gt;

&lt;p&gt;The primary purpose of the proposed visual representation is to create a compelling, single-image showcase that communicates the core value propositions and AI-driven architectures of both VocalScribe and TradeView. This image will function as a key portfolio asset, intended to visually distill Aniruddha Adak’s technical expertise and innovative approach for a diverse audience including potential collaborators, employers, and end-users[2][13]. The design must achieve a balance between &lt;strong&gt;technical accuracy&lt;/strong&gt; in depicting AI workflows and &lt;strong&gt;aesthetic appeal&lt;/strong&gt; to ensure immediate engagement and comprehension. It is not merely an illustration but a strategic communication tool designed to highlight the transformative role of AI in democratizing advanced tools for content creation and financial analysis.&lt;/p&gt;

&lt;p&gt;Consequently, the visual composition must make the AI components of each project immediately apparent through the use of appropriate metaphors and design elements, such as neural network diagrams for VocalScribe and predictive trend lines for TradeView[4][8]. The objectives extend to demonstrating the &lt;strong&gt;user experience&lt;/strong&gt; and &lt;strong&gt;interface design&lt;/strong&gt; principles embodied in both platforms, from VocalScribe’s collaborative editing features to TradeView’s customizable trading widgets[10][14]. By successfully integrating these facets, the final image will underscore Adak’s overarching mission of leveraging cutting-edge technology to build accessible, powerful tools that solve real-world problems, thereby providing a holistic view of his contributions to the field of applied artificial intelligence[12].&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Overview of the AI Projects
&lt;/h2&gt;

&lt;p&gt;The AI projects developed by Aniruddha Adak, VocalScribe and TradeView, represent sophisticated applications of artificial intelligence and machine learning technologies to distinct domains: content creation and financial market analysis[2]. These platforms are built upon his expertise in full-stack development and his focus on creating practical, user-centric tools that leverage advanced algorithms[13]. VocalScribe is designed to transform spoken language into structured written content, while TradeView provides analytical tools for interpreting complex financial data[4][7]. Together, they exemplify a commitment to applying AI to streamline workflows and enhance decision-making processes for professionals and enthusiasts alike[12].&lt;/p&gt;

&lt;h3&gt;
  
  
  2.1 VocalScribe: AI-Powered Transcription and Content Generation
&lt;/h3&gt;

&lt;p&gt;VocalScribe is a comprehensive platform that utilizes artificial intelligence to convert voice recordings into accurate text and subsequently into polished written content such as blog posts, notes, and study materials[4][11]. The core functionality revolves around an advanced speech-to-text engine capable of real-time transcription, which is augmented by natural language processing for intelligent formatting and content enhancement[5][6]. To accommodate diverse user scenarios, the platform supports multiple specialized recording modes, including &lt;strong&gt;Quick Thoughts&lt;/strong&gt; for rapid idea capture, &lt;strong&gt;Deep Dive&lt;/strong&gt; for detailed monologues, and &lt;strong&gt;Interview Mode&lt;/strong&gt; for multi-speaker dialogues[4]. This structured approach ensures the AI can contextually process audio for optimal output quality.&lt;/p&gt;

&lt;p&gt;Beyond basic transcription, VocalScribe incorporates a suite of features designed to refine the audio input and the resulting text[5]. These include &lt;strong&gt;noise reduction&lt;/strong&gt; capabilities to isolate speech from background interference and &lt;strong&gt;voice command integration&lt;/strong&gt; for hands-free control during recording sessions[5]. The editing interface provides smart formatting tools and allows users to highlight important segments within the transcript for easy reference[4]. Furthermore, the platform emphasizes collaboration, enabling multiple users to share, comment on, and edit transcripts in real-time, which facilitates team-based content creation[5][11].&lt;/p&gt;

&lt;p&gt;The technical architecture of VocalScribe is built as a modern, responsive web application utilizing a stack that includes &lt;strong&gt;React&lt;/strong&gt;, &lt;strong&gt;TypeScript&lt;/strong&gt;, and &lt;strong&gt;Tailwind CSS&lt;/strong&gt; for the frontend, ensuring a beautiful and interactive user experience[10]. For its core AI transcription service, it integrates the &lt;strong&gt;AssemblyAI API&lt;/strong&gt;, which provides robust and accurate speech recognition capabilities[10]. The application offers a fully responsive design, dark and light mode themes, and features such as animated backgrounds and social media sharing functionalities[10]. The final content can be exported in multiple formats, including PDF and DOCX, or published directly to various blogging platforms, completing an end-to-end content creation pipeline[4][11].&lt;/p&gt;

&lt;h3&gt;
  
  
  2.2 TradeView: AI-Driven Financial Market Analysis
&lt;/h3&gt;

&lt;p&gt;TradeView represents a suite of tools and indicators that apply artificial intelligence and machine learning to the domain of financial trading and market analysis[7][9]. The platform operates within the broader TradingView ecosystem, providing traders with data-driven insights, predictive analytics, and automated signal generation[8][9]. A key component is the &lt;strong&gt;AI Bot Regime Feed&lt;/strong&gt;, a stable technical indicator that analyzes multiple market data layers to identify prevailing market regimes and high-probability trading events[7]. This indicator generates structured JSON alerts in real-time, which are designed to be sent directly to external trading bots or automation systems via webhooks, enabling automated trade execution[7].&lt;/p&gt;

&lt;p&gt;Another significant AI tool available is &lt;strong&gt;Sonar AI&lt;/strong&gt;, a Chrome extension that functions as an intelligent trading assistant[8]. This assistant analyzes live price charts to identify trends and subsequently provides users with &lt;strong&gt;risk-balanced trading plans&lt;/strong&gt; that include suggested stop-loss and take-profit levels[8]. Sonar AI aggregates order book data from multiple major exchanges to present a consolidated view of market supply and demand, aiming to offer a more accurate foundation for analysis[8]. The tool is designed to help traders validate their ideas by providing statistical data to inform risk management decisions[8].&lt;/p&gt;

&lt;p&gt;The TradingView interface itself, which hosts these AI tools, is built around a highly customizable and feature-rich charting library[15]. The interface is logically divided into main components, including a top toolbar for chart settings and indicator search, the central chart pane with price and time scales, and a widget bar on the right for features like watchlists and news feeds[15][16]. Critical UI elements for trading include the &lt;strong&gt;Account Manager&lt;/strong&gt;, which displays balance, equity, and positions; the &lt;strong&gt;Depth Of Market&lt;/strong&gt; window showing buy and sell orders; and the &lt;strong&gt;Order Ticket&lt;/strong&gt; with configurations for order duration, leverage, and bracket orders[14]. This integrated environment allows the AI-derived signals and analyses to be acted upon seamlessly within a professional trading workflow[14][15].&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Visual Design Elements
&lt;/h2&gt;

&lt;p&gt;The visual representation of Aniruddha Adak's dual AI projects necessitates a design approach that distinctly communicates their core functionalities and underlying artificial intelligence paradigms. For &lt;strong&gt;VocalScribe&lt;/strong&gt;, the design must encapsulate the seamless journey from audio input to polished written content, while for &lt;strong&gt;TradeView&lt;/strong&gt;, the focus shifts to the dynamic visualization of complex financial data and predictive analytics. The overarching creative direction must balance technical accuracy with aesthetic clarity, employing visual metaphors and interface mock-ups to make the sophisticated AI processes intuitively understandable to a diverse audience. A cohesive yet differentiated visual language is essential to illustrate the unique value propositions of each platform within a single, unified composition.&lt;/p&gt;

&lt;h3&gt;
  
  
  3.1 Design Elements for VocalScribe
&lt;/h3&gt;

&lt;p&gt;The visual design for VocalScribe should center on the audio-to-text transformation pipeline, employing a logical and user-centric layout to depict its workflow. Key interface components to be represented include a prominent recording module featuring microphone icons, which aligns with the platform's support for multiple recording modes such as Quick Thoughts, Deep Dive, and Interview Mode[4]. This input stage should visually transition into a transcription and editing panel, symbolizing the real-time conversion of speech into editable text through advanced AI algorithms[4][11]. The representation of the editing interface can include elements for smart formatting and content enhancement, highlighting features like pause and resume recording, segment highlighting, and noise reduction that contribute to the polished final output[5].&lt;/p&gt;

&lt;p&gt;To fully convey the platform's capabilities, the design should incorporate visual cues for its collaborative and multi-format features. A split-view or multi-user interface element can depict the real-time collaboration functionality, allowing users to share and work together on transcripts[5]. Furthermore, export and publishing options can be symbolized through icons representing various file formats such as PDF and DOCX, as well as direct publishing pathways to external platforms[4][11]. The aesthetic should reflect the application's modern technical architecture, built with React, TypeScript, and Tailwind CSS, suggesting a clean, responsive, and interactive user interface through sleek component design and animated transitions[10]. Incorporating a dark/light mode toggle icon would further emphasize the user-centric design philosophy embedded in the platform's development[10].&lt;/p&gt;

&lt;h3&gt;
  
  
  3.2 Design Elements for TradeView
&lt;/h3&gt;

&lt;p&gt;The visual design for TradeView must prioritize data density and analytical clarity, mirroring the platform's role in financial decision-making. Central to this representation is an interactive charting area, which forms the core of the TradingView interface and serves as the canvas for displaying AI-driven market analysis[15][16]. This chart should incorporate visual elements such as AI-generated trend lines, buy/sell signals, and highlighted optimal entry zones to represent the predictive capabilities of indicators like the Ultimate AI Trading System[9]. Overlaying or adjacent to the chart, a representation of the Sonar AI assistant—potentially as a chatbot interface or a data panel—can illustrate the provision of real-time, risk-balanced trading recommendations and market analysis[8].&lt;/p&gt;

&lt;p&gt;The design must integrate key TradingView UI widgets that facilitate trade management and market insight. A critical element is the Account Manager panel, which can be visualized to display critical account information such as balance, equity, and unrealized profit and loss, reflecting its function in providing a snapshot of the user's trading state[14]. The Depth of Market (DOM) widget, which shows buy and sell order volumes at different price levels, should be represented as a separate window or panel to emphasize market liquidity analysis[14]. Furthermore, an Order Ticket interface with configurable fields for stop-loss, take-profit, and leverage can be included to depict the trade execution process and integrated risk management tools[14]. The overall style should employ a high-contrast, data-driven aesthetic with color schemes intuitively aligned with financial markets to convey precision, real-time analysis, and the fast-paced trading environment.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. AI-Driven Features Highlight
&lt;/h2&gt;

&lt;p&gt;This section details the core artificial intelligence functionalities that underpin both VocalScribe and TradeView, transforming them from conventional applications into intelligent, adaptive platforms. These features are not merely supplementary but form the foundational architecture that drives automation, enhances accuracy, and delivers predictive insights[4][7]. The integration of sophisticated machine learning and natural language processing algorithms is what distinguishes these projects, enabling them to address complex challenges in content creation and financial market analysis with remarkable efficiency[6][8]. Consequently, the visual representation must explicitly and dynamically illustrate these AI-driven processes to communicate their transformative impact effectively.&lt;/p&gt;

&lt;h3&gt;
  
  
  4.1 VocalScribe: Intelligent Transcription and Content Generation
&lt;/h3&gt;

&lt;p&gt;At the heart of VocalScribe is its advanced &lt;strong&gt;AI-powered transcription engine&lt;/strong&gt;, which leverages cutting-edge speech-to-text algorithms to convert voice recordings into accurate, structured text in real-time[4][11]. This core functionality is powered by integrating with specialized APIs like AssemblyAI, ensuring high-fidelity conversion even from diverse audio inputs such as quick thoughts, deep dives, or interviews[4][10]. The platform's AI extends beyond basic transcription to include intelligent &lt;strong&gt;noise reduction&lt;/strong&gt; and &lt;strong&gt;voice command integration&lt;/strong&gt;, which actively clean audio inputs and allow for hands-free control during the recording process[5]. This creates a seamless and efficient workflow where spoken ideas are captured with minimal interference and maximum clarity, forming the raw material for further AI enhancement.&lt;/p&gt;

&lt;p&gt;Beyond transcription, VocalScribe employs &lt;strong&gt;natural language processing (NLP) and machine learning (ML) for content enhancement and generation&lt;/strong&gt;[6][11]. The platform's AI analyzes the transcribed text to provide smart formatting suggestions, apply customizable tones and styles, and ultimately assist in generating polished blog posts or study materials[6][11]. This transforms raw transcripts into publication-ready content, significantly accelerating the creative process for bloggers, students, and professionals[12]. Features such as &lt;strong&gt;collaborative editing&lt;/strong&gt; are also facilitated by AI, enabling multiple users to interact with and refine a document simultaneously in a shared intelligent workspace[5][11]. The culmination of these features positions VocalScribe as a comprehensive AI-augmented content creation suite.&lt;/p&gt;

&lt;h3&gt;
  
  
  4.2 TradeView: Predictive Analytics and Automated Market Intelligence
&lt;/h3&gt;

&lt;p&gt;TradeView incorporates AI to deliver &lt;strong&gt;predictive analytics and real-time market regime detection&lt;/strong&gt;, providing traders with data-driven insights that go beyond traditional technical analysis[7][9]. A pivotal feature is the &lt;strong&gt;AI Bot Regime Feed&lt;/strong&gt;, a stable indicator that synthesizes multiple technical layers to identify high-probability market conditions and generate structured, real-time JSON alerts for external trading bots or automation systems[7]. This allows for the automation of trade execution based on AI-identified signals, moving from manual chart interpretation to systematic, algorithm-driven strategies. Furthermore, various AI-powered indicators available on the platform, such as the &lt;strong&gt;Machine Learning Lorencian Classification&lt;/strong&gt; and the &lt;strong&gt;Ultimate AI Trading System&lt;/strong&gt;, analyze trend direction, optimal entry zones, and calculate dynamic risk parameters like take-profit and stop-loss levels[9].&lt;/p&gt;

&lt;p&gt;Another significant AI-driven component is the &lt;strong&gt;Sonar AI assistant&lt;/strong&gt;, a Chrome extension that acts as an intelligent co-pilot for traders on the TradingView platform[8]. This tool analyzes charts in real-time to identify trends and provides &lt;strong&gt;risk-balanced trading plans&lt;/strong&gt; complete with statistical data to validate trade ideas and manage exposure[8]. Sonar AI also aggregates order book data from multiple major exchanges using AI to present a more accurate, consolidated view of market supply and demand, aiding in deeper liquidity analysis[8]. These AI features are integrated within TradeView's interface, which includes widgets like the &lt;strong&gt;Risk and reward calculator&lt;/strong&gt; that can utilize AI-derived data to help set bracket orders based on a percentage of the account balance[14]. Collectively, these functionalities underscore TradeView's role as a platform where AI enhances decision-making through predictive insights, automated signal generation, and sophisticated risk assessment.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. User Experience and Interface Representation
&lt;/h2&gt;

&lt;h3&gt;
  
  
  5.1 VocalScribe: Intuitive Audio-to-Text Workflow
&lt;/h3&gt;

&lt;p&gt;The user experience of VocalScribe is designed around a seamless, multi-stage workflow that guides the user from voice recording to polished written content. The interface initiates with a voice recording module, offering multiple tailored modes such as &lt;strong&gt;Quick Thoughts&lt;/strong&gt;, &lt;strong&gt;Deep Dive&lt;/strong&gt;, and &lt;strong&gt;Interview Mode&lt;/strong&gt; to accommodate different user scenarios and content creation needs[4]. This recording interface incorporates user-centric controls, including &lt;strong&gt;pause and resume&lt;/strong&gt; functionality and voice command integration, allowing for a flexible and non-linear capture of ideas[5]. Following recording, the platform presents a dedicated editing interface where the &lt;strong&gt;real-time transcription&lt;/strong&gt; is displayed, enabling immediate review and refinement of the AI-generated text[4]. This central workspace is the core of the user experience, facilitating the transformation of raw transcriptions into structured documents.&lt;/p&gt;

&lt;p&gt;The platform's interface further enhances productivity through smart formatting tools and collaborative features. Users can interact with the transcription through functionalities such as &lt;strong&gt;highlighting segments&lt;/strong&gt; for later focus and adding interactive comments, which fosters an iterative editing process[5][10]. The collaborative editing capability is visually represented through features that allow multiple users to share and work on documents in real-time, emphasizing the tool's utility for team-based projects[5][11]. To accommodate prolonged usage and personal preference, the interface includes a &lt;strong&gt;dark/light mode&lt;/strong&gt; toggle, reducing eye strain and providing aesthetic customization[10]. The culmination of the workflow is represented by export and publishing panels, where users can format content into various document types and publish directly to external platforms, completing the end-to-end content creation journey[4][11].&lt;/p&gt;

&lt;h3&gt;
  
  
  5.2 TradeView: Data-Centric Trading Dashboard
&lt;/h3&gt;

&lt;p&gt;The TradeView platform delivers a comprehensive user experience centered on the real-time visualization of financial data and the execution of trades through a highly customizable interface. A fundamental component of this interface is the &lt;strong&gt;Account Manager&lt;/strong&gt;, a configurable panel typically located at the bottom of the screen that provides a consolidated view of the user's trading capital and performance[14]. This manager includes an &lt;strong&gt;Account Summary Row&lt;/strong&gt; that displays critical metrics such as account balance, equity, and unrealized profit and loss, offering traders an immediate snapshot of their financial standing[14]. The interface is logically organized into distinct toolbars, with one dedicated to chart modification tools and settings, another for drawing analytical tools, and a third for market overviews, watchlists, and news feeds, creating an intuitive layout for navigation[16].&lt;/p&gt;

&lt;p&gt;For trade execution and risk management, the platform features sophisticated UI elements like the &lt;strong&gt;Order Ticket&lt;/strong&gt; and related widgets. The Order Ticket interface integrates advanced controls for setting &lt;strong&gt;stop-loss and take-profit&lt;/strong&gt; levels, which can be configured as a percentage of the account balance or in the account currency, embodying the platform's integrated risk management capabilities[14]. An &lt;strong&gt;Order Preview&lt;/strong&gt; feature allows users to review the details of a potential trade, including estimated commissions and margin requirements, without actually placing the order, thereby preventing errors and enhancing decision confidence[14]. Furthermore, the &lt;strong&gt;Depth of Market (DOM)&lt;/strong&gt; widget is represented as a dynamic window that visualizes the market's supply and demand by displaying the volume of open buy and sell orders at different price levels, providing crucial liquidity insights for trading decisions[14][8]. This aggregation of real-time data, account information, and analytical tools into a single, cohesive interface is designed to empower traders with the information necessary for informed strategy execution[15].&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Creative Direction and Style
&lt;/h2&gt;

&lt;p&gt;The creative direction for the visual representation of Aniruddha Adak's projects necessitates a bifurcated yet cohesive approach, with each platform's distinct purpose and user base informing its unique stylistic treatment[2][4][7]. The overarching objective is to create a modern and professional visual language that immediately communicates the AI-driven nature of both VocalScribe and TradeView while differentiating their core functions through targeted aesthetic choices[10][14]. This involves the strategic application of color theory, typography, and visual metaphors to symbolize creativity and analytical precision, respectively[17]. The final composition must unify these distinct styles under a shared futuristic theme that underscores Adak's expertise in developing advanced AI systems[1][13].&lt;/p&gt;

&lt;h3&gt;
  
  
  6.1 VocalScribe: Minimalist and Creative Aesthetics
&lt;/h3&gt;

&lt;p&gt;The creative direction for VocalScribe should embody a clean, minimalist, and accessible design philosophy to reflect its mission of empowering users to transform spoken ideas into written content[12]. A palette of soft gradients and pastel tones is recommended to evoke a sense of creativity, clarity, and user-friendliness, aligning with the platform's goal of simplifying content creation[6][17]. The interface's inherent features, such as dark/light mode toggle and animated backgrounds, provide a foundation for visual representation that suggests adaptability and a smooth, engaging user experience[10]. To visually denote its AI-powered core, subtle dynamic elements like floating text particles or ethereal neural network patterns can be integrated into the design, symbolizing real-time transcription and intelligent content generation[4][11]. This combination of a serene color scheme, clean layouts, and hint of algorithmic animation creates an aesthetic that is both inviting and technologically sophisticated, appealing directly to its target audience of students, creators, and professionals[5][12].&lt;/p&gt;

&lt;h3&gt;
  
  
  6.2 TradeView: Data-Driven and Analytical Precision
&lt;/h3&gt;

&lt;p&gt;In stark contrast, the creative direction for TradeView must adopt a high-contrast, data-intensive visual style that mirrors the analytical and fast-paced nature of financial markets[14][16]. The use of geometric shapes, sharp lines, and a structured layout is essential to represent the platform's complex analytical widgets and charting tools[15]. Color application should be strictly functional, employing established financial conventions such as green to signify gains and red to indicate losses, thereby ensuring immediate data readability and reinforcing the platform's purpose[16][17]. Glowing indicators on charts and within UI elements like the Account Manager can be used to create a sense of urgency, real-time activity, and algorithmic precision, highlighting features like the AI Bot Regime Feed and Sonar AI assistant[7][8][9]. This aesthetic, centered on clarity, contrast, and dynamic visual data, projects an image of reliability and cutting-edge analytical power tailored for traders and investors[14][15].&lt;/p&gt;

&lt;h3&gt;
  
  
  6.3 Cohesive Visual Integration
&lt;/h3&gt;

&lt;p&gt;Achieving a unified visual representation of both projects requires a masterful integration of their divergent styles under a consistent design framework[2][13]. This can be accomplished through the use of shared typography, a harmonized iconography system, and a balanced layout that provides clear visual separation between the two platforms while maintaining an overall compositional harmony[17]. The futuristic theme, suggested by integrated circuit patterns or abstract data streams in the background, serves as a common thread that links both sides of the image, continuously reminding the viewer of the underlying AI architecture powering each project[1][3]. The central positioning of Aniruddha Adak's stylized representation further solidifies this cohesion, establishing him as the unifying architect behind both the creative utility of VocalScribe and the analytical rigor of TradeView[13]. Ultimately, this integrated creative direction ensures the image communicates a dual narrative of innovative AI application across different domains without visual confusion, effectively showcasing the breadth of Adak's technical expertise[2][10].&lt;/p&gt;

&lt;h2&gt;
  
  
  7. Final Image Composition
&lt;/h2&gt;

&lt;h3&gt;
  
  
  7.1 Overall Layout and Central Figure
&lt;/h3&gt;

&lt;p&gt;The proposed final image will employ a definitive split-screen layout to create a clear visual dichotomy between the two distinct AI projects[10][14]. The &lt;strong&gt;VocalScribe&lt;/strong&gt; platform will be positioned on the left side of the composition, while the &lt;strong&gt;TradeView&lt;/strong&gt; platform will occupy the right side, establishing a balanced and organized presentation of their respective domains[11][16]. Centrally positioned between these two halves will be a representation of Aniruddha Adak, symbolizing his foundational role as the architect and developer behind both innovative systems[2][13]. This central figure can be rendered as a stylized silhouette or avatar, integrated with subtle background elements such as circuit patterns or flowing data streams to reinforce the overarching theme of AI and technical expertise[3]. The composition must maintain a cohesive visual language across both sides, utilizing consistent typography and a harmonious color palette to unify the representation of Adak’s portfolio while ensuring each project’s unique identity remains visually distinct[17].&lt;/p&gt;

&lt;h3&gt;
  
  
  7.2 Visual Depiction of the VocalScribe Workflow
&lt;/h3&gt;

&lt;p&gt;The left side of the image, dedicated to VocalScribe, will visually narrate the platform’s core audio-to-text transformation pipeline[4][11]. The depiction will begin with an iconographic representation of voice input, such as a microphone, associated with the platform’s multiple recording modes including &lt;strong&gt;Quick Thoughts&lt;/strong&gt;, &lt;strong&gt;Deep Dive&lt;/strong&gt;, and &lt;strong&gt;Interview Mode&lt;/strong&gt;[4][5]. This input element will transition visually into a neural network or processing diagram, symbolizing the advanced speech-to-text and natural language processing AI that powers the real-time transcription and &lt;strong&gt;AI-enhanced content editing&lt;/strong&gt;[6][10]. The final stage of this workflow will be represented by a polished blog post or document layout, showcasing the output of the platform’s content generation capabilities[11]. Supplementary visual elements can include a waveform with filtering effects to denote &lt;strong&gt;noise reduction&lt;/strong&gt;, and a split-screen view showing multiple cursors to indicate the &lt;strong&gt;real-time collaborative editing&lt;/strong&gt; features that allow users to share and work together[5][10]. The interface details, such as a dark/light mode toggle and social media sharing icons, should be included to accurately reflect the &lt;strong&gt;user-centric design&lt;/strong&gt; and interactive functionality of the actual application[10][12].&lt;/p&gt;

&lt;h3&gt;
  
  
  7.3 Visual Depiction of the TradeView Analytics Platform
&lt;/h3&gt;

&lt;p&gt;The right side of the composition will focus on TradeView, emphasizing its role in financial market analysis through data visualization and AI-driven tools[7][15]. The central visual component will be a dynamic stock chart featuring AI-generated trend lines and predictive markers, illustrating the platform’s capacity for real-time technical analysis[9]. Integrated with this chart should be representations of key AI indicators and tools, such as the &lt;strong&gt;AI Bot Regime Feed&lt;/strong&gt; which identifies market regimes and generates structured alerts, and the &lt;strong&gt;Sonar AI&lt;/strong&gt; assistant which provides risk-balanced trading recommendations[7][8]. The image should also incorporate depictions of essential TradingView UI elements and widgets that facilitate user decision-making, such as the &lt;strong&gt;Account Manager&lt;/strong&gt; panel displaying balance and equity, the &lt;strong&gt;Depth Of Market&lt;/strong&gt; window, and an &lt;strong&gt;Order Preview&lt;/strong&gt; interface with &lt;strong&gt;Risk and reward&lt;/strong&gt; calculators[14]. To convey the comprehensive nature of the trading interface, visual references to the top toolbar for modifying charts and searching for indicators, as well as the widget bar for watchlists and news, can be included in the background or periphery of the TradeView section[15][16]. This aggregation of analytical charts, AI tools, and functional widgets will collectively underscore TradeView’s identity as a sophisticated, AI-powered platform for financial strategy and execution.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The proposed visual representation serves as a culminating synthesis of Aniruddha Adak's technical proficiency and innovative vision, effectively encapsulating the transformative potential of his AI-driven projects, VocalScribe and TradeView. By meticulously integrating specific AI functionalities, user-centric design principles, and a coherent aesthetic strategy, the image transcends mere graphical depiction to become a narrative of technological empowerment. It communicates Adak's foundational expertise as an &lt;strong&gt;AI Agent Engineer&lt;/strong&gt; and &lt;strong&gt;full-stack developer&lt;/strong&gt;, skills that underpin the creation of both platforms[1][2]. The design acts as a professional portfolio centerpiece, designed to appeal to a diverse audience of creators, professionals, and traders by making complex AI operations intuitively understandable and visually engaging. This concluding section will delineate how the visual composition successfully consolidates the core themes of AI innovation, practical application, and design intentionality that define Adak's work.&lt;/p&gt;

&lt;h3&gt;
  
  
  8.1 Synthesis of Technical Expertise and AI Innovation
&lt;/h3&gt;

&lt;p&gt;The visual representation fundamentally succeeds in distilling Aniruddha Adak's multifaceted technical background into a single, coherent image. His foundational skills in &lt;strong&gt;React, TypeScript, and Tailwind CSS&lt;/strong&gt;, which form the bedrock of the VocalScribe application, are implicitly represented through the clean, modern interface elements depicted for that project[2][10]. Concurrently, the depiction of TradeView's analytical widgets and charts reflects an understanding of financial data systems and real-time processing, areas demanding robust backend and integration capabilities. Central to this synthesis is the prominent highlighting of &lt;strong&gt;AI and machine learning&lt;/strong&gt; as the unifying thread, from speech-to-text algorithms to predictive market analytics. The image positions Adak not merely as a developer but as an innovator leveraging AI to build self-directed systems and solve domain-specific problems, a focus evident in his participation in AI-centric challenges and development of agent-based engineering concepts[1]. This visual narrative of expertise reinforces his academic credentials and practical achievements, creating a compelling professional identity.&lt;/p&gt;

&lt;p&gt;Furthermore, the composition strategically bridges two distinct application domains—content creation and financial trading—through the common language of AI. This duality showcases Adak's capacity to apply similar technological principles, such as real-time data processing and pattern recognition, to vastly different user needs and industrial contexts. The representation avoids a generic portrayal of AI by grounding it in the tangible, cited features of each platform: VocalScribe's &lt;strong&gt;real-time transcription&lt;/strong&gt; and &lt;strong&gt;noise reduction&lt;/strong&gt; versus TradeView's &lt;strong&gt;regime detection&lt;/strong&gt; and &lt;strong&gt;structured JSON alerts&lt;/strong&gt;[4][5][7]. By doing so, it validates the projects as serious, functional tools rather than conceptual prototypes. The inclusion of collaborative features for VocalScribe and risk-management widgets for TradeView further underscores a deep consideration for real-world workflow integration, reflecting development informed by user experience and practical utility[5][14].&lt;/p&gt;

&lt;h3&gt;
  
  
  8.2 Core AI Capabilities and User Impact
&lt;/h3&gt;

&lt;p&gt;A paramount achievement of the visual design is its effective communication of the specific AI capabilities that define each project's value proposition. For VocalScribe, the image transitions the viewer through the &lt;strong&gt;AI-powered pipeline&lt;/strong&gt; from voice recording to polished written content, explicitly showcasing features like multiple recording modes (&lt;strong&gt;Quick Thoughts, Deep Dive, Interview Mode&lt;/strong&gt;), the editable transcription interface, and final publishing options[4][11]. This visual journey makes concrete the platform's mission to empower users by transforming over &lt;strong&gt;10 million minutes of audio&lt;/strong&gt; into structured notes and blog posts, highlighting its documented impact on students, professionals, and creators[12]. The representation of AI here is not abstract but tied directly to user actions—recording, editing, sharing—thereby demystifying the technology and presenting it as an accessible productivity enhancer.&lt;/p&gt;

&lt;p&gt;For TradeView, the visual emphasis on &lt;strong&gt;predictive analytics and automated decision-support&lt;/strong&gt; tools translates complex algorithmic outputs into comprehensible visual cues. The depiction of indicators such as the &lt;strong&gt;AI Bot Regime Feed&lt;/strong&gt;, which generates &lt;strong&gt;real-time, structured JSON alerts&lt;/strong&gt; for automated trading systems, and the &lt;strong&gt;Sonar AI assistant&lt;/strong&gt;, which provides &lt;strong&gt;risk-balanced stop-loss/take-profit plans&lt;/strong&gt;, concretizes the platform's analytical sophistication[7][8]. By illustrating elements like the &lt;strong&gt;Depth Of Market&lt;/strong&gt; widget and &lt;strong&gt;Order preview&lt;/strong&gt; functionality, the image connects AI-driven analysis to executable trading actions, emphasizing the platform's role in enhancing market decision-making[14]. This segment of the visual communicates that TradeView's AI is not for passive observation but for active strategy formulation and risk management, appealing directly to traders' needs for precision and actionable intelligence in volatile markets.&lt;/p&gt;

&lt;h3&gt;
  
  
  8.3 Visual Design as a Conduit for Technical Narrative
&lt;/h3&gt;

&lt;p&gt;The final composition's efficacy hinges on its deliberate application of visual design principles to narrate technical functionality. The adopted &lt;strong&gt;split-screen layout&lt;/strong&gt; with a central focal point on Aniruddha Adak creates a balanced, logical structure that allows for direct comparison and contrast between the two projects' domains. The use of &lt;strong&gt;color harmonies and schemes&lt;/strong&gt;—potentially employing analogous or complementary palettes as derived from color theory principles—serves a functional purpose beyond aesthetics[17]. For instance, employing a &lt;strong&gt;clean, minimalist palette&lt;/strong&gt; for VocalScribe evokes creativity and clarity, while utilizing &lt;strong&gt;high-contrast, market-aligned colors&lt;/strong&gt; (e.g., greens and reds) for TradeView instantly conveys financial data semantics and urgency. This intentional use of color visually codes the different user experiences and cognitive modes associated with content creation versus financial trading.&lt;/p&gt;

&lt;p&gt;Moreover, the incorporation of specific &lt;strong&gt;UI elements and widgets&lt;/strong&gt;, accurately modeled on the TradingView charting library and VocalScribe's own interface, ensures the representation maintains authenticity and functional credibility[14][15][10]. Elements like the &lt;strong&gt;Account Summary Row&lt;/strong&gt;, &lt;strong&gt;toolbars&lt;/strong&gt; for chart analysis, and the &lt;strong&gt;interactive transcription editor&lt;/strong&gt; are not generic placeholders but referents to actual, documented features that users engage with[14][16]. The &lt;strong&gt;futuristic yet professional aesthetic&lt;/strong&gt;, achieved through subtle animations and geometric shapes, successfully symbolizes the cutting-edge nature of the underlying AI technologies without sacrificing clarity. Ultimately, this thoughtful design synthesis ensures the image fulfills its primary objective: to serve as an immediate, engaging, and informative showcase that articulates the sophistication, utility, and human-centric design of Aniruddha Adak's AI-powered contributions.&lt;/p&gt;

&lt;p&gt;Citations:&lt;br&gt;
[2] &lt;a href="https://dev.to/ha3k/my-aniruddha-adak-bio-4h9g"&gt;https://dev.to/ha3k/my-aniruddha-adak-bio-4h9g&lt;/a&gt;&lt;br&gt;
[13] &lt;a href="https://aniruddhaadak.tech" rel="noopener noreferrer"&gt;https://aniruddhaadak.tech&lt;/a&gt;&lt;br&gt;
[1] &lt;a href="https://dev.to/aniruddhaadak"&gt;https://dev.to/aniruddhaadak&lt;/a&gt;&lt;br&gt;
[4] &lt;a href="https://vocalscribe.xyz/how-it-works" rel="noopener noreferrer"&gt;https://vocalscribe.xyz/how-it-works&lt;/a&gt;&lt;br&gt;
[6] &lt;a href="https://creati.ai/ai-tools/vocalscribe/" rel="noopener noreferrer"&gt;https://creati.ai/ai-tools/vocalscribe/&lt;/a&gt;&lt;br&gt;
[5] &lt;a href="https://vocalscribe.xyz/features" rel="noopener noreferrer"&gt;https://vocalscribe.xyz/features&lt;/a&gt;&lt;br&gt;
[11] &lt;a href="https://www.youtube.com/watch?v=P1fnfXn0njk" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=P1fnfXn0njk&lt;/a&gt;&lt;br&gt;
[10] &lt;a href="https://github.com/AniruddhaAdak/VocalScribe" rel="noopener noreferrer"&gt;https://github.com/AniruddhaAdak/VocalScribe&lt;/a&gt;&lt;br&gt;
[12] &lt;a href="https://vocalscribe.xyz/about-us" rel="noopener noreferrer"&gt;https://vocalscribe.xyz/about-us&lt;/a&gt;&lt;br&gt;
[7] &lt;a href="https://in.tradingview.com/scripts/ai/" rel="noopener noreferrer"&gt;https://in.tradingview.com/scripts/ai/&lt;/a&gt;&lt;br&gt;
[8] &lt;a href="https://www.youtube.com/watch?v=EYdPnYnu1eA" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=EYdPnYnu1eA&lt;/a&gt;&lt;br&gt;
[14] &lt;a href="https://www.tradingview.com/broker-api-docs/trading/ui-elements/" rel="noopener noreferrer"&gt;https://www.tradingview.com/broker-api-docs/trading/ui-elements/&lt;/a&gt;&lt;br&gt;
[15] &lt;a href="https://www.tradingview.com/charting-library-docs/latest/ui_elements/" rel="noopener noreferrer"&gt;https://www.tradingview.com/charting-library-docs/latest/ui_elements/&lt;/a&gt;&lt;br&gt;
[17] &lt;a href="https://sunywcc2ddesign.com/project-8-color-schemes" rel="noopener noreferrer"&gt;https://sunywcc2ddesign.com/project-8-color-schemes&lt;/a&gt;&lt;br&gt;
[9] &lt;a href="https://www.youtube.com/watch?v=FRrgk0XToME" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=FRrgk0XToME&lt;/a&gt;&lt;br&gt;
[16] &lt;a href="https://blackbull.com/en/support/beginners-guide-to-the-tradingview-interface/" rel="noopener noreferrer"&gt;https://blackbull.com/en/support/beginners-guide-to-the-tradingview-interface/&lt;/a&gt;&lt;br&gt;
[3] &lt;a href="https://aniruddhaadak.devdojo.com" rel="noopener noreferrer"&gt;https://aniruddhaadak.devdojo.com&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>ai</category>
      <category>tutorial</category>
      <category>bio</category>
    </item>
    <item>
      <title>Turn Your README into a Movie: Building a 3D Site Generator with Copilot CLI</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Mon, 16 Feb 2026 07:43:26 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/turn-your-readme-into-a-movie-building-a-3d-site-generator-with-copilot-cli-170j</link>
      <guid>https://dev.to/aniruddhaadak/turn-your-readme-into-a-movie-building-a-3d-site-generator-with-copilot-cli-170j</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/github-2026-01-21"&gt;GitHub Copilot CLI Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Lumina-CLI&lt;/strong&gt; is a developer tool that turns your boring README into a &lt;strong&gt;cinematic 3D landing page&lt;/strong&gt; right from your terminal.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvdq11jhixjpt3bb1n9yp.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvdq11jhixjpt3bb1n9yp.gif" alt="ImaDemoion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Documentation is important, but building a landing page for every side project is tedious. Lumina solves this by analyzing your project structure (&lt;code&gt;README.md&lt;/code&gt; and &lt;code&gt;package.json&lt;/code&gt;) and generating a stunning, animated 3D website using &lt;strong&gt;Three.js&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;All with zero configuration. Just run &lt;code&gt;lumina&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Here is the generated output for my other project, &lt;code&gt;CloudCost-CLI&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://drive.google.com/file/d/1yN040jbue47FA4zmqLF_UKu6QPruB6Yp/view?usp=sharing" rel="noopener noreferrer"&gt;🎥 Watch the Demo Video&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;or,&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2FLumina-CLI%2Fraw%2Fmain%2FDemo.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2FLumina-CLI%2Fraw%2Fmain%2FDemo.gif" alt="Image descr==ption"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;and,&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Visit the GitHub repo: &lt;br&gt;


&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/Lumina-CLI" rel="noopener noreferrer"&gt;
        Lumina-CLI
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Lumina-CLI is a developer tool that turns your boring README into a cinematic 3D landing page right from your terminal.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;Lumina CLI&lt;/h1&gt;

&lt;/div&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://github.com/aniruddhaadak80/Lumina-CLI/Demo.gif"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Faniruddhaadak80%2FLumina-CLI%2FDemo.gif" alt="Demo"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Generate &lt;strong&gt;cinematic&lt;/strong&gt; landing pages from your terminal.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Lumina CLI analyzes your project and builds a stunning 3D website automatically.&lt;/p&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Features&lt;/h2&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;⚡ &lt;strong&gt;Zero Config&lt;/strong&gt;: Just run &lt;code&gt;lumina&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;🎨 &lt;strong&gt;Cinematic Themes&lt;/strong&gt;: High-end 3D visuals&lt;/li&gt;
&lt;li&gt;🚀 &lt;strong&gt;Fast&lt;/strong&gt;: Generates static HTML in milliseconds&lt;/li&gt;
&lt;li&gt;🔮 &lt;strong&gt;AI-Powered&lt;/strong&gt;: Smart analysis of your project structure&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Installation&lt;/h2&gt;

&lt;/div&gt;

&lt;p&gt;```bash
npm install -g lumina-cli
```&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Usage&lt;/h2&gt;

&lt;/div&gt;
&lt;p&gt;Navigate to your project folder and run:.
```bash
lumina
```&lt;/p&gt;
&lt;/div&gt;



&lt;/div&gt;
&lt;br&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/Lumina-CLI" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;br&gt;
&lt;/div&gt;




&lt;br&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  How It Works
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Scan&lt;/strong&gt;: Reads &lt;code&gt;README.md&lt;/code&gt; (Markdown) and &lt;code&gt;package.json&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Analyze&lt;/strong&gt;: Extracts title, description, features, and tech stack.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Synthesize&lt;/strong&gt;: Injects this data into a &lt;code&gt;cinematic.js&lt;/code&gt; template powered by Three.js.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Render&lt;/strong&gt;: Outputs a static HTML file ready to deploy.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  My Experience with GitHub Copilot CLI
&lt;/h2&gt;

&lt;p&gt;I used &lt;strong&gt;GitHub Copilot CLI&lt;/strong&gt; to help me bridge the gap between Node.js logic and 3D graphics.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Three.js Integration
&lt;/h3&gt;

&lt;p&gt;I'm not a 3D expert. I asked Copilot:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Create a Three.js scene with floating particles and a rotating icosahedron for a background effect"&lt;br&gt;
Copilot generated the entire &lt;code&gt;animate()&lt;/code&gt; loop and particle system code for the template. I just had to tweak the colors to match the "Cyberpunk" theme.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  2. Markdown Parsing
&lt;/h3&gt;

&lt;p&gt;For the analyzer, I needed to parse the README creatively. I asked:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"How to use marked lexer to extract the first h1 and the feature list from markdown?"&lt;br&gt;
Copilot provided the exact AST traversal logic to pluck out the relevant metadata without needing a complex regex mess.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  3. CLI UX
&lt;/h3&gt;

&lt;p&gt;Copilot suggested using &lt;code&gt;gradient-string&lt;/code&gt; and &lt;code&gt;boxen&lt;/code&gt; to make the CLI output look as premium as the website it generates. It even laid out the welcome message logic.&lt;/p&gt;

&lt;p&gt;Lumina-CLI wouldn't have the same "wow" factor without Copilot's help in polishing both the visual output and the terminal experience.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Generated with ❤️ by &lt;a href="https://dev.to/aniruddhaadak"&gt;Aniruddha&lt;/a&gt; using Lumina-CLI&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>githubchallenge</category>
      <category>cli</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>I Asked Google Antigravity to Build My "Perfect" Portfolio (It Went a Little Crazy... 56 Sections Crazy 🤯)</title>
      <dc:creator>ANIRUDDHA  ADAK</dc:creator>
      <pubDate>Mon, 09 Feb 2026 21:29:10 +0000</pubDate>
      <link>https://dev.to/aniruddhaadak/i-asked-google-antigravity-to-build-my-perfect-portfolio-it-went-a-little-crazy-56-sections-5fno</link>
      <guid>https://dev.to/aniruddhaadak/i-asked-google-antigravity-to-build-my-perfect-portfolio-it-went-a-little-crazy-56-sections-5fno</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/new-year-new-you-google-ai-2025-12-31"&gt;New Year, New You Portfolio Challenge Presented by Google AI&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  About Me
&lt;/h2&gt;

&lt;p&gt;I'm Aniruddha Adak, an AI Agent Engineer specializing in building autonomous systems that can think, plan, and execute. I believe in the power of &lt;strong&gt;"Authentic Imperfection"&lt;/strong&gt;, that technology should feel human, approachable, and creative, not just sterile and efficient.&lt;/p&gt;

&lt;p&gt;With this portfolio, I wanted to break away from the standard corporate sleekness and create something that reflects my rigorous engineering background wrapped in a playful, hand-drawn aesthetic. It's a digital sketchbook where my code comes to life.&lt;/p&gt;

&lt;h2&gt;
  
  
  Portfolio
&lt;/h2&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://aniruddhaadak80.github.io/my-hand-drawn-portfolio-for-2026/" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;aniruddhaadak80.github.io&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Live Demo:&lt;/strong&gt; &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div&gt;
  &lt;iframe src="https://loom.com/embed/8b5458ebb1e44a478cb202424842bc72"&gt;
  &lt;/iframe&gt;
&lt;/div&gt;




&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://my-hand-drawn-portfolio-for-2026.vercel.app/" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;my-hand-drawn-portfolio-for-2026.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Source Code:&lt;/strong&gt; &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/aniruddhaadak80" rel="noopener noreferrer"&gt;
        aniruddhaadak80
      &lt;/a&gt; / &lt;a href="https://github.com/aniruddhaadak80/my-hand-drawn-portfolio-for-2026" rel="noopener noreferrer"&gt;
        my-hand-drawn-portfolio-for-2026
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      I am Aniruddha Adak | AI Agent Engineer - and this is my hand-drawn portfolio for 2026.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;Hand-Drawn AI Engineer Portfolio&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;&lt;strong&gt;A unique, hand-drawn style portfolio for an AI Agent Engineer, built with HTML, CSS, and Vanilla JavaScript.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;I am Aniruddha Adak | AI Agent Engineer - and this is my hand-drawn portfolio for 2026.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h3 class="heading-element"&gt;The Journey Through 56 Sections&lt;/h3&gt;
&lt;/div&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;1. Hero Section&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/fc06bfb0ae45f1c8c33bdb9df74956be2c2fc01fa18040141f907e389cc225f2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d69707369616267676a717174397739396c75742e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/fc06bfb0ae45f1c8c33bdb9df74956be2c2fc01fa18040141f907e389cc225f2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d69707369616267676a717174397739396c75742e706e67" alt="Imagption"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;2. Quotes That Inspire Me&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/d248977526b993e625b9352291bc098e34cd2546ea6927541ae3b37449a1caf9/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f3166336764347763786a66616f646f6c6f6933792e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/d248977526b993e625b9352291bc098e34cd2546ea6927541ae3b37449a1caf9/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f3166336764347763786a66616f646f6c6f6933792e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;3. About Me&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/8d28e4c7f86e99a00ebb9e1c41a8588cfbc8fc1d300ce620cf9a10cd745570f0/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f736868647076326f7470696f353762316b7166722e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/8d28e4c7f86e99a00ebb9e1c41a8588cfbc8fc1d300ce620cf9a10cd745570f0/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f736868647076326f7470696f353762316b7166722e706e67" alt="Image iption"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;4. Mission Statement&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/db081ab988c2a05b57b2b5ff6110f10268ed8fb022e24543b4974abf42fb482d/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7a6d74637572326d78383732666b703830716b392e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/db081ab988c2a05b57b2b5ff6110f10268ed8fb022e24543b4974abf42fb482d/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7a6d74637572326d78383732666b703830716b392e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;5. Metrics Counter&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/716692a996712cdac49a954af70d101c4e7e488ed1ff9e6c8c39aaa3c74827a7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d6f7776396f6874776937616f303733726e64682e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/716692a996712cdac49a954af70d101c4e7e488ed1ff9e6c8c39aaa3c74827a7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d6f7776396f6874776937616f303733726e64682e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;6. Technical Skills&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/546bde14cb8d29e722b25f444221ecd4abae4768ea67083ef8e2b42944405e4c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f68716e72736c6e68766c67307a626175317479332e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/546bde14cb8d29e722b25f444221ecd4abae4768ea67083ef8e2b42944405e4c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f68716e72736c6e68766c67307a626175317479332e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/6cd2aace7709adc2c6ec921435c1b00b5d393d827214c3a93470dff446050fb2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f683962397964726c6974677762726769686b737a2e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/6cd2aace7709adc2c6ec921435c1b00b5d393d827214c3a93470dff446050fb2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f683962397964726c6974677762726769686b737a2e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;7. Proficiency Bars&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/52b8cb68c06961290d02931f613611dfd29b9edfe6f25076f3e2b11c3a59eaed/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f637a6a6c76326c766969367632636133643167762e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/52b8cb68c06961290d02931f613611dfd29b9edfe6f25076f3e2b11c3a59eaed/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f637a6a6c76326c766969367632636133643167762e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;8. Education&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/6cd581ff54e63ef437c27fe5aa44890345ab1fda9185ca667fd2312a17ac20af/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7774687777333930786c7961716575346d7537752e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/6cd581ff54e63ef437c27fe5aa44890345ab1fda9185ca667fd2312a17ac20af/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7774687777333930786c7961716575346d7537752e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;9. Services&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/7dc4778bb4ca4ea14ff9152bc273894ed7c225cc3f21f532c6cf27fd895903ca/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6767716c7864767379676d6c6e7732706e7163662e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/7dc4778bb4ca4ea14ff9152bc273894ed7c225cc3f21f532c6cf27fd895903ca/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6767716c7864767379676d6c6e7732706e7163662e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/3d5f9776b200353a214197b19fdaef6251a6dce117c94412079467e955adcaa2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f677162716a6a626f653577793763647a346733392e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/3d5f9776b200353a214197b19fdaef6251a6dce117c94412079467e955adcaa2/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f677162716a6a626f653577793763647a346733392e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;10. Experience Timeline&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/89d3f619ec8304c0f995b151a2caf9690ef08bac53bba658ceecb35a475768a7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f666f346a6f706371786573616c35346c3531376c2e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/89d3f619ec8304c0f995b151a2caf9690ef08bac53bba658ceecb35a475768a7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f666f346a6f706371786573616c35346c3531376c2e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/faee97694abc00b8d3f3fb1fc5445de662a978208cb57cb32d64dd94f81720e7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f62737236796868303864683865306c7a63656f622e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/faee97694abc00b8d3f3fb1fc5445de662a978208cb57cb32d64dd94f81720e7/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f62737236796868303864683865306c7a63656f622e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;11. Selected Projects&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/3b408c2bbef2e34536a76918eb76d4e4f00ad70cf07cfed967860bedfc326687/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f636e6f6f31663739706e3974647673797873656a2e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/3b408c2bbef2e34536a76918eb76d4e4f00ad70cf07cfed967860bedfc326687/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f636e6f6f31663739706e3974647673797873656a2e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/beb5b6df3d6dcb0da71b3f76172c368c9e448f3df67d365ef4a29ddcb1357a7b/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f75307a79336377666379376d6d7a6264346b66312e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/beb5b6df3d6dcb0da71b3f76172c368c9e448f3df67d365ef4a29ddcb1357a7b/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f75307a79336377666379376d6d7a6264346b66312e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/14a0037dd477134849bc0390689e241d3cf3de00d92100bd3b657264e44eca84/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6135316b6f336a6f30746973663168756d6b7a692e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/14a0037dd477134849bc0390689e241d3cf3de00d92100bd3b657264e44eca84/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6135316b6f336a6f30746973663168756d6b7a692e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;12. Testimonials&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/78033f73cecd7dca578b39f26cedcfab02a7ff654a989a5d1c88a64ee6a6f7b9/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f303470686875733762316330797a61626d726e722e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/78033f73cecd7dca578b39f26cedcfab02a7ff654a989a5d1c88a64ee6a6f7b9/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f303470686875733762316330797a61626d726e722e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;13. Blog Posts&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/a6ea426107edaca15b2831fcd01bda06b9685cc09933e57307172f1ee2b965c5/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7436346138733634776d77346b6b796f6d6870312e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/a6ea426107edaca15b2831fcd01bda06b9685cc09933e57307172f1ee2b965c5/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7436346138733634776d77346b6b796f6d6870312e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;14. Fun Facts&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/968275711438d170e46fa46ece355cbbf48707b1c35ff7268f515d62fd719542/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d6c347a366379726836737a6234643779626a382e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/968275711438d170e46fa46ece355cbbf48707b1c35ff7268f515d62fd719542/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6d6c347a366379726836737a6234643779626a382e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;15. GitHub Stats&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/110654594a526f65fb980deba18b7ade4ab9f7cd449ce3157ef273499d1f1a0c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f337a7930717167677074656b6a73397332656e782e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/110654594a526f65fb980deba18b7ade4ab9f7cd449ce3157ef273499d1f1a0c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f337a7930717167677074656b6a73397332656e782e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;16. Contact &amp;amp; Social Footer&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/6e8c1d09fdab01750c880416eec2b18b2c3d65dfec8b547d542d9be9c788e448/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7139716f796c3530626165396d757972363271622e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/6e8c1d09fdab01750c880416eec2b18b2c3d65dfec8b547d542d9be9c788e448/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f7139716f796c3530626165396d757972363271622e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/ac7daf9275168e2bf5dab7bddd5e857eded45357d9e0684044472f8d0a96b73c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6a30697967356f6571766763736a666e6b3479692e706e67"&gt;&lt;img src="https://camo.githubusercontent.com/ac7daf9275168e2bf5dab7bddd5e857eded45357d9e0684044472f8d0a96b73c/68747470733a2f2f6465762d746f2d75706c6f6164732e73332e616d617a6f6e6177732e636f6d2f75706c6f6164732f61727469636c65732f6a30697967356f6571766763736a666e6b3479692e706e67" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;(...and 40+ more sections including Travel, Music, Goals, and Code Snippets!)&lt;/em&gt;&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;✨ Features&lt;/h2&gt;
&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;50+ Unique Sections&lt;/strong&gt;: Comprehensive details about skills, projects, experience, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hand-Drawn Aesthetic&lt;/strong&gt;: Custom CSS for wobbly borders, tape effects, and paper textures.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;15+ Scroll Animations&lt;/strong&gt;: Dynamic entrance effects including slide, bounce, rotate, and scale.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Interactive Elements&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Typewriter effect&lt;/li&gt;
&lt;li&gt;Drag-to-scroll skills&lt;/li&gt;
&lt;li&gt;…&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/aniruddhaadak80/my-hand-drawn-portfolio-for-2026" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;




&lt;h3&gt;
  
  
  The Journey Through 56 Sections
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;1. Hero Section&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmipsiabggjqqt9w99lut.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmipsiabggjqqt9w99lut.png" alt="Imagption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;2. Quotes That Inspire Me&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1f3gd4wcxjfaodoloi3y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1f3gd4wcxjfaodoloi3y.png" alt="Image scription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;3. About Me&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshhdpv2otpio57b1kqfr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshhdpv2otpio57b1kqfr.png" alt="Image iption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;4. Mission Statement&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzmtcur2mx872fkp80qk9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzmtcur2mx872fkp80qk9.png" alt="Image escription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;5. Metrics Counter&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmowv9ohtwi7ao073rndh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmowv9ohtwi7ao073rndh.png" alt="Image escription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;6. Technical Skills&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqnrslnhvlg0zbau1ty3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqnrslnhvlg0zbau1ty3.png" alt="Image descripion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh9b9ydrlitgwbrgihksz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh9b9ydrlitgwbrgihksz.png" alt="Image desription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;7. Proficiency Bars&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczjlv2lvii6v2ca3d1gv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczjlv2lvii6v2ca3d1gv.png" alt="Image descrption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;8. Education&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwthww390xlyaqeu4mu7u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwthww390xlyaqeu4mu7u.png" alt="Image desription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;9. Services&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fggqlxdvsygmlnw2pnqcf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fggqlxdvsygmlnw2pnqcf.png" alt="Image desription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqbqjjboe5wy7cdz4g39.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqbqjjboe5wy7cdz4g39.png" alt="Image desciption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;10. Experience Timeline&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo4jopcqxesal54l517l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo4jopcqxesal54l517l.png" alt="Image desription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbsr6yhh08dh8e0lzceob.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbsr6yhh08dh8e0lzceob.png" alt="Image escription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;11. Selected Projects&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcnoo1f79pn9tdvsyxsej.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcnoo1f79pn9tdvsyxsej.png" alt="Image dcription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu0zy3cwfcy7mmzbd4kf1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu0zy3cwfcy7mmzbd4kf1.png" alt="Image cription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa51ko3jo0tisf1humkzi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa51ko3jo0tisf1humkzi.png" alt="Ima description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;12. Testimonials&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04phhus7b1c0yzabmrnr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04phhus7b1c0yzabmrnr.png" alt="Imagescription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;13. Blog Posts&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft64a8s64wmw4kkyomhp1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft64a8s64wmw4kkyomhp1.png" alt="Imagcription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;14. Fun Facts&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fml4z6cyrh6szb4d7ybj8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fml4z6cyrh6szb4d7ybj8.png" alt="Image deion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;15. GitHub Stats&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zy0qqggptekjs9s2enx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zy0qqggptekjs9s2enx.png" alt="Image ion"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;16. Contact &amp;amp; Social Footer&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq9qoyl50bae9muyr62qb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq9qoyl50bae9muyr62qb.png" alt="Imageption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj0iyg5oeqvgcsjfnk4yi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj0iyg5oeqvgcsjfnk4yi.png" alt="Imaption"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;(...and 40+ more sections including Travel, Music, Goals, and Code Snippets!)&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;now visit&lt;/em&gt; 

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://my-hand-drawn-portfolio-for-2026.vercel.app/" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;my-hand-drawn-portfolio-for-2026.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;br&gt;
&lt;strong&gt;to see all...&lt;/strong&gt;




&lt;blockquote&gt;
&lt;h2&gt;
  
  
  How I Built It
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;HTML5&lt;/strong&gt; + &lt;strong&gt;CSS3&lt;/strong&gt; + &lt;strong&gt;Vanilla JavaScript&lt;/strong&gt; + &lt;strong&gt;Google Fonts&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;h2&gt;
  
  
  Google Antigravity used for
&lt;/h2&gt;
&lt;/blockquote&gt;

&lt;p&gt;✅ &lt;strong&gt;Code Generation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Visual CSS&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Content Creation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Debugging&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Thanks for checking out my submission!&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>googleaichallenge</category>
      <category>portfolio</category>
      <category>gemini</category>
    </item>
  </channel>
</rss>
