<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kaarthik Andavar</title>
    <description>The latest articles on DEV Community by Kaarthik Andavar (@kaarthik108).</description>
    <link>https://dev.to/kaarthik108</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kaarthik108"/>
    <language>en</language>
    <item>
      <title>BrandVibe</title>
      <dc:creator>Kaarthik Andavar</dc:creator>
      <pubDate>Mon, 26 May 2025 06:19:19 +0000</pubDate>
      <link>https://dev.to/kaarthik108/brandvibe-47b1</link>
      <guid>https://dev.to/kaarthik108/brandvibe-47b1</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/brightdata-2025-05-07"&gt;Bright Data AI Web Access Hackathon&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built a Brand Intelligence Dashboard that gives businesses real-time insights into how their brand is perceived across social media and news platforms. The system analyzes mentions from Twitter, LinkedIn, Reddit, and news sources to provide sentiment analysis, ethical context evaluation, and trending topic identification.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrec9cy877rjf7fu0oef.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrec9cy877rjf7fu0oef.png" alt="BrandVibe Landing" width="800" height="539"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fohhh3dir2jcgqbe21eap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fohhh3dir2jcgqbe21eap.png" alt="BrandVibe dashboard" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The frontend is a Next.js application that provides an intuitive interface where users can input any brand name, location, and category to get comprehensive analytics. The dashboard displays sentiment breakdowns, platform-specific insights, word clouds of trending themes, and ethical highlights that might impact brand reputation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F63e0uhk0ng1k5pvxhnda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F63e0uhk0ng1k5pvxhnda.png" alt="BrandVibe visuals" width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Agent Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ui9zg8jt6tqaa4b5w4w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ui9zg8jt6tqaa4b5w4w.png" alt="Image description" width="800" height="833"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Repository:&lt;/strong&gt; &lt;br&gt;
Backend - &lt;a href="https://github.com/kaarthik108/Know-your-Brand" rel="noopener noreferrer"&gt;https://github.com/kaarthik108/Know-your-Brand&lt;/a&gt;&lt;br&gt;
Frontend - &lt;a href="https://github.com/kaarthik108/kyb" rel="noopener noreferrer"&gt;https://github.com/kaarthik108/kyb&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Live Demo:&lt;/strong&gt; &lt;a href="https://kyb-nine.vercel.app/" rel="noopener noreferrer"&gt;https://kyb-nine.vercel.app/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The application works in two main phases:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Analysis Request&lt;/strong&gt;: Users enter brand details or select from predefined options (Tesla, Apple, Microsoft)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI-Agent&lt;/strong&gt;: Agent starts doing its work to find mentions across platforms using MCP server (Brightdata)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Key features include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multi-platform sentiment analysis with visual breakdowns&lt;/li&gt;
&lt;li&gt;Ethical context identification for CSR-related themes&lt;/li&gt;
&lt;li&gt;Interactive word clouds showing trending topics&lt;/li&gt;
&lt;li&gt;Platform-specific mention tracking with engagement metrics&lt;/li&gt;
&lt;li&gt;Real-time polling interface during data collection&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tools:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Google ADK (Agent) with Bright Data MCP.&lt;/li&gt;
&lt;li&gt;O4-mini to extract structured response&lt;/li&gt;
&lt;li&gt;Deployed backend on Google Cloud Run&lt;/li&gt;
&lt;li&gt;NExtJS frontend on Vercel&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How I Used Bright Data's Infrastructure
&lt;/h2&gt;

&lt;p&gt;The core of this system relies on Bright Data's MCP server to power four specialized AI agents that work in parallel:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Discovery &amp;amp; Access&lt;/strong&gt;: The agents use Bright Data's infrastructure to discover and access content across Twitter, LinkedIn, Reddit, and news websites. This includes navigating complex authentication systems and dynamic content loading that would be impossible with traditional scraping.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Extraction&lt;/strong&gt;: Each platform agent extracts structured data including post content, timestamps, engagement metrics, and author information. Bright Data's reliable extraction capabilities ensure we get consistent, clean data even from JavaScript-heavy social media platforms.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real-time Interaction&lt;/strong&gt;: The agents interact with dynamic web pages, handling infinite scroll feeds, loading more content, and navigating platform-specific UI elements to gather comprehensive mention data.&lt;/p&gt;

&lt;p&gt;The MCP server integration allows our backend to coordinate these four agents simultaneously, dramatically reducing the time needed to gather comprehensive brand intelligence from multiple sources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Performance Improvements
&lt;/h2&gt;

&lt;p&gt;Using Bright Data's real-time web access created significant improvements over traditional approaches:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Speed&lt;/strong&gt;: Instead of sequential API calls or unreliable scraping, our parallel agent architecture powered by Bright Data completes comprehensive brand analysis in 2-5 minutes across all four platforms.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reliability&lt;/strong&gt;: Traditional web scraping often fails due to anti-bot measures, rate limiting, or dynamic content. Bright Data's infrastructure handles these challenges automatically, giving us consistent data collection success rates.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Quality&lt;/strong&gt;: The MCP server ensures we capture complete context around mentions - not just the text, but engagement metrics, temporal data, and surrounding conversation threads that provide richer sentiment analysis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scalability&lt;/strong&gt;: The system can analyze any brand without platform-specific API limitations or access restrictions. This makes it viable for businesses of any size to get enterprise-level brand intelligence.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real-time Insights&lt;/strong&gt;: Unlike static datasets or delayed API responses, Bright Data enables truly current brand monitoring, allowing businesses to respond quickly to emerging trends or reputation issues.&lt;/p&gt;

&lt;p&gt;The combination of AI agents with Bright Data's web access infrastructure transforms brand monitoring from a manual, time-intensive process into an automated, comprehensive intelligence system.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>brightdatachallenge</category>
      <category>ai</category>
      <category>webdata</category>
    </item>
    <item>
      <title>TryOutfit: Virtual Outfit Try-On with AI</title>
      <dc:creator>Kaarthik Andavar</dc:creator>
      <pubDate>Mon, 27 May 2024 02:20:51 +0000</pubDate>
      <link>https://dev.to/kaarthik108/tryoutfit-virtual-outfit-try-on-with-ai-3ndl</link>
      <guid>https://dev.to/kaarthik108/tryoutfit-virtual-outfit-try-on-with-ai-3ndl</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/awschallenge"&gt;The AWS Amplify Fullstack TypeScript Challenge &lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;TryOutfit&lt;/strong&gt;: Virtual Outfit Try-On with AI&lt;/p&gt;

&lt;p&gt;I recently built an AI web application called &lt;strong&gt;TryOutfit&lt;/strong&gt; that allows users to virtually try on outfits using AI. Powered by the IDM-VTON model from Replicate, TryOutfit provides an interactive experience for users to visualize how different outfits would look on them or on predefined models.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F30t61m8jn3qhwle9t044.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F30t61m8jn3qhwle9t044.png" alt="screenshot of the website tryoutfit" width="800" height="1024"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How It Works
&lt;/h2&gt;

&lt;p&gt;TryOutfit offers a user-friendly interface where users can either upload their own image or choose from a selection of predefined models. Once an image is selected, the application uses the IDM-VTON AI model to generate a virtual try-on of the chosen outfit on the selected image. This enables users to see how the outfit would look on them or on the model without physically trying it on.&lt;/p&gt;

&lt;p&gt;All the model generations are automatically deleted after 1hour from database.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo and Code
&lt;/h2&gt;

&lt;p&gt;APP - &lt;a href="https://www.tryoutfit.app/"&gt;https://www.tryoutfit.app/&lt;/a&gt;&lt;br&gt;
Code - &lt;a href="https://github.com/kaarthik108/tryoutfit"&gt;https://github.com/kaarthik108/tryoutfit&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The application follows a serverless architecture, leveraging AWS services to handle various functionalities. User authentication is managed by Amazon Cognito, ensuring secure access to the application. User-uploaded images are stored in S3 buckets, while outfit and user data is stored in DynamoDB tables. &lt;/p&gt;

&lt;p&gt;The AI model from Replicate is seamlessly integrated into the application to generate the virtual try-on images. To optimize performance, webhooks are used to poll the inference results instead of waiting for the model to complete the processing. Additionally, TryOutfit offers shareable links to the generated model outfits, which remain accessible for a period of 1 hour, enabling users to easily share their virtual try-on results with others.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8IS2dLz1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://mk7iyaq7oqz5ihbw.public.blob.vercel-storage.com/tryoutfitgif-eiuPhSbL44vuZ2MIqgYeIzlZLmrZrF.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8IS2dLz1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://mk7iyaq7oqz5ihbw.public.blob.vercel-storage.com/tryoutfitgif-eiuPhSbL44vuZ2MIqgYeIzlZLmrZrF.gif" width="412" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrations
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;NEXTjs 14 (with server actions)&lt;/li&gt;
&lt;li&gt;AWS s3 (To store image files)&lt;/li&gt;
&lt;li&gt;AWS Dynamo DB (To serve product data and persist inference data)&lt;/li&gt;
&lt;li&gt;AWS Cognito (API Key auth)&lt;/li&gt;
&lt;li&gt;Replicate (AI model)&lt;/li&gt;
&lt;li&gt;AWS Amplify (Hosting)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Connected Components and/or Feature Full&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;StorageImage from Amplify UI was used to display the images using path from s3.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Building TryOutfit using AWS Amplify and various AWS services has been a rewarding experience. Amplify simplifies the development process by providing a set of tools and services that make it easy to build scalable and feature-rich applications. The integration of AI models, such as the one from Replicate, adds an exciting dimension to the application, enabling users to virtually try on outfits from the comfort of their own devices.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>awschallenge</category>
      <category>amplify</category>
      <category>fullstack</category>
    </item>
    <item>
      <title>Di1 - AI Driven Insights With Cloudflare</title>
      <dc:creator>Kaarthik Andavar</dc:creator>
      <pubDate>Sun, 14 Apr 2024 10:35:10 +0000</pubDate>
      <link>https://dev.to/kaarthik108/di1-ai-driven-insights-with-cloudflare-1jbd</link>
      <guid>https://dev.to/kaarthik108/di1-ai-driven-insights-with-cloudflare-1jbd</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/devteam/join-us-for-the-cloudflare-ai-challenge-3000-in-prizes-5f99"&gt;Cloudflare AI Challenge&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;Di1 is an AI-powered T2SQL Chatbot for Cloudflare D1. It allows users to interact with the Y Combinator dataset using natural language queries, can be customized to any data that is stored in Cloudlfare D1. Users can ask questions about the dataset, and Di1 will generate the corresponding SQL queries and produce charts and graphs, to retrieve the relevant information from the Cloudflare D1 database. The chatbot provides an intuitive and user-friendly interface for exploring the dataset without requiring knowledge of SQL syntax.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F63o8dh26gpzmovmhnjvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F63o8dh26gpzmovmhnjvy.png" alt="Screenshot of the website Di1" width="800" height="605"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Deployed app - &lt;a href="https://di1-iyr.pages.dev/"&gt;https://di1-iyr.pages.dev/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Generated example chat - &lt;a href="https://di1-iyr.pages.dev/chat/NQFtTkt"&gt;https://di1-iyr.pages.dev/chat/NQFtTkt&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Code
&lt;/h2&gt;

&lt;p&gt;The source code for Di1 is available on GitHub: &lt;a href="https://github.com/kaarthik108/di1"&gt;https://github.com/kaarthik108/di1&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Journey
&lt;/h2&gt;

&lt;p&gt;Just went all out on &lt;strong&gt;cloudflare&lt;/strong&gt; product offerings. &lt;br&gt;
The project is built with the following stack:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare Workers AI&lt;/strong&gt; - Embedding model&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare Vectorize&lt;/strong&gt; - Store vectors for RAG&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare AI Gateway&lt;/strong&gt; - Caching LLM query and ratelimiting&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare Pages&lt;/strong&gt; - Hosting Nextjs&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare Python Workers&lt;/strong&gt; - For importing CSV data to D1&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloudflare D1&lt;/strong&gt; - For context query with RAG, and also to store chat responses.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenAI&lt;/strong&gt; - GPT4-turbo for function calling&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Building Di1 was a valuable learning experience within a short timeframe. &lt;br&gt;
The design process was rewarding, and the integration of multiple Cloudflare products resulted in a comprehensive solution.&lt;/p&gt;

&lt;p&gt;Achieving high accuracy with function calling using the Workers AI model &lt;code&gt;@hf/nousresearch/hermes-2-pro-mistral-7b&lt;/code&gt; proved challenging compared to &lt;code&gt;GPT-4&lt;/code&gt;. As the app relies heavily on function calling capabilities, this is an area for future exploration as Cloudflare expands its model offerings.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multiple Models and/or Triple Task Types&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The app currently uses two LLM's &lt;br&gt;
Workers AI - Embedding model &lt;code&gt;@cf/baai/bge-large-en-v1.5&lt;/code&gt;&lt;br&gt;
OpenAI - Chat model for function calling &lt;code&gt;GPT4-Turbo&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Team Submissions: &lt;a href="https://dev.to/kaarthik108"&gt;https://dev.to/kaarthik108&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloudflarechallenge</category>
      <category>devchallenge</category>
      <category>ai</category>
      <category>nextjs</category>
    </item>
    <item>
      <title>Introducing Supa-Dash✨</title>
      <dc:creator>Kaarthik Andavar</dc:creator>
      <pubDate>Wed, 10 Apr 2024 08:31:21 +0000</pubDate>
      <link>https://dev.to/kaarthik108/introducing-supa-dash-3pek</link>
      <guid>https://dev.to/kaarthik108/introducing-supa-dash-3pek</guid>
      <description>&lt;p&gt;A High-Performance, Multi-Filter Dashboard Built with Vercel AI SDK and Supabase&lt;/p&gt;

&lt;p&gt;We recently participated in a hackathon where we built &lt;strong&gt;Supa-Dash&lt;/strong&gt;, a fast and efficient multi-filter dashboard that leverages the power of the Vercel AI SDK and Supabase. This project showcases the seamless integration of text-to-SQL visuals, delivering a smooth and interactive user experience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2c32oh4xeb5vuinyx4x1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2c32oh4xeb5vuinyx4x1.png" alt="Image description" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Real-time Data Visualization: Supa-Dash provides real-time, dynamic visualizations based on user-defined filters and queries.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Text-to-SQL Integration: Users can input natural language queries, which are automatically converted into SQL statements using the Vercel AI SDK, enabling effortless data exploration.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scalable and Performant: Built on top of Next.js and Supabase, Supa-Dash ensures high performance and scalability, handling large datasets with ease.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Tech Stack
&lt;/h2&gt;

&lt;p&gt;Supa-Dash is built using the following cutting-edge technologies:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Next.js: A powerful React framework for building server-side rendered and statically generated web applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Supabase: An open-source Firebase alternative that provides a fast and secure PostgreSQL database with real-time capabilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Vercel AI SDK: A comprehensive toolkit for integrating AI capabilities into web applications, enabling natural language processing and text-to-SQL conversion.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Tremor: React components to build charts and dashboards.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Drizzle ORM: A modern and intuitive ORM (Object-Relational Mapping) library for TypeScript and JavaScript, simplifying database interactions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Shadcn UI: A beautifully designed component library that offers a wide range of pre-built UI components, enabling rapid development of visually appealing interfaces.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Live Demo&lt;br&gt;
Check out the live demo of Supa-Dash: &lt;a href="https://supa-dash.vercel.app"&gt;Supa-Dash&lt;/a&gt;&lt;/p&gt;

</description>
      <category>nextjs</category>
      <category>hackathon</category>
      <category>ai</category>
      <category>devchallenge</category>
    </item>
    <item>
      <title>AWS Bedrock on Snowflake (Talk to Claude, LLAMA)</title>
      <dc:creator>Kaarthik Andavar</dc:creator>
      <pubDate>Wed, 18 Oct 2023 20:19:34 +0000</pubDate>
      <link>https://dev.to/kaarthik108/aws-bedrock-on-snowflake-talk-to-claude-llama-5h73</link>
      <guid>https://dev.to/kaarthik108/aws-bedrock-on-snowflake-talk-to-claude-llama-5h73</guid>
      <description>&lt;p&gt;Directly Linking AWS Bedrock and Snowflake Without the Hassle&lt;/p&gt;

&lt;p&gt;Yes it is that simple to connect to AWS Bedrock directly from Snowflake without the need for snowflake external function or API Gateway.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;When AI Sneaks into Data Warehousing: Expect the Unexpected&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi1fzfhdn4qkb0vn2byf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi1fzfhdn4qkb0vn2byf.png" alt="snowflake-aws"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview of AWS Bedrock
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Amazon Bedrock&lt;/strong&gt; is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon with a single API, along with a broad set of capabilities you need to build generative AI applications, simplifying development while maintaining privacy and security.&lt;br&gt;
Amazon Bedrock is serverless, you don't have to manage any infrastructure&lt;/p&gt;

&lt;h2&gt;
  
  
  What are the services required to run the project ?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;AWS Lambda&lt;/li&gt;
&lt;li&gt;Snowflake External Access Integration&lt;/li&gt;
&lt;li&gt;Snowflake Stored Procedure&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: This process can be simplified further without the need of AWS Lambda by just using snowflake stored procedure if the anaconda snowflake channel has their boto3 package upgraded to the latest version (boto3&amp;gt;=1.28.57)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  How does the whole process look like ?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwyj4vpffqdzj6op63bs3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwyj4vpffqdzj6op63bs3.png" alt="snowflake-awsbedrock-architecture"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integration Steps
&lt;/h2&gt;

&lt;h2&gt;
  
  
  1. Setting up Snowflake Secrets
&lt;/h2&gt;

&lt;p&gt;First we create two snowflake secrets to call aws services AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

CREATE OR REPLACE SECRET AWS_ACCESS_KEY_ID
    TYPE = GENERIC_STRING
    SECRET_STRING = '....';

CREATE OR REPLACE SECRET AWS_SECRET_ACCESS_KEY
    TYPE = GENERIC_STRING
    SECRET_STRING = '......';



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  2. Configuring Snowflake Network Rules
&lt;/h2&gt;

&lt;p&gt;Now create snowflake network rule, to tell snowflake to allow egress on specific traffic.&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

CREATE OR REPLACE NETWORK RULE AWS_EGRESS_RULE
    MODE = EGRESS
    TYPE = HOST_PORT
    VALUE_LIST = (
        'lambda.us-east-1.amazonaws.com'
    );


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  3. Setting Up External Access Integration in Snowflake
&lt;/h2&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION bedrock_client
    ALLOWED_NETWORK_RULES = (AWS_EGRESS_RULE)
    ALLOWED_AUTHENTICATION_SECRETS = (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
    ENABLED = TRUE;
4. Creating the Stored Procedure
This procedure will call AWS Lambda.
CREATE OR REPLACE 
  PROCEDURE snow_claude(question STRING)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION=3.9
EXTERNAL_ACCESS_INTEGRATIONS = (bedrock_client)
SECRETS = ('AWS_ACCESS_KEY_ID' = AWS_ACCESS_KEY_ID, 'AWS_SECRET_ACCESS_KEY' = AWS_SECRET_ACCESS_KEY)
PACKAGES=('snowflake-snowpark-python', 'boto3', 'botocore', 'requests')
HANDLER='handler'
AS
$$
import json

import _snowflake
import boto3
from snowflake.snowpark import Session

# Your AWS credentials
AWS_ACCESS_KEY_ID = _snowflake.get_generic_secret_string("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = _snowflake.get_generic_secret_string("AWS_SECRET_ACCESS_KEY")

#Lambda function name
FUNCTION_NAME = "snowbedrock"

# Initialize the boto3 client for Lambda
session = boto3.session.Session(region_name="us-east-1")
lambda_client = session.client(
    "lambda",
    aws_access_key_id=AWS_ACCESS_KEY_ID,
    aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
)

def invoke_lambda_function(payload):
    response = lambda_client.invoke(
        FunctionName=FUNCTION_NAME,
        InvocationType="RequestResponse",
        Payload=json.dumps(payload),
    )
    # Since the Lambda response is already parsed into a string
    return response["Payload"].read().decode("utf-8")

def handler(session: Session, question: str):
    body = {"question": question}
    try:
        response = invoke_lambda_function(body)
        parsed_response = json.loads(response)
        return parsed_response
    except Exception as err:
        print("An error occurred:", err)
        raise

$$
;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
  
  
  5. Lambda Function Setup
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: The boto3 package is not upto date in AWS Lambda, make sure to zip the latest version package (boto3&amp;gt;=1.28.57) as a layer, the older version doesn't know about bedrock.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

#snowbedrock

import json
import boto3
import os

def claude(prompt):
    bedrock_runtime = boto3.client(
        service_name="bedrock-runtime",
        region_name="us-east-1",
    )
    TEMPLATE = f'Human: "{prompt}"\nAssistant: '

    body = json.dumps(
        {
            "prompt": TEMPLATE,
            "max_tokens_to_sample": 256,
            "stop_sequences": [],
            "temperature": 0,
            "top_p": 0.9,
        }
    )
    modelId = "anthropic.claude-instant-v1"
    accept = "application/json"
    contentType = "application/json"
    response = bedrock_runtime.invoke_model(
        body=body, modelId=modelId, accept=accept, contentType=contentType
    )
    response_body = json.loads(response.get("body").read())
    print("Model Response:", response_body)

    return response_body

def lambda_handler(event, context) -&amp;gt; str:    
    question = event.get("question")
    if not question:
        return "No question provided"

    answer = claude(question)
    return answer["completion"]


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Make sure the lambda has permission to trigger bedrock&lt;br&gt;
&lt;code&gt;{&lt;br&gt;
 "Version": "2012–10–17",&lt;br&gt;
 "Statement": [&lt;br&gt;
 {&lt;br&gt;
 "Effect": "Allow",&lt;br&gt;
 "Action": "bedrock:*",&lt;br&gt;
 "Resource": "*"&lt;br&gt;
 }&lt;br&gt;
 ]&lt;br&gt;
}&lt;/code&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  6. Time to test
&lt;/h2&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

call snow_claude('who is this elon musk?');



&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;code&gt;Elon Musk is a South African-born American entrepreneur and businessman. Some key things to know about him: - He is the founder, CEO, and Chief Engineer of SpaceX…..&lt;/code&gt;&lt;/p&gt;




&lt;p&gt;I hope this guide simplifies your AWS Bedrock and Snowflake integration process.&lt;br&gt;
Follow for more insightful content and to explore the world of open-source with me!&lt;br&gt;
Medium: &lt;a href="https://kaarthikandavar.medium.com/" rel="noopener noreferrer"&gt;Kaarthikandavar&lt;/a&gt;&lt;br&gt;
X: &lt;a href="https://twitter.com/kaarthikcodes" rel="noopener noreferrer"&gt;Kaarthikcodes&lt;/a&gt;&lt;br&gt;
LinkedIn: &lt;a href="https://www.linkedin.com/in/kaarthik-andavar-b32a27143/" rel="noopener noreferrer"&gt;Kaarthik&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thanks for Reading! Just another day in the tech soap opera.&lt;/p&gt;

</description>
      <category>snowflake</category>
      <category>aws</category>
      <category>serverless</category>
      <category>dataengineering</category>
    </item>
  </channel>
</rss>
