<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: brandon0405</title>
    <description>The latest articles on DEV Community by brandon0405 (@brandon0405).</description>
    <link>https://dev.to/brandon0405</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/brandon0405"/>
    <language>en</language>
    <item>
      <title>Your AWS Data Can Now Power Google AI — No Migration Required: Inside Google Cloud's Cross-Cloud Lakehouse</title>
      <dc:creator>brandon0405</dc:creator>
      <pubDate>Sat, 25 Apr 2026 18:15:25 +0000</pubDate>
      <link>https://dev.to/brandon0405/your-aws-data-can-now-power-google-ai-no-migration-required-inside-google-clouds-cross-cloud-3ci8</link>
      <guid>https://dev.to/brandon0405/your-aws-data-can-now-power-google-ai-no-migration-required-inside-google-clouds-cross-cloud-3ci8</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/google-cloud-next-2026-04-22"&gt;Google Cloud NEXT Writing Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Problem Every Data Engineer Knows Too Well
&lt;/h2&gt;

&lt;p&gt;Picture this: your company has years of carefully curated data sitting in Amazon S3. It powers your dashboards, your pipelines, your ML models. Then someone in leadership asks: "Can we run this through Google's AI?"&lt;/p&gt;

&lt;p&gt;And you already know what that means. Migration. Weeks of ETL work. Egress fees that make your stomach drop. Data duplication across clouds. Governance nightmares. The risk of breaking something that's already working.&lt;/p&gt;

&lt;p&gt;For years, the cloud data world operated on an unspoken rule: &lt;strong&gt;pick your cloud and stay there&lt;/strong&gt;. Moving between providers wasn't impossible — it was just painful enough that most teams didn't bother.&lt;/p&gt;

&lt;p&gt;At Google Cloud NEXT '26, that rule changed. Google announced the &lt;strong&gt;Cross-Cloud Lakehouse&lt;/strong&gt;, built on Apache Iceberg, and it might be the most underrated announcement of the entire event — especially if you care about where AI is actually heading.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Is the Cross-Cloud Lakehouse?
&lt;/h2&gt;

&lt;p&gt;Before diving into the announcements, some quick context: as of April 20, 2026, Google renamed BigLake to &lt;strong&gt;Google Cloud Lakehouse&lt;/strong&gt;, and BigLake Metastore is now the &lt;strong&gt;Lakehouse Runtime Catalog&lt;/strong&gt;. If you've worked with BigLake before, it's the same APIs and CLI commands — just a new name that better reflects what it actually does.&lt;/p&gt;

&lt;p&gt;The Cross-Cloud Lakehouse extends Google Cloud Lakehouse to let you query data in AWS (and Azure, coming later this year) directly from Google Cloud using BigQuery, Dataproc, and Apache Spark — &lt;strong&gt;without migrating your data or building complex ETL pipelines&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;It works in two layers:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Metadata layer:&lt;/strong&gt; Your remote Apache Iceberg catalogs (like Databricks Unity Catalog or AWS Glue) are connected to Google's Lakehouse. It discovers your data without copying any files, and authenticates securely through Workload Identity Federation — no long-lived access keys needed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Transport layer:&lt;/strong&gt; Google integrates Cross-Cloud Interconnect (CCI) directly into the data plane. By combining CCI's dedicated private networking with Apache Iceberg REST Catalog, cross-cloud queries run with low latency and without the massive egress fees you'd normally pay routing traffic over the public internet.&lt;/p&gt;

&lt;p&gt;The result: your agents and analysts can query data in AWS S3 as if it were sitting right there in Google Cloud.&lt;/p&gt;




&lt;h2&gt;
  
  
  The 4 Announcements That Actually Matter
&lt;/h2&gt;

&lt;p&gt;Google announced its next-generation Cross-Cloud Lakehouse with four concrete breakthroughs. Let me break each one down beyond the keynote surface level.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Fully Managed Iceberg Storage with Real Interoperability
&lt;/h3&gt;

&lt;p&gt;This one matters more than it sounds. Previously, if you used Apache Spark for ETL into Iceberg REST Catalog tables, you couldn't write through BigQuery or use its storage management features. You had to choose one or the other.&lt;/p&gt;

&lt;p&gt;Now there's true read/write interoperability between BigQuery and Managed Service for Apache Spark, including Iceberg-compatible engines like Spark, Trino, Flink — and third-party engines like Databricks and Snowflake (in Preview). One copy of your data, multiple engines, no compromises.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Cross-Cloud Caching: The Feature Nobody's Talking About
&lt;/h3&gt;

&lt;p&gt;This is the one that makes cross-cloud economically viable. Google introduced an intelligent cache that stores cross-cloud data on the &lt;strong&gt;first read&lt;/strong&gt;, slashing egress fees and dramatically accelerating follow-on queries for your AWS and Azure data.&lt;/p&gt;

&lt;p&gt;In plain terms: the first time you query your S3 data from BigQuery, it gets cached on Google's side. Every subsequent query is fast and cheap. The penalty for cross-cloud access shrinks to nearly nothing after that first read.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Lightning Engine for Apache Spark: Up to 4.5x Faster
&lt;/h3&gt;

&lt;p&gt;Google's Lightning Engine is a real-time, serverless Spark engine that delivers up to 4.5x faster performance than open-source Spark alternatives, and up to 2x better price-performance over the leading proprietary competitor for large datasets.&lt;/p&gt;

&lt;p&gt;Flipkart, Lowe's, and Meesho are already accelerating their Apache Spark workloads with it. That's not a beta experiment — that's production scale.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. An Estimated 117% ROI in Under 6 Months
&lt;/h3&gt;

&lt;p&gt;Google's own analysis puts the estimated ROI of this agentic-first lakehouse approach at &lt;strong&gt;117%&lt;/strong&gt;, with payback in under six months. Spotify is already unlocking innovation with it.&lt;/p&gt;

&lt;p&gt;Take vendor-published ROI numbers with appropriate skepticism — but the underlying logic is sound. If you eliminate data movement costs, reduce ETL complexity, and let multiple engines share a single data copy, the math does work in your favor.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Angle Most People Are Missing: This Is About AI Agents, Not Just Data
&lt;/h2&gt;

&lt;p&gt;Here's where I think the real story is, and why this announcement deserves more attention than it's getting.&lt;/p&gt;

&lt;p&gt;Everyone at Next '26 is talking about Gemini Enterprise Agent Platform, Agent Studio, agentic workflows. But there's a foundational problem with all of that: &lt;strong&gt;an AI agent is only as intelligent as the data it can access&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If your agent hits a cross-cloud wall — high latency, expensive egress, proprietary catalog lock-in — its autonomy is broken. It can't reason across your full data estate. It can only see the slice of data you've managed to centralize, which in most enterprises is a fraction of the whole.&lt;/p&gt;

&lt;p&gt;The Cross-Cloud Lakehouse isn't a data feature. It's the infrastructure layer that makes truly capable AI agents possible in multi-cloud enterprises — which is to say, virtually every real enterprise.&lt;/p&gt;




&lt;h2&gt;
  
  
  What It Looks Like in Practice
&lt;/h2&gt;

&lt;p&gt;No account needed to understand this. Here's what a cross-cloud query actually looks like once you've set up federation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;user_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;action&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;total_actions&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="nv"&gt;`your-project.federated_aws_catalog.your_namespace.your_table`&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;event_date&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="s1"&gt;'2026-04-01'&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's standard BigQuery SQL. The table in that query physically lives in Amazon S3. No data moved. No migration. No special connector to manage. Google Cloud Lakehouse handles the metadata translation and secure data access transparently.&lt;/p&gt;

&lt;p&gt;You can also read Cross-Cloud Lakehouse data directly from Apache Spark clusters without managing separate AWS credentials or S3 connectors — Lakehouse automatically provides temporary, scoped S3 credentials to Spark through the Iceberg REST Catalog interface.&lt;/p&gt;




&lt;h2&gt;
  
  
  My Take
&lt;/h2&gt;

&lt;p&gt;As a final-year IT Engineering student, I've spent a fair amount of time experimenting with both AWS and Google Cloud under their free tiers — and honestly, the experience has been mostly positive. Both platforms are incredibly powerful for learning, and the free tier generosity means you can build real things without spending a cent. That said, I learned the hard way that "free tier" requires constant attention. I once left several AWS services running after a learning project, forgot about them, and woke up to a bill of over $120. AWS did refund me after I explained the situation, but that moment of panic stuck with me. The anxiety of not knowing exactly what's running, and what it's costing, is real — especially when you're a student with no budget.&lt;/p&gt;

&lt;p&gt;That's why the cost angle of this Cross-Cloud Lakehouse announcement stands out to me the most. The cross-cloud caching feature in particular — where data from S3 is cached on Google's side after the first read, dramatically reducing egress fees on subsequent queries — is the kind of thing that changes the math for smaller teams and learners, not just enterprise giants. Egress fees are one of the most frustrating hidden costs in cloud, and the fact that Google is tackling that directly instead of just promising "seamless interoperability" is meaningful.&lt;/p&gt;

&lt;p&gt;My honest skepticism? I'd want to see real benchmarks from teams outside of Google's own case studies before trusting this in a production environment. The 117% ROI figure is compelling on paper, but vendor-published numbers always deserve scrutiny. I'd also want to understand the failure modes — what happens when the cross-cloud connection has latency spikes, or when the cache goes stale? For learning and experimentation, this looks genuinely exciting. For production at scale, I'd want at least six months of community battle-testing first.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's Coming Next (And Why It Matters)
&lt;/h2&gt;

&lt;p&gt;The ecosystem is already bigger than most people realize. The Cross-Cloud Lakehouse already supports bi-directional federation with Databricks, Oracle Autonomous Database, Snowflake, SAP, Salesforce Data360, ServiceNow, Workday, and more. Azure support is coming later this year.&lt;/p&gt;

&lt;p&gt;Catalog federation is also launching in Preview for AWS Glue, Databricks, SAP, and Snowflake — with Confluent Tableflow support coming soon.&lt;/p&gt;

&lt;p&gt;This isn't Google building a feature. This is Google positioning itself as the &lt;strong&gt;analytical brain of a multi-cloud world&lt;/strong&gt; — a place where your data stays wherever it is, but your AI runs on Google's infrastructure.&lt;/p&gt;

&lt;p&gt;Whether that bet pays off depends on adoption and performance at scale, which we'll know more about in the months ahead. But the architectural direction is clear, and the announcement at Next '26 was the starting gun.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;"The real test will be whether enterprises trust Google enough to let it be the query layer over their AWS data — what do you think?"&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://cloud.google.com/blog/products/data-analytics/whats-new-in-the-agentic-data-cloud" rel="noopener noreferrer"&gt;What's New in the Agentic Data Cloud — Google Cloud Blog&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://cloud.google.com/blog/products/data-analytics/the-future-of-data-lakehouse-for-the-agentic-era" rel="noopener noreferrer"&gt;The Future of Data Lakehouse for the Agentic Era — Google Cloud Blog&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://cloud.google.com/blog/topics/google-cloud-next/google-cloud-next-2026-wrap-up/" rel="noopener noreferrer"&gt;Google Cloud NEXT '26 Wrap-Up — All 260 Announcements&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.cloud.google.com/lakehouse/docs/about-cross-cloud-lakehouse" rel="noopener noreferrer"&gt;Cross-Cloud Lakehouse Documentation&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devchallenge</category>
      <category>cloudnextchallenge</category>
      <category>googlecloud</category>
    </item>
    <item>
      <title>Before It Becomes Trash — An AI-Powered Circular Economy App for Everyday Objects</title>
      <dc:creator>brandon0405</dc:creator>
      <pubDate>Mon, 20 Apr 2026 02:38:56 +0000</pubDate>
      <link>https://dev.to/brandon0405/before-it-becomes-trash-an-ai-powered-circular-economy-app-for-everyday-objects-2ab6</link>
      <guid>https://dev.to/brandon0405/before-it-becomes-trash-an-ai-powered-circular-economy-app-for-everyday-objects-2ab6</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for &lt;a href="https://dev.to/challenges/weekend-2026-04-16"&gt;Weekend Challenge: Earth Day Edition&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Before It Becomes Trash — Helping Everyday Objects Get a Second Chance
&lt;/h1&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;For this Earth Day challenge, I wanted to build something practical.&lt;/p&gt;

&lt;p&gt;A lot of sustainability projects focus on awareness, visualization, or habit tracking. I wanted to focus on a smaller but very real moment: standing in front of a damaged object and not knowing whether it should be repaired, reused, recycled, or thrown away.&lt;/p&gt;

&lt;p&gt;That is what &lt;strong&gt;Before It Becomes Trash&lt;/strong&gt; is built for.&lt;/p&gt;

&lt;p&gt;It is a circular economy web app that helps users decide what to do with everyday damaged objects before discarding them. A user can describe an item, upload an image, and receive a structured recommendation powered by AI. The app then helps turn that recommendation into a concrete rescue action and stores the result in a persistent user history.&lt;/p&gt;

&lt;p&gt;Instead of treating sustainability as a distant global concept, I wanted to turn it into a local, personal decision support tool.&lt;/p&gt;

&lt;p&gt;Core actions supported by the app:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;create and analyze an item&lt;/li&gt;
&lt;li&gt;upload an image to enrich the evaluation&lt;/li&gt;
&lt;li&gt;receive a structured recommendation: &lt;strong&gt;repair, reuse, recycle, or discard&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;save rescue actions to a personal history&lt;/li&gt;
&lt;li&gt;earn badges for successful rescues&lt;/li&gt;
&lt;li&gt;record a symbolic badge memo on Solana Devnet&lt;/li&gt;
&lt;li&gt;use the interface in multiple languages&lt;/li&gt;
&lt;li&gt;explore a public demo flow without logging in&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1zjwhqecvh4s340hs18.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1zjwhqecvh4s340hs18.png" alt=" " width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Live App:&lt;/strong&gt; &lt;a href="https://before-it-becomes-trash.vercel.app/" rel="noopener noreferrer"&gt;before-it-becomes-trash.vercel.app&lt;/a&gt;&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Public Demo for Judges:&lt;/strong&gt; &lt;a href="https://before-it-becomes-trash.vercel.app/demo" rel="noopener noreferrer"&gt;before-it-becomes-trash.vercel.app/demo&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I also created a &lt;strong&gt;public &lt;code&gt;/demo&lt;/code&gt; route&lt;/strong&gt; so judges can review the experience without creating an account. It uses mock data, but preserves the same visual flow and core interaction structure as the main app.&lt;/p&gt;

&lt;p&gt;That was an intentional product decision: I wanted the judging experience to be quick and frictionless while still keeping the real authenticated app architecture in place.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d2ylcgfreupqac8ls56.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d2ylcgfreupqac8ls56.png" alt=" " width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnysslsn95a2bpd5kti9u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnysslsn95a2bpd5kti9u.png" alt=" " width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flrur61dk0dges2ppfzi0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flrur61dk0dges2ppfzi0.png" alt=" " width="800" height="310"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Code
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;GitHub Repository:&lt;/strong&gt; &lt;a href="https://github.com/brandon0405/Before-It-Becomes-Trash" rel="noopener noreferrer"&gt;github.com/brandon0405/Before-It-Becomes-Trash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The repository contains the full implementation, including the authenticated flow, demo mode, AI analysis pipeline, Supabase persistence, internationalization, and Solana memo integration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdbakjfhhbpt5gdtxktfv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdbakjfhhbpt5gdtxktfv.png" alt=" " width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  How I Built It
&lt;/h2&gt;

&lt;p&gt;I built the app with a full-stack approach using a modern TypeScript-based architecture.&lt;/p&gt;

&lt;h3&gt;
  
  
  Product approach
&lt;/h3&gt;

&lt;p&gt;My goal was not to build another generic “eco app.” I wanted to build a tool that answers one very practical question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;What should I do with this object before I throw it away?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That led to a product centered on &lt;strong&gt;decision support for damaged objects&lt;/strong&gt;, not just environmental awareness.&lt;/p&gt;

&lt;h3&gt;
  
  
  Technical architecture
&lt;/h3&gt;

&lt;p&gt;The app is built with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Next.js 14&lt;/strong&gt; with App Router&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;TypeScript&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Tailwind CSS&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Auth0&lt;/strong&gt; for authentication&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Supabase&lt;/strong&gt; for persistence&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Google Gemini&lt;/strong&gt; for structured multilingual analysis&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Solana Devnet&lt;/strong&gt; for symbolic badge memos&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;zod&lt;/strong&gt; for validation&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Database design
&lt;/h3&gt;

&lt;p&gt;I used Supabase to persist the main entities of the application:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;profiles&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;items&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;analyses&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;rescue_actions&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;badges&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;blockchain_records&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This separation helped keep the logic clear between:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the original object&lt;/li&gt;
&lt;li&gt;the AI-generated analysis&lt;/li&gt;
&lt;li&gt;the final rescue action&lt;/li&gt;
&lt;li&gt;the badge and symbolic blockchain record&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  AI analysis
&lt;/h3&gt;

&lt;p&gt;Gemini is used to generate structured recommendations rather than purely conversational output.&lt;/p&gt;

&lt;p&gt;That was important because the UI needs reliable categories and displayable fields, not just a paragraph of text. The model is used to help determine whether an object should be repaired, reused, recycled, or discarded, while also generating supporting reasoning.&lt;/p&gt;

&lt;p&gt;The app also supports multilingual analysis, which matches the multilingual UI.&lt;/p&gt;

&lt;h3&gt;
  
  
  Internationalization
&lt;/h3&gt;

&lt;p&gt;One part I especially wanted to push further was language support.&lt;/p&gt;

&lt;p&gt;Something that bothered me about the other proposals was that they were only in English. As a Spanish speaker, I felt compelled to internationalize the site so that as many people as possible could use it.&lt;/p&gt;

&lt;p&gt;The app includes a custom i18n implementation with &lt;strong&gt;7 languages&lt;/strong&gt;, plus &lt;strong&gt;RTL support for Arabic&lt;/strong&gt;. That meant thinking beyond translated strings and handling direction-aware layouts properly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Demo mode
&lt;/h3&gt;

&lt;p&gt;Since full-stack challenge apps can be harder to evaluate when login is required, I added a public &lt;code&gt;/demo&lt;/code&gt; route with mock data and the same visual flow pattern as the authenticated experience.&lt;/p&gt;

&lt;p&gt;That gave me a better balance between:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a real app architecture&lt;/li&gt;
&lt;li&gt;and a judge-friendly review experience&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qrlmf91332p5re7ez0f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2qrlmf91332p5re7ez0f.png" alt=" " width="726" height="911"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is a strong place to show the form where the user enters object details and uploads an image.&lt;/p&gt;

&lt;h3&gt;
  
  
  Real user flow
&lt;/h3&gt;

&lt;p&gt;The authenticated flow looks like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The user signs in with Auth0&lt;/li&gt;
&lt;li&gt;The user creates a new item&lt;/li&gt;
&lt;li&gt;The user enters a description and optionally uploads an image&lt;/li&gt;
&lt;li&gt;The payload is validated with zod&lt;/li&gt;
&lt;li&gt;Gemini generates a structured evaluation&lt;/li&gt;
&lt;li&gt;The UI displays a recommendation&lt;/li&gt;
&lt;li&gt;The analysis is stored in Supabase&lt;/li&gt;
&lt;li&gt;The user saves a rescue action&lt;/li&gt;
&lt;li&gt;A badge can be awarded&lt;/li&gt;
&lt;li&gt;A symbolic memo is recorded on Solana Devnet&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fztcke5gc6srkgptrv6kw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fztcke5gc6srkgptrv6kw.png" alt=" " width="639" height="861"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To keep the blockchain layer honest and lightweight, I used Solana Devnet to record a symbolic memo tied to rescue badge events. It is not the core of the product, but a verifiable trace of positive circular actions.&lt;/p&gt;

&lt;p&gt;For judges, the &lt;code&gt;/demo&lt;/code&gt; route offers a no-login path through essentially the same product story.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bdcjf86nyawvr225mk5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bdcjf86nyawvr225mk5.png" alt=" " width="719" height="854"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This should show the recommendation card clearly, ideally with one of the four outcomes visible.&lt;/p&gt;

&lt;h3&gt;
  
  
  Technical challenges I ran into
&lt;/h3&gt;

&lt;h4&gt;
  
  
  1. Making AI output reliable enough for the UI
&lt;/h4&gt;

&lt;p&gt;A free-form AI answer is not ideal when the app needs structured fields.&lt;/p&gt;

&lt;p&gt;I solved that by designing the Gemini integration around structured output and validating the incoming data before rendering. This made the analysis much more stable and usable in the interface.&lt;/p&gt;

&lt;h4&gt;
  
  
  2. Preventing duplicate saves
&lt;/h4&gt;

&lt;p&gt;Saving the same action twice is easy to do when users retry or click again after a delay.&lt;/p&gt;

&lt;p&gt;To avoid noisy records, I added duplicate prevention and explicit &lt;code&gt;409&lt;/code&gt; conflict handling when appropriate.&lt;/p&gt;

&lt;h4&gt;
  
  
  3. Handling AI failures gracefully
&lt;/h4&gt;

&lt;p&gt;I did not want the whole app to feel broken if the AI call failed.&lt;/p&gt;

&lt;p&gt;So I implemented a robust fallback path, allowing the application to degrade gracefully instead of collapsing the user experience.&lt;/p&gt;

&lt;h4&gt;
  
  
  4. Making the app usable in multiple languages
&lt;/h4&gt;

&lt;p&gt;Supporting seven languages, plus Arabic RTL, pushed the project beyond simple translation. Layout, spacing, and component behavior had to stay coherent across multiple language contexts.&lt;/p&gt;

&lt;h4&gt;
  
  
  5. Using Solana in a way that felt honest
&lt;/h4&gt;

&lt;p&gt;I did not want blockchain to feel forced.&lt;/p&gt;

&lt;p&gt;Instead of making it the center of the product, I used Solana Devnet to write a symbolic memo tied to rescue badges. That kept it lightweight, real, and aligned with the product idea.&lt;/p&gt;

&lt;h4&gt;
  
  
  6. Handling iPhone image uploads for AI analysis
&lt;/h4&gt;

&lt;p&gt;Another issue I ran into was image compatibility.&lt;/p&gt;

&lt;p&gt;Photos taken on iPhones are often uploaded in &lt;strong&gt;HEIC&lt;/strong&gt;, which is not always handled reliably by AI pipelines or downstream processing steps. Large image sizes were also increasing the chance of failed requests or unnecessary payload weight.&lt;/p&gt;

&lt;p&gt;To solve this, I added a client-side preprocessing step before sending images to the AI:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;automatically convert &lt;strong&gt;HEIC&lt;/strong&gt; images to &lt;strong&gt;JPEG&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;resize and compress images to reduce file size&lt;/li&gt;
&lt;li&gt;reject oversized files with a clear user-facing error message if they still exceed the limit&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This made the upload flow much more reliable and improved the overall experience, especially for mobile users.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vuyf3yybjqkfb67k5lu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vuyf3yybjqkfb67k5lu.png" alt=" " width="800" height="475"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftqoboeyw0ympvbqz8eql.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftqoboeyw0ympvbqz8eql.png" alt=" " width="800" height="244"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkr26nmnuc7w7uk19e1b4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkr26nmnuc7w7uk19e1b4.png" alt=" " width="800" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  What I learned
&lt;/h3&gt;

&lt;p&gt;This project taught me that sustainability tools become much more interesting when they help with &lt;strong&gt;real decisions&lt;/strong&gt;, not just awareness.&lt;/p&gt;

&lt;p&gt;It also reinforced that AI is much more useful in production-like experiences when constrained into a structured output format.&lt;/p&gt;

&lt;p&gt;And on the UX side, I learned that building a challenge project is not just about shipping features. It is also about making the project easy to understand and easy to review. The &lt;code&gt;/demo&lt;/code&gt; route ended up being one of the most important decisions in the whole build.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to run locally
&lt;/h3&gt;

&lt;h4&gt;
  
  
  1. Clone the repository
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/brandon0405/Before-It-Becomes-Trash.git
&lt;span class="nb"&gt;cd &lt;/span&gt;Before-It-Becomes-Trash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  2. Install dependencies
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  3. Create environment variables
&lt;/h4&gt;

&lt;p&gt;Create a .env.local file with the required values:&lt;/p&gt;

&lt;p&gt;APP_BASE_URL=&lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AUTH0_SECRET=your_auth0_secret&lt;br&gt;
AUTH0_BASE_URL=&lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt;&lt;br&gt;
AUTH0_ISSUER_BASE_URL=&lt;a href="https://your-auth0-domain" rel="noopener noreferrer"&gt;https://your-auth0-domain&lt;/a&gt;&lt;br&gt;
AUTH0_CLIENT_ID=your_auth0_client_id&lt;br&gt;
AUTH0_CLIENT_SECRET=your_auth0_client_secret&lt;br&gt;
AUTH0_AUDIENCE=&lt;/p&gt;

&lt;p&gt;GEMINI_API_KEY=your_gemini_api_key&lt;br&gt;
GEMINI_MODEL=gemini-1.5-flash&lt;/p&gt;

&lt;p&gt;SUPABASE_URL=&lt;a href="https://your-project.supabase.co" rel="noopener noreferrer"&gt;https://your-project.supabase.co&lt;/a&gt;&lt;br&gt;
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key&lt;/p&gt;

&lt;p&gt;SQLITE_BACKUP_ENABLED=true&lt;br&gt;
SQLITE_BACKUP_PATH=./.data/local-backup.db&lt;/p&gt;

&lt;p&gt;SOLANA_NETWORK=devnet&lt;br&gt;
SOLANA_PRIVATE_KEY_JSON=[1,2,3]&lt;/p&gt;

&lt;h4&gt;
  
  
  4. Run the development server
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm run dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  5. Open it locally
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Main app: &lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Public demo: &lt;a href="http://localhost:3000/demo" rel="noopener noreferrer"&gt;http://localhost:3000/demo&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What I would improve next
&lt;/h2&gt;

&lt;p&gt;If I keep building this project, the next things I would work on are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;more material-aware recommendations&lt;/li&gt;
&lt;li&gt;better image-based classification&lt;/li&gt;
&lt;li&gt;location-aware recycling guidance&lt;/li&gt;
&lt;li&gt;richer explainability for recommendations&lt;/li&gt;
&lt;li&gt;stronger accessibility testing&lt;/li&gt;
&lt;li&gt;a deeper rescue scoring system&lt;/li&gt;
&lt;li&gt;community-contributed reuse ideas&lt;/li&gt;
&lt;li&gt;more meaningful onchain proof beyond symbolic memos&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prize Categories
&lt;/h2&gt;

&lt;p&gt;I am submitting this project for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Best Use of Google Gemini&lt;/li&gt;
&lt;li&gt;Best Use of Solana&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Gemini is used for structured, multilingual item analysis.&lt;br&gt;
Solana Devnet is used to record symbolic badge memos tied to rescue actions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Closing
&lt;/h2&gt;

&lt;p&gt;For Earth Day, I did not want to build something that only talks about the planet in abstract terms.&lt;/p&gt;

&lt;p&gt;I wanted to build something for the moment when a person is holding a real object and asking:&lt;/p&gt;

&lt;p&gt;Should this be repaired, reused, recycled, or thrown away?&lt;/p&gt;

&lt;p&gt;That is the space Before It Becomes Trash is trying to support.&lt;/p&gt;

&lt;p&gt;If you try the demo, I would genuinely love feedback on the product and the implementation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Which object would you test first?&lt;/li&gt;
&lt;li&gt;Would you trust the recommendation?&lt;/li&gt;
&lt;li&gt;What would make the rescue decision feel more useful?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Thanks for reading.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>weekendchallenge</category>
    </item>
  </channel>
</rss>
