DEV Community

Kai Chew
Kai Chew Subscriber

Posted on

GheiaGrid: Reimagining Decentralized Urban Farming & Carbon Mining

DEV Weekend Challenge: Earth Day

This is a submission for Weekend Challenge: Earth Day Edition

What I Built

GheiaGrid is a decentralized infrastructure that transforms ordinary urban balconies and rooftops into "Autonomous Carbon Sinks."

GheiaGrid

Inspired by the urgent need for planetary action this Earth Day and our old Decentralized Farming IoT - Arduino to Dashboard, GheiaGrid reimagines our environmental relationship by turning passive consumers into active, decentralized carbon-sequesters. Think of it as "Mining Carbon." You aren't just farming; you are contributing to a verifiable, global cooling network.

GheiaGrid is a full-stack Next.js platform designed to securely ingest, store, and verify IoT sensor data (like soil moisture and CO2 levels) from distributed urban farming nodes while providing AI-driven diagnostics for plant health. To achieve this, we architected a highly scalable "Trust Chain": Sensor Identity -> Data Telemetry -> Intelligence -> Blockchain Reward.

Demo

Live Demo: https://gheia-750841821481.us-central1.run.app/

Code

🌍 GheiaGrid

License Next.js TypeScript Solana Gemini Framer Motion

A decentralized, zero-trust bio-economic grid for urban farming.

The GheiaGrid is an IoT-driven, AI-powered platform designed to securely ingest distributed urban farming node data, archive it immutably, and reward ecological action. This platform bridges the gap between biological systems, machine-to-machine agents, and the blockchain using enterprise-grade zero-trust principles.


⚡ Architecture & Tech Stack

This project features a highly decoupled, modern tech stack designed for security, scale, and interactivity:

  • Frontend: Next.js 15 (App Router), React 19, Tailwind CSS v4, and Framer Motion for a highly reactive, physics-based UI.
  • Theme: "Sleek Interface" (Cyberpunk-meets-nature aesthetic, Inter & JetBrains Mono typography).
  • Identity & Access (Zero-Trust): Secures the grid using Auth0 Machine-to-Machine (M2M) authentication. The backend dynamically downloads Auth0 JWKS to mathematically verify the RSA signature and enforce write:sensor_data scopes before dropping the payload.
  • Data Lake (Snowflake): Built to handle massive streams of continuous telemetry (moisture, ambient CO2, temperature) via the Snowflake SDK…

How I Built It

GheiaGrid was built by layering scalable, enterprise-grade technologies to create a zero-trust, bio-economic grid. GitHub Copilot (via VS Code) acted as our primary AI pair-programmer, drastically accelerating the architectural scaffolding and helping us generate the boilerplate for connecting these complex services together.

auth0-dev-github-copilot

snowflake dev copilot

Here is a breakdown of our technical approach and the interesting decisions we made:

đź”’ Phase 1: Zero-Trust IoT Security (Auth0 for Agents)

The Goal: Prevent malicious actors from flooding the database with spoofed sensor data to falsely claim carbon rewards.

  • Machine-to-Machine (M2M) Auth: Because physical edge sensors cannot manually log in via a browser, we built a standalone mock-sensor.mjs script that uses a client_credentials grant to securely fetch a JWT from Auth0.
  • Cryptographic Verification: When the Next.js API receives a telemetry payload, it dynamically downloads the Auth0 JSON Web Key Set (JWKS) to mathematically verify the RSA signature of the token.
  • Scope Enforcement: We strictly enforce the write:sensor_data scope. If a compromised internal service pings the endpoint without that exact permission, it is instantly rejected with a 403 Forbidden.

auth0 agent

❄️ Phase 2: The Data Lake (Snowflake)

The Goal: Permanently and reliably store incoming telemetry data without brittle database schemas crashing every time a new sensor type is added.

  • VARIANT Data Types: We utilized Snowflake's native PARSE_JSON() function. Instead of making individual rigid columns for temperature, moisture, etc., we drop the entire JSON payload into a flexible VARIANT column, future-proofing our data ingestion.
  • Edge-Optimized Ingestion: Using the snowflake-sdk in our API, the data is handed off to Snowflake in a "fire-and-forget" Promise wrapper (.catch()). This allows the API to return a 200 OK to the IoT device instantly, saving precious battery life on the edge sensor while the heavy database insert happens in the background.

snowflake sql query

🤖 Phase 3: Multimodal AI Diagnostics (Google Gemini)

The Goal: Allow urban farmers to upload photos of sick plants and receive immediate, telemetry-aware diagnostics.

  • Vision & Context: We integrated the Gemini API to act as the visual engine powering plant diagnostics. By passing Base64 encoded image strings securely to the model, Gemini can identify issues like chlorosis or nutrient deficiencies.
  • Custom Markdown Parsing: To make the AI output actionable, we built a custom regex-based parser that takes raw string streams from Gemini and automatically formats them into a beautiful, color-coded UI (Warnings in red, Health checks in green).

gemini ai diagnostics

đź”— Phase 4: Blockchain Provenance (Solana)

The Goal: Prove mathematically to auditors or carbon-credit buyers that our stored ecological data was never tampered with.

  • Ledger Strategy: Rather than expensing high gas fees to store full JSON payloads on-chain, we use Node's native crypto library to generate a SHA-256 hash of the raw telemetry text payload.
  • Web3 Integration: Using @solana/web3.js, we store only the data hash on the Solana Devnet. If anyone maliciously alters the telemetry data sitting in Snowflake, the hashes will no longer match the blockchain, instantly exposing the tamper.

solana dev copilot

🎨 Phase 5: The Reactive UI

The Goal: Make the dashboard feel "alive" and directly tethered to the backend pipelines.

  • We utilized Framer Motion for a staggered, physics-based UI that boots up sequentially, simulating a tactical HUD.
  • The frontend silently polls the local in-memory store for new data. When a payload is successfully verified through the pipeline, the UI detects the new timestamp, updates the node grid with live moisture data, and pulses the analytics counters in real-time.

cloud run deploy

cloud run env

Prize Categories

  • Overall Earth Day Winner: Most Earth Day projects are trackers or calculators. Ours is active infrastructure. By framing urban balconies as "Autonomous Carbon Sinks," we’ve turned a hobby (gardening) into a verifiable environmental service.
  • Best Use of Auth0 for Agents: Our use of JWKS (JSON Web Key Sets) to verify RSA signatures on the fly shows a deep understanding of security. We’re treating our sensors as first-class "agents," which is exactly what this category looks for.
  • Best Use of Google Gemini: We aren't just using Gemini as a text generator; we're using it as a Computer Vision expert for plant diagnostics.
  • Best Use of Snowflake: Our "fire-and-forget" ingestion strategy is a brilliant nod to real-world engineering—minimizing the time an IoT device spends with its radio on to preserve battery life. It's also a system that won't break when a new sensor type (like pH or light intensity) is added tomorrow.
  • Best Use of GitHub Copilot: Copilot is most effective when handling the "glue code" and boilerplate for complex SDKs. We successfully integrated five major, disparate enterprise APIs (Auth0, Snowflake, Solana, Gemini, Next.js) in a single weekend.
  • Best Use of Solana: Storing SHA-256 hashes on the Devnet ledger creates a mathematically immutable audit trail. This makes our "LeafTokens" or carbon credits credible to third-party auditors, which is a sophisticated use case for Web3.

Team Submissions: @kheai @yeemun122

Top comments (0)