<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mustapha Adekunle</title>
    <description>The latest articles on DEV Community by Mustapha Adekunle (@engrkrooozy).</description>
    <link>https://dev.to/engrkrooozy</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/engrkrooozy"/>
    <language>en</language>
    <item>
      <title>I Built a Real-Time AI Legal Assistant That Reads Your Documents and Talks to You — Here's How</title>
      <dc:creator>Mustapha Adekunle</dc:creator>
      <pubDate>Mon, 16 Mar 2026 17:05:38 +0000</pubDate>
      <link>https://dev.to/engrkrooozy/i-built-a-real-time-ai-legal-assistant-that-reads-your-documents-and-talks-to-you-heres-how-28i</link>
      <guid>https://dev.to/engrkrooozy/i-built-a-real-time-ai-legal-assistant-that-reads-your-documents-and-talks-to-you-heres-how-28i</guid>
      <description>&lt;p&gt;&lt;em&gt;Built for the Google Gemini Live Agent Challenge. This article covers what it does, the architecture behind it, and how to run it yourself.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Imagine receiving a letter from your landlord. The words "NOTICE TO QUIT" are printed in bold at the top. Below that: dense legal text, references to statutes you've never heard of, a deadline buried somewhere in the third paragraph.&lt;/p&gt;

&lt;p&gt;You don't know if you have three days or thirty. You don't know if this is serious or routine. You don't know your rights.&lt;/p&gt;

&lt;p&gt;For 60 million Americans facing civil legal crises every year — evictions, debt collection, court summons, insurance denials — this is a real, daily experience. And most of them face it alone, because a lawyer costs $400 an hour and legal aid organizations have years-long waitlists.&lt;/p&gt;

&lt;p&gt;That's the problem I set out to address with &lt;strong&gt;ClearRight&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Is ClearRight?
&lt;/h2&gt;

&lt;p&gt;ClearRight is a real-time voice AI assistant that helps people understand their legal documents and know their rights. You upload a document, an AI reads it and gives you an instant briefing, and then you talk — out loud, naturally, conversationally — with Clara, a legal information assistant powered by Google's Gemini Live API.&lt;/p&gt;

&lt;p&gt;No typing. No waiting. No cost.&lt;/p&gt;

&lt;p&gt;The experience is designed to feel like having a knowledgeable, calm friend walk you through a scary letter — not like querying a legal database.&lt;/p&gt;

&lt;p&gt;Here's what happens when you use it:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1 — Upload your document.&lt;/strong&gt; You drop in a PDF or photo of any legal document. Within a few seconds, the AI reads it using Gemini's multimodal vision and surfaces a structured analysis: what kind of document it is, whether it's high, medium, or low risk, the two most important things you need to know, and four specific questions you should be asking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2 — Read the briefing.&lt;/strong&gt; Before you say a word, you already know what you're dealing with. The right panel shows you the document type ("Pay or Quit Notice"), the risk level (a red "High Risk" badge), key points ("You have 3 days to pay $1,200 or vacate the property"), and tappable question chips. Tap any chip and it's sent to Clara instantly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3 — Talk to Clara.&lt;/strong&gt; Start a voice session and speak naturally. Clara hears you, responds with audio, and can be interrupted mid-sentence. She uses Google Search in real time to ground her answers in current law. She tells you your rights, explains what the document is asking you to do, and always ends by pointing you to free legal aid resources — because she provides legal &lt;em&gt;information&lt;/em&gt;, not legal &lt;em&gt;advice&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Technical Architecture
&lt;/h2&gt;

&lt;p&gt;ClearRight has two independently deployed services on Google Cloud Run: a Python FastAPI backend and a Next.js frontend. Let me walk through how each piece works.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvbipqctmwhl7z5gw5x6x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvbipqctmwhl7z5gw5x6x.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Document Processing: Two Gemini Calls in One Upload
&lt;/h3&gt;

&lt;p&gt;When you upload a document, the backend makes two sequential calls to &lt;code&gt;gemini-2.5-flash&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The first call passes the raw file bytes directly to the model using &lt;code&gt;Part.from_bytes()&lt;/code&gt; with the file's MIME type. Gemini's multimodal vision reads the document — PDF, JPEG, PNG, HEIC, whatever — and returns the full extracted text. This is more reliable than traditional OCR because it understands document structure, handles handwriting, and correctly interprets scanned pages.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;models&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;generate_content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gemini-2.5-flash&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;contents&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Part&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_bytes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;file_bytes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;mime_type&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content_type&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Please extract and transcribe ALL text content from this document...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;],&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;extracted_text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The second call takes that extracted text and asks the model to produce a structured JSON analysis:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;analysis_prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Based on the document above, return ONLY a valid JSON object:
{
  &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;doc_type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;type in 2-4 words (e.g. Lease Agreement)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;,
  &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;risk_level&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;high or medium or low&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;,
  &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;key_points&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;: [&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;key point 1&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;, &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;key point 2&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;],
  &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;suggested_questions&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;: [&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;question 1?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;, &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;question 2?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;, &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;question 3?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;, &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;question 4?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;]
}&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The analysis step is fault-tolerant — if JSON parsing fails, the upload still succeeds and the document is still available for Clara to read. The UI just won't show the analysis card. This matters in production: you never want a secondary feature to break the primary flow.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Voice Agent: Google ADK + Gemini Live API
&lt;/h3&gt;

&lt;p&gt;The live conversation is powered by Google's Agent Development Kit (ADK) and the Gemini Live API. Here's the setup:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;root_agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;clara_legal_assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gemini-2.5-flash-native-audio-latest&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Clara — a compassionate legal information assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;instruction&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;AGENT_INSTRUCTION&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;google_search&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The model &lt;code&gt;gemini-2.5-flash-native-audio-latest&lt;/code&gt; is a &lt;em&gt;native audio&lt;/em&gt; model — it processes and generates audio directly, without converting speech to text and back. This is what makes the conversation feel natural: there's no robotic cadence, no processing pauses between speech segments. Clara sounds like a person.&lt;/p&gt;

&lt;p&gt;When a WebSocket connection opens, the backend starts an ADK &lt;code&gt;InMemoryRunner&lt;/code&gt; session and configures it for bidirectional streaming:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;run_config&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;RunConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;streaming_mode&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;bidi&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;response_modalities&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Modality&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;AUDIO&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="n"&gt;realtime_input_config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;RealtimeInputConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;automatic_activity_detection&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;AutomaticActivityDetection&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;start_of_speech_sensitivity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;StartSensitivity&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;START_SENSITIVITY_HIGH&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;end_of_speech_sensitivity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;EndSensitivity&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;END_SENSITIVITY_HIGH&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;prefix_padding_ms&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;silence_duration_ms&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="n"&gt;speech_config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;SpeechConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;voice_config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;VoiceConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;prebuilt_voice_config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;PrebuiltVoiceConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;voice_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Aoede&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The VAD (Voice Activity Detection) settings are tuned for low latency: 200ms silence duration means Clara starts responding quickly after you stop talking, and high sensitivity means she catches soft speech.&lt;/p&gt;

&lt;p&gt;If a document was uploaded, it's injected into the session before any user interaction:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;live_request_queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send_content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;Content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;parts&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nc"&gt;Part&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;[DOCUMENT UPLOADED BY USER]&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;document_context&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)],&lt;/span&gt;
&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;live_request_queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send_content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;Content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;parts&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nc"&gt;Part&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Please greet me briefly, confirm you&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ve read my document, &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
                     &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;and tell me the single most important thing I should know about it.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)],&lt;/span&gt;
&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Clara's first words when you connect are always relevant to your document. She's already read it.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Audio Pipeline: AudioWorklet in the Browser
&lt;/h3&gt;

&lt;p&gt;Getting real-time audio in and out of a browser is more involved than it sounds. ClearRight uses two custom &lt;code&gt;AudioWorkletProcessor&lt;/code&gt; instances running on dedicated audio threads.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Recording (16kHz):&lt;/strong&gt; The recorder worklet captures microphone input, resamples to 16kHz PCM, and sends chunks to the main thread roughly every 100ms. It also runs a simple energy-based VAD that detects speech start and end events, which the UI uses to drive the "you're speaking" indicator and flush the playback buffer when you interrupt.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Playback (24kHz):&lt;/strong&gt; The player worklet receives 24kHz PCM chunks from the backend via WebSocket and plays them in order. When the user starts speaking (detected by the recorder), the player's buffer is flushed immediately — this is what makes interruption work. Clara stops mid-sentence, the buffer clears, and the session moves forward.&lt;/p&gt;

&lt;p&gt;One important implementation detail: AudioContexts should be &lt;em&gt;suspended&lt;/em&gt;, not &lt;em&gt;closed&lt;/em&gt;, between sessions. Chrome does not allow re-registering an AudioWorklet module on a context that has been previously used. Suspending the context keeps the worklet thread alive for reuse.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// On disconnect — suspend, don't close&lt;/span&gt;
&lt;span class="nx"&gt;audioRecorderContextRef&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;suspend&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;
&lt;span class="nx"&gt;audioPlayerContextRef&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;suspend&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;

&lt;span class="c1"&gt;// On reconnect — resume the existing context&lt;/span&gt;
&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;state&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;suspended&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;resume&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Clara's Persona
&lt;/h3&gt;

&lt;p&gt;Clara is not just a legal search engine. The system prompt defines a careful persona:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Warm and reassuring.&lt;/strong&gt; The people talking to Clara are scared. They received a threatening letter and don't know what it means. Clara's tone is the same as a trusted, knowledgeable friend — not a lawyer billing by the hour.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Plain English always.&lt;/strong&gt; No legal jargon without immediate explanation. If Clara says "FDCPA," she immediately follows with what that means in plain language.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Information, not advice.&lt;/strong&gt; Clara explains what documents say, what rights exist under the law, what deadlines apply. She does not tell you what to do in your specific situation — that requires a licensed attorney. Every substantive response ends with a pointer to free legal aid resources.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Grounded answers.&lt;/strong&gt; Clara never guesses at specific deadlines or statute numbers. She uses &lt;code&gt;google_search&lt;/code&gt; when she needs current information, state-specific rules, or local legal aid contacts.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Running ClearRight Yourself
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.11+&lt;/li&gt;
&lt;li&gt;Node.js 18+&lt;/li&gt;
&lt;li&gt;A free Google AI Studio API key from &lt;a href="https://aistudio.google.com" rel="noopener noreferrer"&gt;aistudio.google.com&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Chrome or Edge (required for AudioWorklet support)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  1. Clone and configure
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/engr-krooozy/clearright.git
&lt;span class="nb"&gt;cd &lt;/span&gt;clearright

&lt;span class="nb"&gt;cp&lt;/span&gt; .env.example server/.env
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Edit &lt;code&gt;server/.env&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GOOGLE_API_KEY=your_api_key_here
APP_NAME=clearright
AGENT_VOICE=Aoede
AGENT_LANGUAGE=en-US
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Start everything
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;./run_local.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This starts the FastAPI backend on port 8000 and the Next.js frontend on port 3000. Open &lt;code&gt;http://localhost:3000&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Try it
&lt;/h3&gt;

&lt;p&gt;Upload any PDF — a lease, a letter from a debt collector, an employment contract. Watch the analysis card appear. Then click "Talk to Clara" and ask anything.&lt;/p&gt;

&lt;p&gt;If you don't have a legal document handy, try uploading the terms of service from any app. The analysis will still work and Clara can walk you through what you're actually agreeing to.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkto084wdoq916z5bjtm1.png" alt=" "&gt;
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Things I Learned Building This
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Voice-first UX is genuinely different from chat-first UX.&lt;/strong&gt; The initial instinct was to show a conversation transcript. But the native audio model doesn't surface text through the ADK event stream — it outputs audio only. That constraint forced a better design decision: front-load the useful information (the analysis card) and let the voice feel like voice, not a chat interface with audio bolted on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multimodal document processing is remarkably capable.&lt;/strong&gt; Gemini 2.5 Flash handles handwritten notes, photographs of physical documents, and scanned PDFs with high fidelity. I tested it with water-damaged lease printouts, photos of letters taken on a phone, and PDFs with multi-column legal formatting. The extraction quality was consistently better than expected.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The ADK does the hard orchestration work.&lt;/strong&gt; Session management, tool call routing, event streaming, and the live connection lifecycle are all handled by the ADK. What you write is the agent definition, the system prompt, and the application logic around it. That said, understanding what events actually flow through &lt;code&gt;run_live()&lt;/code&gt; for a native audio model required careful debugging — always add server-side event logging during development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Guardrails are a product decision, not just a legal disclaimer.&lt;/strong&gt; The distinction between legal information and legal advice shapes every aspect of Clara's persona. Defining that boundary clearly in the system prompt, and testing that the model respects it across a range of scenarios, was as important as any technical implementation detail.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;p&gt;The foundation is solid. A few natural extensions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Persistent sessions&lt;/strong&gt; — Return to a conversation about your document across multiple days&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Document comparison&lt;/strong&gt; — "Here's my old lease and my new one — what changed?"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;State-specific legal aid routing&lt;/strong&gt; — Automatically surface the right legal aid organization based on the document type and detected jurisdiction&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spanish and other languages&lt;/strong&gt; — Legal document complexity is compounded by language barriers; this is an obvious high-impact next step&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;The live version is deployed at: &lt;strong&gt;&lt;a href="https://clearright-ui-q63ub5ulzq-uc.a.run.app" rel="noopener noreferrer"&gt;https://clearright-ui-q63ub5ulzq-uc.a.run.app&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The full source code is on GitHub: &lt;strong&gt;&lt;a href="https://github.com/engr-krooozy/clearright" rel="noopener noreferrer"&gt;https://github.com/engr-krooozy/clearright&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This project was built for the &lt;a href="https://geminiliveagentchallenge.devpost.com/" rel="noopener noreferrer"&gt;Gemini Live Agent Challenge&lt;/a&gt; hosted by Google.&lt;/p&gt;

&lt;p&gt;If you're building something with the Gemini Live API or ADK, I hope this walkthrough saves you some debugging time. The audio pipeline details in particular took a while to get right — happy to answer questions.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;ClearRight provides general legal information only — not legal advice. For guidance specific to your situation, consult a licensed attorney or contact your local legal aid organization.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;#GeminiLiveAgentChallenge #GoogleAI #ADK #GeminiLiveAPI #LegalTech&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Deploy Enterprise Applications on the Cloud in Minutes, Not Months</title>
      <dc:creator>Mustapha Adekunle</dc:creator>
      <pubDate>Sat, 14 Feb 2026 19:58:59 +0000</pubDate>
      <link>https://dev.to/engrkrooozy/deploy-enterprise-applications-on-the-cloud-in-minutes-not-months-4h47</link>
      <guid>https://dev.to/engrkrooozy/deploy-enterprise-applications-on-the-cloud-in-minutes-not-months-4h47</guid>
      <description>&lt;p&gt;⚡ &lt;strong&gt;Deploy Enterprise Applications on the Cloud in Minutes, Not Months&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Tired of wrestling with cloud infrastructure just to get an application running?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;RAD (Rapid Application Deployment)&lt;/strong&gt; is an open-source framework that lets you deploy production-ready applications on the Cloud with minimal configuration — &lt;strong&gt;no deep Terraform knowledge required.&lt;/strong&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  📦 Choose From 16+ Ready-to-Deploy Applications
&lt;/h1&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Application&lt;/th&gt;
&lt;th&gt;What It Does&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;🏭&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Odoo&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🎓&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Moodle&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📝&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;WordPress&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🤖&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;N8N&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🏥&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;OpenEMR&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🐍&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Django&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📰&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Ghost&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🧩&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Strapi / Directus&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🛒&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Medusa&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📚&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Wiki.js&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h1&gt;
  
  
  ✅ What You Get Out of the Box
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;🗄️ Managed PostgreSQL or MySQL databases with automated backups&lt;/li&gt;
&lt;li&gt;🔐 Private networking and Secret Manager integration — &lt;strong&gt;security by default&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;📊 Auto-scaling and &lt;strong&gt;scale-to-zero&lt;/strong&gt; — pay only for what you use&lt;/li&gt;
&lt;li&gt;💓 Health checks, monitoring dashboards, and alerting&lt;/li&gt;
&lt;li&gt;💾 NFS and Cloud Storage for persistent file storage&lt;/li&gt;
&lt;li&gt;🔒 SSL/TLS encryption throughout&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  🙋 Who Is This For?
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;👨‍💻 Developers who want production infrastructure without the ops overhead&lt;/li&gt;
&lt;li&gt;👥 Small teams deploying internal tools, learning platforms, or client projects&lt;/li&gt;
&lt;li&gt;💼 Freelancers and consultants delivering cloud solutions to clients&lt;/li&gt;
&lt;li&gt;🙌 Anyone who has tried deploying these applications manually and wished there was a better way&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 RAD handles the infrastructure plumbing — VPC networking, IAM, database provisioning, container orchestration — so you can focus on &lt;strong&gt;using&lt;/strong&gt; the application, not fighting the cloud.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Get started and deploy your first application today at&lt;/strong&gt; &lt;a href="https://rad.techequity.cloud/?ref=TIKS57NPPU0D" rel="noopener noreferrer"&gt;&lt;strong&gt;https://rad.techequity.cloud/?ref=TIKS57NPPU0D&lt;/strong&gt;&lt;/a&gt;&lt;strong&gt;.&lt;/strong&gt; 👇&lt;/p&gt;

&lt;p&gt;Sample video to show the ease of &lt;a href="https://storage.googleapis.com/rad-public-2b65/demos/15-user-deploy-application_module.mp4" rel="noopener noreferrer"&gt;deployment&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;code&gt;#GoogleCloud&lt;/code&gt; &lt;code&gt;#CloudRun&lt;/code&gt; &lt;code&gt;#Terraform&lt;/code&gt; &lt;code&gt;#OpenSource&lt;/code&gt; &lt;code&gt;#RAD&lt;/code&gt; &lt;code&gt;#NoOps&lt;/code&gt; &lt;code&gt;#CloudDeployment&lt;/code&gt; &lt;code&gt;#Moodle&lt;/code&gt; &lt;code&gt;#Odoo&lt;/code&gt; &lt;code&gt;#WordPress&lt;/code&gt; &lt;code&gt;#DevTools&lt;/code&gt;&lt;/p&gt;

</description>
      <category>odoo</category>
      <category>rad</category>
      <category>googlecloud</category>
    </item>
    <item>
      <title>⚡ Deploy Enterprise Applications on the Cloud in Minutes, Not Months</title>
      <dc:creator>Mustapha Adekunle</dc:creator>
      <pubDate>Sat, 07 Feb 2026 07:52:52 +0000</pubDate>
      <link>https://dev.to/engrkrooozy/deploy-enterprise-applications-on-the-cloud-in-minutes-not-months-1bi7</link>
      <guid>https://dev.to/engrkrooozy/deploy-enterprise-applications-on-the-cloud-in-minutes-not-months-1bi7</guid>
      <description>&lt;p&gt;Tired of wrestling with cloud infrastructure just to get an application running?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;RAD (Rapid Application Deployment)&lt;/strong&gt; is an open-source framework that lets you deploy production-ready applications on the Cloud with minimal configuration — &lt;strong&gt;no deep Terraform knowledge required.&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  📦 Choose From 16+ Ready-to-Deploy Applications
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Application&lt;/th&gt;
&lt;th&gt;What It Does&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;🏭&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Odoo&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Full ERP — CRM, accounting, e-commerce, manufacturing&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🎓&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Moodle&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Learning management for online courses and training&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📝&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;WordPress&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Content management and marketing sites&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🤖&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;N8N&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Workflow automation (including AI-enhanced with vector DB &amp;amp; LLM)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🏥&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;OpenEMR&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Electronic health records for medical practices&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🐍&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Django&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Custom Python web applications&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📰&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Ghost&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Professional blogging and newsletters&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🧩&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Strapi / Directus&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Headless CMS and content APIs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;🛒&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Medusa&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Headless e-commerce platform&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;📚&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Wiki.js&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Knowledge base and documentation&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  ✅ What You Get Out of the Box
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;🗄️ Managed PostgreSQL or MySQL databases with automated backups&lt;/li&gt;
&lt;li&gt;🔐 Private networking and Secret Manager integration — &lt;strong&gt;security by default&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;📊 Auto-scaling and &lt;strong&gt;scale-to-zero&lt;/strong&gt; — pay only for what you use&lt;/li&gt;
&lt;li&gt;💓 Health checks, monitoring dashboards, and alerting&lt;/li&gt;
&lt;li&gt;💾 NFS and Cloud Storage for persistent file storage&lt;/li&gt;
&lt;li&gt;🔒 SSL/TLS encryption throughout
@@ -120,7 +120,7 @@&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 RAD handles the infrastructure plumbing — VPC networking, IAM, database provisioning, container orchestration — so you can focus on &lt;strong&gt;using&lt;/strong&gt; the application, not fighting the cloud.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://rad.techequity.cloud/?ref=TIKS57NPPU0D" rel="noopener noreferrer"&gt;Get started and deploy your first application today&lt;/a&gt;.&lt;/strong&gt; 👇&lt;/p&gt;

&lt;p&gt;&lt;code&gt;#GoogleCloud&lt;/code&gt; &lt;code&gt;#CloudRun&lt;/code&gt; &lt;code&gt;#Terraform&lt;/code&gt; &lt;code&gt;#OpenSource&lt;/code&gt; &lt;code&gt;#RAD&lt;/code&gt; &lt;code&gt;#NoOps&lt;/code&gt; &lt;code&gt;#CloudDeployment&lt;/code&gt; &lt;code&gt;#Moodle&lt;/code&gt; &lt;code&gt;#Odoo&lt;/code&gt; &lt;code&gt;#WordPress&lt;/code&gt; &lt;code&gt;#DevTools&lt;/code&gt;&lt;br&gt;
`&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>terraform</category>
      <category>odoo</category>
      <category>moodle</category>
    </item>
  </channel>
</rss>
