<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Fiewor John</title>
    <description>The latest articles on DEV Community by Fiewor John (@fiewor).</description>
    <link>https://dev.to/fiewor</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/fiewor"/>
    <language>en</language>
    <item>
      <title>How Gemma 4 Helped Us Cut the Last Wire Keeping AI Out of African Classrooms</title>
      <dc:creator>Fiewor John</dc:creator>
      <pubDate>Mon, 11 May 2026 14:41:21 +0000</pubDate>
      <link>https://dev.to/fiewor/how-gemma-4-helped-us-cut-the-last-wire-keeping-ai-out-of-african-classrooms-315i</link>
      <guid>https://dev.to/fiewor/how-gemma-4-helped-us-cut-the-last-wire-keeping-ai-out-of-african-classrooms-315i</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/google-gemma-2026-05-06"&gt;Gemma 4 Challenge: Build with Gemma 4&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Problem Started Long Before This Hackathon
&lt;/h2&gt;

&lt;p&gt;In 2021, as a postgraduate student, I built something called &lt;a href="https://www.linkedin.com/posts/john-fiewor-365484127_azure-ai-aiineducation-ugcPost-7010767935166164992-gkdo?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAB83RP0B1K-CvhaaKEtWP1XEL_aX0EkIPpE" rel="noopener noreferrer"&gt;AI-Grader&lt;/a&gt;. The idea was simple: teachers spend an unreasonable amount of their lives marking scripts. AI could change that. The first version used Microsoft Azure's Cognitive Services — OCR for handwriting, key phrase extraction to compare answers. It worked, but it was brittle. It compared keywords, not understanding. It could not tell you &lt;em&gt;why&lt;/em&gt; it gave a grade.&lt;/p&gt;

&lt;p&gt;Then ChatGPT dropped, and the gap between what I had built and what was now possible became impossible to ignore.&lt;/p&gt;

&lt;p&gt;In early 2024, I teamed up with two people I had only just met, and together we rebuilt everything from scratch — this time with Google Gemini — during the Google BuildWithAI Hackathon organised by GDG Lagos. We placed &lt;a href="https://www.linkedin.com/posts/john-fiewor-365484127_buildwithai-hackathon-ai-activity-7193176535451766784-VPqg?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAB83RP0B1K-CvhaaKEtWP1XEL_aX0EkIPpE" rel="noopener noreferrer"&gt;third nationally&lt;/a&gt;, renamed the project GradrAI, and walked away with confirmation that this problem was real and that Gemini was the right foundation.&lt;/p&gt;

&lt;p&gt;GradrAI's relationship with Google had officially begun.&lt;/p&gt;




&lt;h2&gt;
  
  
  What GradrAI Does
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://gradrai.com/" rel="noopener noreferrer"&gt;GradrAI&lt;/a&gt; is an AI-powered assessment platform for educators in Africa. Three core workflows:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Paper-Based Test (PBT) grading&lt;/strong&gt;: Teachers upload scanned student scripts, a question paper, and a marking guide. Gemini reads the handwriting, evaluates each answer, and returns per-question scores, explanations, and personalised student feedback — in minutes, not days.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Computer-Based Test (CBT) generation and grading&lt;/strong&gt;: Teachers upload lecture notes or past papers. The platform extracts topics, generates a structured quiz (MCQ, essay, or hybrid), produces a marking guide, and grades submissions automatically.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SmartPrep&lt;/strong&gt;: A student-facing practice mode for JAMB UTME and WASSCE. Students work through real past exam questions and receive AI-generated explanations and feedback on every answer — not just whether they got it right, but why, and what to revisit.&lt;/p&gt;

&lt;p&gt;The cloud version works well. Schools loved the demos. Then came the part nobody wants to hear at a demo: &lt;em&gt;"We don't have reliable internet."&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Conversation That Started Everything
&lt;/h2&gt;

&lt;p&gt;On the 20th of April, I shared a &lt;a href="https://youtube.com/shorts/d0gTthacB5c?si=Sfz_XA90ywjU8CnD" rel="noopener noreferrer"&gt;NetworkChuck video&lt;/a&gt; about offline AI models with my team. The idea of a fully offline GradrAI came up immediately, because we had real context for why it mattered.&lt;/p&gt;

&lt;p&gt;Three schools we had demoed to had given us near-identical feedback: a private secondary school in Lagos said they lacked the infrastructure to adopt a cloud-dependent solution. A school in Ogun State said internet costs were not something they were prepared to cover for teachers. A third Lagos school said something similar.&lt;/p&gt;

&lt;p&gt;These were not fringe cases. They were our target market telling us clearly that the cloud dependency was the barrier. Not the price. Not the product. The wire running from the classroom to the internet.&lt;/p&gt;

&lt;p&gt;We started building the offline app that same day. Students could take exams offline and MCQs were graded instantly — but essay questions still sat in a queue, waiting for an internet connection to sync to the cloud. We had solved the last mile while leaving the first mile untouched.&lt;/p&gt;

&lt;p&gt;Then, shortly before the DEV Community announced the Gemma 4 Challenge, &lt;a href="https://www.freecodecamp.org/" rel="noopener noreferrer"&gt;FreeCodeCamp&lt;/a&gt; released a tutorial on &lt;a href="https://youtu.be/HNVaYYxmwLU?si=SACW0arJa0PHRCtm" rel="noopener noreferrer"&gt;open model coding essentials&lt;/a&gt; that specifically called out Gemma 4 as &lt;em&gt;"very smart with low memory usage."&lt;/em&gt; When the hackathon dropped on May 6th, the stars had aligned. This feature needed to be finished.&lt;/p&gt;




&lt;h2&gt;
  
  
  Taking the Entire Pipeline Offline
&lt;/h2&gt;

&lt;p&gt;The cloud CBT pipeline has four AI operations:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Topic extraction&lt;/strong&gt; — read a PDF, identify key topics and weights&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Exam generation&lt;/strong&gt; — produce a structured quiz from those topics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Marking guide generation&lt;/strong&gt; — produce the evaluation rubric for essays&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Theory grading&lt;/strong&gt; — evaluate student answers against the rubric&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Operations 3 and 4 are text-in, text-out. Straightforward to move offline.&lt;/p&gt;

&lt;p&gt;Operations 1 and 2 are the hard part. The cloud implementation passes Google Cloud Storage URIs directly to Vertex AI. Ollama cannot resolve a GCS URI. To run these locally, we rasterise PDF pages to images in memory, base64-encode them, and pass them as inline multimodal parts to Gemma 4.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// offlineProcessor.js — PDF pages to base64 image parts, entirely in memory&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;pdfToPng&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;pdf-to-png-converter&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;rasterisePdfToInlineParts&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;localFilePath&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;pages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;pdfToPng&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;localFilePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;viewportScale&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;1.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;outputFileMask&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;page&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;capped&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;pages&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// respect context window&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;capped&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;image&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;toString&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;base64&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="na"&gt;mimeType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;image/png&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;}));&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// gemma.provider.js — unified Ollama interface for all four operations&lt;/span&gt;
&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;generateContent&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;promptText&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;imageParts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;promptText&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;...(&lt;/span&gt;&lt;span class="nx"&gt;imageParts&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;images&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;imageParts&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;}),&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;http://127.0.0.1:11434/api/chat&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;OFFLINE_MODEL&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gemma4&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="na"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1200000&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="c1"&gt;// 20 minutes — consumer hardware varies significantly&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The provider abstraction means the service layer never needs to know whether it is talking to Gemini on Vertex AI or Gemma 4 on Ollama. One interface. Both providers.&lt;/p&gt;




&lt;h2&gt;
  
  
  Engineering for the Edge: What We Learned the Hard Way
&lt;/h2&gt;

&lt;p&gt;Building for offline on consumer hardware revealed challenges that clean architecture diagrams do not warn you about.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Schema drift from smaller models.&lt;/strong&gt; Gemma 4 occasionally returned exam questions in a split format — separate &lt;code&gt;mcq&lt;/code&gt; and &lt;code&gt;essay&lt;/code&gt; arrays — rather than the unified root array the application expects. We built a normalisation layer that detects output shape and remaps non-conformant field names before they reach the persistence layer. The service never crashes on a slightly malformed response; it corrects and continues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Crash recovery for long inference.&lt;/strong&gt; On a standard teacher's laptop, the full pipeline — topic extraction, exam generation, marking guide — takes between 5 and 15 minutes. A power cut or accidental app close at minute 7 is a catastrophic user experience. Every generation stage now writes to a local recovery cache. On relaunch, the app detects the backup and lets the teacher restore instantly without re-running inference.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Room collision in air-gapped networks.&lt;/strong&gt; With multiple exams in a ready state, students were being routed to the oldest session by default — meaning valid access codes for the new exam were being rejected. We refined local session state to prioritise the most recently created session, which is almost always the one the teacher just generated.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sequential grading on slow hardware.&lt;/strong&gt; Grading ten essay questions locally is sequential. On slower machines this produced a silent hang in the UI that left students uncertain whether their submission had registered. We rebuilt the submission flow to include real-time status events pushed back to the student's device throughout the grading process.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Gemma 4 Specifically
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Native multimodal input eliminates a pipeline dependency.&lt;/strong&gt; Every other approach to offline PDF processing requires a separate OCR layer. Gemma 4's vision capability means one model handles document understanding, question generation, and essay grading. One model. One Ollama endpoint. The entire pipeline.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The 128K context window makes full syllabus context viable.&lt;/strong&gt; For topic extraction, an entire semester's lecture notes fit in a single call. For grading, the marking guide, question paper, and student answer fit together without chunking. This is the difference between a model that skims a document and one that reads it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The E4B model runs on the hardware that actually exists.&lt;/strong&gt; Most schools in our target market do not have server-grade machines. The teacher's laptop is the server. Gemma 4 E4B runs on consumer-grade laptops and high-end Android phones — a mid-range Windows machine, the kind a teacher might already carry to class.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Structured output is reliable enough for production.&lt;/strong&gt; Our grading pipeline requires strict schema adherence on every response. Gemma 4's instruction following is consistent enough at this task that our validation layer catches edge cases rather than routine failures.&lt;/p&gt;




&lt;h2&gt;
  
  
  What the Offline Flow Looks Like Now
&lt;/h2&gt;

&lt;p&gt;A teacher with no internet can now:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the GradrAI desktop app. Ollama starts as a sidecar. Gemma 4 loads.&lt;/li&gt;
&lt;li&gt;Upload a PDF — lecture notes, a past paper, a textbook chapter.&lt;/li&gt;
&lt;li&gt;Gemma 4 extracts topics. The teacher adjusts weights and priorities in the UI.&lt;/li&gt;
&lt;li&gt;Configure the exam: MCQ count, essay count, difficulty, total marks.&lt;/li&gt;
&lt;li&gt;Gemma 4 generates the full question set and marking guide.&lt;/li&gt;
&lt;li&gt;Publish the exam to the local network. Students connect via the teacher's hotspot.&lt;/li&gt;
&lt;li&gt;MCQs graded instantly. Essays graded by Gemma 4 within seconds of submission.&lt;/li&gt;
&lt;li&gt;Results stored locally, synced to cloud when connectivity returns.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;No cloud token consumed. No data left the building. The student gets their feedback before the class ends.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why This Matters
&lt;/h2&gt;

&lt;p&gt;The shift toward Computer-Based Testing is accelerating across Africa. JAMB administers its UTME entirely via CBT to over 1.8 million candidates annually. Ghana's WAEC has been piloting CBT for WASSCE. Kenya's KNEC has published a national CBT roadmap.&lt;/p&gt;

&lt;p&gt;The infrastructure reality has not kept pace. Nigeria's internet penetration sits below 45%. Electricity supply in many states is unreliable, compounding the problem — no power means no router means no exam. The tools being built for this transition cannot assume the connectivity that developed-market EdTech takes for granted.&lt;/p&gt;

&lt;p&gt;GradrAI's offline-first architecture is not a feature for edge cases. It is the baseline requirement for the market we are actually building for. Gemma 4 is what made it technically feasible at the hardware level this market actually has.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Comes Next
&lt;/h2&gt;

&lt;p&gt;The next stage is extending Gemma 4 into SmartPrep — GradrAI's student-facing JAMB and WASSCE practice mode. The goal is an offline practice experience where Gemma 4 generates questions from curriculum content, grades responses, identifies knowledge gaps across a session, and produces a diagnostic summary entirely on-device.&lt;/p&gt;

&lt;p&gt;A student preparing for JAMB in a rural school, with no data, no tutor, and no prep centre nearby, practising with an AI that tells them exactly what they got wrong and why. That is the version of this product we are building toward.&lt;/p&gt;




&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub Repository&lt;/strong&gt;: &lt;em&gt;(to be added)&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Demo Video&lt;/strong&gt;: &lt;em&gt;(to be added)&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GradrAI Platform&lt;/strong&gt;: &lt;a href="https://gradrai.com" rel="noopener noreferrer"&gt;gradrai.com&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;What we built during this hackathon is not a demo. It is a production feature shipping to real schools. The students who take exams on GradrAI's desktop app this term will get their theory grades back before they leave the classroom — because Gemma 4 is running on the teacher's laptop while they wait.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>gemmachallenge</category>
      <category>gemma</category>
    </item>
    <item>
      <title>My Updated Dev Portfolio</title>
      <dc:creator>Fiewor John</dc:creator>
      <pubDate>Mon, 02 Feb 2026 06:53:54 +0000</pubDate>
      <link>https://dev.to/fiewor/my-updated-dev-portfolio-307k</link>
      <guid>https://dev.to/fiewor/my-updated-dev-portfolio-307k</guid>
      <description>&lt;h2&gt;
  
  
  New Year, New You Portfolio Challenge Submission
&lt;/h2&gt;

&lt;h2&gt;
  
  
  Portfolio
&lt;/h2&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag__cloud-run"&gt;
  &lt;iframe height="600px" src="https://portfolio-943768265988.us-central1.run.app"&gt;
  &lt;/iframe&gt;
&lt;/div&gt;




&lt;h2&gt;
  
  
  About Me
&lt;/h2&gt;

&lt;p&gt;I am a software engineer who builds full-stack web applications with an emphasis on pragmatic, maintainable code and strong developer experience. For this portfolio refresh, I aimed to present a concise, modern summary of my skills and projects while improving performance, accessibility, and responsiveness so the site better represents my current capabilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I Built It
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Overview &amp;amp; Goals
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Modernize an old Create React App portfolio I last updated four years ago.
&lt;/li&gt;
&lt;li&gt;Improve developer experience (faster dev server, smaller builds).
&lt;/li&gt;
&lt;li&gt;Improve end-user experience (dark mode, responsive layout, clearer project highlights).
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Tools &amp;amp; Tech Stack
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Frontend:&lt;/strong&gt; React migrated from CRA → Vite (for faster dev feedback and smaller build times).
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build / Bundler:&lt;/strong&gt; Vite (ESM-first, fast HMR).
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hosting / Deployment:&lt;/strong&gt; Google Cloud Run (containerized, serverless deployment).
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Containerization:&lt;/strong&gt; Docker with multi-stage builds, small production image, gzip compression, SPA routing.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Google AI:&lt;/strong&gt; Antigravity — AI-first development environment to refactor, suggest improvements, and accelerate repetitive tasks.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Development practices:&lt;/strong&gt; prompt engineering best practices, incremental commits, dependency hygiene, CI/CD practices.
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Key Development Steps
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;CRA → Vite Migration&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Replaced CRA-specific scripts and configs with Vite-compatible equivalents.
&lt;/li&gt;
&lt;li&gt;Updated imports and adjusted environment variable usage to match Vite conventions.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Dependency Updates&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Upgraded major frontend dependencies to current stable releases (React 18 for compatibility).
&lt;/li&gt;
&lt;li&gt;Resolved peer dependency conflicts between React, React DOM, and testing libraries.
&lt;/li&gt;
&lt;li&gt;Removed deprecated or unmaintained packages.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Dark Mode&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;System-preference-aware theme with manual toggle persisted in &lt;code&gt;localStorage&lt;/code&gt;.
&lt;/li&gt;
&lt;li&gt;CSS variables for theme tokens for maintainability and compact implementation.
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Responsive &amp;amp; Accessibility Improvements&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Adjusted layout breakpoints and touch targets for mobile devices.
&lt;/li&gt;
&lt;li&gt;Improved semantic HTML and ARIA attributes where appropriate.
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Containerization &amp;amp; Cloud Run Deployment&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multi-stage Docker build:

&lt;ul&gt;
&lt;li&gt;Node build stage for dependency installation and Vite production build.
&lt;/li&gt;
&lt;li&gt;Nginx runtime stage with gzip compression and SPA routing.
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Ensured cross-platform compatibility by building &lt;code&gt;amd64/linux&lt;/code&gt; image for Cloud Run.
&lt;/li&gt;
&lt;li&gt;Pushed Docker image to Artifact Registry and deployed to Cloud Run.
&lt;/li&gt;
&lt;li&gt;Embedded the live Cloud Run deployment in this post.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  How I Used Antigravity
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Performed targeted refactors (e.g., simplifying a large portfolio component).
&lt;/li&gt;
&lt;li&gt;Generated guided migration snippets for CRA → Vite.
&lt;/li&gt;
&lt;li&gt;Used iterative prompt → change → test cycles to keep changes minimal and reversible.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  What I'm Most Proud Of
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Successful migration from CRA → Vite with faster builds and improved developer feedback loop.
&lt;/li&gt;
&lt;li&gt;Dark mode implementation that respects system preference with a simple toggle.
&lt;/li&gt;
&lt;li&gt;Dependency hygiene and modernized build process.
&lt;/li&gt;
&lt;li&gt;Cloud Run deployment with a multi-stage Docker build and gzip, making the portfolio lightweight, fast, and embeddable.
&lt;/li&gt;
&lt;li&gt;AI-augmented development workflow significantly reduced time spent on repetitive refactors.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  What to Look for in the Demo
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Project cards linking to code samples and short implementation notes.
&lt;/li&gt;
&lt;li&gt;Clear, scannable “About / Skills / Contact” section.
&lt;/li&gt;
&lt;li&gt;Theme toggle and responsive behavior — try resizing the viewport or switching system dark/light mode.
&lt;/li&gt;
&lt;li&gt;Accessible semantic structure and keyboard navigability for main sections.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Thank you for reviewing my submission — I look forward to feedback from the Google AI team and the community.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>googleaichallenge</category>
      <category>portfolio</category>
      <category>gemini</category>
    </item>
    <item>
      <title>Using GitHub Actions to Battle Racism (or at least contribute to the process)!</title>
      <dc:creator>Fiewor John</dc:creator>
      <pubDate>Sun, 21 Nov 2021 13:38:30 +0000</pubDate>
      <link>https://dev.to/fiewor/using-github-actions-to-battle-racism-or-at-least-contribute-to-the-process-5fpf</link>
      <guid>https://dev.to/fiewor/using-github-actions-to-battle-racism-or-at-least-contribute-to-the-process-5fpf</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;GitHub Actions are event-driven workflows which are very useful for automating processes in software development lifecycle. They're even more awesome because they're located where all the fun stuff happens - GitHub! This takes resolving of issues, collaboration and deployment to a whole other level and makes it even easier too.&lt;/p&gt;

&lt;p&gt;To learn more about GitHub actions and how you can begin to utilise them, check their easy-to-follow &lt;a href="https://docs.github.com/en/actions"&gt;documentation&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  My Workflow
&lt;/h3&gt;

&lt;p&gt;During Hacktoberfest, I created two greeting bots and one stale bot for IBM's Call for Code for Racial Justice Open Source projects.&lt;br&gt;
Basically, the greeting bots greet new contributors to the project with a message.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Greetings

on: [pull_request, issues]

jobs:
  greeting:
    runs-on: ubuntu-latest
    steps:
       ...
       issue-message: 'Thank you so much for contributing to our work!'
       pr-message: 'Thank you for your contribution! Someone will review it ASAP.'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The stale bot marks issues stale after a particular number of days. You might be wondering why this would be useful. Well, it helps notify contributors of issues that have been unattended to for too long and hence prioritize such issues.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Mark stale issues and pull requests

on: 
  schedule:
    - cron: "0 0 * * *"

jobs:
  stale:

    runs-on: ubuntu-latest

    steps:
      - uses: actions/stale@v1
        ....
          stale-issue-message: ':wave: Hi! This issue has been marked stale due to inactivity. If no further activity occurs within the next 7 days, it will automatically be closed.'
          ...
          stale-issue-label: 'stale'
          exempt-issue-label: 'keep-open'
          remove-stale-when-updated: true
          stale-issue-label: 'Stale'
          stale-pr-label: 'Stale'
          labels-to-add-when-unstale: 'help-wanted'
          days-before-stale: 60
          days-before-close: 7
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These GitHub Action Bots are being used in two Call For Code for Racial Justice Projects&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://github.com/Call-for-Code-for-Racial-Justice/Truth-Loop"&gt;Truth Loop&lt;/a&gt; - a solution that helps communities simply understand the policies, regulations and legislation that will impact them the most and allows them to share their experiences around how policies have impacted them or how proposed policies could impact them using short video testimonials.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/Call-for-Code-for-Racial-Justice/TakeTwo-Marker-ChromeExtension"&gt;TakeTwo-Marker-ChromeExtension&lt;/a&gt; which is 'a plugin to facilitate the capture and categorization of words and phrases that could be racially biased through a browser.'&lt;/p&gt;
&lt;h3&gt;
  
  
  Submission Category:
&lt;/h3&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Maintainer Must-Haves - since it makes the lives of open source maintainers easier.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Link to Code
&lt;/h3&gt;

&lt;p&gt;Stale Bot&lt;br&gt;
&lt;a href="https://github.com/Call-for-Code-for-Racial-Justice/Truth-Loop/pull/211/files"&gt;https://github.com/Call-for-Code-for-Racial-Justice/Truth-Loop/pull/211/files&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Greeting Bot&lt;br&gt;
TakeTwo-Marker-ChromeExtension&lt;br&gt;
&lt;a href="https://github.com/Call-for-Code-for-Racial-Justice/TakeTwo-Marker-ChromeExtension/pull/29/files"&gt;https://github.com/Call-for-Code-for-Racial-Justice/TakeTwo-Marker-ChromeExtension/pull/29/files&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Truth Loop&lt;br&gt;
&lt;a href="https://github.com/Call-for-Code-for-Racial-Justice/Truth-Loop/pull/195/files"&gt;https://github.com/Call-for-Code-for-Racial-Justice/Truth-Loop/pull/195/files&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Additional Resources / Info
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Here's the article by Call for Code in which I was featured:&lt;br&gt;
&lt;a href="https://medium.com/callforcode/hacktoberfest-fighting-racism-with-open-source-code-956559da7d6d"&gt;https://medium.com/callforcode/hacktoberfest-fighting-racism-with-open-source-code-956559da7d6d&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I was also a speaker in the Demo Day (somewhere between 8:00 and 13:00)&lt;br&gt;
&lt;a href="https://www.youtube.com/watch?v=7VWlc5JiZ7Q"&gt;https://www.youtube.com/watch?v=7VWlc5JiZ7Q&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Most importantly, join the call, there's a lot more awesome stuff to be done!&lt;br&gt;
&lt;a href="https://developer.ibm.com/callforcode/racial-justice/"&gt;https://developer.ibm.com/callforcode/racial-justice/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>actionshackathon21</category>
      <category>hacktoberfest</category>
      <category>github</category>
      <category>blacklivesmatter</category>
    </item>
  </channel>
</rss>
