This is a submission for the New Year, New You Portfolio Challenge Presented by Google AI
About Me
I’m Bryan Chense Simwayi, a Creative AI Technologist based in Zambia.
I don’t see AI as just a productivity tool or a code generator. I see it as a new interface for thought—a way to externalize cognition, perception, agency, and exploration. My work lives at the intersection of experimental AI systems, psychoacoustics, and human–AI collaboration.
For this challenge, I didn’t want to build a static brochure. I wanted to build a digital lab—a portfolio that behaves like a living system, not a document.
Portfolio
How I Built It
This portfolio is a fully functional, client-side React application with real AI integration and interactive systems.
The Stack
Google AI Studio: Used as the primary environment for prompting, iterating, and refining the "Neural Core" persona.
Gemini 2.0 Flash (via @google/genai SDK): Powers the live AI assistant embedded in the site.
Google Cloud Run: The app is containerized via Docker and deployed to Cloud Run to ensure scalability for the real-time visualizers.
Frontend: React 19, TypeScript, and Tailwind CSS (CDN).
The AI Assistant (“Neural Core”)
The assistant embedded in the bottom-right is not a mock UI. It is a fully functional agent that functions as an intelligent archive of my work.
Real Connection: It initializes a Gemini 2.0 Flash chat session directly in the browser.
Context Aware: It operates on a strict JSON knowledge graph of my bio, projects, and philosophy.
Function Calling: I exposed a navigate() tool to Gemini. If you ask it to "Take me to the Lab," it doesn't just tell you where to go—it actually executes the navigation function to route the app.
Voice Input: It utilizes the browser's native webkitSpeechRecognition API to allow for voice-to-text interaction.
The Lab (Interactive Audio Visualizer)
The Lab page is a real-time system powered by the Web Audio API and HTML5 Canvas.
It captures live microphone input and visualizes the frequency data (FFT) as a 3D particle sphere. The bass frequencies pulse at the bottom, mids ripple through the center, and treble sparks at the top. It serves as a metaphor for "Neural Weight Connection," making the portfolio a genuine interactive experiment.
Generative Interface Elements
I avoided generic templates. All visuals are code-driven:
The background runs a particle simulation that changes behavior (Flow vs. Grid vs. Orbit) based on which page you are viewing.
Project cards generate procedural SVG visuals based on the project's ID and category—no static stock images are used.
What I’m Most Proud Of
- Treating the portfolio as an engineering project This isn't just layout and content. It includes stateful navigation logic, real-time audio analysis, and agentic AI integration. The portfolio itself is a case study.
- Research-style presentation Each project page is structured around "Intent," "Exploration," and "Field Notes." This reflects how I actually work: through experimentation and observation rather than just feature checklists.
- Using Google AI Studio as a design partner I didn't just use AI Studio to generate code. I treated it as a collaborator—using it to shape the tone of the assistant, refine the visual metaphors, and architecture the system prompts that give the site its unique "personality."
Top comments (0)