DEV Community

Cover image for AI Engineering and Building Systems: Reflections on a Month of AI Engineering with goose by Block
Erica
Erica

Posted on • Edited on

AI Engineering and Building Systems: Reflections on a Month of AI Engineering with goose by Block

A month-long engineering journey powered with goose by Block , MCP, Anthropic's Sonnet 4.5, spatial intelligence with MediaPipe by Google, accessibility-driven UI design, and a comprehensive stack of modern AI tooling. Seventeen full-stack applications later, here's what I learned and why this workflow fundamentally transformed the way I build.


Reflections on a Month of AI Engineering with goose
Over the past month, I built seventeen full-stack applications; not prototypes, but complete, production-ready systems. Each project pushed my understanding of architecture, workflow optimization, and AI engineering capabilities beyond the previous one. I genuinely loved every moment of it. Designing interfaces, structuring state management, building MCP servers, orchestrating YAML recipes, experimenting with spatial intelligence, and refining accessibility patterns became a rhythm I craved each day. The momentum was relentless and exhilarating. Each application revealed another layer of what this workflow makes possible, and I'm walking away from this challenge more energized and inspired than when I began.


What goose Is and Why It Transformed My Workflow
goose isn't a replacement for my existing engineering workflow, it's a powerful augmentation. Goose is a comprehensive engineering environment built on the Model Context Protocol (MCP). MCP provides a structured framework for defining tools, managing state, and delivering dynamic interfaces without the traditional server infrastructure overhead. Goose brings that protocol to life in remarkable ways.

goose Desktop
A live development environment featuring real-time UI previews, auto-visualizers, comprehensive tool logs, and resource inspectors. It feels like building inside a living, breathing system that responds to your intent.

goose CLI
A command-driven interface for running MCP servers, inspecting tools, testing workflows, and debugging state. It becomes the backbone of your development loop, providing granular control over every aspect of the build process.

YAML Recipes
A lightweight automation layer that enables you to chain tools, execute multi-step workflows, and define repeatable processes without writing additional code. It's automation that feels intuitive rather than cumbersome.

MCP Auto Visualizers
These instantly transform structured tool outputs into visual interfaces - tables, trees, HTML, JSON, UI fragments. No extra configuration required. The visualization happens automatically, letting you focus on the logic rather than the presentation layer.

Anthropic's Sonnet 4.5 and Goose API Integration
Throughout this challenge, I leveraged Sonnet 4.5 within goose to generate code, refactor architecture, build UI components, draft documentation, validate accessibility patterns, and prototype spatial intelligence workflows. The synergy between Sonnet 4.5's reasoning capabilities and goose's structured environment created a development experience that genuinely felt like pair programming with an engineer who comprehends the entire system architecture.

Why Goose Is a Game-Changer for AI Engineering
Let me be clear: goose fundamentally changed how I approach building software. This isn't hyperbole - it's the reality of working with a tool that eliminates friction at every level of the development process. Before goose, I was constantly context-switching between editors, terminals, browsers, documentation, and testing environments. Each switch broke my flow, interrupted my thinking, and slowed down the feedback loop that's essential to creative problem-solving.
goose eliminates that fragmentation entirely

The Power of Unified Context
What makes goose exceptional is how it maintains unified context across every aspect of development. When I'm building an MCP server, goose isn't just running my code, it's visualizing my data structures in real-time, logging every tool interaction with full transparency, and letting me inspect state at any moment without breaking stride. The Desktop environment shows me exactly what's happening inside my application as it happens. There's no "build, refresh, check, debug" cycle. There's just immediate, continuous feedback.

This might sound like a minor convenience, but the impact is profound. When you can see your changes reflected instantly, when you can inspect tool outputs without switching windows, when you can debug state without adding console logs and restarting processes, you stay in flow. I feel flow is where the best engineering happens.

MCP: The Protocol That Changes Everything
The Model Context Protocol is the foundation that makes goose's magic possible, and it deserves recognition as a genuinely innovative approach to building AI-powered applications. MCP standardizes how tools communicate, how state is managed, and how interfaces are rendered. Instead of building custom APIs, managing server deployments, and coordinating complex microservice architectures, MCP gives you a clean, declarative way to define what your application does.

This isn't just simpler, it's fundamentally more expressive. With MCP servers, I can define complex tool chains, manage stateful workflows, and deliver dynamic UIs without the overhead that typically bogs down development. And because goose is built around MCP, everything just works together seamlessly. Tools talk to each other. State persists correctly. Visualizations appear automatically. The protocol eliminates an entire category of integration problems that usually consume hours of debugging time.

Speed Without Sacrificing Quality
One of the biggest misconceptions about rapid development tools is that speed comes at the cost of quality. Goose proves that's false. The seventeen applications I built this month aren't prototypes or proof-of-concepts; they're production-quality systems with proper architecture, comprehensive accessibility, and thoughtful design. Goose didn't make me cut corners; it removed the tedious barriers that usually slow down good engineering.

When YAML recipes let me automate multi-step workflows, I'm not avoiding the complexity. I'm managing it more intelligently. When auto-visualizers render my data structures instantly, I'm not skipping the work of understanding my outputs. I'm seeing them more clearly. When Sonnet 4.5 integration helps me refactor code or generate documentation, I'm not replacing my engineering judgment. I'm augmenting it with AI that understands context and intent.
This is what modern AI engineering should feel like: powerful, expressive, and fast without compromising on craft.

A Development Experience That Feels Alive
The best way I can describe working with goose is that it feels alive. The environment responds to you. It anticipates what you need. It makes the invisible visible. When you're building with goose, you're not fighting your tools; you're collaborating with an environment that's designed to amplify your capabilities.

I've used a lot of development environments over the years. Some are powerful but clunky. Others are elegant but limited. Goose is the first tool I've encountered that manages to be both powerful and delightful. It handles complexity gracefully while staying out of your way. It gives you control without overwhelming you with configuration. It's opinionated about the right things and flexible about everything else.

The Future of Development Is Here
goose represents a fundamental shift in what's possible with AI-assisted development. It's not about replacing developers - it's about giving developers superpowers. It's about removing the friction that separates an idea from its implementation. It's about making the development process so smooth, so intuitive, so responsive that building software becomes as creative and expressive as any other art form.

This month proved to me that we're at an inflection point. The tools exist. The protocols exist. The AI models exist. What we're seeing with goose is all of these pieces coming together in a way that feels inevitable in hindsight but revolutionary in practice.

The new agentic group created by major tech companies, including OpenAI, Block, and Anthropic are the co-founding stewards, with support from other platinum members including Google, Microsoft, Amazon Web Services (AWS), Bloomberg, and Cloudflare is called the Agentic AI Foundation (AAIF). The foundation was launched under the umbrella of the Linux Foundation in December 2025. If you're serious about AI engineering, if you want to build faster without sacrificing quality, if you want a development experience that feels empowering rather than exhausting; like many other great AI tools goose is a wonderful addition. It's not just a tool. It's a glimpse into the how we'll all be building software in the very near future with all these amazing technologies.

[(https://www.linuxfoundation.org/press/linux-foundation-announces-the-formation-of-the-agentic-ai-foundation#:~:text=Summary,across%20different%20repositories%20and%20toolchains.)]

Anthropic's Sonnet 4.5 and Goose API Integration
Throughout this challenge, I leveraged Sonnet 4.5 within goose to generate code, refactor architecture, build UI components, draft documentation, validate accessibility patterns, and prototype spatial intelligence workflows. The synergy between Sonnet 4.5's reasoning capabilities and goose's structured environment created a development experience that genuinely felt like pair programming with an engineer who comprehends the entire system architecture.

MediaPipe: Building Spatial Intelligence into Applications
One of the most exciting aspects of this challenge was integrating MediaPipe by Google into my workflow. MediaPipe is Google's open-source framework for building multimodal machine learning pipelines. It provides pre-trained models for hand tracking, pose detection, face mesh recognition, object detection, and gesture recognition - all running efficiently in the browser or on device.

What makes MediaPipe particularly powerful is its accessibility. You don't need extensive ML expertise or server-side infrastructure to implement sophisticated computer vision features. The framework handles the complexity of real-time processing, letting developers focus on creating meaningful interactions.

Why MediaPipe Matters for Modern Applications
In my projects, I used MediaPipe to build spatial intelligence features that transform how users interact with applications. Hand tracking enables gesture-based controls without physical input devices. Pose detection opens possibilities for fitness tracking, accessibility tools, and interactive experiences. Face mesh recognition powers AR filters, emotion detection, and attention tracking.

These aren't just novel features - they represent a fundamental shift in human-computer interaction. As we move toward more natural, intuitive interfaces, spatial intelligence becomes essential. MediaPipe makes this technology accessible to developers at every level, democratizing what was once only available to large organizations with significant ML resources.
Practical Applications I Built

Throughout the challenge, I integrated MediaPipe into several applications to explore gesture-based navigation, hands-free controls for accessibility, spatial UI interactions, and pose-based fitness tracking interfaces. Each implementation revealed new possibilities for creating more inclusive and intuitive user experiences.
The combination of MediaPipe with goose's rapid development environment meant I could prototype, test, and refine these spatial features quickly. What might have taken weeks in a traditional workflow became achievable in hours.


How This Work Applies to Real-World Engineering

The systems I built this month weren't abstract experiments - they map directly to real engineering challenges. MCP servers function like lightweight microservices. Dynamic HTML rendering mirrors internal dashboards and admin tools. YAML recipes reflect production automation pipelines. Sonnet 4.5 became my AI engineering partner for code generation, architecture decisions, and technical documentation. The accessibility work aligns with production UI standards and WCAG compliance. Even the spatial intelligence prototypes connect to emerging AR and multimodal interfaces that are reshaping how we interact with technology.

goose assisted in expanding my workflow, giving me a faster and more expressive way to build the same caliber of systems that real engineering teams depend on daily.


A Month of Technologies, Patterns, and Systems

Across seventeen projects, I worked extensively with:

  • MCP servers with custom tool layers, state engines, and rendering pipelines
  • Semantic HTML, WCAG guidelines, and ARIA accessibility patterns for inclusive design
  • Glassmorphism, gradients, and motion-aware UI for modern, polished interfaces
  • YAML automation recipes for workflow orchestration
  • MediaPipe and spatial intelligence for multimodal interactions
  • JavaScript and TypeScript full-stack patterns for robust application architecture
  • Organizational systems including architecture diagrams, planning documents, and reusable pattern libraries

Each project built on the foundation of the last. The workflow became progressively more structured, more expressive, and more enjoyable with every iteration.


Accessibility: Building for Everyone
Accessibility isn't just a feature I implement; it's a core value that shapes every decision I make as an engineer. Throughout this challenge, I prioritized WCAG (Web Content Accessibility Guidelines) and ARIA (Accessible Rich Internet Applications) standards in every single application I built. This wasn't an afterthought or a checklist item. It was foundational to my design process.

Why Accessibility Matters to Me
Technology should empower everyone, regardless of their abilities. When we build inaccessible applications, we're not just creating inconvenience -we're actively excluding people from participating in digital spaces. That's unacceptable to me. Accessibility means someone with a visual impairment can navigate my interface using a screen reader. It means someone with motor difficulties can use keyboard navigation instead of precise mouse movements. It means someone with cognitive differences can understand my interface without confusion.

The beautiful thing about accessible design is that it makes applications better for everyone. Clear semantic HTML improves SEO and code maintainability. Proper ARIA labels enhance usability across all devices. Keyboard navigation benefits power users. High contrast ratios help people in bright sunlight or low-light environments. When we design for accessibility, we design for flexibility and resilience.

WCAG and ARIA in Practice
In every application, I implemented:

  • Semantic HTML structure using proper heading hierarchies, landmark regions, and meaningful element choices
  • ARIA labels and roles to provide context for screen readers and assistive technologies
  • Keyboard navigation patterns ensuring every interactive element is reachable and operable without a mouse
  • Color contrast ratios meeting WCAG AA standards (4.5:1 for normal text, 3:1 for large text)
  • Focus indicators that are clearly visible and never removed without providing an alternative
  • Alt text for images that provides meaningful descriptions, not just decorative labels
  • Error identification and suggestions that help users understand and correct mistakes
  • Responsive layouts that work across screen sizes and zoom levels without breaking functionality

The Real-World Impact
Accessibility isn't theoretical. It changes lives. I've seen firsthand how proper ARIA implementation allows someone using a screen reader to complete a task in seconds instead of minutes. I've watched keyboard navigation enable people with motor impairments to use interfaces that would otherwise be impossible for them. I've received feedback from users with cognitive differences who appreciated clear, consistent navigation patterns.

Every time I write semantic HTML, every time I add an ARIA label, every time I test keyboard navigation. I'm making a choice to include rather than exclude. And that matters deeply to me.

Moving Forward
As I continue building, accessibility remains non-negotiable. The tools are here. The guidelines are clear. The impact is measurable. There's no excuse for building inaccessible applications, and I'm committed to raising the standard in every project I touch. Because technology should work for everyone—and it's our responsibility as engineers to make that happen.

Closing Reflection
While this marks the final chapter of the Advent of AI challenge and the conclusion of this particular phase, it represents only the continuation of an AI engineering journey I'm committed to pursuing every single day. The tools are here, the architecture is proven, and the momentum is undeniable. Now it's time to discover just how far this approach can scale.


What's Next for this AI Engineer

This month of intensive engineering was a foundation, not a finish line. Seventeen full-stack systems later, I'm stepping into the next phase of my AI engineering journey with unprecedented clarity and structural understanding. The projects I built during this challenge were deliberately diverse: Full-stack applications deployed, MCP servers, UI engines, automation workflows, spatial intelligence prototypes, accessibility-driven interfaces, and organizational systems that now fundamentally shape how I approach software development.

The next step is taking this work beyond the challenge and into the broader AI engineering community. I'll be attending the Microsoft AI Tour conference, continuing to refine and evolve my workflow, and exploring how these patterns scale into larger, production-grade systems. My goal is to keep pushing the boundaries of AI engineering and AI-assisted development deeper MCP integrations, richer UI architectures, more sophisticated spatial and multimodal experiments, and a more intentional approach to building tools that feel both cohesive and genuinely human-centered.

This challenge may be complete, albeit my AI engineering journey will continue. I couldn't be more excited about what comes next.

This post is part of my AI engineering journey.

Follow along for more AI Engineering Building with Eri!

Top comments (0)