I've edited this post, but AI helped. These are meant to be quick posts for the Advent of AI. If I'm doing one of these each day, I don't have time to spend a couple hours on each post.
The advent of AI series leverages Goose, an open source AI agent. If you've never heard of it, check it out!
block
/
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
goose is your on-machine AI agent, capable of automating complex development tasks from start to finish. More than just code suggestions, goose can build entire projects from scratch, write and execute code, debug failures, orchestrate workflows, and interact with external APIs - autonomously.
Whether you're prototyping an idea, refining existing code, or managing intricate engineering pipelines, goose adapts to your workflow and executes tasks with precision.
Designed for maximum flexibility, goose works with any LLM and supports multi-model configuration to optimize performance and cost, seamlessly integrates with MCP servers, and is available as both a desktop app as well as CLI - making it the ultimate AI assistant for developers who want to move faster and focus on innovation.
Quick Links
Need Help?
The Challenge: Democracy by AI
Day 12 threw a Winter Festival mascot crisis at me. The Festival Committee spent three hours arguing about whether their mascot should be a snowman, penguin, polar bear, ice fairy, or yeti. Classic committee deadlock.
The challenge was to use the Council of Mine extension to get nine AI personalities to debate and vote on the decision. This teaches you about MCP sampling, which I'll get to in a second.
What Actually Happened
I fired up Goose and tried to use Council of Mine. Except it didn't work. Even though the extension was installed and enabled, Goose couldn't see it. I had to disable and re-enable the extension multiple times before it finally showed up. Once I got it working, I ran multiple debates:
- Mascot choice (spoiler: the yeti won)
- Mascot name (went with "Yuki")
- Origin story (monks in the mountains)
- Personality traits (jovial, powerful, and wise)
- Festival duration (expanded to five days)
The council gave genuinely different perspectives each time. The Devil's Advocate consistently pushed back on popular choices. The Systems Thinker worried about scalability. The Optimist found upsides in everything.
The MCP Sampling Thing
Here's what makes this interesting beyond "AI voting on stuff." Council of Mine doesn't have its own LLM. It uses MCP sampling to ask the AI you're already connected to for help.
Angie Jones (@techgirl1908) wrote about MCP sampling recently. Check it out!
Normal MCP flow: You talk to Goose, Goose calls an MCP tool, the tool returns data.
MCP sampling flow: You talk to Goose, Goose calls an MCP tool, the tool asks Goose's LLM for help, the tool processes that response and returns it.
The Council of Mine extension defines nine personality system prompts. When you start a debate, it makes nine separate sampling calls, one for each council member, with their distinct personality prepended to the prompt. Then it makes another nine calls for voting, and one more for synthesizing the results.
That's 19 LLM calls per debate, all going through whatever AI model you have configured in Goose.
Why This Matters
Sampling lets MCP servers be intelligent without managing their own API keys, model selection, or LLM infrastructure. The server becomes an orchestrator, not an AI application.
You could build:
- A code review system with multiple expert viewpoints
- A documentation analyzer that explains concepts differently based on user level
- A search tool that intelligently filters and ranks results
- A debate simulator for product decisions (which is literally what I just used)
The Council of Mine repo has examples of how to structure sampling requests, handle different LLM response formats, and protect against prompt injection. If you're building MCP servers, it's worth studying how they handle the nine distinct personalities.
Build Your Own MCP Sampling MCP?
Building your own MCP with sampling was part of the bonus for the challenge. I didn't get around to it due to time constraints fpr me today, but I opened an issue in my TypeScript MCP template repo a couple of weeks ago. I should probably get on that.
Feature: Implement MCP Sampling Support
#55
Implement bidirectional sampling pattern that allows MCP servers to request AI-generated content from connected clients for intelligent judgment tasks like summarization, sentiment analysis, and content generation.
References:
The Takeaway
MCP sampling is one of those features that opens up a new category of tools. You're not just exposing data anymore. You're orchestrating intelligent behavior using whatever AI the user brought with them.
Also, extensions still have rough edges. Keep the disable/enable trick in your back pocket.
If you want to try this yourself, the Council of Mine docs walk through installation. The MCP sampling guide explains the technical details. And the Advent of AI challenge has the full requirements.
If you want to stay in touch, all my socials are on nickyt.online.
Until the next one!
Photo by Laura Peruchi on Unsplash


Top comments (0)