DEV Community

Discussion on: MCP vs A2A: The Complete Guide to AI Agent Protocols in 2026

Collapse
 
leonting1010 profile image
Leon

The granularity question is exactly what I've been wrestling with. I built an MCP server for browser automation with ~30 tools, and the answer I landed on was a layered protocol: 8 irreducible core operations (eval, pointer, keyboard, nav, wait, screenshot, run, capabilities) + 17 composed built-in operations that any AI client gets for free.

The key insight: the AI doesn't need 30 fine-grained tools if you give it a small, composable core + higher-level operations built from that core. click(target) is just eval(find) + pointer(x, y, 'click') — but the AI can call either depending on what it needs.

On A2A: I think MCP's "tool as function call" model wins for single-agent use cases. A2A adds value when agents need to negotiate capabilities with each other — but most real-world automation today is one agent talking to one tool server, not agent-to-agent coordination.

Collapse
 
apireno profile image
Alessandro Pireno

The layered protocol is the same design I landed on with DOMShell. 39 tools total, but structurally it is a small set of primitives (eval, cd, ls, find, text, click, type, scroll) plus composed operations that chain them (read, grep, extract_table, extract_links). The AI calls whichever level it needs, and the composed operations are just documented aliases for common primitive chains. The 8+17 split you describe is almost identical. Where I ended up diverging: DOMShell exposes a filesystem metaphor on top of the accessibility tree rather than raw DOM. cd into a section, ls its children, grep for elements. That abstraction cut API calls by about 50% compared to coordinate-based approaches because the agent navigates structure rather than pixels. Agree on A2A. The single-agent-to-tool-server pattern is where 95% of real usage is today. A2A becomes interesting when you need agents to discover and negotiate capabilities, which is a different problem than tool execution.