Claude Code has hit number one among developer tools. NVIDIA's Rubin platform inches closer to general availability. MCP crossed 97 million monthly SDK downloads. This week in AI moved fast, and the signals point in one direction: agentic systems are becoming the new baseline.
AI Coding Tools: Claude Code Tops Developer Charts
Claude Code has become the most-used AI coding tool in just eight months. A survey of nearly 1,000 software engineers published this week by The Pragmatic Engineer found that 95% of respondents use AI tools at least weekly. Claude Code leads the pack, followed by GitHub Copilot and Cursor.
Among the smallest companies surveyed, Claude Code holds a 75% usage rate. The tool, released by Anthropic in 2025, lets developers work directly in the command line to write, run, debug, and iterate on code without manual intervention at each step. Engineers now report doing 70% or more of their coding work with AI assistance.
OpenAI moved in parallel this week. On March 6, the company launched Codex Security in preview. This tool uses OpenAI's latest models to scan a codebase for real vulnerabilities while cutting false-positive noise. Beta tests found critical bugs, including a cross-tenant authentication flaw, that standard tools had missed. OpenAI reports more than 90% fewer false-severity findings compared to traditional scanners.
The Washington Post published a hands-on look at Claude Cowork this week, Anthropic's desktop tool for non-developers. A reader built a functional media-tracking website in minutes using plain language. The tool required no command-line knowledge. This signals that AI coding is no longer only for engineers.
The broader pattern is clear. AI generates as much as 30% of Microsoft's code and more than a quarter of Google's code, according to statements from each company's CEO. Meta's Mark Zuckerberg has said he wants most of Meta's code written by AI agents. Agentic coding is not a coming trend. It is the current state of the industry.
AI Processing: NVIDIA Rubin Locks In Cloud Partners
NVIDIA confirmed this week that the Rubin platform is in full production. Hardware will reach cloud partners in the second half of 2026. AWS, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure will be among the first to deploy Vera Rubin-based instances. CoreWeave, Lambda, Nebius, and Nscale are also in the initial rollout.
The Rubin platform promises up to a 10x reduction in inference token cost and a 4x reduction in GPUs needed to train Mixture-of-Experts models, compared to the current Blackwell platform. NVIDIA's Spectrum-X Ethernet Photonics switches deliver 5x improved power efficiency. A new Inference Context Memory Storage Platform, built on the BlueField-4 storage processor, targets agentic AI reasoning workloads specifically.
Microsoft will deploy NVIDIA Vera Rubin NVL72 rack-scale systems at its next-generation Fairwater AI superfactories, which will scale to hundreds of thousands of Rubin Superchips. Meta announced a separate multiyear partnership with NVIDIA covering millions of Blackwell and Rubin GPUs for training and inference. The partnership includes NVIDIA Grace CPUs, Spectrum-X Ethernet networking, and Confidential Computing for user data protection.
NVIDIA GTC 2026 opens on March 16 in San Jose. CEO Jensen Huang has said the company will reveal "several new chips the world has never seen." Speculation points to the Feynman architecture, designed to accelerate on-device agentic AI inference. Rubin and its HBM4 memory configuration are also expected to be detailed further.
The AI infrastructure arms race is not slowing. Gartner projects worldwide AI spending will reach $2.52 trillion in 2026. Hardware is the foundation, and NVIDIA is building most of it.
Standards and Protocols: MCP Crosses 97 Million Monthly Downloads
The Model Context Protocol (MCP) reached 97 million monthly SDK downloads in February 2026, covering both Python and TypeScript implementations. Every major AI provider now supports the protocol: Anthropic, OpenAI, Google, Microsoft, and Amazon.
Anthropic donated MCP to the Agentic AI Foundation (AAIF), a directed fund under the Linux Foundation, in December 2025. The AAIF was co-founded by Anthropic, Block, and OpenAI, with support from Google, Microsoft, AWS, Cloudflare, and Bloomberg. The foundation also houses OpenAI's AGENTS.md standard and Block's goose agent framework.
AGENTS.md has been adopted by more than 60,000 open-source projects. The standard gives AI coding agents a consistent way to understand project-specific instructions across different repositories and toolchains. Tools including Cursor, GitHub Copilot, Codex, Devin, and VS Code have all adopted it.
This week, Guideline launched a new MCP Server for media plan management. The server gives advertising agencies direct access to their media planning data through AI agents, removing the need to manually export reports across disconnected systems. The server uses read-only access for security. Industry analysts project that 75% of enterprise gateway vendors will integrate MCP capabilities by the end of 2026.
The companion Agent-to-Agent (A2A) protocol, originally created by Google and now under AAIF governance, handles agent-to-agent communication. MCP handles how an agent talks to tools. A2A handles how agents talk to each other. The two protocols are now used together in production agentic systems. As of March 2026, the "write once, use everywhere" promise of MCP is working in practice. A Postgres MCP server built for one AI client runs across every major AI platform.
The open standard question for agentic AI is largely settled. The ecosystem is consolidating around MCP and A2A as the foundational protocols for the next generation of AI systems.
Build Your Agentic Data Platform with Dremio
AI agents need fast, reliable access to data. Data teams building AI pipelines need tools that keep pace with model improvements without rebuilding infrastructure each quarter.
Dremio's Agentic Lakehouse Platform gives AI systems query access to Apache Iceberg data across clouds and on-premises sources through a single interface. The platform includes a native MCP server so AI agents can query your lakehouse directly using the same open protocol powering the rest of the agentic ecosystem.
Start your free 30-day trial at dremio.com/get-started
Want to go deeper on building AI-ready data architectures?
The agentic AI stack is being built right now. Your data layer should be ready for it.
Top comments (0)