Originally published on AI Tech Connect.
What is new in Gemini 3.1 Ultra Two headline additions define the 3.1 release. The first is the 2-million token context window — doubled from the 1M maximum available in earlier Gemini Ultra builds, and twice the size of Claude Opus 4.7's 1M context. The second is a native sandboxed Code Execution tool: the model can write Python, execute it in an isolated environment, read the standard output, and iterate within the same conversation turn — without any external interpreter infrastructure on your side. Both features were announced at Google Cloud Next '26 alongside Google's TPU 8i chip, which Google claims delivers 80% better performance-per-dollar over the prior generation. That hardware improvement matters for production economics: it translates directly into lower per-token serving…
Top comments (0)