DEV Community

Cover image for How to Connect MiniMax-M2.7 to Cursor
Orchid Files
Orchid Files

Posted on

How to Connect MiniMax-M2.7 to Cursor

MiniMax-M2.7 is a new Chinese frontier model from MiniMax. According to some benchmarks, it has almost caught up with Opus-4.6. However, based on my tests over the past few days, I've concluded that it doesn't even measure up to Sonnet-4. If you use it for simple tasks, everything is fine. But if you have a monorepo project structure with packages and apps, you have to run a lot of iterations to complete tasks. Even if the rules and skills specify the project structure — where types, helpers, and ESLint configurations are located — it still doesn't follow that structure. And it's very slow compared to Sonnet, and it's about 20 times slower than GPT-5.4. I asked the agent with MiniMax to copy the structure from another monorepo repository, and it took me 4 hours of back-and-forth with the agent to clarify the details so that it would finally complete the task. With Sonnet-4.6, this takes me about 15–30 minutes. Benchmarks and real-world performance differ greatly. If a model is estimated to be close to Sonnet or even Opus in benchmarks, in practice there may be a significant gap between them.

But you need to consider not only quality and speed, but also price. MiniMax is 10 times cheaper than Sonnet-4.6. A $10 subscription gives you 1,500 queries every 5 hours and 15,000 queries per week. I also noticed that MiniMax's planning capabilities are quite good — comparable to Sonnet or GPT. Therefore, for simple tasks or situations where speed isn't a priority, MiniMax can be a reasonable choice.

The documentation on the MiniMax website states that the model uses the OpenAI format and is compatible with most IDEs. But it turned out that if you connect it to Cursor, it doesn't separate the contents of <think>...</think> from the actual response; everything goes into a single stream. You see the model's thoughts and the response mixed together, and at some point it becomes unclear where its thoughts end and the response to the user begins. Working in this mode is extremely inconvenient.

I've added MiniMax support to the Ungate extension for Cursor that I'm developing. It processes content from <think>...</think> blocks, and now Cursor correctly separates the model's reasoning from the response to the user. Working with MiniMax in Cursor is now the same as with other OpenAI-compatible models. I wrote more about the extension itself in a previous post: How to use a Claude Subscription in Cursor. To access the model, you need to add the custom model name MiniMax-M2.7 to Cursor. I've added a Base URL selector for MiniMax to the Ungate settings: China, Global, Custom. Also, the query analytics now distinguishes between Claude and MiniMax.
 

Ungate extension repository: https://github.com/orchidfiles/ungate

VSX Marketplace: https://open-vsx.org/extension/orchidfiles/ungate

Install from the terminal: cursor --install-extension orchidfiles.ungate

Top comments (0)