Skip to content

feat: add MiniMax as a first-class AI provider (M2.7 default)#61

Open
octo-patch wants to merge 2 commits intoTraderAlice:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as a first-class AI provider (M2.7 default)#61
octo-patch wants to merge 2 commits intoTraderAlice:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 16, 2026

Summary

Add MiniMax as a first-class AI provider via Vercel AI SDK OpenAI-compatible adapter, with M2.7 as default model.

Changes

  • Add MiniMax provider to model factory (OpenAI-compatible via @ai-sdk/openai)
  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model selection list (default)
  • Add MiniMax-M2.5 and MiniMax-M2.5-highspeed as alternative models
  • Add MiniMax API key management in config and UI
  • API Base URL: https://api.minimax.io/v1

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities.

Testing

  • All 766 existing unit tests pass
  • Integration tested with MiniMax API

Add MiniMax (https://www.minimax.io) support via the Vercel AI SDK's
OpenAI-compatible adapter. MiniMax offers MiniMax-M2.5 (204K context)
and MiniMax-M2.5-highspeed models through an OpenAI-compatible API.

Changes:
- model-factory.ts: add minimax provider case using @ai-sdk/openai
  with compatibility mode and chat completions endpoint
- config.ts: add minimax to apiKeys Zod schema
- config.ts (web routes): expose minimax API key status
- types.ts (UI): add minimax to ApiKeys type
- AIProviderPage.tsx: add MiniMax to provider list with model presets
- README.md: mention MiniMax in AI Provider and api-keys docs

Tested with both MiniMax-M2.5 and MiniMax-M2.5-highspeed models.
All 766 existing tests continue to pass.
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 22e116b to 8ad0645 Compare March 16, 2026 00:22
@luokerenx4
Copy link
Contributor

Hi @octo-patch, thanks for the contribution! We appreciate the effort.

However, due to security considerations, we're unable to directly accept external PRs at this time.

That said, MiniMax support is something we're thinking about. Currently there are two possible approaches we're considering:

  1. Configure MiniMax's Claude-compatible endpoint via Anthropic's Claude Agent SDK
  2. Configure MiniMax's Claude-compatible endpoint via the Vercel AI SDK's Claude provider

We haven't decided yet how to best surface this in the frontend for quick switching between providers. Once we've figured that out, we'll close this PR and include your name in the commit as the original proposer.

Thanks again for your interest in the project!

@octo-patch
Copy link
Author

Thanks for the response, @luokerenx4! Completely understand the security considerations. Great to hear that MiniMax support is on the roadmap — happy to close this PR if you'd prefer to handle it internally. Let me know if there's anything else I can help with!

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model (first in list)
- Keep all previous models as alternatives
@octo-patch octo-patch changed the title feat: add MiniMax as a first-class AI provider feat: add MiniMax as a first-class AI provider (M2.7 default) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants