feat: add MiniMax as first-class LLM provider#74
feat: add MiniMax as first-class LLM provider#74octo-patch wants to merge 1 commit intodtyq:masterfrom
Conversation
Add MiniMax AI as a built-in LLM service provider, leveraging its OpenAI-compatible API. MiniMax offers high-performance M2.7 and M2.5 series models with million-token context, tool calling and deep thinking capabilities. Changes: - Register MiniMax in ProviderCode enum with OpenAIModel implementation - Add MiniMaxLlm template ID for provider-category mapping - Add MiniMax provider initialization data with bilingual descriptions - Add MiniMax to frontend ServiceProvider enum with default API URL - Create LLMMiniMaxProvider connectivity test class - Add unit tests for enum, template ID, initializer and connectivity
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
|
|
||
| return Json::decode($response->getBody()->getContents()); | ||
| } | ||
| } |
There was a problem hiding this comment.
New connectivity test class duplicates existing DeepSeek provider
Low Severity
LLMMiniMaxProvider is an exact duplicate of LLMDeepSeekProvider — the only difference is the $apiBase URL string. All methods (connectivityTestByModel, fetchModels) have identical logic. Additionally, a grep for both class names shows neither is referenced anywhere in application code outside their own definitions and test files, suggesting these connectivity test classes may be unused dead code. A single parameterized class (or a base class accepting the API URL) would eliminate the duplication.


Summary
Add MiniMax as a built-in LLM service provider for Magic. MiniMax offers high-performance AI models (M2.7, M2.5 series) with million-token context windows, tool calling, and deep thinking capabilities, all served via an OpenAI-compatible API at
https://api.minimax.io/v1.Changes
Backend (PHP)
ProviderCode.php: RegisterMiniMaxenum case withOpenAIModelimplementationProviderTemplateId.php: AddMiniMaxLlm = '23'template ID for provider-category mappingServiceProviderInitializer.php: Add MiniMax provider initialization data with bilingual (EN/CN) descriptionsLLMMiniMaxProvider.php: Connectivity test class following existing DeepSeek patternFrontend (TypeScript)
aiModel.ts: AddMiniMaxtoServiceProviderenum with default API URLTests
ProviderCodeMiniMaxTest.php: 12 unit tests for enum, implementation, sort order, template ID mappingLLMMiniMaxProviderTest.php: 5 unit tests for connectivity test with mock HTTP (success, auth error, network error)ServiceProviderInitializerMiniMaxTest.php: 4 unit tests for provider data, translations, sort order uniquenessIntegration Notes
defaultcase inProviderConfigFactoryandgetImplementationConfig()- no special adapter neededMiniMax-M2.7,MiniMax-M2.7-highspeed,MiniMax-M2.5,MiniMax-M2.5-highspeedTest Plan
Note
Medium Risk
Adds a new third-party LLM provider option end-to-end (backend enums/templates/seed data plus connectivity check and frontend defaults), which could affect provider selection/order and configuration flows. Risk is mitigated by being largely additive and covered by new unit tests.
Overview
Adds MiniMax as a first-class
llmservice provider.Backend registers
ProviderCode::MiniMax(mapped to the existingOpenAIModelimplementation), introduces a newProviderTemplateId::MiniMaxLlm, seeds MiniMax into default provider initialization data (with EN/CN translations), and adds anLLMMiniMaxProviderconnectivity test that validates the API key by callingGET https://api.minimax.io/v1/models.Frontend updates
AiModel.ServiceProviderandServiceProviderUrlto include MiniMax with the default base URL, and the PR adds unit tests covering the new enum/template mappings, seeded provider data, and connectivity test behavior.Written by Cursor Bugbot for commit 04f6291. This will update automatically on new commits. Configure here.