Open
Conversation
Introduces src/utils/aiProvider.ts with getApiKey(), getBaseUrl(), and getModels() accessors that read from OPENAI_API_KEY, OPENAI_BASE_URL, and OPENAI_MODEL env vars, falling back to OPENROUTER_API_KEY and OpenRouter defaults for backwards compatibility.
Update aiMerge, PaneAnalyzer, slug, MergePane, and StatusDetector to use the centralized getApiKey/getBaseUrl/getModels accessors instead of hardcoding the OpenRouter URL, API key env var, and model list.
Generalize onboarding and shell config persistence to use OPENAI_API_KEY instead of OPENROUTER_API_KEY. Migrate legacy shell config blocks and onboarding state. Update all test files accordingly.
|
It's a nice enhance. But in some cases, to connect all these ai providers should use a proxy, I'm not sure if these changes can connect to the proxy now. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Replaces all hardcoded OpenRouter API references with a centralized
aiProvidermodule that supports any provider implementing the OpenAI API specification. Users can now point dmux at OpenAI, Ollama, LM Studio, vLLM, or any other compatible endpoint by setting standard environment variables.Previously,
OPENROUTER_API_KEY, the OpenRouter base URL, and a hardcoded model list were duplicated across five files (aiMerge.ts,PaneAnalyzer.ts,slug.ts,MergePane.tsx,onboarding.ts). This made it impossible to use a different provider without patching source code.The new
src/utils/aiProvider.tsexposes three config accessors consumed by all callers:OPENAI_API_KEY— API key (falls back toOPENROUTER_API_KEYfor backwards compatibility)OPENAI_BASE_URL— provider base URL (default:https://openrouter.ai/api/v1)OPENAI_MODEL— comma-separated model list to try in order (overrides the default fallback chain)Existing users with
OPENROUTER_API_KEYset require no changes. The onboarding flow and shell config persistence (aiApiKeySetup.ts) were updated to writeOPENAI_API_KEYgoing forward, and migrate any existing# >>> dmux openrouter >>>shell config blocks automatically.