fix(server): skip auth check when Codex CLI uses a custom model provider#649
Conversation
When the Codex CLI is configured with a custom model_provider in ~/.codex/config.toml (e.g. Portkey, Azure OpenAI proxy), authentication is handled via provider-specific environment variables rather than `codex login`. The `codex login status` probe would report 'not logged in' and t3code would treat this as a blocking error, even though the CLI works perfectly fine. This change reads the model_provider key from the Codex CLI config file at startup. When a non-OpenAI provider is detected, the auth probe is skipped and the provider health check returns ready with authStatus 'unknown' instead of erroring out. Fixes pingdotgg#644
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: Repository UI Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Apologies for tagging. This will be a massive unblocker for anyone using it in the workplace. It will allow us to collect more bug reports and potentially enter that market by unblocking them. |
9be9575 to
7325895
Compare
- make `readCodexConfigModelProvider` and `hasCustomModelProvider` Effect-based - skip `codex login status` via Effect flow when custom model providers are configured - refactor ProviderHealth tests to use scoped Effect Node services and temp `CODEX_HOME`
- Replace Node assert import with `assert` from `@effect/vitest` - Keep test assertions aligned with the Effect Vitest test stack
Summary
Fixes #644
When the Codex CLI is configured with a custom
model_providerin~/.codex/config.toml(e.g. Portkey, Azure OpenAI proxy, Ollama), t3code's startup health check incorrectly blocks the user from using the app. Thecodex login statusprobe reports "not logged in" because there is no~/.codex/auth.json, even though the CLI works perfectly — authentication is handled via provider-specific environment variables (e.g.PORTKEY_API_KEY,AZURE_API_KEY).Problem
The provider health check in
ProviderHealth.tsruns two sequential probes at server startup:codex --version— checks CLI is installed and above minimum versioncodex login status— checks OpenAI authentication statusFor custom model providers, probe 2 always fails because:
codex logincodex login statusreports "not logged in" since there's noauth.jsonstatus: "error",authStatus: "unauthenticated")Solution
readCodexConfigModelProvider()— reads themodel_providerkey from the Codex CLI config file ($CODEX_HOME/config.tomlor~/.codex/config.toml) using a line-by-line scan of the top-level TOML section (no new dependency needed)hasCustomModelProvider()— returnstruewhenmodel_provideris set to anything other than"openai"checkCodexProviderStatusto skip thecodex login statusprobe when a custom provider is detected, returningstatus: "ready"withauthStatus: "unknown"and a descriptive messageThe version check (probe 1) always runs regardless of provider configuration.
Why not change
ProviderKind?The issue mentions that
provider: "codex"is hardcoded throughout. This is intentional —ProviderKindrefers to the agent runtime (Codex CLI vs future Claude Code), not the model API endpoint. Themodel_providerinconfig.tomlcontrols which API backend the Codex CLI connects to internally — it's an implementation detail of the Codex adapter. The two concepts are at different levels of abstraction.Test coverage
checkCodexProviderStatustests now properly isolateCODEX_HOMEvia temp directories to prevent env leakagecheckCodexProviderStatusflow with custom providers (Portkey config), confirming the auth probe is skipped and the spawner never receiveslogin statusargsmodel_provider = "openai"readCodexConfigModelProvidercover: missing file, missing key, top-level providers, section-scoped keys (ignored), comments/whitespace, both single and double quoted TOML valueshasCustomModelProvidercover: no config, no key, openai, portkey, azure, ollama, and arbitrary custom proxy namesVerification
bun lint— passes (zero new warnings/errors)bun typecheck— passes across all 7 packagesbun run test— all 26 ProviderHealth tests pass; no regressions in other test filesNote
Skip auth probe in
checkCodexProviderStatuswhen Codex CLI uses a custom model providerreadCodexConfigModelProviderto parse the top-levelmodel_providerkey from$CODEX_HOME/config.toml(falling back to~/.codex/config.toml) using a regex scan that ignores comments and TOML sections.hasCustomModelProviderto returntruewhen the provider is set and not in the newOPENAI_AUTH_PROVIDERSset (currently only'openai').checkCodexProviderStatusskips thecodex login statuscommand and returnsreadywithauthStatus: 'unknown'and a skip message instead.checkCodexProviderStatusnow requiresFileSystemandPathin its Effect environment.Macroscope summarized 70bcd06.