Skip to main content

OpenAI

Use this when your team already runs on OpenAI and you want that vendor family to stay explicit in Loong config instead of disappearing behind a generic compatibility route.

At A Glance

FieldValue
Built-in kindopenai
Provider groupDirect Hosted Providers
Protocol familyopenai_chat_completions
Feature familyopenai_compatible
Auth schemebearer
Credential envsOPENAI_CODEX_OAUTH_TOKEN, OPENAI_OAUTH_ACCESS_TOKEN, OPENAI_API_KEY
Aliasesopenai_compatible
Default base URLhttps://api.openai.com
Request endpointhttps://api.openai.com/v1/chat/completions
Models endpointhttps://api.openai.com/v1/models

Minimal Config

active_provider = "openai"

[providers.openai]
kind = "openai"
api_key = { env = "OPENAI_API_KEY" }
model = "auto"

Verify It

loong doctor
loong list-models
loong ask --message "Say hello and name the active provider."
If list-models is unreliable for this account or region, pin an explicit provider.model or add preferred_models instead of leaving recovery implicit.

Auth And Routing Contract

ContractValue
Auth optionalno
Model probe auth optionalno
Default API key envOPENAI_API_KEY
OAuth envOPENAI_CODEX_OAUTH_TOKEN
Primary request routehttps://api.openai.com/v1/chat/completions
Primary model-catalog routehttps://api.openai.com/v1/models

Operator Notes

  • OpenAI can use either the normal API key path or the OAuth token path exposed through the descriptor contract.