Skip to main content

LM Studio

Local or self-hosted servers where the model boundary stays operator-owned. This page covers the built-in lm_studio provider kind directly.

At A Glance

FieldValue
Built-in kindlm_studio
Provider groupLocal And Self-Hosted Providers
Protocol familyopenai_chat_completions
Feature familyopenai_compatible
Auth schemebearer
Credential envsnone required by default
Aliaseslmstudio, lm-studio
Default base URLhttp://127.0.0.1:1234
Request endpointhttp://127.0.0.1:1234/v1/chat/completions
Models endpointhttp://127.0.0.1:1234/v1/models

Minimal Config

active_provider = "lm_studio"

[providers.lm_studio]
kind = "lm_studio"
base_url = "http://127.0.0.1:1234"
model = "auto"

Verify It

loong doctor
loong list-models
loong ask --message "Say hello and name the active provider."
If list-models is unreliable for this account or region, pin an explicit provider.model or add preferred_models instead of leaving recovery implicit.

Auth And Routing Contract

ContractValue
Auth optionalyes
Model probe auth optionalyes
Default API key envnone
OAuth envnone
Primary request routehttp://127.0.0.1:1234/v1/chat/completions
Primary model-catalog routehttp://127.0.0.1:1234/v1/models

Operator Notes

  • Authentication is optional by default on this profile, which is useful for local or self-hosted servers during first setup.