LLM Integrations
LiteLLM Integration
This MCP server features comprehensive liteLLM integration, allowing you to use any supported LLM provider with a simple model string configuration.
- Unified Configuration: Use a single
CUA_MODEL_NAMEenvironment variable with a model string - Automatic Provider Detection: The agent automatically detects the provider and capabilities from the model string
- Extensive Provider Support: Works with Anthropic, OpenAI, local models, and any liteLLM-compatible provider
Model String Examples:
- Anthropic:
"anthropic/claude-sonnet-4-5-20250929" - OpenAI:
"openai/computer-use-preview" - UI-TARS:
"huggingface-local/ByteDance-Seed/UI-TARS-1.5-7B" - Omni + Any LiteLLM:
"omniparser+litellm/gpt-4o","omniparser+litellm/claude-3-haiku","omniparser+ollama_chat/gemma3"
Was this page helpful?