Supported Model Providers
Supported Models
CUA VLM Router (Recommended)
Use CUA's cloud inference API for intelligent routing and cost optimization with a single API key. CUA manages all provider infrastructure and credentials for you.
model="cua/anthropic/claude-sonnet-4.5" # Claude Sonnet 4.5 (recommended)
model="cua/anthropic/claude-haiku-4.5" # Claude Haiku 4.5 (faster)Benefits:
- Single API key for multiple providers
- Cost tracking and optimization
- Fully managed infrastructure (no provider keys to manage)
Learn more about CUA VLM Router →
Anthropic Claude (Computer Use API - BYOK)
Direct access to Anthropic's Claude models using your own Anthropic API key (BYOK - Bring Your Own Key).
model="anthropic/claude-3-7-sonnet-20250219"
model="anthropic/claude-opus-4-20250514"
model="anthropic/claude-sonnet-4-20250514"Setup: Set ANTHROPIC_API_KEY environment variable with your Anthropic API key.
OpenAI Computer Use Preview (BYOK)
Direct access to OpenAI's computer use models using your own OpenAI API key (BYOK).
model="openai/computer-use-preview"Setup: Set OPENAI_API_KEY environment variable with your OpenAI API key.
UI-TARS (Local or Huggingface Inference)
Run UI-TARS models locally for privacy and offline use.
model="huggingface-local/ByteDance-Seed/UI-TARS-1.5-7B"
model="ollama_chat/0000/ui-tars-1.5-7b"Omniparser + Any LLM
Combine Omniparser for UI understanding with any LLM provider.
model="omniparser+ollama_chat/mistral-small3.2"
model="omniparser+vertex_ai/gemini-pro"
model="omniparser+anthropic/claude-sonnet-4-5-20250929"
model="omniparser+openai/gpt-4o"Was this page helpful?