Pricing that scales with you

Pay only for what you use - credits for compute and AI inference

Free

Free
10 credits to start

Try before you commit - perfect for testing and evaluation

  • Linux Small sandbox only
  • Claude Sonnet 4.5 Haiku only
  • Community Discord support
  • MIT License open source
Get Started
Most Popular

Pro

$10+

Flexible credits for cloud compute and Computer-Use VLMs, with managed cloud environments and inference.

  • Cloud Linux and Windows Environments
  • CUA LLM Inference Provider
  • Cuazar Playground UI
  • Priority support via Slack
$

100 credits per dollar

Enterprise

Custom monthly

Scalable cloud containers tailored for large teams and organizations.

  • Everything in Pro
  • 24/7 support
  • HIPAA, SOC Type 1/2 Reports
Book a Demo

What do credits buy?

Credits are our unified currency for both compute time and AI inference

How many credits do you need per month?

10kcredits/month
Approximately $100/month
Compute Time
~833 hours
on Medium Linux instance
VLM Inference
~33.3M tokens
on Claude Sonnet
100
1k
5k
10k
25k
50k
100k

Sandbox Compute

Per hour for cloud environments

Linux Small5 credits/hour
Linux Medium9 credits/hour
Linux Large24 credits/hour
Windows Small8 credits/hour
Windows Medium15 credits/hour
Windows Large31 credits/hour

Billed per minute. Sessions include full desktop access with persistent storage.

VLM Inference

Per million tokens processed

Claude Haiku 4.5~435 credits/1M
Claude Sonnet 4.5~1305 credits/1M
Qwen 3 VL 235B~1566 credits/1M
Claude Opus 4.5~2175 credits/1M
ByteDance UI-TARS-2~2610 credits/1M

Input and output tokens combined. Actual costs vary by model and usage.