╦ ╦╔═╗╔═╗╔╦╗ ╔═╗╔═╗╔═╗╔╗╔╔═╗╦ ╔═╗╦ ╦ ╠═╣║ ║╚═╗ ║ ║ ║╠═╝║╣ ║║║║ ║ ╠═╣║║║ ╩ ╩╚═╝╚═╝ ╩ ╚═╝╩ ╚═╝╝╚╝╚═╝╩═╝╩ ╩╚╩╝
Deploy a real OpenClaw agent on your own infrastructure in 90 seconds. One command.
Click to copy · MIT licensed
One bootstrapper, six targets
Ship to wherever you already pay.
One command picks the right buildpack for your provider. No vendor lock-in. Move between them whenever you want.
Why self-host
Real infrastructure.
Not a managed black box.
Self-hosted OpenClaw is the same engine that powers RunLobster — minus the hosting bill, plus full source access.
Full Linux, no sandbox
Real Ubuntu container. apt, brew, git, any package, any language. Your agent has the same machine you would give a junior developer.
Real Chromium browser
Logs into anything you can. LinkedIn, internal tools, legacy CRMs. Watch it click through pages live from your dashboard.
Persistent memory
Built-in SQLite + filesystem so your agent remembers context between conversations. Doesn't forget who you are between restarts.
Built-in scheduler
Cron at the language level. Schedule jobs in plain English ("check stripe at 7am every weekday") and they run forever.
Bring your own model
ANTHROPIC_API_KEY, OPENAI_API_KEY, or your own LLM endpoint. Switch providers without changing code. No vendor markup.
Open source, MIT
Read every line, fork it, modify it, run it forever. No telemetry, no phone-home, no usage caps. Your infrastructure, your rules.
Architecture
Six layers.
Each one swappable.
Every layer is independent. Switch your model provider without touching the runtime. Move providers without rewriting your agent.
Your provider
Vercel, Railway, Fly.io, AWS, or your own VPS. The container layer.
- ›1 vcpu / 1gb ram baseline
- ›Auto-scale on demand
- ›Bring-your-own region
Compute container
An isolated Ubuntu environment with Linux, browser, and persistent disk.
- ›Ubuntu 24.04 LTS
- ›Headless Chromium
- ›SQLite + filesystem
OpenClaw runtime
The agent loop, memory, scheduler, and tool router. MIT licensed.
- ›Plan → act → observe loop
- ›Persistent memory store
- ›Cron-based scheduler
Model provider
Bring your own key. Anthropic, OpenAI, or self-hosted Llama.
- ›Anthropic Claude
- ›OpenAI GPT
- ›Self-hosted Llama 4
Tools layer
Browser, file system, terminal, and 3,000+ integrations via Composio.
- ›Real Chromium
- ›Full Linux access
- ›Composio integrations
Your interface
Web dashboard, CLI, REST API, and webhooks. You pick the surface.
- ›Web dashboard at /
- ›CLI: openclaw chat
- ›REST API + webhooks
Quickstart
Three commands.
Pick your provider.
# 1. Install the CLI
npx hostopenclaw deploy --provider vercel
# 2. Add your model key
vercel env add ANTHROPIC_API_KEY
# 3. Talk to your agent
openclaw chat› Average time to first agent run: 90 seconds
Used by teams shipping in production
Wall of love
Engineers who shipped
on their own terms.
“I ran the npx command on a Saturday morning. By the time my coffee was done I had a working OpenClaw agent at a real URL. The whole thing felt illegal.”
“We needed to keep our agent infrastructure inside our own AWS for compliance reasons. HostOpenClaw was the only thing that let us self-host without giving up the polish of a managed service.”
“The architecture doc actually explains the trade-offs instead of hand-waving. I knew exactly what I was getting into before I deployed. Felt like reading a real engineering doc, not marketing.”
“Switched our model provider from Anthropic to OpenAI in 30 seconds. One env var change, no rebuild. This is what model-agnostic actually looks like.”
“Open source means I read the runtime before deploying it. Found one thing I wanted to change, opened a PR, it was merged the next day. That trust loop is why I picked OpenClaw over the closed alternatives.”
“We're running the same agent in three regions across Vercel, Fly, and our on-prem cluster. Same code, same behavior, three providers. This is the dream and it actually works.”
FAQ
The questions engineers ask.
Pricing
Pick your operator.
Both tiers run the same engine. The only question is who you want managing the infrastructure.
Run on your own infra. Pay your own provider. Own everything.
- MIT licensed source code
- Bring your own model key
- Bring your own provider
- All 3,000+ integrations
- Real browser, real Linux
- Built-in scheduler
- Community support
We host it. You use it. Free tier for starters, paid plans for higher limits.
- Everything in Self-hosted
- No infra to manage
- Polished web dashboard
- Auto-scale + auto-update
- Priority support
- Pre-built agent roles
- Single-tenant compute
One command away
Your agent. Your infra.
Live in 90 seconds.
Open a terminal. Paste the command. Get back a real URL. We'll be on your side the whole time.
MIT licensed · Zero telemetry · Bring your own model