Open source · Apache 2.0

AI agents that
actually run anywhere

18MB binary. Four LLM providers. Six chat adapters. Multi-agent orchestration. Zero dependencies. From your terminal to your phone.

Get Started View on GitHub
marsclaw
$ marsclaw "add error handling to main.go"

⚡ read_file main.go
✓ read_file
⚡ edit_file main.go
✓ edit_file

Added error wrapping with fmt.Errorf to all three return paths in main().

── gemini-2.5-flash │ 1.2K in / 523 out │ $0.003 session ──
18MB
Binary size
<50MB
RAM usage
<200ms
Startup time
0
CVEs

Everything you need

A complete agent runtime in a single binary

🚀

Multi-Agent Orchestration

Four patterns: Supervisor, Pipeline, Parallel, and Debate. Run complex workflows with specialized agents working together.

🔒

Security First

Per-tool danger levels with approval workflows. Credential scanning. Path traversal guards. No secrets leak.

Built-in Tools

Read, write, edit files. Execute shell commands. Search code. Git integration. Your agent can do anything your CLI can.

🕑

Scheduled Tasks

Run agents overnight on cron schedules. PR reviews at 9 AM. Security scans at 2 AM. Daily briefings for your team.

🧠

Persistent Memory

Three-tier memory system — episodic, semantic, procedural. SQLite-backed. Your agent remembers across sessions.

🔌

MCP Client

Connect to any MCP tool server. Discover tools automatically. Extend your agent with any capability.

🌐

Web UI

Built-in web interface. Access from any browser, any device. Deploy once, use from your phone at night.

💻

Context Engineering

Smart history trimming, tool result truncation, token budget allocation. Following Anthropic's production guidelines.

💰

Cost Tracking

Microdollar accounting with daily/monthly budgets. Know exactly what each agent costs. No surprise bills.

Access from anywhere

Six ways to talk to your agents

💻
CLI
🌐
Web UI
✈️
Telegram
💬
Discord
🗨️
Slack
📲
WhatsApp

Four LLM providers

Use the right model for each task

Anthropic

Claude Sonnet, Opus, Haiku

API key

Google Gemini

Gemini 2.5 Flash & Pro

GCP credits

OpenAI

GPT-4o, GPT-4o-mini

API key

Ollama

Llama, Mistral, any local

Free & offline

How we compare

MarsClaw vs the alternatives

OpenClawPicoClawZeroClawMarsClaw
LanguageTypeScriptGoRustGo
Binarynpm install8MB3.4MB18MB
Memory200MB+<20MB<15MB<50MB
CVEs512+000
Multi-agent4 patterns
Providers1114
Web UI
Chat adapters6
MCP client
Scheduled tasks
Persistent memory
Cost tracking
Offline mode

Get started in seconds

One command to install. One command to run.

# Install go install github.com/marsstein/marsclaw/cmd/marsclaw@latest
# Set your API key (pick one) export GEMINI_API_KEY="..." export ANTHROPIC_API_KEY="..." export OPENAI_API_KEY="..."
# Run marsclaw "explain this codebase"
# Or start the Web UI marsclaw serve
# Or use offline with Ollama (free) marsclaw -m llama3.1 "review this code"

Ready to run agents from anywhere?

Open source. Apache 2.0. Built by Marsstein.

Star on GitHub Contribute