Run any coding agent from a single CLI. Harness spawns the selected agent as a subprocess, translates its native streaming output into a unified NDJSON event stream, and outputs it to stdout.
Write one integration, and it works with any supported agent backend.
| Agent | Binary | Status |
|---|---|---|
| Claude Code | claude |
Supported |
| OpenAI Codex | codex |
Supported |
| OpenCode | opencode |
Supported |
| Cursor | agent |
Supported |
Recommended — curl installer:
curl -fsSL https://harness.lol/install.sh | shVia crates.io:
cargo install harnesscliBuild from source:
git clone https://github.com/ayshptk/harness-cli.git
cd harness
cargo build --release
# Binary at target/release/harness# Run Claude Code with a prompt
harness run --agent claude --prompt "explain this codebase"
# Use a model alias
harness run --agent claude --model sonnet --prompt "fix the bug"
# Dry-run — see the resolved command without executing
harness run --agent claude --model sonnet --prompt "hello" --dry-run
# Pipe prompt from stdin
echo "explain this codebase" | harness run --agent claude
# List available agents
harness list
# Check if an agent is installed
harness check claude --capabilitiesHarness maps human-friendly model names to the exact IDs each agent expects:
# List all known models
harness models list
# Resolve an alias for a specific agent
harness models resolve opus --agent claude
# → claude-opus-4-6
# Update the registry cache
harness models updateBuilt-in aliases include opus. You can add your own in harness.toml, and the cached registry at ~/.harness/models.toml is auto-updated from GitHub.
Place a harness.toml in your project root (or any parent directory):
default_agent = "claude"
default_model = "sonnet"
default_permissions = "full-access"
default_timeout_secs = 300
[agents.claude]
model = "opus"
extra_args = ["--verbose"]
[models.my-model]
description = "My custom model"
provider = "anthropic"
claude = "my-custom-model-id"Every agent's output is translated into a common NDJSON format with 8 event types:
SessionStart— session initializedTextDelta— streaming text chunkMessage— complete messageToolStart— tool invocation beginningToolEnd— tool invocation completeUsageDelta— incremental token usage and cost updateResult— run finishedError— error occurred
Full docs at harness.lol
MIT