Dual pi-agent runtime harness. Two agents (local + Cloudflare) negotiate, share state, and self-modify via a 5-message protocol. Open source. $1–2/day to run.
npm i -g @spfunctions/harnessAfter install, the binary is sparkco.
sparkco initThe setup wizard will:
- Check your environment (git, node, wrangler)
- Configure Cloudflare credentials
- Deploy the server runtime
- Configure LLM models for both agents
- Optionally install pi/Claude Code skill
- Node.js >= 18
- git
- wrangler (
npm install -g wrangler) - pi (optional) (
npm install -g @anthropic-ai/claude-code)
You need a Cloudflare API Token with these permissions:
| Permission | Level | Why |
|---|---|---|
| Workers Scripts | Edit | Deploy/update Workers |
| Workers KV Storage | Edit | Create/read/write KV |
| Workers Routes | Edit | Configure routing |
| Durable Objects | Edit | Create DO namespace |
| Account Settings | Read | Verify account |
Create your token:
- Go to https://dash.cloudflare.com/profile/api-tokens
- Click "Create Token"
- Choose "Create Custom Token"
- Add the permissions listed above
- Zone Resources: All zones (or specific zone)
- Click "Continue to summary" then "Create Token"
- Copy the token — you'll need it during
sparkco init
SparkCo Harness runs two pi-agent runtimes that need LLM access to autonomously write code and modify environments.
The setup wizard will ask you to configure an LLM provider. Recommended: OpenRouter — one key accesses all models.
Default model: minimax/minimax-m2.7 ($0.30/$1.20 per M tokens).
Best cost-performance ratio for continuous agent operation.
| Variable | Required | Default |
|---|---|---|
SPARKCO_LLM_PROVIDER |
No | openrouter |
SPARKCO_LLM_API_KEY |
Yes | — |
SPARKCO_CLIENT_MODEL |
No | minimax/minimax-m2.7 |
SPARKCO_SERVER_MODEL |
No | Same as client |
sparkco model # View current config
sparkco model test # Verify both agents can respond
sparkco model set both anthropic/claude-sonnet-4-6 # Switch models
sparkco model key set <new-key> # Update API keyAt default settings (MiniMax M2.7), continuous operation costs roughly $1-2/day depending on activity level. This covers both client and server pi runtimes.
| Command | Description |
|---|---|
sparkco init |
Interactive setup wizard |
sparkco status |
System overview |
sparkco daemon start|stop|restart |
Manage client daemon |
sparkco send <type> <content> |
Send protocol message |
sparkco inbox |
View pending requests |
sparkco routes list|add|remove |
Manage local endpoints |
sparkco ps |
Process management |
sparkco manifest show|history|rollback |
Version control |
sparkco logs [name] |
View logs |
sparkco deploy |
Deploy/redeploy server |
sparkco secret set|list|delete |
Manage secrets |
sparkco model show|set|list|key|test |
Manage LLM models |
sparkco server status|logs|tasks |
Server runtime management |
sparkco improve status|issues|fixes|pause|resume |
Self-improvement engine |
sparkco destroy |
Tear down everything |
All commands support --json for machine-readable output (useful for pi).
CLIENT (local) CLOUDFLARE SERVER (VPS)
+--------------+ +------------------+ +--------------+
| pi-agent | | Worker (relay) | | pi-agent |
| daemon |<SSE>| ├─ auth |<SSE>| daemon |
| read/write | REST| ├─ SSE relay | REST| read/write |
| npm/git |---->| └─ KV/R2 proxy |<----| npm/git |
+--------------+ | Durable Object | | scheduler |
| └─ event log | +--------------+
+------------------+
The communication protocol has 5 message types:
- capability-request — "I need you to have X capability"
- capability-ready — "X is ready at endpoint Y"
- data — Payload on an established channel
- state-sync — Heartbeat + version + process status
- negotiate — Free-form discussion about capabilities
After init, install the sparkco skill:
# For pi
cp src/pi/skill.md ~/.pi/agent/skills/sparkco/SKILL.md
# For Claude Code
cp src/pi/skill.md ~/.claude/skills/sparkco/SKILL.mdOr let the setup wizard handle it automatically.
Pi can then use all sparkco commands via bash to coordinate
client-server workflows autonomously.
Config lives at ~/.sparkco/config.json. See sparkco status for current values.
npm install
npm test # all tests
npm run test:unit # unit tests only
npm run test:integration # integration tests only
npx tsx scripts/dev.ts # local dev server + daemonsparkco destroy
npm uninstall -g @spfunctions/harnessApache-2.0