Skip to content

services: add Surf - pay-per-use research API for AI agents#406

Open
tenequm wants to merge 1 commit intotempoxyz:mainfrom
cascade-protocol:feat/surf-inference
Open

services: add Surf - pay-per-use research API for AI agents#406
tenequm wants to merge 1 commit intotempoxyz:mainfrom
cascade-protocol:feat/surf-inference

Conversation

@tenequm
Copy link

@tenequm tenequm commented Mar 21, 2026

Service: Surf

URL: https://surf.cascade.fyi
Integration: First-party (native MPP, not proxied)
Payment: Tempo USDC
Intent: Charge (data endpoints) + Session (inference streaming)

What it does

Unified pay-per-use API for AI agents. Twitter, Reddit, web data, and LLM inference - no signup, no API keys. 10 composite REST endpoints + MCP server with 9 tools.

Endpoints

Route Description Pricing
POST /api/v1/twitter/search Search tweets with advanced operators $0.005
POST /api/v1/twitter/tweet Fetch tweet with full thread context $0.005
POST /api/v1/twitter/user Fetch user profile with recent tweets $0.005
POST /api/v1/reddit/search Search Reddit posts $0.005
POST /api/v1/reddit/post Fetch post with comments $0.005
POST /api/v1/reddit/subreddit Subreddit info and top posts $0.005
POST /api/v1/reddit/user User profile with recent activity $0.005
POST /api/v1/web/search AI-powered web search $0.01
POST /api/v1/web/crawl Crawl URL to markdown or HTML $0.002
POST /api/v1/inference/completions Chat completions (15 models) Dynamic
GET /api/v1/inference/models List available models Free

Why it's novel

  • Bundled data + compute: Four domains (Twitter, Reddit, Web, Inference) under one API, one wallet. No competitor bundles multiple data sources with inference under a single payment model.
  • Composite enriched data: Each call returns what would normally require multiple API requests - thread context, engagement metrics, topic extraction, auto-crawled article content.
  • MCP server: Unified MCP at /mcp with tool filtering via ?tools= query param for direct AI agent tool calls.
  • Dual-protocol: Same endpoints handle both x402 (Solana/Base USDC) and MPP (Tempo USDC).
  • First-party: MPP built into the service, not proxied through mpp.tempo.xyz.

Checklist

  • Service is live and accepting MPP payments
  • schemas/services.ts updated
  • pnpm check:types passes

Verify it's live

# 402 challenge on a data endpoint
curl -s https://surf.cascade.fyi/api/v1/twitter/user \
  -X POST -H "Content-Type: application/json" \
  -d '{"ref":"cascade_fyi"}' \
  -o /dev/null -w "%{http_code}"
# Returns: 402

# Free endpoint
curl -s https://surf.cascade.fyi/api/v1/inference/models | head -c 200

# OpenAPI spec
curl -s https://surf.cascade.fyi/openapi.json | head -c 200

@vercel
Copy link

vercel bot commented Mar 21, 2026

@tenequm is attempting to deploy a commit to the Tempo Team on Vercel.

A member of the Team first needs to authorize it.

Unified pay-per-use research and inference API for AI agents at
surf.cascade.fyi. 10 composite REST endpoints (Twitter, Reddit, Web,
Inference) + MCP server. First-party MPP, no signup or API keys.
@tenequm tenequm force-pushed the feat/surf-inference branch from 7bea038 to 7cb4038 Compare March 26, 2026 23:00
@tenequm tenequm changed the title services: add Surf Inference - OpenAI-compatible LLM inference with streaming sessions services: add Surf - pay-per-use research API for AI agents Mar 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant