Make any AI agent world-aware in one line. Cached, ~800-token snapshot of 30,000+ prediction markets — ready for system-prompt injection. Zero config, zero dependencies, no API key.
import { getWorldContext } from 'agent-world-awareness'
const context = await getWorldContext()
// markdown ready to drop into any LLM system promptLLMs don't know what's happening today. Web search returns narratives and opinions; you have to weigh hundreds of contradictory headlines. Prediction markets return calibrated probabilities backed by real money — the closest thing to ground truth about uncertain future events.
This package gives your agent:
- What's happening — regime summary, market movers, key narrative-shaping events
- How certain — uncertainty index (0-100), momentum, geopolitical risk, activity
- What's changing — incremental delta since the agent's last call (cheap polling)
A single getWorldContext() call returns ~800 tokens of structured markdown,
already cached for 15 minutes so multiple agents/tools share one network call.
npm install agent-world-awarenessZero runtime dependencies. ESM and CJS, full TypeScript types.
import OpenAI from 'openai'
import { getWorldContext } from 'agent-world-awareness'
const openai = new OpenAI()
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [
{
role: 'system',
content: `You are a market intelligence assistant.
## Current world state (prediction markets, ~15min stale)
${await getWorldContext()}
Use this context to ground every claim in real-money probabilities.`,
},
{ role: 'user', content: "What's the highest geopolitical risk right now?" },
],
})The same context works as a "system context provider" inside any agent framework. For richer per-tool integrations (multiple tools, tool calling, etc.) see the sister packages below.
import { getWorldChanges, refreshContext } from 'agent-world-awareness'
// Every minute: poll for changes, only refetch full context if there were changes
setInterval(async () => {
const delta = await getWorldChanges('1m')
if (delta.changes.length > 0) {
console.log('World changed:', delta.changes)
await refreshContext()
}
}, 60_000)import { getWorldSignals } from 'agent-world-awareness'
const s = await getWorldSignals()
// {
// uncertainty: 22, // 0-100
// geopolitical: 0, // 0-100
// momentum: -0.08, // -1 to +1
// activity: 99, // 0-100
// }Fetch /api/agent/world?format=markdown and return the result as a string.
Cached in memory for 15 minutes.
Fetch /api/public/index and return the four numeric signals. Cached in
memory for 15 minutes (separate cache from context).
Fetch the incremental delta since since. Not cached — always fresh.
interface WorldDelta {
from: string // ISO-8601 start
to: string // ISO-8601 end
changes: string[] // one-line change descriptions
markdown: string // pretty markdown rendering
}since accepts a relative duration ('30m', '1h', '6h', '24h') or an
ISO-8601 timestamp. Defaults to '1h' on the server.
Force-refresh the context cache. Returns the new value.
True if the cached context is older than maxAgeMs (defaults to the configured
TTL, 15 minutes).
Override the cache TTL for both context and signals. Pass 0 to disable caching.
Drop both caches.
All fetchers throw Error("SimpleFunctions API error <status> for <url>") on
non-2xx, and getWorldSignals throws "...malformed index payload" if the
response is missing expected fields.
Want richer integration with your specific agent framework?
| Stack | Package |
|---|---|
| Vercel AI SDK | vercel-ai-prediction-markets |
| LangChain / LangGraph | langchain-prediction-markets |
| OpenAI Agents SDK / function calling | openai-agents-prediction-markets |
| CrewAI (Python) | crewai-prediction-markets |
| MCP / Claude / Cursor | simplefunctions-cli |
| Just one number / one label | prediction-market-uncertainty, prediction-market-regime |
npm test14 tests, all fetch-mocked — no network required.
MIT — built by SimpleFunctions.