Skip to content

spfunctions/agent-world-awareness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

agent-world-awareness

npm License: MIT

Make any AI agent world-aware in one line. Cached, ~800-token snapshot of 30,000+ prediction markets — ready for system-prompt injection. Zero config, zero dependencies, no API key.

import { getWorldContext } from 'agent-world-awareness'

const context = await getWorldContext()
// markdown ready to drop into any LLM system prompt

Why prediction markets?

LLMs don't know what's happening today. Web search returns narratives and opinions; you have to weigh hundreds of contradictory headlines. Prediction markets return calibrated probabilities backed by real money — the closest thing to ground truth about uncertain future events.

This package gives your agent:

  • What's happening — regime summary, market movers, key narrative-shaping events
  • How certain — uncertainty index (0-100), momentum, geopolitical risk, activity
  • What's changing — incremental delta since the agent's last call (cheap polling)

A single getWorldContext() call returns ~800 tokens of structured markdown, already cached for 15 minutes so multiple agents/tools share one network call.

Install

npm install agent-world-awareness

Zero runtime dependencies. ESM and CJS, full TypeScript types.

Usage

One-line system-prompt injection

import OpenAI from 'openai'
import { getWorldContext } from 'agent-world-awareness'

const openai = new OpenAI()
const response = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [
    {
      role: 'system',
      content: `You are a market intelligence assistant.

## Current world state (prediction markets, ~15min stale)
${await getWorldContext()}

Use this context to ground every claim in real-money probabilities.`,
    },
    { role: 'user', content: "What's the highest geopolitical risk right now?" },
  ],
})

LangChain / Vercel AI / CrewAI

The same context works as a "system context provider" inside any agent framework. For richer per-tool integrations (multiple tools, tool calling, etc.) see the sister packages below.

Cheap polling with deltas

import { getWorldChanges, refreshContext } from 'agent-world-awareness'

// Every minute: poll for changes, only refetch full context if there were changes
setInterval(async () => {
  const delta = await getWorldChanges('1m')
  if (delta.changes.length > 0) {
    console.log('World changed:', delta.changes)
    await refreshContext()
  }
}, 60_000)

Just the four signals

import { getWorldSignals } from 'agent-world-awareness'

const s = await getWorldSignals()
// {
//   uncertainty: 22,    // 0-100
//   geopolitical: 0,    // 0-100
//   momentum: -0.08,    // -1 to +1
//   activity: 99,       // 0-100
// }

API

getWorldContext(): Promise<string>

Fetch /api/agent/world?format=markdown and return the result as a string. Cached in memory for 15 minutes.

getWorldSignals(): Promise<WorldSignals>

Fetch /api/public/index and return the four numeric signals. Cached in memory for 15 minutes (separate cache from context).

getWorldChanges(since?): Promise<WorldDelta>

Fetch the incremental delta since since. Not cached — always fresh.

interface WorldDelta {
  from: string       // ISO-8601 start
  to: string         // ISO-8601 end
  changes: string[]  // one-line change descriptions
  markdown: string   // pretty markdown rendering
}

since accepts a relative duration ('30m', '1h', '6h', '24h') or an ISO-8601 timestamp. Defaults to '1h' on the server.

refreshContext(): Promise<string>

Force-refresh the context cache. Returns the new value.

isStale(maxAgeMs?): boolean

True if the cached context is older than maxAgeMs (defaults to the configured TTL, 15 minutes).

setCacheTTL(ms): void

Override the cache TTL for both context and signals. Pass 0 to disable caching.

clearCache(): void

Drop both caches.

Errors

All fetchers throw Error("SimpleFunctions API error <status> for <url>") on non-2xx, and getWorldSignals throws "...malformed index payload" if the response is missing expected fields.

Sister packages

Want richer integration with your specific agent framework?

Stack Package
Vercel AI SDK vercel-ai-prediction-markets
LangChain / LangGraph langchain-prediction-markets
OpenAI Agents SDK / function calling openai-agents-prediction-markets
CrewAI (Python) crewai-prediction-markets
MCP / Claude / Cursor simplefunctions-cli
Just one number / one label prediction-market-uncertainty, prediction-market-regime

Testing

npm test

14 tests, all fetch-mocked — no network required.

License

MIT — built by SimpleFunctions.

About

Make any AI agent world-aware in one line. Real-time context from 30,000+ prediction markets, cached and ready for system prompt injection.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors