Project Stranger is an emergent social simulation engine running on Bun and Groq.
It creates a "petri dish" for synthetic personalities. Instead of scripting a conversation, the engine generates two random human personas with specific professions, places them in a virtual bar, and observes what happens when they interact without supervision.
The simulation runs in three distinct phases:
The engine queries the LLM with temperature: 1.0 to generate two unique personas based on specified domains (professions).
- Example A: "Marcus, a doctor"
- Example B: "Elena, a software engineer"
Personas are kept simple (name + profession only) so personality, quirks, and details emerge naturally through conversation.
A third agent (The Bartender/Moderator) makes a casual icebreaker comment as a real bartender would - without knowing anything about the customers except surface-level observations.
- Bartender: "Tough day, or just unwinding after work, huh?"
The two agents enter a conversational loop.
- They maintain their own private memory context.
- They are strictly instructed not to act like AI.
- Responses are kept very short (1-2 sentences max) for natural bar conversation.
- Topics drift naturally; agents don't narrate or describe what others say.
- Professional backgrounds influence the conversation organically without being explicitly stated.
- Runtime: Bun (Native TypeScript support, high performance)
- Inference: Groq (Fast cloud-based LLM API)
- Model: Defaults to
llama-3.3-70b-versatile(Configurable via environment)
Get your Groq API key from console.groq.com
Create a .env file (copy from .env.example):
GROQ_API_KEY=your_actual_groq_api_key
MODEL=llama-3.3-70b-versatile
AGENT_A_DOMAIN=doctor
AGENT_B_DOMAIN=software engineer
LOOP_ROUNDS=10bun installbun startor
bun devThe simulation will:
- Generate two personas with specified professions (or random if not set)
- Have the bartender make a casual icebreaker
- Run the configured number of conversation rounds (default: 10)
- Save a complete transcript to
transcripts/session_[timestamp].json
project-stranger/
├── src/
│ ├── types.ts # TypeScript interfaces and types (Role, TranscriptEntry, SimulationLog)
│ ├── agents.ts # Agent and Moderator classes with LLM interaction
│ └── index.ts # Main simulation runner and orchestration
├── transcripts/ # Generated conversation logs (auto-created)
├── .env # Environment variables (create from .env.example)
├── .env.example # Example environment configuration
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # This file
Each simulation produces a JSON transcript with:
- Metadata: Start time, personas, introduction context
- Transcript: Complete conversation with timestamps and speaker roles
Example output location: transcripts/session_1738281234567.json
- Model: Set the
MODELenvironment variable (see Groq models) - Professions: Change
AGENT_A_DOMAINandAGENT_B_DOMAINin.env - Rounds: Adjust
LOOP_ROUNDSin.env(default: 10) - Response Length: Modify
max_tokensinsrc/agents.ts(currently: 50 for natural brevity) - Prompts: Modify the
PROMPTSobject insrc/agents.tsto change agent behavior