Skip to content

Memory implementation#640

Draft
pratik-fractl wants to merge 39 commits intomainfrom
memory
Draft

Memory implementation#640
pratik-fractl wants to merge 39 commits intomainfrom
memory

Conversation

@pratik-fractl
Copy link
Copy Markdown
Contributor

No description provided.

Signed-off-by: Pratik Karki <pratik@fractl.io>
Introduce a comprehensive memory system for Agentlang that enables
AI agents to maintain persistent knowledge graphs, extract facts from
conversations, and retrieve relevant context for improved responses.

Core Memory Features:
- Knowledge graph storage using Neo4j for entities, relationships, and
facts
- Automatic fact extraction from agent conversations using LLM
- Context building for relevant memory retrieval during agent
interactions
- Conversation processing with streaming support for large documents
- Entity deduplication with semantic similarity matching
- Document processing pipeline for ingesting external knowledge

Neo4j Integration:
- Graph database abstraction supporting both Node.js and browser
environments
- Configurable Neo4j connection (bolt://localhost:7687 by default)
- Automatic schema creation for nodes, relationships, and indexes
- Support for graph visualization with color schemes

Examples Added:
example/dracula/ - Full knowledge graph demo using Bram Stoker's
Dracula
   - Processes ~870KB novel text (~15,475 lines)
   - Demonstrates streaming chunk processing for memory efficiency
   - Shows entity extraction, relationship mapping, and contextual
    queries
   - Includes MEMORY.md with detailed documentation

diff --git a/example/custom_config/config.al b/example/custom_config/config.al
index a351bf2..d4b8d30 100644
--- a/example/custom_config/config.al
+++ b/example/custom_config/config.al
@@ -26,8 +26,8 @@
         "name": "llm01",
         "service": "openai",
         "config": {
-          "model": "gpt-4.1",
-          "maxTokens": 200,
+          "model": "gpt-5.2",
+          "maxTokens": 100000,
           "temperature": 0.7
         }
       }
@@ -37,7 +37,7 @@
         "name": "llm02",
         "service": "openai",
         "config": {
-          "model": "gpt-4.0"
+          "model": "gpt-4o"
         }
       }
     },
diff --git a/example/dracula/MEMORY.md b/example/dracula/MEMORY.md
new file mode 100644
index 0000000..b860bfe
--- /dev/null
+++ b/example/dracula/MEMORY.md
@@ -0,0 +1,986 @@
+# Dracula Knowledge Graph - Memory System Documentation
+
+## Overview
+
+This document describes how the Dracula example uses Agentlang's knowledge graph memory system to process Bram Stoker's novel and enable intelligent query responses.
+
+**Key Insight:** Unlike simple text search, the system extracts **entities** (characters, locations), **relationships** (TRAVELS_TO, IMPRISONS), and **embeddings** (vector representations), then stores them in both SQL (with vectors) and Neo4j (graph database) for hybrid retrieval.
+
+---
+
+## Architecture
+
+```
+┌─────────────────────────────────────────────────────────────────────────────┐
+│                         KNOWLEDGE GRAPH FLOW                                │
+├─────────────────────────────────────────────────────────────────────────────┤
+│                                                                             │
+│  Input: dracula.txt (~870KB, public domain)                                │
+│       ↓                                                                     │
+│  ┌────────────────────────────────────────────────────────────────────┐     │
+│  │  INGESTION PIPELINE                                                │     │
+│  │  ┌──────────────┐   ┌──────────────┐   ┌──────────────┐         │     │
+│  │  │ Text Chunk   │ → │ Entity       │ → │ Generate     │         │     │
+│  │  │ (1000+200)   │   │ Extraction   │   │ Embeddings   │         │     │
+│  │  └──────────────┘   │ (LLM)        │   │ (1536-dim)   │         │     │
+│  │                     └──────┬───────┘   └──────┬───────┘         │     │
+│  │                            │                  │                  │     │
+│  │                     ┌──────┴──────────┬───────┴──────────┐         │     │
+│  │                     ↓                 ↓                  ↓         │     │
+│  │              ┌────────────┐   ┌────────────┐   ┌────────────┐    │     │
+│  │              │ Deduplicate│   │ Create     │   │ Create     │    │     │
+│  │              │ (exact +   │   │ Nodes      │   │ Edges      │    │     │
+│  │              │ semantic)  │   │            │   │            │    │     │
+│  │              └────────────┘   └────────────┘   └────────────┘    │     │
+│  └────────────────────────────────────────────────────────────────────┘     │
+│       ↓                                                                     │
+│  ┌────────────────────────────────────────────────────────────────────┐     │
+│  │  HYBRID STORAGE                                                    │     │
+│  │                                                                    │     │
+│  │   ┌──────────────────┐            ┌──────────────────┐             │     │
+│  │   │  SQLite          │            │  Neo4j           │             │     │
+│  │   │  (dracula.db)    │            │  (Graph DB)      │             │     │
+│  │   ├────────────────┤            ├──────────────────┤             │     │
+│  │   │ • KnowledgeNode │            │ • Nodes          │             │     │
+│  │   │ • KnowledgeEdge │◄──────────►│ • Edges          │             │     │
+│  │   │ • Vectors       │   Sync     │ • Traversal      │             │     │
+│  │   └────────────────┘            └──────────────────┘             │     │
+│  │                                                                    │     │
+│  └────────────────────────────────────────────────────────────────────┘     │
+│       ↓                                                                     │
+│  ┌────────────────────────────────────────────────────────────────────┐     │
+│  │  QUERY RETRIEVAL                                                  │     │
+│  │  ┌──────────────┐   ┌──────────────┐   ┌──────────────┐         │     │
+│  │  │ Extract      │ → │ Vector       │ → │ Graph        │         │     │
+│  │  │ Entities     │   │ Search       │   │ Expansion    │         │     │
+│  │  │ from Query   │   │ (Semantic)   │   │ (BFS 2-hop)  │         │     │
+│  │  └──────────────┘   └──────┬───────┘   └──────┬───────┘         │     │
+│  │                            │                  │                  │     │
+│  │                     ┌──────┴──────────────────┴──────────┐         │     │
+│  │                     ↓                                      ↓         │     │
+│  │              ┌────────────────────────────────────────────────┐    │     │
+│  │              │ Build Structured Context                         │    │     │
+│  │              │ - Entities by type                                │    │     │
+│  │              │ - Relationship chains                             │    │     │
+│  │              │ - Instance data                                 │    │     │
+│  │              └────────────────────────────────────────────────┘    │     │
+│  └────────────────────────────────────────────────────────────────────┘     │
+└─────────────────────────────────────────────────────────────────────────────┘
+```
+
+---
+
+## Startup Flow
+
+When you run `agentlang run example/dracula`:
+
+### Phase 1: CLI Initialization
+
+```typescript
+// src/cli/main.ts
+export const runModule = async (fileName: string) => {
+  // 1. Initialize SQLite database
+  await initDatabase(config?.store); // dracula.db
+
+  // 2. Load module (src/core.al)
+  await load(fileName, undefined, async appSpec => {
+    await runPostInitTasks(appSpec, config); // ← Triggers document processing
+  });
+};
+```
+
+**Storage after load:**
+
+```sql
+-- Table: agentlang.ai/Document
+-- id | title   | url
+-- ----|---------|-------------------------------
+-- 1  | dracula | ./example/dracula/docs/dracula.txt
+
+-- Table: Dracula/dracula (Agent)
+-- name    | documents | llm
+-- --------|-----------|-------
+-- dracula | dracula   | llm01
+```
+
+### Phase 2: Document Pre-Processing (BLOCKING)
+
+```typescript
+// src/cli/main.ts
+export async function runPostInitTasks(appSpec, config) {
+  // Initialize Knowledge Service
+  const knowledgeService = getKnowledgeService();
+  await knowledgeService.init(); // Connects to Neo4j
+
+  // PRE-PROCESS DOCUMENTS (blocks startup)
+  await preProcessAgentDocuments(knowledgeService);
+}
+```
+
+```typescript
+// src/cli/main.ts
+async function preProcessAgentDocuments(knowledgeService) {
+  // Find all agents with documents
+  const agents = await parseAndEvaluateStatement(`{agentlang.ai/Agent? {}}`);
+
+  for (const agent of agents) {
+    const docTitles = agent.documents.split(','); // ["dracula"]
+
+    // Create session for processing
+    const session = await knowledgeService.getOrCreateSession(
+      agent.name, // "dracula"
+      'system', // System user
+      `${agent.moduleName}/${agent.name}` // "Dracula/dracula"
+    );
+
+    // Process documents (THIS IS THE KEY)
+    await knowledgeService.maybeProcessAgentDocuments(
+      session,
+      docTitles, // ["dracula"]
+      undefined,
+      agent.llm // "gpt-4o"
+    );
+  }
+}
+```
+
+---
+
+## Knowledge Ingestion
+
+### Step 1: Document Loading
+
+```typescript
+// src/runtime/knowledge/service.ts
+async maybeProcessAgentDocuments(session, documentTitles, env, llmName) {
+  // Check if already processed
+  const existing = await parseAndEvaluateStatement(`
+    {agentlang.knowledge/KnowledgeNode {
+      agentId? "${session.agentId}",
+      sourceType? "DOCUMENT"
+    }} @limit 1
+  `);
+
+  if (existing.length > 0) {
+    logger.info('Documents already processed, skipping');
+    return;
+  }
+
+  // Fetch documents
+  const allDocs = await parseAndEvaluateStatement(
+    `{agentlang.ai/Document? {}}`
+  );
+
+  // Load content from file
+  const matchingDocs = [];
+  for (const doc of allDocs) {
+    if (documentTitles.includes(doc.lookup('title'))) {
+      const content = await loadDocumentContent(doc.lookup('url'));
+      // content = "Count Dracula is the protagonist..."
+      matchingDocs.push({name: doc.lookup('title'), content});
+    }
+  }
+
+  // Process
+  await this.processDocuments(matchingDocs, containerTag, userId, agentId, env, llmName);
+
+  // Sync to Neo4j
+  await this.syncToNeo4j(containerTag);
+}
+```
+
+### Step 2: Text Chunking
+
+```typescript
+// src/runtime/knowledge/document-processor.ts
+async processDocument(document, containerTag, userId, agentId, env, llmName) {
+  const chunker = new TextChunker(1000, 200);  // chunkSize=1000, overlap=200
+
+  // Generate chunks
+  const chunks = [];
+  for (const chunk of chunker.streamChunks(document.content)) {
+    chunks.push(chunk);
+  }
+  // ~870 chunks for Dracula
+
+  // Group into mega-batches (up to 50K chars)
+  const megaBatches = groupIntoMegaBatches(chunks);
+  // ~18 mega-batches for Dracula
+}
+```
+
+**TextChunker logic** (from `src/runtime/embeddings/chunker.js`):
+
+- Splits text at word boundaries
+- Maintains 200-character overlap between chunks
+- Preserves context across chunk boundaries
+
+### Step 3: Entity Extraction via LLM
+
+```typescript
+// src/runtime/knowledge/document-processor.ts
+const entityResults = await runParallel(
+  megaBatches,
+  async (batch, i) => {
+    const entities = await this.extractor.extractEntitiesFromBatch(
+      batch,
+      env,
+      llmName, // "gpt-4o"
+      MAX_ENTITIES_PER_BATCH // 50
+    );
+    return entities;
+  },
+  LLM_CONCURRENCY // 5 concurrent
+);
+```
+
+**LLM Prompt** (from `src/runtime/knowledge/prompts.ts`):
+
+```typescript
+export const MEGABATCH_ENTITY_PROMPT = `You are a knowledge graph entity extraction system...
+
+CRITICAL RULES:
+- Focus on CENTRAL, RECURRING entities
+- Each entity must be a proper name
+- Provide a salience score from 1-5 (5 = most central)
+- Estimate how many times each entity is mentioned
+
+Return JSON:
+{
+  "entities": [
+    {
+      "name": "Entity name",
+      "type": "Person|Organization|Location|Product|Concept|Event|Role",
+      "description": "Brief description",
+      "salience": 5,
+      "mentions": 10
+    }
+  ]
+}`;
+```
+
+**LLM Response for Dracula batch:**
+
+```json
+{
+  "entities": [
+    {
+      "name": "Count Dracula",
+      "type": "Person",
+      "description": "The protagonist, an ancient vampire",
+      "salience": 5,
+      "mentions": 156
+    },
+    {
+      "name": "Jonathan Harker",
+      "type": "Person",
+      "description": "Young English lawyer",
+      "salience": 5,
+      "mentions": 89
+    },
+    {
+      "name": "Transylvania",
+      "type": "Location",
+      "description": "Region in Romania where Dracula lives",
+      "salience": 4,
+      "mentions": 45
+    }
+  ]
+}
+```
+
+### Step 4: Type Normalization
+
+```typescript
+// src/runtime/knowledge/extractor.ts
+function normalizeEntityType(type: string): string {
+  const normalized = type.trim().toLowerCase();
+  switch (normalized) {
+    case 'person':
+    case 'character':
+    case 'animal':
+    case 'creature':
+      return 'Person';
+    case 'location':
+    case 'place':
+    case 'setting':
+      return 'Location';
+    case 'product':
+    case 'artifact':
+    case 'object':
+      return 'Product';
+    // ... other mappings
+    default:
+      return 'Concept';
+  }
+}
+```
+
+### Step 5: Select Core Entities
+
+```typescript
+// src/runtime/knowledge/document-processor.ts
+const coreEntities = selectCoreEntities(
+  candidateEntities,
+  CORE_ENTITY_LIMIT, // 60 max entities
+  CORE_ENTITY_PER_TYPE, // 20 per type
+  MIN_ENTITY_MENTIONS, // 2 minimum
+  MIN_ENTITY_SALIENCE, // 3.0 minimum
+  MAX_DOCUMENT_NODES // 1000
+);
+
+// Filters by:
+// - Score: mentions * 2 + salience
+// - Per-type limit: max 20 per type
+// - Minimum thresholds
+```
+
+### Step 6: Create Nodes with Embeddings
+
+```typescript
+// src/runtime/knowledge/deduplicator.ts
+async findOrCreateNodesBatch(entities, containerTag, userId, sourceType, sourceId, sourceChunks, agentId) {
+  const results = [];
+  const needsEmbedding = [];
+
+  // Step 1: Check for duplicates
+  for (const entity of entities) {
+    const existing = await this.findNodeByExactName(entity.name, containerTag);
+    if (existing) {
+      results.push(await this.mergeNode(existing, entity));
+      continue;
+    }
+
+    const similar = await this.findSimilarNode(entity, containerTag);
+    if (similar) {
+      results.push(await this.mergeNode(similar, entity));
+      continue;
+    }
+
+    needsEmbedding.push({entity, index: results.length});
+  }
+
+  // Step 2: Batch embed ALL new entities (ONE API call)
+  if (needsEmbedding.length > 0) {
+    const embeddingService = new EmbeddingService(embeddingConfig);
+
+    const textsToEmbed = needsEmbedding.map(item =>
+      `${item.entity.name} ${item.entity.type} ${item.entity.description || ''}`.trim()
+    );
+    // ["Count Dracula Person ancient vampire", "Jonathan Harker Person lawyer", ...]
+
+    const embeddings = await embeddingService.embedTexts(textsToEmbed);
+    // Returns: [[0.023, -0.156, 0.089, ...], ...] (1536 dimensions each)
+
+    // Step 3: Create nodes with embeddings
+    for (let i = 0; i < needsEmbedding.length; i++) {
+      const {entity, index} = needsEmbedding[i];
+      const node = await this.createNewNode(
+        entity,
+        containerTag,
+        userId,
+        sourceType,
+        sourceId,
+        sourceChunks,
+        agentId,
+        embeddings[i]  // ← EMBEDDING STORED
+      );
+      results[index] = node;
+    }
+  }
+
+  return results;
+}
+```
+
+**Storage:**
+
+```sql
+-- Table: agentlang.knowledge/KnowledgeNode
+-- id         | name            | type     | description
+-- -----------|-----------------|----------|--------------------------
+-- uuid-abc   | Count Dracula   | Person   | Ancient vampire
+-- uuid-def   | Jonathan Harker | Person   | Young lawyer
+-- uuid-ghi   | Transylvania    | Location | Region in Romania
+-- uuid-jkl   | Castle Dracula  | Location | Ancient fortress
+-- uuid-mno   | The Demeter     | Product  | Russian sailing ship
+
+-- Table: agentlang.knowledge/KnowledgeNode_vector (pgvector/sqlite-vec)
+-- id       | embedding (1536-dim vector)
+-- ---------|----------------------------------
+-- uuid-abc | [0.023, -0.156, 0.089, 0.234, ...]
+-- uuid-def | [-0.045, 0.234, 0.123, -0.078, ...]
+-- uuid-ghi | [0.156, -0.089, 0.045, 0.312, ...]
+```
+
+### Step 7: Relationship Extraction
+
+```typescript
+// src/runtime/knowledge/document-processor.ts
+const relResults = await runParallel(
+  megaBatches,
+  async (batch, i) => {
+    const relationships = await this.extractor.extractRelationshipsFromBatch(
+      batch,
+      coreEntityNames, // ["Count Dracula", "Jonathan Harker", ...]
+      env,
+      llmName,
+      MAX_RELATIONSHIPS_PER_BATCH // 100
+    );
+    return relationships;
+  },
+  LLM_CONCURRENCY
+);
+```
+
+**LLM Prompt** (from `src/runtime/knowledge/prompts.ts`):
+
+```typescript
+export const MEGABATCH_RELATIONSHIP_PROMPT = `You are a knowledge graph relationship extraction system.
+
+CRITICAL RULES:
+- ONLY use entity names from the provided list
+- Extract only relationships explicitly stated or strongly implied
+- Use descriptive relationship types in UPPER_SNAKE_CASE
+
+Return JSON:
+{
+  "relationships": [
+    {
+      "source": "Entity name",
+      "target": "Entity name",
+      "type": "RELATIONSHIP_TYPE"
+    }
+  ]
+}
+
+Common relationship types: KNOWS, MEETS, FOLLOWS, WORKS_FOR, LOCATED_IN, HAS_ROLE, BELONGS_TO, TRAVELS_TO`;
+```
+
+**LLM Response:**
+
+```json
+{
+  "relationships": [
+    { "source": "Count Dracula", "target": "Jonathan Harker", "type": "IMPRISONS" },
+    { "source": "Count Dracula", "target": "Transylvania", "type": "RESIDES_AT" },
+    { "source": "Count Dracula", "target": "The Demeter", "type": "TRAVELS_ON" },
+    { "source": "The Demeter", "target": "England", "type": "ARRIVES_AT" },
+    { "source": "Jonathan Harker", "target": "Mina Harker", "type": "MARRIED_TO" }
+  ]
+}
+```
+
+### Step 8: Create Edges
+
+```typescript
+// src/runtime/knowledge/document-processor.ts
+for (const rel of candidateRelationships.values()) {
+  const edge = await this.createEdgeFromCandidate(
+    rel,
+    coreNodeMap,
+    containerTag,
+    userId,
+    agentId
+  );
+}
+
+// SQL storage:
+-- Table: agentlang.knowledge/KnowledgeEdge
+-- id         | sourceId | targetId | relType     | weight
+-- ------------|----------|----------|-------------|--------
+-- uuid-edge1 | uuid-abc | uuid-def | IMPRISONS   | 3.0
+-- uuid-edge2 | uuid-abc | uuid-ghi | RESIDES_AT  | 5.0
+-- uuid-edge3 | uuid-abc | uuid-mno | TRAVELS_ON  | 2.0
+-- uuid-edge4 | uuid-mno | uuid-pqr | ARRIVES_AT  | 4.0
+-- uuid-edge5 | uuid-def | uuid-stu | MARRIED_TO  | 5.0
+```
+
+### Step 9: Sync to Neo4j
+
+```typescript
+// src/runtime/knowledge/service.ts
+async syncToNeo4j(containerTag: string): Promise<void> {
+  // 1. Fetch nodes from SQL
+  const nodes = await parseAndEvaluateStatement(`
+    {agentlang.knowledge/KnowledgeNode {
+      containerTag? "${containerTag}",
+      isLatest? true
+    }}
+  `);
+
+  // 2. Upsert to Neo4j
+  for (const inst of nodes) {
+    const node = {
+      id: inst.lookup('id'),
+      name: inst.lookup('name'),
+      type: inst.lookup('type'),
+      description: inst.lookup('description')
+    };
+    await this.graphDb.upsertNode(node);
+    // Cypher: MERGE (n:KnowledgeNode {id: $id}) SET n.name = $name, ...
+  }
+
+  // 3. Sync edges
+  const edges = await parseAndEvaluateStatement(`
+    {agentlang.knowledge/KnowledgeEdge {
+      containerTag? "${containerTag}"
+    }}
+  `);
+
+  for (const inst of edges) {
+    const edge = {
+      sourceId: inst.lookup('sourceId'),
+      targetId: inst.lookup('targetId'),
+      relationship: inst.lookup('relType'),
+      weight: inst.lookup('weight')
+    };
+    await this.graphDb.upsertEdge(edge);
+    // Cypher: MERGE (a)-[r:RELTYPE]->(b) SET r.weight = $weight
+  }
+}
+```
+
+**Neo4j after sync:**
+
+```cypher
+// Nodes
+(:KnowledgeNode {name: "Count Dracula", type: "Person"})
+(:KnowledgeNode {name: "Jonathan Harker", type: "Person"})
+(:KnowledgeNode {name: "Transylvania", type: "Location"})
+(:KnowledgeNode {name: "The Demeter", type: "Product"})
+(:KnowledgeNode {name: "England", type: "Location"})
+
+// Relationships
+(:KnowledgeNode {name: "Count Dracula"})-[:IMPRISONS]->(:KnowledgeNode {name: "Jonathan Harker"})
+(:KnowledgeNode {name: "Count Dracula"})-[:RESIDES_AT]->(:KnowledgeNode {name: "Transylvania"})
+(:KnowledgeNode {name: "Count Dracula"})-[:TRAVELS_ON]->(:KnowledgeNode {name: "The Demeter"})
+(:KnowledgeNode {name: "The Demeter"})-[:ARRIVES_AT]->(:KnowledgeNode {name: "England"})
+```
+
+---
+
+## Knowledge Retrieval
+
+When user asks: **"How does Dracula travel from Transylvania to England?"**
+
+### Phase 1: Query Processing
+
+```typescript
+// src/runtime/modules/ai.ts
+async handleMessage(message, env) {
+  // 1. Get knowledge session
+  const knowledgeService = getKnowledgeService();
+  const knowledgeSession = await knowledgeService.getOrCreateSession(
+    this.name,      // "dracula"
+    userId,         // user's ID
+    agentFqName     // "Dracula/dracula"
+  );
+
+  // 2. Build context from knowledge graph
+  const knowledgeContext = await knowledgeService.buildContext(
+    message,                    // "How does Dracula travel..."
+    knowledgeSession.containerTag,  // "Dracula/dracula:user:xyz"
+    knowledgeSession.userId,
+    knowledgeSession.agentId
+  );
+
+  // 3. Add context to LLM prompt
+  const finalMsg = knowledgeContext.contextString
+    ? `${message}\n\n${knowledgeContext.contextString}`
+    : message;
+
+  // 4. Send to LLM
+  const response = await llm.complete(finalMsg);
+}
+```
+
+### Phase 2: Context Building
+
+```typescript
+// src/runtime/knowledge/context-builder.ts
+async buildContext(query, containerTag, userId, extraContainerTags) {
+  // Step 1: Extract entity candidates from query
+  const candidates = extractEntityCandidates(query);
+  // Returns: ["Dracula", "Transylvania", "England"]
+
+  // Step 2: Find exact matches
+  let seedNodes = [];
+  for (const candidate of candidates) {
+    const matches = await this.findExactMatches(candidate, containerTag);
+    seedNodes = mergeNodes(seedNodes, matches, MAX_SEED_NODES);
+  }
+  // seedNodes = [Dracula, Transylvania, England]
+
+  // Step 3: Vector search for semantic matches (if needed)
+  if (seedNodes.length < MAX_SEED_NODES) {
+    const vectorMatches = await this.vectorSearchNodes(query, containerTag);
+    seedNodes = mergeNodes(seedNodes, vectorMatches, MAX_SEED_NODES);
+  }
+
+  // Step 4: Graph expansion (BFS)
+  let expandedNodes = seedNodes;
+  let expandedEdges = [];
+
+  if (this.graphDb.isConnected()) {
+    const seedIds = seedNodes.map(n => n.id);
+    const expanded = await this.graphDb.expandGraph(
+      seedIds,
+      MAX_GRAPH_DEPTH,  // 2 hops
+      containerTag
+    );
+    expandedNodes = expanded.nodes;
+    expandedEdges = expanded.edges;
+  }
+
+  // Step 5: Fetch instance data (if any)
+  const instanceData = await this.fetchInstanceData(expandedNodes);
+
+  // Step 6: Format context
+  const contextString = this.formatContext(
+    expandedNodes,
+    expandedEdges,
+    instanceData
+  );
+
+  return {
+    entities: expandedNodes,
+    relationships: expandedEdges,
+    instanceData,
+    contextString
+  };
+}
+```
+
+### Phase 3: Vector Search
+
+```typescript
+// src/runtime/knowledge/context-builder.ts
+private async vectorSearchNodes(query: string, containerTag: string): Promise<GraphNode[]> {
+  // Embed the query
+  const result = await parseAndEvaluateStatement(`
+    {agentlang.knowledge/KnowledgeNode {
+      containerTag? "${containerTag}",
+      name? "${query}"   // ← This triggers vector search!
+    }, @limit ${MAX_SEED_NODES}}
+  `);
+
+  return result.map(instanceToGraphNode);
+}
+```
+
+**Behind the scenes** (from `src/runtime/resolvers/sqldb/impl.ts`):
+
+```typescript
+// Embed query
+const embeddingService = new EmbeddingService(embeddingConfig);
+const queryVec = await embeddingService.embedQuery(
+  'How does Dracula travel from Transylvania to England?'
+);
+// Returns: [0.034, -0.128, 0.067, ...]
+
+// Search in vector store
+const rslt = await vectorStoreSearch(tableName, queryVec, 10, ctx);
+// SQL: SELECT id FROM KnowledgeNode_vector
+//      ORDER BY embedding <-> '[0.034, -0.128, ...]'  -- cosine distance
+//      LIMIT 10
+```
+
+### Phase 4: Graph Expansion
+
+```typescript
+// src/runtime/graph/neo4j.ts
+async expandGraph(seedIds: string[], maxDepth: number, containerTag: string) {
+  const session = this.driver.session();
+
+  try {
+    const result = await session.run(`
+      MATCH (seed:KnowledgeNode)
+      WHERE seed.id IN $seedIds AND seed.containerTag = $containerTag
+
+      // Use APOC for subgraph expansion
+      CALL apoc.path.subgraphAll(seed, {
+        maxLevel: $maxDepth,     // 2 hops
+        labelFilter: 'KnowledgeNode'
+      })
+      YIELD nodes, relationships
+
+      RETURN DISTINCT n,
+             startNode(r).id AS srcId,
+             endNode(r).id AS tgtId,
+             type(r) AS relType,
+             r.weight AS weight
+    `, { seedIds, maxDepth: 2, containerTag });
+
+    // Process results into nodes and edges
+    const nodeMap = new Map();
+    const edges = [];
+
+    for (const record of result.records) {
+      const n = record.get('n');
+      if (n) {
+        nodeMap.set(n.properties.id, this.recordToNode(n));
+      }
+
+      const srcId = record.get('srcId');
+      const tgtId = record.get('tgtId');
+      if (srcId && tgtId) {
+        edges.push({
+          sourceId: srcId,
+          targetId: tgtId,
+          relationship: record.get('relType'),
+          weight: record.get('weight') ?? 1.0
+        });
+      }
+    }
+
+    return {
+      nodes: Array.from(nodeMap.values()),
+      edges
+    };
+  } finally {
+    await session.close();
+  }
+}
+```
+
+**Graph traversal for "Dracula travel Transylvania England":**
+
+```
+Seed: [Dracula]
+
+Depth 0: Dracula
+           │
+           ▼
+Depth 1: RESIDES_AT → Transylvania
+         IMPRISONS   → Jonathan Harker
+         TRAVELS_ON  → The Demeter
+           │
+           ▼
+Depth 2: (from The Demeter)
+         ARRIVES_AT  → England
+         DEPARTS_FROM→ Transylvania
+
+Result: Dracula, Transylvania, Jonathan Harker, The Demeter, England
+Edges: RESIDES_AT, IMPRISONS, TRAVELS_ON, ARRIVES_AT, DEPARTS_FROM
+```
+
+### Phase 5: Context Formatting
+
+```typescript
+// src/runtime/knowledge/context-builder.ts
+formatContext(nodes, edges, instances) {
+  let context = '## Knowledge Graph Context\n\n';
+
+  // Entities by type
+  context += '### Relevant Entities\n';
+  const byType = this.groupByType(nodes);
+  for (const [type, entities] of byType) {
+    context += `\n**${type}s:**\n`;
+    for (const entity of entities) {
+      context += `- ${entity.name}`;
+      if (entity.description) {
+        context += `: ${entity.description}`;
+      }
+      context += '\n';
+    }
+  }
+
+  // Relationships
+  context += '\n### Relationships\n';
+  for (const edge of edges) {
+    const source = nodes.find(n => n.id === edge.sourceId);
+    const target = nodes.find(n => n.id === edge.targetId);
+    if (source && target) {
+      context += `- ${source.name} ${edge.relationship} ${target.name}\n`;
+    }
+  }
+
+  return context;
+}
+```
+
+**Generated Context:**
+
+```markdown
+## Knowledge Graph Context
+
+### Relevant Entities
+
+**Persons:**
+
+- Count Dracula: Ancient vampire, Transylvanian nobleman
+- Jonathan Harker: Young English lawyer
+
+**Locations:**
+
+- Transylvania: Region in Romania where Dracula lives
+- England: Destination country
+
+**Products:**
+
+- The Demeter: Russian sailing ship
+
+### Relationships
+
+- Count Dracula RESIDES_AT Transylvania
+- Count Dracula IMPRISONS Jonathan Harker
+- Count Dracula TRAVELS_ON The Demeter
+- The Demeter ARRIVES_AT England
+```
+
+### Phase 6: LLM Response
+
+The LLM receives:
+
+```
+User: How does Dracula travel from Transylvania to England?
+
+## Knowledge Graph Context
+
+### Relevant Entities
+**Persons:**
+- Count Dracula: Ancient vampire, Transylvanian nobleman
+...
+
+### Relationships
+- Count Dracula TRAVELS_ON The Demeter
+- The Demeter ARRIVES_AT England
+...
+```
+
+**LLM Response:**
+
+> "Count Dracula travels from Transylvania to England aboard the Russian sailing ship **The Demeter**. He boards the ship at a port in Transylvania and arrives at **Whitby**, a port town in Yorkshire, England. During the voyage, Dracula uses his supernatural abilities to transform into mist, allowing him to move undetected among the crew and cargo."
+
+---
+
+## Key Configuration
+
+### Environment Variables
+
+```bash
+# Required
+export AGENTLANG_OPENAI_KEY=sk-...
+
+# Optional: Neo4j
+export GRAPH_DB_URI=bolt://localhost:7687
+export GRAPH_DB_USER=neo4j
+export GRAPH_DB_PASSWORD=password
+
+# Optional: Processing tuning
+export KG_CHUNK_SIZE=1000
+export KG_CHUNK_OVERLAP=200
+export KG_MEGA_BATCH_CHARS=50000
+export KG_CORE_ENTITY_LIMIT=60
+export KG_MIN_ENTITY_SALIENCE=3
+export KG_MAX_SEED_NODES=5
+export KG_MAX_GRAPH_DEPTH=2
+```
+
+### Database Schema
+
+**SQLite (dracula.db):**
+
+```sql
+-- KnowledgeNode: Main entity store
+CREATE TABLE KnowledgeNode (
+  id TEXT PRIMARY KEY,
+  name TEXT,
+  type TEXT,
+  description TEXT,
+  containerTag TEXT,
+  isLatest BOOLEAN,
+  confidence FLOAT,
+  createdAt DATETIME
+);
+
+-- KnowledgeNode_vector: Vector embeddings (sqlite-vec)
+CREATE VIRTUAL TABLE KnowledgeNode_vector USING vec0(
+  id TEXT PRIMARY KEY,
+  embedding FLOAT[1536]
+);
+
+-- KnowledgeEdge: Relationships
+CREATE TABLE KnowledgeEdge (
+  id TEXT PRIMARY KEY,
+  sourceId TEXT,
+  targetId TEXT,
+  relType TEXT,
+  weight FLOAT,
+  containerTag TEXT
+);
+
+-- KnowledgeSession: Session tracking
+CREATE TABLE KnowledgeSession (
+  id TEXT PRIMARY KEY,
+  agentId TEXT,
+  userId TEXT,
+  containerTag TEXT,
+  documentsProcessed BOOLEAN
+);
+```
+
+**Neo4j:**
+
+```cypher
+// Nodes
+(:KnowledgeNode {
+  id: "uuid",
+  name: "Count Dracula",
+  type: "Person",
+  description: "...",
+  containerTag: "Dracula/dracula:shared"
+})
+
+// Edges
+(:KnowledgeNode)-[:TRAVELS_ON {weight: 2.0}]->(:KnowledgeNode)
+```
+
+---
+
+## Performance Characteristics
+
+| Operation             | Time      | Details                            |
+| --------------------- | --------- | ---------------------------------- |
+| **First Startup**     | 3-5 min   | Document processing (18 LLM calls) |
+| **Subsequent Starts** | <1s       | Skips processing (already done)    |
+| **Query Context**     | 300-500ms | Vector search + graph expansion    |
+| **Neo4j Sync**        | 10-30s    | After document processing          |
+
+**Large Document Handling:**
+
+- Dracula: ~870KB → ~60 nodes, ~200 edges
+- Streaming chunk processing keeps memory constant
+- Parallel LLM calls (5 concurrent) for speed
+
+---
+
+## Code Reference Map
+
+| Component               | File                                          | Purpose               |
+| ----------------------- | --------------------------------------------- | --------------------- |
+| **CLI Entry**           | `src/cli/main.ts`                             | Startup orchestration |
+| **Document Processing** | `src/runtime/knowledge/service.ts`            | Main API              |
+| **Chunking**            | `src/runtime/knowledge/document-processor.ts` | Text processing       |
+| **Entity Extraction**   | `src/runtime/knowledge/extractor.ts`          | LLM calls             |
+| **Deduplication**       | `src/runtime/knowledge/deduplicator.ts`       | Semantic matching     |
+| **Context Builder**     | `src/runtime/knowledge/context-builder.ts`    | Query handling        |
+| **Neo4j Adapter**       | `src/runtime/graph/neo4j.ts`                  | Graph operations      |
+| **Embedding Service**   | `src/runtime/resolvers/sqldb/impl.ts`         | Vector generation     |
+| **LLM Prompts**         | `src/runtime/knowledge/prompts.ts`            | Extraction prompts    |
+
+---
+
+## Summary
+
+The Dracula example demonstrates:
+
+1. **Document Ingestion**: Text → Chunks → LLM extraction → Embeddings → SQL + Neo4j
+2. **Hybrid Storage**: SQLite (authoritative + vectors) + Neo4j (graph traversal)
+3. **Semantic Retrieval**: Exact match → Vector search → Graph expansion → Structured context
+4. **Intelligent Responses**: LLM answers using entity relationships, not just text
+
+**Result:** The agent doesn't just search for text—it **understands** the relationships between characters, locations, and events to provide contextually rich answers.
+
+---
diff --git a/example/dracula/README.md b/example/dracula/README.md
new file mode 100644
index 0000000..a87855d
--- /dev/null
+++ b/example/dracula/README.md
@@ -0,0 +1,222 @@
+# Dracula - Knowledge Graph Example
+
+This example demonstrates Agentlang's knowledge graph memory system by building a knowledge graph from "Dracula" by Bram Stoker (1897, public domain via [Project Gutenberg](https://www.gutenberg.org/ebooks/345)).
+
+When you first send a message, the system:
+
+1. Processes the document into chunks
+2. Extracts entities (characters, locations, events) and relationships using LLM
+3. Deduplicates entities using exact + semantic similarity (when embeddings enabled)
+4. Stores everything in Neo4j (graph database) + Agentlang entities (vector search)
+5. On subsequent queries, retrieves relevant context via vector search + graph traversal
+
+## Prerequisites
+
+- Node.js >= 20
+- An OpenAI API key (for entity extraction and chat). You can switch to Anthropic in `src/core.al`.
+- Docker (for Neo4j - optional but recommended)
+
+## Setup
+
+### 1. Start Neo4j (Optional)
+
+Neo4j provides graph storage and traversal. Without it, the system still works using Agentlang's built-in vector search, but you won't get multi-hop graph traversal or the Neo4j Browser visualization.
+
+```bash
+docker run -d --name neo4j-dracula \
+  -p 7474:7474 -p 7687:7687 \
+  -e NEO4J_AUTH=neo4j/password \
+  neo4j:5
+```
+
+Set the environment variables:
+
+```bash
+export GRAPH_DB_URI=bolt://localhost:7687
+export GRAPH_DB_USER=neo4j
+export GRAPH_DB_PASSWORD=password
+```
+
+### 2. Set your API key
+
+```bash
+export AGENTLANG_OPENAI_KEY=sk-...
+```
+
+Or if using Anthropic (edit `src/core.al` to change `service "openai"` to `service "anthropic"`):
+
+```bash
+export AGENTLANG_ANTHROPIC_KEY=sk-ant-...
+```
+
+### 3. Build Agentlang
+
+From the agentlang root directory:
+
+```bash
+npm run build
+```
+
+### 4. Run the example
+
+```bash
+node ./bin/cli.js run example/dracula
+```
+
+## Performance Notes
+
+Dracula is a large novel (~870KB, about 6x the size of Alice in Wonderland). The streaming approach keeps memory constant regardless of size, but expect more LLM calls:
+
+- ~18 mega-batches → **18 LLM calls** for entity extraction + **18 for relationships** = **~36 total**
+- **First request will take 3-5 minutes** depending on LLM response times
+- Subsequent requests are fast (<1s)
+
+### Recommended settings for large documents
+
+```bash
+# Increase mega-batch size to reduce LLM calls further (halves to ~18 total)
+export KG_MEGA_BATCH_CHARS=100000
+
+# Higher salience threshold filters noise from a longer text
+export KG_MIN_ENTITY_SALIENCE=3
+export KG_MIN_ENTITY_MENTIONS=3
+
+# Run with exposed GC for memory efficiency
+node --expose-gc ./bin/cli.js run example/dracula
+```
+
+## Usage
+
+### Ask questions about Dracula
+
+```bash
+# Characters
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "Who is Count Dracula and what are his powers?"}' | jq .
+
+# Relationships
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "What is the relationship between Jonathan Harker and Mina?"}' | jq .
+
+# The hunt
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "Who joins Van Helsing in hunting Dracula and what role does each play?"}' | jq .
+
+# Lucy's fate
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "What happens to Lucy Westenra?"}' | jq .
+
+# Locations
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "Describe Castle Dracula and its significance in the story."}' | jq .
+
+# The journey
+curl -s -X POST http://localhost:8080/Dracula/ask \
+  -H 'Content-Type: application/json' \
+  -d '{"question": "How does Dracula travel from Transylvania to England?"}' | jq .
+```
+
+### Inspect the Knowledge Graph (Neo4j Browser)
+
+If you started Neo4j, open http://localhost:7474 in your browser and run:
+
+```cypher
+-- See all nodes
+MATCH (n:KnowledgeNode) RETURN n LIMIT 50
+
+-- See all relationships
+MATCH (n:KnowledgeNode)-[r]->(m:KnowledgeNode)
+RETURN n, r, m LIMIT 100
+
+-- Find Dracula's relationships
+MATCH (n:KnowledgeNode {name: 'Dracula'})-[r]->(m)
+RETURN n, r, m
+
+-- Find all characters
+MATCH (n:KnowledgeNode {type: 'Person'})
+RETURN n.name, n.description
+ORDER BY n.name
+
+-- Find all locations
+MATCH (n:KnowledgeNode {type: 'Location'})
+RETURN n.name, n.description
+```
+
+## Expected Knowledge Graph
+
+After processing, the knowledge graph should contain entities like:
+
+```
+Characters (Person):
+├── Count Dracula - Ancient vampire, Transylvanian nobleman
+├── Jonathan Harker - Solicitor, travels to Castle Dracula
+├── Mina Harker (née Murray) - Jonathan's wife, resourceful and brave
+├── Abraham Van Helsing - Dutch professor, vampire expert
+├── Dr. John Seward - Psychiatrist, runs asylum near Carfax
+├── Arthur Holmwood (Lord Godalming) - Lucy's fiancé, nobleman
+├── Quincey Morris - American, Texan adventurer
+├── Lucy Westenra - Mina's friend, Dracula's victim
+├── R.M. Renfield - Seward's patient, eats insects for "life"
+├── Peter Hawkins - Jonathan's employer, solicitor
+├── Captain of the Demeter - Ship that carries Dracula to England
+└── The Three Vampire Women - Dracula's brides at the castle
+
+Locations:
+├── Castle Dracula - Ancient fortress in the Carpathian Mountains
+├── Transylvania - Region in Romania, Dracula's homeland
+├── Borgo Pass - Mountain pass near the castle
+├── Whitby - English coastal town where Dracula arrives
+├── Carfax Abbey - Dracula's English estate, next to asylum
+├── Dr. Seward's Asylum - Adjacent to Carfax
+├── Hillingham - Lucy's home
+├── Piccadilly - Dracula's London property
+├── The Demeter - Ship from Varna to Whitby
+└── Varna - Port city, Dracula's escape route
+
+Key Relationships:
+Jonathan Harker ──MARRIED_TO──> Mina Harker
+Count Dracula ──IMPRISONS──> Jonathan Harker
+Count Dracula ──FEEDS_ON──> Lucy Westenra
+Count Dracula ──FEEDS_ON──> Mina Harker
+Van Helsing ──LEADS──> Hunting Party
+Arthur Holmwood ──ENGAGED_TO──> Lucy Westenra
+Dr. Seward ──TREATS──> Renfield
+Renfield ──SERVES──> Count Dracula
+Count Dracula ──TRAVELS_ON──> The Demeter
+Count Dracula ──RESIDES_AT──> Castle Dracula
+Count Dracula ──PURCHASES──> Carfax Abbey
+Van Helsing ──DESTROYS──> Count Dracula
+```
+
+## Architecture
+
+```
+Document (dracula.txt, ~870KB)
+    ↓
+Chunking (1000 chars, 200 overlap)
+    ↓
+Entity Extraction (LLM per chunk)
+    ↓
+Semantic Deduplication (cosine similarity > 0.85)
+    ↓
+┌──────────────────┬───────────────────┐
+│  Neo4j           │  Agentlang        │
+│  (Graph DB)      │  (KnowledgeNode)  │
+│  - Nodes         │  - Vector search  │
+│  - Edges         │  - Instance data  │
+│  - BFS traversal │  - @fullTextSearch│
+└──────────────────┴───────────────────┘
+    ↓
+Query: "Who hunts Dracula?"
+    ↓
+Vector Search → Seed nodes (Dracula, Van Helsing)
+    ↓
+Graph Expansion (2-hop BFS) → Related entities
+    ↓
+Structured Context → LLM → Answer
+```
diff --git a/example/dracula/config.al b/example/dracula/config.al
new file mode 100644
index 0000000..f46f50f
--- /dev/null
+++ b/example/dracula/config.al
@@ -0,0 +1,11 @@
+{
+  "agentlang": {
+    "service": {
+        "port": "#js parseInt(process.env.SERVICE_PORT || '8080')"
+    },
+    "store": {
+        "type": "sqlite",
+        "dbname": "dracula.db"
+    }
+   }
+}
diff --git a/example/dracula/docs/dracula.txt b/example/dracula/docs/dracula.txt
new file mode 100644
index 0000000..05acc01
--- /dev/null
+++ b/example/dracula/docs/dracula.txt
@@ -0,0 +1,15475 @@
+*** START OF THE PROJECT GUTENBERG EBOOK 345 ***
+
+
+
+
+                                DRACULA
+
+                                  _by_
+
+                              Bram Stoker
+
+                        [Illustration: colophon]
+
+                                NEW YORK
+
+                            GROSSET & DUNLAP
+
+                              _Publishers_
+
+      Copyright, 1897, in the United States of America, according
+                   to Act of Congress, by Bram Stoker
+
+                        [_All rights reserved._]
+
+                      PRINTED IN THE UNITED STATES
+                                   AT
+               THE COUNTRY LIFE PRESS, GARDEN CITY, N.Y.
+
+
+
+
+                                   TO
+
+                             MY DEAR FRIEND
+
+                               HOMMY-BEG
+
+
+
+
+Contents
+
+CHAPTER I. Jonathan Harker’s Journal
+CHAPTER II. Jonathan Harker’s Journal
+CHAPTER III. Jonathan Harker’s Journal
+CHAPTER IV. Jonathan Harker’s Journal
+CHAPTER V. Letters—Lucy and Mina
+CHAPTER VI. Mina Murray’s Journal
+CHAPTER VII. Cutting from “The Dailygraph,” 8 August
+CHAPTER VIII. Mina Murray’s Journal
+CHAPTER IX. Mina Murray’s Journal
+CHAPTER X. Mina Murray’s Journal
+CHAPTER XI. Lucy Westenra’s Diary
+CHAPTER XII. Dr. Seward’s Diary
+CHAPTER XIII. Dr. Seward’s Diary
+CHAPTER XIV. Mina Harker’s Journal
+CHAPTER XV. Dr. Seward’s Diary
+CHAPTER XVI. Dr. Seward’s Diary
+CHAPTER XVII. Dr. Seward’s Diary
+CHAPTER XVIII. Dr. Seward’s Diary
+CHAPTER XIX. Jonathan Harker’s Journal
+CHAPTER XX. Jonathan Harker’s Journal
+CHAPTER XXI. Dr. Seward’s Diary
+CHAPTER XXII. Jonathan Harker’s Journal
+CHAPTER XXIII. Dr. Seward’s Diary
+CHAPTER XXIV. Dr. Seward’s Phonograph Diary, spoken by Van Helsing
+CHAPTER XXV. Dr. Seward’s Diary
+CHAPTER XXVI. Dr. Seward’s Diary
+CHAPTER XXVII. Mina Harker’s Journal
+
+
+
+
+How these papers have been placed in sequence will be made manifest in
+the reading of them. All needless matters have been eliminated, so that
+a history almost at variance with the possibilities of later-day belief
+may stand forth as simple fact. There is throughout no statement of
+past things wherein memory may err, for all the records chosen are
+exactly contemporary, given from the standpoints and within the range
+of knowledge of those who made them.
+
+
+
+
+DRACULA
+
+
+
+
+CHAPTER I
+
+JONATHAN HARKER’S JOURNAL
+
+(_Kept in shorthand._)
+
+
+_3 May. Bistritz._--Left Munich at 8:35 P. M., on 1st May, arriving at
+Vienna early next morning; should have arrived at 6:46, but train was an
+hour late. Buda-Pesth seems a wonderful place, from the glimpse which I
+got of it from the train and the little I could walk through the
+streets. I feared to go very far from the station, as we had arrived
+late and would start as near the correct time as possible. The
+impression I had was that we were leaving the West and entering the
+East; the most western of splendid bridges over the Danube, which is
+here of noble width and depth, took us among the traditions of Turkish
+rule.
+
+We left in pretty good time, and came after nightfall to Klausenburgh.
+Here I stopped for the night at the Hotel Royale. I had for dinner, or
+rather supper, a chicken done up some way with red pepper, which was
+very good but thirsty. (_Mem._, get recipe for Mina.) I asked the
+waiter, and he said it was called “paprika hendl,” and that, as it was a
+national dish, I should be able to get it anywhere along the
+Carpathians. I found my smattering of German very useful here; indeed, I
+don’t know how I should be able to get on without it.
+
+Having had some time at my disposal when in London, I had visited the
+British Museum, and made search among the books and maps in the library
+regarding Transylvania; it had struck me that some foreknowledge of the
+country could hardly fail to have some importance in dealing with a
+nobleman of that country. I find that the district he named is in the
+extreme east of the country, just on the borders of three states,
+Transylvania, Moldavia and Bukovina, in the midst of the Carpathian
+mountains; one of the wildest and least known portions of Europe. I was
+not able to light on any map or work giving the exact locality of the
+Castle Dracula, as there are no maps of this country as yet to compare
+with our own Ordnance Survey maps; but I found that Bistritz, the post
+town named by Count Dracula, is a fairly well-known place. I shall enter
+here some of my notes, as they may refresh my memory when I talk over my
+travels with Mina.
+
+In the population of Transylvania there are four distinct nationalities:
+Saxons in the South, and mixed with them the Wallachs, who are the
+descendants of the Dacians; Magyars in the West, and Szekelys in the
+East and North. I am going among the latter, who claim to be descended
+from Attila and the Huns. This may be so, for when the Magyars conquered
+the country in the eleventh century they found the Huns settled in it. I
+read that every known superstition in the world is gathered into the
+horseshoe of the Carpathians, as if it were the centre of some sort of
+imaginative whirlpool; if so my stay may be very interesting. (_Mem._, I
+must ask the Count all about them.)
+
+I did not sleep well, though my bed was comfortable enough, for I had
+all sorts of queer dreams. There was a dog howling all night under my
+window, which may have had something to do with it; or it may have been
+the paprika, for I had to drink up all the water in my carafe, and was
+still thirsty. Towards morning I slept and was wakened by the continuous
+knocking at my door, so I guess I must have been sleeping soundly then.
+I had for breakfast more paprika, and a sort of porridge of maize flour
+which they said was “mamaliga,” and egg-plant stuffed with forcemeat, a
+very excellent dish, which they call “impletata.” (_Mem._, get recipe
+for this also.) I had to hurry breakfast, for the train started a little
+before eight, or rather it ought to have done so, for after rushing to
+the station at 7:30 I had to sit in the carriage for more than an hour
+before we began to move. It seems to me that the further east you go the
+more unpunctual are the trains. What ought they to be in China?
+
+All day long we seemed to dawdle through a country which was full of
+beauty of every kind. Sometimes we saw little towns or castles on the
+top of steep hills such as we see in old missals; sometimes we ran by
+rivers and streams which seemed from the wide stony margin on each side
+of them to be subject to great floods. It takes a lot of water, and
+running strong, to sweep the outside edge of a river clear. At every
+station there were groups of people, sometimes crowds, and in all sorts
+of attire. Some of them were just like the peasants at home or those I
+saw coming through France and Germany, with short jackets and round hats
+and home-made trousers; but others were very picturesque. The women
+looked pretty, except when you got near them, but they were very clumsy
+about the waist. They had all full white sleeves of some kind or other,
+and most of them had big belts with a lot of strips of something
+fluttering from them like the dresses in a ballet, but of course there
+were petticoats under them. The strangest figures we saw were the
+Slovaks, who were more barbarian than the rest, with their big cow-boy
+hats, great baggy dirty-white trousers, white linen shirts, and enormous
+heavy leather belts, nearly a foot wide, all studded over with brass
+nails. They wore high boots, with their trousers tucked into them, and
+had long black hair and heavy black moustaches. They are very
+picturesque, but do not look prepossessing. On the stage they would be
+set down at once as some old Oriental band of brigands. They are,
+however, I am told, very harmless and rather wanting in natural
+self-assertion.
+
+It was on the dark side of twilight when we got to Bistritz, which is a
+very interesting old place. Being practically on the frontier--for the
+Borgo Pass leads from it into Bukovina--it has had a very stormy
+existence, and it certainly shows marks of it. Fifty years ago a series
+of great fires took place, which made terrible havoc on five separate
+occasions. At the very beginning of the seventeenth century it underwent
+a siege of three weeks and lost 13,000 people, the casualties of war
+proper being assisted by famine and disease.
+
+Count Dracula had directed me to go to the Golden Krone Hotel, which I
+found, to my great delight, to be thoroughly old-fashioned, for of
+course I wanted to see all I could of the ways of the country. I was
+evidently expected, for when I got near the door I faced a
+cheery-looking elderly woman in the usual peasant dress--white
+undergarment with long double apron, front, and back, of coloured stuff
+fitting almost too tight for modesty. When I came close she bowed and
+said, “The Herr Englishman?” “Yes,” I said, “Jonathan Harker.” She
+smiled, and gave some message to an elderly man in white shirt-sleeves,
+who had followed her to the door. He went, but immediately returned with
+a letter:--
+
+     “My Friend.--Welcome to the Carpathians. I am anxiously expecting
+     you. Sleep well to-night. At three to-morrow the diligence will
+     start for Bukovina; a place on it is kept for you. At the Borgo
+     Pass my carriage will await you and will bring you to me. I trust
+     that your journey from London has been a happy one, and that you
+     will enjoy your stay in my beautiful land.
+
+“Your friend,
+
+“DRACULA.”
+
+
+_4 May._--I found that my landlord had got a letter from the Count,
+directing him to secure the best place on the coach for me; but on
+making inquiries as to details he seemed somewhat reticent, and
+pretended that he could not understand my German. This could not be
+true, because up to then he had understood it perfectly; at least, he
+answered my questions exactly as if he did. He and his wife, the old
+lady who had received me, looked at each other in a frightened sort of
+way. He mumbled out that the money had been sent in a letter, and that
+was all he knew. When I asked him if he knew Count Dracula, and could
+tell me anything of his castle, both he and his wife crossed themselves,
+and, saying that they knew nothing at all, simply refused to speak
+further. It was so near the time of starting that I had no time to ask
+any one else, for it was all very mysterious and not by any means
+comforting.
+
+Just before I was leaving, the old lady came up to my room and said in a
+very hysterical way:
+
+“Must you go? Oh! young Herr, must you go?” She was in such an excited
+state that she seemed to have lost her grip of what German she knew, and
+mixed it all up with some other language which I did not know at all. I
+was just able to follow her by asking many questions. When I told her
+that I must go at once, and that I was engaged on important business,
+she asked again:
+
+“Do you know what day it is?” I answered that it was the fourth of May.
+She shook her head as she said again:
+
+“Oh, yes! I know that! I know that, but do you know what day it is?” On
+my saying that I did not understand, she went on:
+
+“It is the eve of St. George’s Day. Do you not know that to-night, when
+the clock strikes midnight, all the evil things in the world will have
+full sway? Do you know where you are going, and what you are going to?”
+She was in such evident distress that I tried to comfort her, but
+without effect. Finally she went down on her knees and implored me not
+to go; at least to wait a day or two before starting. It was all very
+ridiculous but I did not feel comfortable. However, there was business
+to be done, and I could allow nothing to interfere with it. I therefore
+tried to raise her up, and said, as gravely as I could, that I thanked
+her, but my duty was imperative, and that I must go. She then rose and
+dried her eyes, and taking a crucifix from her neck offered it to me. I
+did not know what to do, for, as an English Churchman, I have been
+taught to regard such things as in some measure idolatrous, and yet it
+seemed so ungracious to refuse an old lady meaning so well and in such a
+state of mind. She saw, I suppose, the doubt in my face, for she put the
+rosary round my neck, and said, “For your mother’s sake,” and went out
+of the room. I am writing up this part of the diary whilst I am waiting
+for the coach, which is, of course, late; and the crucifix is still
+round my neck. Whether it is the old lady’s fear, or the many ghostly
+traditions of this place, or the crucifix itself, I do not know, but I
+am not feeling nearly as easy in my mind as usual. If this book should
+ever reach Mina before I do, let it bring my good-bye. Here comes the
+coach!
+
+       *       *       *       *       *
+
+_5 May. The Castle._--The grey of the morning has passed, and the sun is
+high over the distant horizon, which seems jagged, whether with trees or
+hills I know not, for it is so far off that big things and little are
+mixed. I am not sleepy, and, as I am not to be called till I awake,
+naturally I write till sleep comes. There are many odd things to put
+down, and, lest who reads them may fancy that I dined too well before I
+left Bistritz, let me put down my dinner exactly. I dined on what they
+called “robber steak”--bits of bacon, onion, and beef, seasoned with red
+pepper, and strung on sticks and roasted over the fire, in the simple
+style of the London cat’s meat! The wine was Golden Mediasch, which
+produces a queer sting on the tongue, which is, however, not
+disagreeable. I had only a couple of glasses of this, and nothing else.
+
+When I got on the coach the driver had not taken his seat, and I saw him
+talking with the landlady. They were evidently talking of me, for every
+now and then they looked at me, and some of the people who were sitting
+on the bench outside the door--which they call by a name meaning
+“word-bearer”--came and listened, and then looked at me, most of them
+pityingly. I could hear a lot of words often repeated, queer words, for
+there were many nationalities in the crowd; so I quietly got my polyglot
+dictionary from my bag and looked them out. I must say they were not
+cheering to me, for amongst them were “Ordog”--Satan, “pokol”--hell,
+“stregoica”--witch, “vrolok” and “vlkoslak”--both of which mean the same
+thing, one being Slovak and the other Servian for something that is
+either were-wolf or vampire. (_Mem._, I must ask the Count about these
+superstitions)
+
+When we started, the crowd round the inn door, which had by this time
+swelled to a considerable size, all made the sign of the cross and
+pointed two fingers towards me. With some difficulty I got a
+fellow-passenger to tell me what they meant; he would not answer at
+first, but on learning that I was English, he explained that it was a
+charm or guard against the evil eye. This was not very pleasant for me,
+just starting for an unknown place to meet an unknown man; but every one
+seemed so kind-hearted, and so sorrowful, and so sympathetic that I
+could not but be touched. I shall never forget the last glimpse which I
+had of the inn-yard and its crowd of picturesque figures, all crossing
+themselves, as they stood round the wide archway, with its background of
+rich foliage of oleander and orange trees in green tubs clustered in the
+centre of the yard. Then our driver, whose wide linen drawers covered
+the whole front of the box-seat--“gotza” they call them--cracked his big
+whip over his four small horses, which ran abreast, and we set off on
+our journey.
+
+I soon lost sight and recollection of ghostly fears in the beauty of the
+scene as we drove along, although had I known the language, or rather
+languages, which my fellow-passengers were speaking, I might not have
+been able to throw them off so easily. Before us lay a green sloping
+land full of forests and woods, with here and there steep hills, crowned
+with clumps of trees or with farmhouses, the blank gable end to the
+road. There was everywhere a bewildering mass of fruit blossom--apple,
+plum, pear, cherry; and as we drove by I could see the green grass under
+the trees spangled with the fallen petals. In and out amongst these
+green hills of what they call here the “Mittel Land” ran the road,
+losing itself as it swept round the grassy curve, or was shut out by the
+straggling ends of pine woods, which here and there ran down the
+hillsides like tongues of flame. The road was rugged, but still we
+seemed to fly over it with a feverish haste. I could not understand then
+what the haste meant, but the driver was evidently bent on losing no
+time in reaching Borgo Prund. I was told that this road is in summertime
+excellent, but that it had not yet been put in order after the winter
+snows. In this respect it is different from the general run of roads in
+the Carpathians, for it is an old tradition that they are not to be kept
+in too good order. Of old the Hospadars would not repair them, lest the
+Turk should think that they were preparing to bring in foreign troops,
+and so hasten the war which was always really at loading point.
+
+Beyond the green swelling hills of the Mittel Land rose mighty slopes
+of forest up to the lofty steeps of the Carpathians themselves. Right
+and left of us they towered, with the afternoon sun falling full upon
+them and bringing out all the glorious colours of this beautiful range,
+deep blue and purple in the shadows of the peaks, green and brown where
+grass and rock mingled, and an endless perspective of jagged rock and
+pointed crags, till these were themselves lost in the distance, where
+the snowy peaks rose grandly. Here and there seemed mighty rifts in the
+mountains, through which, as the sun began to sink, we saw now and again
+the white gleam of falling water. One of my companions touched my arm as
+we swept round the base of a hill and opened up the lofty, snow-covered
+peak of a mountain, which seemed, as we wound on our serpentine way, to
+be right before us:--
+
+“Look! Isten szek!”--“God’s seat!”--and he crossed himself reverently.
+
+As we wound on our endless way, and the sun sank lower and lower behind
+us, the shadows of the evening began to creep round us. This was
+emphasised by the fact that the snowy mountain-top still held the
+sunset, and seemed to glow out with a delicate cool pink. Here and there
+we passed Cszeks and Slovaks, all in picturesque attire, but I noticed
+that goitre was painfully prevalent. By the roadside were many crosses,
+and as we swept by, my companions all crossed themselves. Here and there
+was a peasant man or woman kneeling before a shrine, who did not even
+turn round as we approached, but seemed in the self-surrender of
+devotion to have neither eyes nor ears for the outer world. There were
+many things new to me: for instance, hay-ricks in the trees, and here
+and there very beautiful masses of weeping birch, their white stems
+shining like silver through the delicate green of the leaves. Now and
+again we passed a leiter-wagon--the ordinary peasant’s cart--with its
+long, snake-like vertebra, calculated to suit the inequalities of the
+road. On this were sure to be seated quite a group of home-coming
+peasants, the Cszeks with their white, and the Slovaks with their
+coloured, sheepskins, the latter carrying lance-fashion their long
+staves, with axe at end. As the evening fell it began to get very cold,
+and the growing twilight seemed to merge into one dark mistiness the
+gloom of the trees, oak, beech, and pine, though in the valleys which
+ran deep between the spurs of the hills, as we ascended through the
+Pass, the dark firs stood out here and there against the background of
+late-lying snow. Sometimes, as the road was cut through the pine woods
+that seemed in the darkness to be closing down upon us, great masses of
+greyness, which here and there bestrewed the trees, produced a
+peculiarly weird and solemn effect, which carried on the thoughts and
+grim fancies engendered earlier in the evening, when the falling sunset
+threw into strange relief the ghost-like clouds which amongst the
+Carpathians seem to wind ceaselessly through the valleys. Sometimes the
+hills were so steep that, despite our driver’s haste, the horses could
+only go slowly. I wished to get down and walk up them, as we do at home,
+but the driver would not hear of it. “No, no,” he said; “you must not
+walk here; the dogs are too fierce”; and then he added, with what he
+evidently meant for grim pleasantry--for he looked round to catch the
+approving smile of the rest--“and you may have enough of such matters
+before you go to sleep.” The only stop he would make was a moment’s
+pause to light his lamps.
+
+When it grew dark there seemed to be some excitement amongst the
+passengers, and they kept speaking to him, one after the other, as
+though urging him to further speed. He lashed the horses unmercifully
+with his long whip, and with wild cries of encouragement urged them on
+to further exertions. Then through the darkness I could see a sort…
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/example/personal_assistant/README.md b/example/personal_assistant/README.md
deleted file mode 100644
index 5259d4a..0000000
--- a/example/personal_assistant/README.md
+++ /dev/null
@@ -1,4 +0,0 @@
-# Personal Assistant with Automatic Memory
-
-This example demonstrates **Agentlang's built-in memory system**. The assistant automatically remembers conversations,
-extracts facts, and retrieves relevant context - all without any memory configuration.
diff --git a/example/personal_assistant/config.al b/example/personal_assistant/config.al
deleted file mode 100644
index 5407748..0000000
--- a/example/personal_assistant/config.al
+++ /dev/null
@@ -1,11 +0,0 @@
-{
-  "agentlang": {
-    "service": {
-        "port": "#js parseInt(process.env.SERVICE_PORT || '8080')"
-    },
-    "store": {
-        "type": "sqlite",
-        "dbname": "pa.db"
-    }
-   }
-}
diff --git a/example/personal_assistant/docs/company_handbook.txt b/example/personal_assistant/docs/company_handbook.txt
deleted file mode 100644
index 4d8cf36..0000000
--- a/example/personal_assistant/docs/company_handbook.txt
+++ /dev/null
@@ -1,49 +0,0 @@
-ACME Corporation Employee Handbook
-==================================
-
-Welcome to ACME Corporation!
-
-Company Values
---------------
-1. Innovation: We encourage creative thinking and new ideas
-2. Integrity: We act with honesty and transparency
-3. Collaboration: We work together to achieve goals
-4. Excellence: We strive for the highest quality
-
-Work Hours
-----------
-- Standard hours: 9:00 AM - 5:00 PM
-- Flexible work arrangements available upon manager approval
-- Core hours (required presence): 10:00 AM - 3:00 PM
-
-Leave Policy
-------------
-- Annual leave: 20 days per year
-- Sick leave: 10 days per year
-- Parental leave: 12 weeks paid
-- Bereavement leave: 5 days
-
-Benefits
---------
-- Health insurance (medical, dental, vision)
-- 401(k) with 6% company match
-- Professional development budget: $2,000/year
-- Home office stipend: $500 one-time
-
-Remote Work Policy
-------------------
-- Hybrid work model: 3 days office, 2 days remote
-- Remote-first positions available for certain roles
-- Equipment provided for home office setup
-
-Expense Reimbursement
---------------------
-- Travel expenses must be pre-approved
-- Per diem for meals: $75/day domestic, $100/day international
-- Receipts required for expenses over $25
-
-Contact Information
--------------------
-- HR Department: hr@acme.com
-- IT Support: it-help@acme.com
-- Facilities: facilities@acme.com
diff --git a/example/personal_assistant/docs/project_guidelines.txt b/example/personal_assistant/docs/project_guidelines.txt
deleted file mode 100644
index c5b8b6f..0000000
--- a/example/personal_assistant/docs/project_guidelines.txt
+++ /dev/null
@@ -1,45 +0,0 @@
-Project Management Guidelines
-=============================
-
-Project Phases
---------------
-1. Initiation: Define scope, objectives, stakeholders
-2. Planning: Create timeline, allocate resources, identify risks
-3. Execution: Implement plan, manage team, track progress
-4. Monitoring: Review KPIs, adjust as needed, status reports
-5. Closure: Deliver results, document lessons learned
-
-Meeting Guidelines
-------------------
-- Daily standups: 15 minutes max, async option available
-- Sprint planning: Every 2 weeks, 2 hours
-- Retrospectives: End of each sprint, 1 hour
-- All hands: Monthly, first Tuesday
-
-Communication Channels
-----------------------
-- Urgent issues: Slack #urgent or phone call
-- Project updates: Slack project channels
-- Formal decisions: Email with "DECISION:" prefix
-- Documentation: Confluence wiki
-
-Task Prioritization
--------------------
-- P0 (Critical): Immediate attention, same-day resolution
-- P1 (High): Within 24 hours
-- P2 (Medium): Within 1 week
-- P3 (Low): As bandwidth permits
-
-Code Review Standards
---------------------
-- All changes require at least 1 approval
-- Critical changes need 2 approvals
-- PR description must include: what, why, how to test
-- CI must pass before merge
-
-Deployment Process
-------------------
-- Staging deployment: Any time with approval
-- Production deployment: Tuesdays and Thursdays only
-- Hotfixes: Emergency procedure with VP approval
-- Rollback plan required for all deployments
diff --git a/example/personal_assistant/package.json b/example/personal_assistant/package.json
deleted file mode 100644
index f7e63d8..0000000
--- a/example/personal_assistant/package.json
+++ /dev/null
@@ -1,4 +0,0 @@
-{
-  "name": "PersonalAssistant",
-  "version": "0.0.1"
-}
diff --git a/example/personal_assistant/src/core.al b/example/personal_assistant/src/core.al
deleted file mode 100644
index 567ffa1..0000000
--- a/example/personal_assistant/src/core.al
+++ /dev/null
@@ -1,104 +0,0 @@
-module PersonalAssistant
-
-{agentlang.ai/LLM {
-  name "llm01",
-  service "anthropic",
-  config {"model": "claude-sonnet-4-5"}
-  }, @upsert}
-
- {agentlang.ai/doc {
-   title "company handbook",
-   url "./example/personal_assistant/docs/company_handbook.txt"}}
-
-{agentlang.ai/doc {
-    title "project guidelines",
-    url "./example/personal_assistant/docs/project_guidelines.txt"}}
-
-// ---- The Assistant Agent ----
-// This agent has memory AUTOMATICALLY. Every conversation is stored,
-// facts are extracted, and context is retrieved for future queries.
-
-@public agent assistant {
-    instruction `You are a friendly and helpful personal assistant with a great memory.
-    Additionally, refer to the documents provided to answer user's query if it matches things there.
-
-Your key traits:
-1. REMEMBER everything the user tells you - their name, preferences, interests, and past conversations
-2. Be warm, conversational, and personable
-3. Reference past conversations naturally ("As you mentioned before..." or "I remember you said...")
-4. Ask follow-up questions to learn more about the user
-5. Be helpful with any task - from simple questions to complex planning
-
-When chatting:
-- Greet users warmly, especially if you remember their name
-- Make connections between current and past conversations
-- Show genuine interest in what the user shares
-- Be concise but engaging
-- If you don't know something, say so honestly
-
-Examples of good responses:
-- "Hi Sarah! Great to hear from you again. How did that project deadline go?"
-- "I remember you mentioned you love hiking. Have you been on any trails lately?"
-- "Based on our conversation yesterday about your Python learning, here are some resources..."
-
-Your memory is automatic - every conversation is stored and relevant context is retrieved.`,
-    documents ["company handbook", "project guidelines"],
-    llm "llm01"
-}
-
-@public workflow chat {
-    {assistant {message chat.message}}
-}
-
-// =============================================================================
-// Personal Assistant with Memory - Usage Guide
-// =============================================================================
-//
-// Start the app:
-//   node ./bin/cli.js run example/personal_assistant
-//
-// Then watch the console output for memory visualization!
-//
-// Example conversation flow:
-//
-// 1. Introduce yourself:
-//    curl -X POST http://localhost:8080/PersonalAssistant/chat \
-//      -H 'Content-Type: application/json' \
-//      -d '{"message": "Hi! My name is Alex and I love programming in Python."}'
-//
-//    Console will show:
-//    ┌─────────────────────────────────────────────────────────────┐
-//    │ MEMORY ADDED                                                │
-//    │ Type: EPISODE                                               │
-//    │ Content: User: Hi! My name is Alex...                       │
-//    └─────────────────────────────────────────────────────────────┘
-//
-// 2. Test memory recall:
-//    curl -X POST http://localhost:8080/PersonalAssistant/chat \
-//      -H 'Content-Type: application/json' \
-//      -d '{"message": "What do you remember about me?"}'
-//
-//    Console will show memory retrieval:
-//    ┌─────────────────────────────────────────────────────────────┐
-//    │ MEMORY RETRIEVAL                                            │
-//    │ Vector search found: 2 memories                             │
-//    │ Retrieved memories:                                         │
-//    │   [EPISODE] User: Hi! My name is Alex...                    │
-//    └─────────────────────────────────────────────────────────────┘
-//
-// 3. Add more context:
-//    curl -X POST http://localhost:8080/PersonalAssistant/chat \
-//      -H 'Content-Type: application/json' \
-//      -d '{"message": "I work at a startup and our tech stack is Python and React."}'
-//
-// 4. Ask about your interests later:
-//    curl -X POST http://localhost:8080/PersonalAssistant/chat \
-//      -H 'Content-Type: application/json' \
-//      -d '{"message": "What programming languages have I mentioned?"}'
-//
-// The graph visualization shows:
-// - Node additions when memories are stored
-// - Edge connections between related memories
-// - Memory retrieval during context building
-//
-// =============================================================================
diff --git a/src/runtime/modules/ai.ts b/src/runtime/modules/ai.ts
index 007421f..28c8bc5 100644
--- a/src/runtime/modules/ai.ts
+++ b/src/runtime/modules/ai.ts
@@ -1189,7 +1189,10 @@ Only return a pure JSON object with no extra text, annotations etc.`;
                 if (docContext.length + entry.length > AgentInstance.MAX_DOC_CONTEXT_CHARS) {
                   const remaining = AgentInstance.MAX_DOC_CONTEXT_CHARS - docContext.length;
                   if (remaining > 200) {
-                    docContext += (docContext ? '\n\n---\n\n' : '') + entry.substring(0, remaining) + '\n...[truncated]';
+                    docContext +=
+                      (docContext ? '\n\n---\n\n' : '') +
+                      entry.substring(0, remaining) +
+                      '\n...[truncated]';
                   }
                   break;
                 }
@@ -1227,7 +1230,8 @@ Only return a pure JSON object with no extra text, annotations etc.`;
               if (docContext.length + entry.length > AgentInstance.MAX_DOC_CONTEXT_CHARS) {
                 const remaining = AgentInstance.MAX_DOC_CONTEXT_CHARS - docContext.length;
                 if (remaining > 200) {
-                  docContext += (docContext ? '\n\n' : '') + entry.substring(0, remaining) + '\n...[truncated]';
+                  docContext +=
+                    (docContext ? '\n\n' : '') + entry.substring(0, remaining) + '\n...[truncated]';
                 }
                 break;
               }
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/package-lock.json b/package-lock.json
index a4a7734..63bee1a 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -2097,7 +2097,6 @@
       "resolved": "https://registry.npmjs.org/@langchain/core/-/core-1.1.24.tgz",
       "integrity": "sha512-u6l0dmMHN/2PCsY6stXoh9CH1OTlVR5Gjz0JjT1XRPuidAlu3kTq4ivW95xCog/PRhiAsCh6GCEC4/PqhNrcgQ==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@cfworker/json-schema": "^4.0.2",
         "ansi-styles": "^5.0.0",
@@ -3595,7 +3594,6 @@
       "resolved": "https://registry.npmjs.org/@types/express/-/express-5.0.6.tgz",
       "integrity": "sha512-sKYVuV7Sv9fbPIt/442koC7+IIwK5olP1KWeD88e/idgoJqDm3JV/YUiPwkoKK92ylff2MGxSz1CSjsXelx0YA==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@types/body-parser": "*",
         "@types/express-serve-static-core": "^5.0.0",
@@ -3741,7 +3739,6 @@
       "integrity": "sha512-BtE0k6cjwjLZoZixN0t5AKP0kSzlGu7FctRXYuPAm//aaiZhmfq1JwdYpYr1brzEspYyFeF+8XF5j2VK6oalrA==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@typescript-eslint/scope-manager": "8.54.0",
         "@typescript-eslint/types": "8.54.0",
@@ -4080,7 +4077,6 @@
       "integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "bin": {
         "acorn": "bin/acorn"
       },
@@ -4307,7 +4303,8 @@
       "version": "0.4.0",
       "resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
       "integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
-      "license": "MIT"
+      "license": "MIT",
+      "peer": true
     },
     "node_modules/available-typed-arrays": {
       "version": "1.0.7",
@@ -4412,7 +4409,6 @@
       "integrity": "sha512-8VYKM3MjCa9WcaSAI3hzwhmyHVlH8tiGFwf0RlTsZPWJ1I5MkzjiudCo4KC4DxOaL/53A5B1sI/IbldNFDbsKA==",
       "hasInstallScript": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "bindings": "^1.5.0",
         "prebuild-install": "^7.1.1"
@@ -4893,7 +4889,6 @@
       "resolved": "https://registry.npmjs.org/chevrotain/-/chevrotain-11.1.1.tgz",
       "integrity": "sha512-f0yv5CPKaFxfsPTBzX7vGuim4oIC1/gcS7LUGdBSwl2dU6+FON6LVUksdOo1qJjoUvXNn45urgh8C+0a24pACQ==",
       "license": "Apache-2.0",
-      "peer": true,
       "dependencies": {
         "@chevrotain/cst-dts-gen": "11.1.1",
         "@chevrotain/gast": "11.1.1",
@@ -5060,6 +5055,7 @@
       "resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
       "integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
       "license": "MIT",
+      "peer": true,
       "dependencies": {
         "delayed-stream": "~1.0.0"
       },
@@ -5523,6 +5519,7 @@
       "resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
       "integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
       "license": "MIT",
+      "peer": true,
       "engines": {
         "node": ">=0.4.0"
       }
@@ -5727,6 +5724,7 @@
       "resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
       "integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
       "license": "MIT",
+      "peer": true,
       "dependencies": {
         "es-errors": "^1.3.0",
         "get-intrinsic": "^1.2.6",
@@ -5813,7 +5811,6 @@
       "integrity": "sha512-LEyamqS7W5HB3ujJyvi0HQK/dtVINZvd5mAAp9eT5S/ujByGjiZLCzPcHVzuXbpJDJF/cxwHlfceVUDZ2lnSTw==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@eslint-community/eslint-utils": "^4.8.0",
         "@eslint-community/regexpp": "^4.12.1",
@@ -6207,7 +6204,6 @@
       "resolved": "https://registry.npmjs.org/express/-/express-5.2.1.tgz",
       "integrity": "sha512-hIS4idWWai69NezIdRt2xFVofaF4j+6INOpJlVOLDO8zXGpUVEVzIYk12UUi2JzjEzWL3IOAxcTubgz9Po0yXw==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "accepts": "^2.0.0",
         "body-parser": "^2.2.1",
@@ -6498,6 +6494,7 @@
       "resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.5.tgz",
       "integrity": "sha512-8RipRLol37bNs2bhoV67fiTEvdTrbMUYcFTiy3+wuuOnUog2QBHCZWXDRijWQfAkhBj2Uf5UnVaiWwA5vdd82w==",
       "license": "MIT",
+      "peer": true,
       "dependencies": {
         "asynckit": "^0.4.0",
         "combined-stream": "^1.0.8",
@@ -6514,6 +6511,7 @@
       "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
       "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
       "license": "MIT",
+      "peer": true,
       "engines": {
         "node": ">= 0.6"
       }
@@ -6523,6 +6521,7 @@
       "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
       "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
       "license": "MIT",
+      "peer": true,
       "dependencies": {
         "mime-db": "1.52.0"
       },
@@ -6986,7 +6985,6 @@
       "resolved": "https://registry.npmjs.org/hono/-/hono-4.11.9.tgz",
       "integrity": "sha512-Eaw2YTGM6WOxA6CXbckaEvslr2Ne4NFsKrvc0v97JD5awbmeBLO5w9Ho9L9kmKonrwF9RJlW6BxT1PVv/agBHQ==",
       "license": "MIT",
-      "peer": true,
       "engines": {
         "node": ">=16.9.0"
       }
@@ -7536,7 +7534,6 @@
       "resolved": "https://registry.npmjs.org/js-yaml/-/js-yaml-4.1.1.tgz",
       "integrity": "sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "argparse": "^2.0.1"
       },
@@ -8544,7 +8541,6 @@
       "resolved": "https://registry.npmjs.org/openai/-/openai-6.21.0.tgz",
       "integrity": "sha512-26dQFi76dB8IiN/WKGQOV+yKKTTlRCxQjoi2WLt0kMcH8pvxVyvfdBDkld5GTl7W1qvBpwVOtFcsqktj3fBRpA==",
       "license": "Apache-2.0",
-      "peer": true,
       "bin": {
         "openai": "bin/cli"
       },
@@ -8904,7 +8900,6 @@
       "resolved": "https://registry.npmjs.org/pg/-/pg-8.18.0.tgz",
       "integrity": "sha512-xqrUDL1b9MbkydY/s+VZ6v+xiMUmOUk7SS9d/1kpyQxoJ6U9AO1oIJyUWVZojbfe5Cc/oluutcgFG4L9RDP1iQ==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "pg-connection-string": "^2.11.0",
         "pg-pool": "^3.11.0",
@@ -9217,7 +9212,8 @@
       "version": "1.1.0",
       "resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
       "integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
-      "license": "MIT"
+      "license": "MIT",
+      "peer": true
     },
     "node_modules/pstree.remy": {
       "version": "1.1.8",
@@ -9544,7 +9540,6 @@
       "integrity": "sha512-oQL6lgK3e2QZeQ7gcgIkS2YZPg5slw37hYufJ3edKlfQSGGm8ICoxswK15ntSzF/a8+h7ekRy7k7oWc3BQ7y8A==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@types/estree": "1.0.8"
       },
@@ -10019,8 +10014,7 @@
       "version": "1.13.0",
       "resolved": "https://registry.npmjs.org/sql.js/-/sql.js-1.13.0.tgz",
       "integrity": "sha512-RJbVP1HRDlUUXahJ7VMTcu9Rm1Nzw+EBpoPr94vnbD4LwR715F3CcxE2G2k45PewcaZ57pjetYa+LoSJLAASgA==",
-      "license": "MIT",
-      "peer": true
+      "license": "MIT"
     },
     "node_modules/sqlite-vec": {
       "version": "0.1.7-alpha.2",
@@ -10406,7 +10400,6 @@
       "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "engines": {
         "node": ">=12"
       },
@@ -10727,7 +10720,6 @@
       "integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==",
       "dev": true,
       "license": "Apache-2.0",
-      "peer": true,
       "bin": {
         "tsc": "bin/tsc",
         "tsserver": "bin/tsserver"
@@ -10795,7 +10787,6 @@
       "integrity": "sha512-4z2nCSBfVIMnbuu8uinj+f0o4qOeggYJLbjpPHka3KH1om7e+H9yLKTYgksTaHcGco+NClhhY2vyO3HsMH1RGw==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@typescript-eslint/scope-manager": "8.55.0",
         "@typescript-eslint/types": "8.55.0",
@@ -11146,7 +11137,6 @@
       "integrity": "sha512-w+N7Hifpc3gRjZ63vYBXA56dvvRlNWRczTdmCBBa+CotUzAPf5b7YMdMR/8CQoeYE5LX3W4wj6RYTgonm1b9DA==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "esbuild": "^0.27.0",
         "fdir": "^6.5.0",
@@ -11257,7 +11247,6 @@
       "integrity": "sha512-5gTmgEY/sqK6gFXLIsQNH19lWb4ebPDLA4SdLP7dsWkIXHWlG66oPuVvXSGFPppYZz8ZDZq0dYYrbHfBCVUb1Q==",
       "dev": true,
       "license": "MIT",
-      "peer": true,
       "engines": {
         "node": ">=12"
       },
@@ -11558,7 +11547,6 @@
       "resolved": "https://registry.npmjs.org/winston/-/winston-3.19.0.tgz",
       "integrity": "sha512-LZNJgPzfKR+/J3cHkxcpHKpKKvGfDZVPS4hfJCc4cCG0CgYzvlD6yE/S3CIL/Yt91ak327YCpiF/0MyeZHEHKA==",
       "license": "MIT",
-      "peer": true,
       "dependencies": {
         "@colors/colors": "^1.6.0",
         "@dabh/diagnostics": "^2.0.8",
@@ -11794,7 +11782,6 @@
       "resolved": "https://registry.npmjs.org/zod/-/zod-4.3.6.tgz",
       "integrity": "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==",
       "license": "MIT",
-      "peer": true,
       "funding": {
         "url": "https://github.com/sponsors/colinhacks"
       }
diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml
index ea80067..a5c8c18 100644
--- a/pnpm-lock.yaml
+++ b/pnpm-lock.yaml
@@ -202,6 +202,8 @@ importers:

   example/customer_support_system: {}

+  example/dracula: {}
+
   example/employee_chat: {}

   example/joins: {}
@@ -210,8 +212,6 @@ importers:

   example/monitoring: {}

-  example/personal_assistant: {}
-
   example/pets: {}

   example/sdr: {}
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/src/runtime/embeddings/provider.ts b/src/runtime/embeddings/provider.ts
index 05f60b4..5027ae4 100644
--- a/src/runtime/embeddings/provider.ts
+++ b/src/runtime/embeddings/provider.ts
@@ -24,11 +24,11 @@ export abstract class EmbeddingProvider {
   abstract getProviderName(): string;

   async embedText(text: string): Promise<number[]> {
-    logger.info(`[EMBEDDING-PROVIDER] embedText called (${text.length} chars)`);
+    logger.debug(`[EMBEDDING-PROVIDER] embedText called (${text.length} chars)`);
     const startTime = Date.now();
     const result = await this.embeddings.embedQuery(text);
     const duration = Date.now() - startTime;
-    logger.info(
+    logger.debug(
       `[EMBEDDING-PROVIDER] embedText completed in ${duration}ms (${result.length} dimensions)`
     );
     return result;
@@ -37,11 +37,11 @@ export abstract class EmbeddingProvider {
   async embedTexts(texts: string[]): Promise<number[][]> {
     if (texts.length === 0) return [];
     if (texts.length === 1) return [await this.embedText(texts[0])];
-    logger.info(`[EMBEDDING-PROVIDER] embedTexts called (${texts.length} texts)`);
+    logger.debug(`[EMBEDDING-PROVIDER] embedTexts called (${texts.length} texts)`);
     const startTime = Date.now();
     const results = await this.embeddings.embedDocuments(texts);
     const duration = Date.now() - startTime;
-    logger.info(
+    logger.debug(
       `[EMBEDDING-PROVIDER] embedTexts completed in ${duration}ms (${texts.length} texts, ${results[0]?.length || 0} dimensions)`
     );
     return results;
Signed-off-by: Pratik Karki <pratik@fractl.io>
BREAKING CHANGES:
- Neo4j label renamed from `KnowledgeNode` to `KnowledgeEntity`
- Node field `type` renamed to `entityType` across graph nodes
- Node field `containerTag` renamed to `__tenant__` in Neo4j storage

Features:
- Add `LanceDBVectorStore` implementation with full `VectorStore`
 interface support (init, add, search, delete, exists, close)
- Add `VectorStore` abstraction types:
  `VectorRecord`, `SearchResult`, `VectorStore`, `VectorStoreConfig`

Refactors:
- Update all Cypher queries in Neo4j adapter to use new label and field
 names
- Propagate `entityType` field rename through knowledge pipeline:
  context-builder, conversation-processor, deduplicator,
  document-processor, extractor, graph-visualizer, knowledge service,
  modules, state, utils, and interpreter
- Update SQL DB resolver and embeddings (OpenAI) to align with schema
 changes
- Update example configs

CI:
- Update integration, publish, and release-please workflow files
- Migrate from pnpm to npm

Remove pnpm and only keep npm

Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/.github/workflows/integration.yml b/.github/workflows/integration.yml
index 581601b..dc54833 100644
--- a/.github/workflows/integration.yml
+++ b/.github/workflows/integration.yml
@@ -30,28 +30,20 @@ jobs:
       - name: Checkout code
         uses: actions/checkout@v4

-      - name: Setup pnpm
-        uses: pnpm/action-setup@v4
-
       - name: Setup Node.js
         uses: actions/setup-node@v4
         with:
           node-version: 22.x
-          cache: 'pnpm'
+          cache: 'npm'

       - name: Install dependencies
-        run: |
-          pnpm install --frozen-lockfile
-          # Auto-approve build scripts for esbuild and sqlite3 if needed
-          if pnpm approve-builds --help >/dev/null 2>&1; then
-            echo -e "\ny\ny\n" | pnpm approve-builds || true
-          fi
+        run: npm ci

       - name: Generate Langium files
-        run: pnpm run langium:generate
+        run: npm run langium:generate

       - name: Build project
-        run: pnpm run build
+        run: npm run build

       - name: Run Blog Example App
         env:
diff --git a/.github/workflows/publish.yml b/.github/workflows/publish.yml
index 92bf60c..061f111 100644
--- a/.github/workflows/publish.yml
+++ b/.github/workflows/publish.yml
@@ -34,46 +34,37 @@ jobs:
       - name: Checkout code
         uses: actions/checkout@v6

-      # Step 2: Setup pnpm
-      - name: Setup pnpm
-        uses: pnpm/action-setup@v4
-
-      # Step 3: Setup Node.js
+      # Step 2: Setup Node.js
       - name: Setup Node.js
         uses: actions/setup-node@v6
         with:
           node-version: '22'
-          cache: 'pnpm'
+          cache: 'npm'
           registry-url: 'https://registry.npmjs.org/'

-      # Step 4: Install dependencies
+      # Step 3: Install dependencies
       - name: Install dependencies
-        run: |
-          pnpm install --frozen-lockfile
-          # Auto-approve build scripts for esbuild and sqlite3 if needed
-          if pnpm approve-builds --help >/dev/null 2>&1; then
-            echo -e "\ny\ny\n" | pnpm approve-builds || true
-          fi
+        run: npm ci

-      # Step 5: Generate Langium files
+      # Step 4: Generate Langium files
       - name: Generate Langium files
-        run: pnpm run langium:generate
+        run: npm run langium:generate

-      # Step 6: Build package
+      # Step 5: Build package
       - name: Build package
-        run: pnpm run build
+        run: npm run build

-      # Step 6.5: Rebuild better-sqlite3 for correct native bindings
+      # Step 6: Rebuild better-sqlite3 for correct native bindings
       - name: Rebuild better-sqlite3 for correct native bindings
         run: npm rebuild better-sqlite3

       # Step 7: Run tests
       - name: Run tests
-        run: NODE_ENV=test pnpm test
+        run: NODE_ENV=test npm test

       # Step 8: Publish to npm Registry
       - name: Publish to npm
-        run: pnpm publish --no-git-checks
+        run: npm publish --no-git-checks
         env:
           NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}

diff --git a/.github/workflows/release-please.yml b/.github/workflows/release-please.yml
index e4585ab..5551f6d 100644
--- a/.github/workflows/release-please.yml
+++ b/.github/workflows/release-please.yml
@@ -148,23 +148,15 @@ jobs:
           node-version: '22'
           cache: 'npm'

-      - name: Update npm lock file
+       - name: Update npm lock file
         run: |
           npm install
           echo "npm lock file updated"

-      - name: Setup pnpm
-        uses: pnpm/action-setup@v4
-
-      - name: Update pnpm lock file
-        run: |
-          pnpm install --no-frozen-lockfile
-          echo "pnpm lock file updated"
-
       - name: Format code
         run: |
           if grep -q '"format"' package.json; then
-            pnpm run format || npm run format || true
+            npm run format || true
             echo "Code formatted"
           else
             echo "No format script found, skipping"
@@ -173,7 +165,7 @@ jobs:
       - name: Lint code
         run: |
           if grep -q '"lint"' package.json; then
-            pnpm run lint || npm run lint || true
+            npm run lint || true
             echo "Code linted"
           else
             echo "No lint script found, skipping"
@@ -189,7 +181,7 @@ jobs:

             - Update package.json to ${{ steps.version.outputs.version }}
             - Update CHANGELOG.md with commits since ${{ steps.prev_tag.outputs.prev_tag }}
-            - Update package-lock.json and pnpm-lock.yaml
+            - Update package-lock.json
           branch: release/${{ steps.version.outputs.version }}
           delete-branch: true
           title: "Release ${{ steps.version.outputs.version }}"
@@ -202,7 +194,6 @@ jobs:
             - ✅ Updated `package.json` version to `${{ steps.version.outputs.version }}`
             - ✅ Updated `CHANGELOG.md` with all commits since `${{ steps.prev_tag.outputs.prev_tag }}`
             - ✅ Updated `package-lock.json` (npm)
-            - ✅ Updated `pnpm-lock.yaml` (pnpm)

             ### Commits included in this release:

diff --git a/README.md b/README.md
index fe634b3..a19b746 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,3 @@
-
 <div align="center">

 <p>
@@ -25,18 +24,20 @@
 <a href="https://github.com/agentlang-ai/agentlang/tree/main/example"><img src="https://img.shields.io/badge/Examples-Page-yellow?logo=homepage&logoColor=yellow&style=for-the-badge"></a>

 [![Node Version](https://img.shields.io/badge/node-%3E%3D20.0.0-brightgreen?logo=node.js)](https://nodejs.org) [![CI](https://github.com/agentlang-ai/agentlang/actions/workflows/ci.yml/badge.svg)](https://github.com/agentlang-ai/agentlang/actions/workflows/ci.yml)![License](https://img.shields.io/badge/License-Sustainable%20Use%20v1.0-blue.svg) [![npm downloads](https://img.shields.io/npm/dm/agentlang.svg)](https://www.npmjs.com/package/agentlang)
+
 <hr>

 ## Agentlang - Team as Code
+
 </div>

 Agentlang is a declarative DSL (built on TypeScript) for creating AI Agents and full-stack Agentic Apps. With Agentlang, you define, version, run, mentor, and monitor teams of AI agents, along with the app infrastructure they need: data model, workflows, RBAC, integrations, and UI. We refer to this approach - bringing together AI agent development and App development into a single coherent discipline - as Team-as-Code (our riff on IaC).

-* **For Devs and Non-Devs:** Code and vibe-code in your IDE, focusing on the business-logic of your app, not wiring. Alternatively, you can build, run, mentor and monitor your AI Team in Studio - our visual-builder (coming soon). Switch back-and-forth between the two modes seamlessly.
+- **For Devs and Non-Devs:** Code and vibe-code in your IDE, focusing on the business-logic of your app, not wiring. Alternatively, you can build, run, mentor and monitor your AI Team in Studio - our visual-builder (coming soon). Switch back-and-forth between the two modes seamlessly.

-* **Robust Integrations:** The Agentlang runtime ships with native integrations for LLMs, databases, vector DBs, and auth providers. Our connector architecture is built for the enterprise, with a rapidly growing catalog for systems like Salesforce, ServiceNow, HubSpot, Snowflake, and more. Also, because Agentlang compiles to Node.js (and runs in the browser), you can use any existing JavaScript library out of the box.
+- **Robust Integrations:** The Agentlang runtime ships with native integrations for LLMs, databases, vector DBs, and auth providers. Our connector architecture is built for the enterprise, with a rapidly growing catalog for systems like Salesforce, ServiceNow, HubSpot, Snowflake, and more. Also, because Agentlang compiles to Node.js (and runs in the browser), you can use any existing JavaScript library out of the box.

-* **Production-grade**: Under the hood, it’s all modern TypeScript—strong typing, tooling, testing, and CI/CD-friendly workflows—built for enterprise-class reliability, governance, and scale.
+- **Production-grade**: Under the hood, it’s all modern TypeScript—strong typing, tooling, testing, and CI/CD-friendly workflows—built for enterprise-class reliability, governance, and scale.

 Agentlang introduces two foundational innovations: [Agentic Reliability Modeling](#-agentic-reliability-modeling) and [AgentLang Ontology](#agentlang-ontology)

@@ -132,7 +133,6 @@ workflow ticketInProgress {
 }
 ```

-
 ### ✨ First-class AI Agents

 Agents and many concepts agents use are built-in language constructs.
@@ -328,8 +328,8 @@ What makes this model special is how seamlessly an agent can interact with it

 To get started with Agentlang Ontology, please see the [Agentlang Tutorial](https://docs.fractl.io/app) or explore the following example applications:

- * [Car Dealership](https://github.com/agentlang-ai/agentlang/tree/main/example/car_dealership)
- * [Customer Support System](https://github.com/agentlang-ai/agentlang/tree/main/example/customer_support_system)
+- [Car Dealership](https://github.com/agentlang-ai/agentlang/tree/main/example/car_dealership)
+- [Customer Support System](https://github.com/agentlang-ai/agentlang/tree/main/example/customer_support_system)

 ## 🚀 Getting Started

@@ -363,14 +363,7 @@ For contributors who want to build and develop Agentlang itself:
 ```shell
 # Install dependencies
 npm install
-
-OR
-
-# Install pnpm: https://pnpm.io/installation
-# Use pnpm
-pnpm install
 ```
-**Note**: If pnpm shows build script warnings, run `pnpm approve-builds` and approve esbuild and sqlite3.

 ### ⚡ Build

diff --git a/example/custom_config/README.md b/example/custom_config/README.md
index 54e02b1..b2a3825 100644
--- a/example/custom_config/README.md
+++ b/example/custom_config/README.md
@@ -1,6 +1,7 @@
 # Resolver Configuration Example — Using Entities as Config Providers

 This example demonstrates:
+
 1. How an Agentlang application can expose configuration values to JavaScript resolvers using **entities** and **`fetchConfig`**.
 2. How to configure documents from various sources: local files, S3, HTTPS, and the secure **Document Service**.

@@ -12,10 +13,10 @@ A resolver may require external configuration—such as API keys, endpoints, or

 In this module:

-* The `Config` entity is used to **store resolver configuration** (server URL and API key).
-* The configuration is **initialized via `config.al`**, not created through workflows.
-* The resolver (`ChatResolver`) uses the JavaScript function `createChatMessage` to send or simulate sending chat messages.
-* The resolver retrieves configuration using the Agentlang runtime helper **`fetchConfig()`**.
+- The `Config` entity is used to **store resolver configuration** (server URL and API key).
+- The configuration is **initialized via `config.al`**, not created through workflows.
+- The resolver (`ChatResolver`) uses the JavaScript function `createChatMessage` to send or simulate sending chat messages.
+- The resolver retrieves configuration using the Agentlang runtime helper **`fetchConfig()`**.

 This architecture cleanly separates **configuration**, **data**, and **resolver behavior**, enabling maintainable and secure integrations.

@@ -47,33 +48,33 @@ resolver ChatResolver ["custom_config.core/ChatMessage"] {

 ### Key Features

-* `ChatMessage` is resolved by `ChatResolver`, which implements the **create** handler.
-* The resolver pulls configuration from the `Config` entity using `fetchConfig('custom_config.core/Config')`.
+- `ChatMessage` is resolved by `ChatResolver`, which implements the **create** handler.
+- The resolver pulls configuration from the `Config` entity using `fetchConfig('custom_config.core/Config')`.

 ---

 ## 3. Resolver Logic (`resolver.js`)

 ```js
-import { fetchConfig } from "../../../out/runtime/api.js"
+import { fetchConfig } from '../../../out/runtime/api.js';

 export async function createChatMessage(_, inst) {
-    const config = await fetchConfig('custom_config.core/Config')
-    console.log(`Connecting to chat server ${config.server} using key ${config.key}`)
+  const config = await fetchConfig('custom_config.core/Config');
+  console.log(`Connecting to chat server ${config.server} using key ${config.key}`);

-    const to = inst.lookup('to')
-    const message = inst.lookup('message')
-    console.log(`To: ${to}, Body: ${message}`)
+  const to = inst.lookup('to');
+  const message = inst.lookup('message');
+  console.log(`To: ${to}, Body: ${message}`);

-    return inst
+  return inst;
 }
 ```

 ### How it Works

-* **`fetchConfig(entityName)`** loads the config entity from the service configuration.
-* The resolver reads attributes from the `ChatMessage` instance via `inst.lookup()`.
-* The function returns the instance, completing the creation process.
+- **`fetchConfig(entityName)`** loads the config entity from the service configuration.
+- The resolver reads attributes from the `ChatMessage` instance via `inst.lookup()`.
+- The function returns the instance, completing the creation process.

 ---

@@ -83,11 +84,11 @@ The `Chat` package declares that it expects configuration for the `Config` entit

 ```json
 {
-    "name": "Chat",
-    "version": "0.0.1",
-    "agentlang": {
-        "config": ["custom_config.core/Config"]
-    }
+  "name": "Chat",
+  "version": "0.0.1",
+  "agentlang": {
+    "config": ["custom_config.core/Config"]
+  }
 }
 ```

@@ -100,6 +101,7 @@ This tells Agentlang that the configuration must be provided in **config.al**.
 This example also demonstrates various ways to configure documents for agents:

 ### 5.1 Local File
+
 ```json
 {
   "agentlang.ai/doc": {
@@ -110,6 +112,7 @@ This example also demonstrates various ways to configure documents for agents:
 ```

 ### 5.2 S3 Storage
+
 ```json
 {
   "agentlang.ai/doc": {
@@ -128,6 +131,7 @@ This example also demonstrates various ways to configure documents for agents:
 ```

 ### 5.3 HTTPS URL
+
 ```json
 {
   "agentlang.ai/doc": {
@@ -142,6 +146,7 @@ This example also demonstrates various ways to configure documents for agents:
 The secure way to access documents uploaded via Studio:

 **Option A: Direct URL (with document-service:// protocol)**
+
 ```json
 {
   "agentlang.ai/doc": {
@@ -165,6 +170,7 @@ The secure way to access documents uploaded via Studio:
 ```

 **Option B: Lookup by Title**
+
 ```json
 {
   "agentlang.ai/doc": {
@@ -208,15 +214,77 @@ The secure way to access documents uploaded via Studio:
     "store": {
         "type": "sqlite",
         "dbname": "cc.db"
+    },
+    "vectorStore": {
+        "type": "lancedb",
+        "dbname": "./data/vector-store/cc-vectors.lance"
+    },
+    "knowledgeGraph": {
+        "enabled": true
     }
 }
 ```

 ### Notes

-* The `Config` entity is populated at startup using values in this file.
-* These values are immediately accessible to resolvers via `fetchConfig`.
-* This design avoids hard-coding sensitive values (API keys, credentials).
+- The `Config` entity is populated at startup using values in this file.
+- These values are immediately accessible to resolvers via `fetchConfig`.
+- This design avoids hard-coding sensitive values (API keys, credentials).
+
+---
+
+## 6.1 Vector Store Configuration
+
+The `vectorStore` configuration controls how document embeddings are stored and searched:
+
+### LanceDB (Default for SQLite)
+
+```json
+{
+  "vectorStore": {
+    "type": "lancedb",
+    "dbname": "./data/vector-store/cc-vectors.lance"
+  }
+}
+```
+
+### pgvector (for PostgreSQL)
+
+```json
+{
+  "vectorStore": {
+    "type": "pgvector"
+  }
+}
+```
+
+- **Note**: `vectorStore` does not have an `enabled` field. If present in config, it's enabled. If absent, vector store operations fall back to the main `store` type (LanceDB for SQLite, pgvector for PostgreSQL).
+- **LanceDB**: Embedded vector database (no separate server). Uses `dbname` as the file path.
+- **pgvector**: PostgreSQL extension. Shares the main PostgreSQL connection (no `dbname` needed in `vectorStore`).
+- If `vectorStore` is not specified, defaults are: LanceDB for SQLite, pgvector for PostgreSQL.
+
+---
+
+## 6.2 Knowledge Graph Configuration
+
+The `knowledgeGraph` configuration enables Neo4j integration for graph-based document relationships:
+
+```json
+{
+  "knowledgeGraph": {
+    "enabled": true,
+    "neo4j": {
+      "uri": "#js process.env.GRAPH_DB_URI || 'bolt://localhost:7687'",
+      "user": "#js process.env.GRAPH_DB_USER || 'neo4j'",
+      "password": "#js process.env.GRAPH_DB_PASSWORD || 'password'"
+    }
+  }
+}
+```
+
+- When enabled, document entities and relationships are synced to Neo4j.
+- Enables graph traversal queries (e.g., "find all documents related to X").
+- Set `enabled: false` to disable Neo4j integration.

 ---

@@ -257,4 +325,4 @@ This example shows:

 ### ✔ Clean separation between configuration, data, and resolver logic

-This pattern is ideal for any resolver that integrates with external systems—chat services, APIs, payment gateways, databases, etc.
\ No newline at end of file
+This pattern is ideal for any resolver that integrates with external systems—chat services, APIs, payment gateways, databases, etc.
diff --git a/example/custom_config/config.al b/example/custom_config/config.al
index d4b8d30..c432d25 100644
--- a/example/custom_config/config.al
+++ b/example/custom_config/config.al
@@ -7,6 +7,13 @@
       "type": "sqlite",
       "dbname": "cc.db"
     },
+    "vectorStore": {
+      "type": "lancedb",
+      "dbname": "./data/vector-store/cc-vectors.lance"
+    },
+    "knowledgeGraph": {
+      "enabled": true
+    },
     "retry": [
       {
         "name": "classifyRetry",
diff --git a/example/dracula/config.al b/example/dracula/config.al
index f46f50f..9d0caf5 100644
--- a/example/dracula/config.al
+++ b/example/dracula/config.al
@@ -6,6 +6,12 @@
     "store": {
         "type": "sqlite",
         "dbname": "dracula.db"
+    },
+    "vectorStore": {
+      "type": "lancedb"
+    },
+    "knowledgeGraph": {
+      "enabled": true
     }
    }
 }
diff --git a/package-lock.json b/package-lock.json
index 63bee1a..791fb7e 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -14,6 +14,7 @@
         "@aws-sdk/client-s3": "^3.975.0",
         "@aws-sdk/credential-providers": "^3.975.0",
         "@isomorphic-git/lightning-fs": "^4.6.2",
+        "@lancedb/lancedb": "^0.15.0",
         "@langchain/anthropic": "^1.3.12",
         "@langchain/core": "^1.1.17",
         "@langchain/openai": "^1.2.3",
@@ -2076,6 +2077,168 @@
       "dev": true,
       "license": "MIT"
     },
+    "node_modules/@lancedb/lancedb": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb/-/lancedb-0.15.0.tgz",
+      "integrity": "sha512-qm3GXLA17/nFGUwrOEuFNW0Qg2gvCtp+yAs6qoCM6vftIreqzp8d4Hio6eG/YojS9XqPnR2q+zIeIFy12Ywvxg==",
+      "cpu": [
+        "x64",
+        "arm64"
+      ],
+      "license": "Apache 2.0",
+      "os": [
+        "darwin",
+        "linux",
+        "win32"
+      ],
+      "dependencies": {
+        "reflect-metadata": "^0.2.2"
+      },
+      "engines": {
+        "node": ">= 18"
+      },
+      "optionalDependencies": {
+        "@lancedb/lancedb-darwin-arm64": "0.15.0",
+        "@lancedb/lancedb-darwin-x64": "0.15.0",
+        "@lancedb/lancedb-linux-arm64-gnu": "0.15.0",
+        "@lancedb/lancedb-linux-arm64-musl": "0.15.0",
+        "@lancedb/lancedb-linux-x64-gnu": "0.15.0",
+        "@lancedb/lancedb-linux-x64-musl": "0.15.0",
+        "@lancedb/lancedb-win32-arm64-msvc": "0.15.0",
+        "@lancedb/lancedb-win32-x64-msvc": "0.15.0"
+      },
+      "peerDependencies": {
+        "apache-arrow": ">=15.0.0 <=18.1.0"
+      }
+    },
+    "node_modules/@lancedb/lancedb-darwin-arm64": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-darwin-arm64/-/lancedb-darwin-arm64-0.15.0.tgz",
+      "integrity": "sha512-e6eiS1dUdSx3G3JXFEn5bk6I26GR7UM2QwQ1YMrTsg7IvGDqKmXc/s5j4jpJH0mzm7rwqh+OAILPIjr7DoUCDA==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "darwin"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-darwin-x64": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-darwin-x64/-/lancedb-darwin-x64-0.15.0.tgz",
+      "integrity": "sha512-kEgigrqKf954egDbUdIp86tjVfFmTCTcq2Hydw/WLc+LI++46aeT2MsJv0CQpkNFMfh/T2G18FsDYLKH0zTaow==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "darwin"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-linux-arm64-gnu": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-linux-arm64-gnu/-/lancedb-linux-arm64-gnu-0.15.0.tgz",
+      "integrity": "sha512-TnpbBT9kaSYQqastJ+S5jm4S5ZYBx18X8PHQ1ic3yMIdPTjCWauj+owDovOpiXK9ucjmi/FnUp8bKNxGnlqmEg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-linux-arm64-musl": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-linux-arm64-musl/-/lancedb-linux-arm64-musl-0.15.0.tgz",
+      "integrity": "sha512-fe8LnC9YKbLgEJiLQhyVj+xz1d1RgWKs+rLSYPxaD3xQBo3kMC94Esq+xfrdNkSFvPgchRTvBA9jDYJjJL8rcg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-linux-x64-gnu": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-linux-x64-gnu/-/lancedb-linux-x64-gnu-0.15.0.tgz",
+      "integrity": "sha512-0lKEc3M06ax3RozBbxHuNN9qWqhJUiKDnRC3ttsbmo4VrOUBvAO3fKoaRkjZhAA8q4+EdhZnCaQZezsk60f7Ag==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-linux-x64-musl": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-linux-x64-musl/-/lancedb-linux-x64-musl-0.15.0.tgz",
+      "integrity": "sha512-ls+ikV7vWyVnqVT7bMmuqfGCwVR5JzPIfJ5iZ4rkjU4iTIQRpY7u/cTe9rGKt/+psliji8x6PPZHpfdGXHmleQ==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-win32-arm64-msvc": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-win32-arm64-msvc/-/lancedb-win32-arm64-msvc-0.15.0.tgz",
+      "integrity": "sha512-C30A+nDaJ4jhjN76hRcp28Eq+G48SR9wO3i1zGm0ZAEcRV1t9O1fAp6g18IPT65Qyu/hXJBgBdVHtent+qg9Ng==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "win32"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/@lancedb/lancedb-win32-x64-msvc": {
+      "version": "0.15.0",
+      "resolved": "https://registry.npmjs.org/@lancedb/lancedb-win32-x64-msvc/-/lancedb-win32-x64-msvc-0.15.0.tgz",
+      "integrity": "sha512-amXzIAxqrHyp+c9TpIDI8ze1uCqWC6HXQIoXkoMQrBXoUUo8tJORH2yGAsa3TSgjZDDjg0HPA33dYLhOLk1m8g==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "Apache 2.0",
+      "optional": true,
+      "os": [
+        "win32"
+      ],
+      "engines": {
+        "node": ">= 18"
+      }
+    },
     "node_modules/@langchain/anthropic": {
       "version": "1.3.17",
       "resolved": "https://registry.npmjs.org/@langchain/anthropic/-/anthropic-1.3.17.tgz",
@@ -3528,6 +3691,16 @@
       "dev": true,
       "license": "MIT"
     },
+    "node_modules/@swc/helpers": {
+      "version": "0.5.18",
+      "resolved": "https://registry.npmjs.org/@swc/helpers/-/helpers-0.5.18.tgz",
+      "integrity": "sha512-TXTnIcNJQEKwThMMqBXsZ4VGAza6bvN4pa41Rkqoio6QBKMvo+5lexeTMScGCIxtzgQJzElcvIltani+adC5PQ==",
+      "license": "Apache-2.0",
+      "peer": true,
+      "dependencies": {
+        "tslib": "^2.8.0"
+      }
+    },
     "node_modules/@types/body-parser": {
       "version": "1.19.6",
       "resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.6.tgz",
@@ -3549,6 +3722,20 @@
         "assertion-error": "^2.0.1"
       }
     },
+    "node_modules/@types/command-line-args": {
+      "version": "5.2.3",
+      "resolved": "https://registry.npmjs.org/@types/command-line-args/-/command-line-args-5.2.3.tgz",
+      "integrity": "sha512-uv0aG6R0Y8WHZLTamZwtfsDLVRnOa+n+n5rEvFWL5Na5gZ8V2Teab/duDPFzIIIhs9qizDpcavCusCLJZu62Kw==",
+      "license": "MIT",
+      "peer": true
+    },
+    "node_modules/@types/command-line-usage": {
+      "version": "5.0.4",
+      "resolved": "https://registry.npmjs.org/@types/command-line-usage/-/command-line-usage-5.0.4.tgz",
+      "integrity": "sha512-BwR5KP3Es/CSht0xqBcUXS3qCAUVXwpRKsV2+arxeb65atasuXG9LykC9Ab10Cw3s2raH92ZqOeILaQbsB2ACg==",
+      "license": "MIT",
+      "peer": true
+    },
     "node_modules/@types/connect": {
       "version": "3.4.38",
       "resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.38.tgz",
@@ -4229,6 +4416,44 @@
         "node": ">= 8"
       }
     },
+    "node_modules/apache-arrow": {
+      "version": "18.1.0",
+      "resolved": "https://registry.npmjs.org/apache-arrow/-/apache-arrow-18.1.0.tgz",
+      "integrity": "sha512-v/ShMp57iBnBp4lDgV8Jx3d3Q5/Hac25FWmQ98eMahUiHPXcvwIMKJD0hBIgclm/FCG+LwPkAKtkRO1O/W0YGg==",
+      "license": "Apache-2.0",
+      "peer": true,
+      "dependencies": {
+        "@swc/helpers": "^0.5.11",
+        "@types/command-line-args": "^5.2.3",
+        "@types/command-line-usage": "^5.0.4",
+        "@types/node": "^20.13.0",
+        "command-line-args": "^5.2.1",
+        "command-line-usage": "^7.0.1",
+        "flatbuffers": "^24.3.25",
+        "json-bignum": "^0.0.3",
+        "tslib": "^2.6.2"
+      },
+      "bin": {
+        "arrow2csv": "bin/arrow2csv.js"
+      }
+    },
+    "node_modules/apache-arrow/node_modules/@types/node": {
+      "version": "20.19.33",
+      "resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.33.tgz",
+      "integrity": "sha512-Rs1bVAIdBs5gbTIKza/tgpMuG1k3U/UMJLWecIMxNdJFDMzcM5LOiLVRYh3PilWEYDIeUDv7bpiHPLPsbydGcw==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "undici-types": "~6.21.0"
+      }
+    },
+    "node_modules/apache-arrow/node_modules/undici-types": {
+      "version": "6.21.0",
+      "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz",
+      "integrity": "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==",
+      "license": "MIT",
+      "peer": true
+    },
     "node_modules/app-root-path": {
       "version": "3.1.0",
       "resolved": "https://registry.npmjs.org/app-root-path/-/app-root-path-3.1.0.tgz",
@@ -4250,6 +4475,16 @@
       "integrity": "sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==",
       "license": "Python-2.0"
     },
+    "node_modules/array-back": {
+      "version": "3.1.0",
+      "resolved": "https://registry.npmjs.org/array-back/-/array-back-3.1.0.tgz",
+      "integrity": "sha512-TkuxA4UCOvxuDK6NZYXCalszEzj+TLszyASooky+i742l9TqsOdYCMJJupxRic61hwquNtppB3hgcuq9SVSH1Q==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=6"
+      }
+    },
     "node_modules/asn1.js": {
       "version": "4.10.1",
       "resolved": "https://registry.npmjs.org/asn1.js/-/asn1.js-4.10.1.tgz",
@@ -4884,6 +5119,88 @@
         "url": "https://github.com/chalk/chalk?sponsor=1"
       }
     },
+    "node_modules/chalk-template": {
+      "version": "0.4.0",
+      "resolved": "https://registry.npmjs.org/chalk-template/-/chalk-template-0.4.0.tgz",
+      "integrity": "sha512-/ghrgmhfY8RaSdeo43hNXxpoHAtxdbskUHjPpfqUWGttFgycUhYPGx3YZBCnUCvOa7Doivn1IZec3DEGFoMgLg==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "chalk": "^4.1.2"
+      },
+      "engines": {
+        "node": ">=12"
+      },
+      "funding": {
+        "url": "https://github.com/chalk/chalk-template?sponsor=1"
+      }
+    },
+    "node_modules/chalk-template/node_modules/ansi-styles": {
+      "version": "4.3.0",
+      "resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
+      "integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "color-convert": "^2.0.1"
+      },
+      "engines": {
+        "node": ">=8"
+      },
+      "funding": {
+        "url": "https://github.com/chalk/ansi-styles?sponsor=1"
+      }
+    },
+    "node_modules/chalk-template/node_modules/chalk": {
+      "version": "4.1.2",
+      "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
+      "integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "ansi-styles": "^4.1.0",
+        "supports-color": "^7.1.0"
+      },
+      "engines": {
+        "node": ">=10"
+      },
+      "funding": {
+        "url": "https://github.com/chalk/chalk?sponsor=1"
+      }
+    },
+    "node_modules/chalk-template/node_modules/color-convert": {
+      "version": "2.0.1",
+      "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
+      "integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "color-name": "~1.1.4"
+      },
+      "engines": {
+        "node": ">=7.0.0"
+      }
+    },
+    "node_modules/chalk-template/node_modules/color-name": {
+      "version": "1.1.4",
+      "resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
+      "integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
+      "license": "MIT",
+      "peer": true
+    },
+    "node_modules/chalk-template/node_modules/supports-color": {
+      "version": "7.2.0",
+      "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
+      "integrity": "sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "has-flag": "^4.0.0"
+      },
+      "engines": {
+        "node": ">=8"
+      }
+    },
     "node_modules/chevrotain": {
       "version": "11.1.1",
       "resolved": "https://registry.npmjs.org/chevrotain/-/chevrotain-11.1.1.tgz",
@@ -5063,6 +5380,58 @@
         "node": ">= 0.8"
       }
     },
+    "node_modules/command-line-args": {
+      "version": "5.2.1",
+      "resolved": "https://registry.npmjs.org/command-line-args/-/command-line-args-5.2.1.tgz",
+      "integrity": "sha512-H4UfQhZyakIjC74I9d34fGYDwk3XpSr17QhEd0Q3I9Xq1CETHo4Hcuo87WyWHpAF1aSLjLRf5lD9ZGX2qStUvg==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "array-back": "^3.1.0",
+        "find-replace": "^3.0.0",
+        "lodash.camelcase": "^4.3.0",
+        "typical": "^4.0.0"
+      },
+      "engines": {
+        "node": ">=4.0.0"
+      }
+    },
+    "node_modules/command-line-usage": {
+      "version": "7.0.3",
+      "resolved": "https://registry.npmjs.org/command-line-usage/-/command-line-usage-7.0.3.tgz",
+      "integrity": "sha512-PqMLy5+YGwhMh1wS04mVG44oqDsgyLRSKJBdOo1bnYhMKBW65gZF1dRp2OZRhiTjgUHljy99qkO7bsctLaw35Q==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "array-back": "^6.2.2",
+        "chalk-template": "^0.4.0",
+        "table-layout": "^4.1.0",
+        "typical": "^7.1.1"
+      },
+      "engines": {
+        "node": ">=12.20.0"
+      }
+    },
+    "node_modules/command-line-usage/node_modules/array-back": {
+      "version": "6.2.2",
+      "resolved": "https://registry.npmjs.org/array-back/-/array-back-6.2.2.tgz",
+      "integrity": "sha512-gUAZ7HPyb4SJczXAMUXMGAvI976JoK3qEx9v1FTmeYuJj0IBiaKttG1ydtGKdkfqWkIkouke7nG8ufGy77+Cvw==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=12.17"
+      }
+    },
+    "node_modules/command-line-usage/node_modules/typical": {
+      "version": "7.3.0",
+      "resolved": "https://registry.npmjs.org/typical/-/typical-7.3.0.tgz",
+      "integrity": "sha512-ya4mg/30vm+DOWfBg4YK3j2WD6TWtRkCbasOJr40CseYENzCUby/7rIvXA99JGsQHeNxLbnXdyLLxKSv3tauFw==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=12.17"
+      }
+    },
     "node_modules/commander": {
       "version": "14.0.3",
       "resolved": "https://registry.npmjs.org/commander/-/commander-14.0.3.tgz",
@@ -6394,6 +6763,19 @@
         "url": "https://opencollective.com/express"
       }
     },
+    "node_modules/find-replace": {
+      "version": "3.0.0",
+      "resolved": "https://registry.npmjs.org/find-replace/-/find-replace-3.0.0.tgz",
+      "integrity": "sha512-6Tb2myMioCAgv5kfvP5/PkZZ/ntTpVK39fHY7WkWBgvbeE+VHd/tZuZ4mrC+bxh4cfOZeYKVPaJIZtZXV7GNCQ==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "array-back": "^3.0.1"
+      },
+      "engines": {
+        "node": ">=4.0.0"
+      }
+    },
     "node_modules/find-up": {
       "version": "5.0.0",
       "resolved": "https://registry.npmjs.org/find-up/-/find-up-5.0.0.tgz",
@@ -6425,6 +6807,13 @@
         "node": ">=16"
       }
     },
+    "node_modules/flatbuffers": {
+      "version": "24.12.23",
+      "resolved": "https://registry.npmjs.org/flatbuffers/-/flatbuffers-24.12.23.tgz",
+      "integrity": "sha512-dLVCAISd5mhls514keQzmEG6QHmUUsNuWsb4tFafIUwvvgDjXhtfAYSKOzt5SWOy+qByV5pbsDZ+Vb7HUOBEdA==",
+      "license": "Apache-2.0",
+      "peer": true
+    },
     "node_modules/flatted": {
       "version": "3.3.3",
       "resolved": "https://registry.npmjs.org/flatted/-/flatted-3.3.3.tgz",
@@ -7541,6 +7930,15 @@
         "js-yaml": "bin/js-yaml.js"
       }
     },
+    "node_modules/json-bignum": {
+      "version": "0.0.3",
+      "resolved": "https://registry.npmjs.org/json-bignum/-/json-bignum-0.0.3.tgz",
+      "integrity": "sha512-2WHyXj3OfHSgNyuzDbSxI1w2jgw5gkWSWhS7Qg4bWXx1nLk3jnbwfUeS0PSba3IzpTUWdHxBieELUzXRjQB2zg==",
+      "peer": true,
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/json-buffer": {
       "version": "3.0.1",
       "resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.1.tgz",
@@ -7820,6 +8218,13 @@
       "integrity": "sha512-kVI48u3PZr38HdYz98UmfPnXl2DXrpdctLrFLCd3kOx1xUkOmpFPx7gCWWM5MPkL/fD8zb+Ph0QzjGFs4+hHWg==",
       "license": "MIT"
     },
+    "node_modules/lodash.camelcase": {
+      "version": "4.3.0",
+      "resolved": "https://registry.npmjs.org/lodash.camelcase/-/lodash.camelcase-4.3.0.tgz",
+      "integrity": "sha512-TwuEnCnxbc3rAvhf/LbG7tJUDzhqXyFnv3dtzLOPgCG/hODL7WFnsbwktkD7yUV0RrreP/l1PALq/YSg6VvjlA==",
+      "license": "MIT",
+      "peer": true
+    },
     "node_modules/lodash.merge": {
       "version": "4.6.2",
       "resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz",
@@ -10295,6 +10700,30 @@
         "url": "https://github.com/sponsors/ljharb"
       }
     },
+    "node_modules/table-layout": {
+      "version": "4.1.1",
+      "resolved": "https://registry.npmjs.org/table-layout/-/table-layout-4.1.1.tgz",
+      "integrity": "sha512-iK5/YhZxq5GO5z8wb0bY1317uDF3Zjpha0QFFLA8/trAoiLbQD0HUbMesEaxyzUgDxi2QlcbM8IvqOlEjgoXBA==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "array-back": "^6.2.2",
+        "wordwrapjs": "^5.1.0"
+      },
+      "engines": {
+        "node": ">=12.17"
+      }
+    },
+    "node_modules/table-layout/node_modules/array-back": {
+      "version": "6.2.2",
+      "resolved": "https://registry.npmjs.org/array-back/-/array-back-6.2.2.tgz",
+      "integrity": "sha512-gUAZ7HPyb4SJczXAMUXMGAvI976JoK3qEx9v1FTmeYuJj0IBiaKttG1ydtGKdkfqWkIkouke7nG8ufGy77+Cvw==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=12.17"
+      }
+    },
     "node_modules/tar-fs": {
       "version": "2.1.4",
       "resolved": "https://registry.npmjs.org/tar-fs/-/tar-fs-2.1.4.tgz",
@@ -10985,6 +11414,16 @@
         "url": "https://opencollective.com/eslint"
       }
     },
+    "node_modules/typical": {
+      "version": "4.0.0",
+      "resolved": "https://registry.npmjs.org/typical/-/typical-4.0.0.tgz",
+      "integrity": "sha512-VAH4IvQ7BDFYglMd7BPRDfLgxZZX4O4TFcRDA6EN5X7erNJJq+McIEp8np9aVtxrCJ6qx4GTYVfOWNjcqwZgRw==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=8"
+      }
+    },
     "node_modules/uglify-js": {
       "version": "3.19.3",
       "resolved": "https://registry.npmjs.org/uglify-js/-/uglify-js-3.19.3.tgz",
@@ -11612,6 +12051,16 @@
       "integrity": "sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==",
       "license": "MIT"
     },
+    "node_modules/wordwrapjs": {
+      "version": "5.1.1",
+      "resolved": "https://registry.npmjs.org/wordwrapjs/-/wordwrapjs-5.1.1.tgz",
+      "integrity": "sha512-0yweIbkINJodk27gX9LBGMzyQdBDan3s/dEAiwBOj+Mf0PPyWL6/rikalkv8EeD0E8jm4o5RXEOrFTP3NXbhJg==",
+      "license": "MIT",
+      "peer": true,
+      "engines": {
+        "node": ">=12.17"
+      }
+    },
     "node_modules/wrap-ansi": {
       "version": "7.0.0",
       "resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
diff --git a/package.json b/package.json
index 9969bfe..c38800b 100644
--- a/package.json
+++ b/package.json
@@ -59,6 +59,7 @@
     "@langchain/anthropic": "^1.3.12",
     "@langchain/core": "^1.1.17",
     "@langchain/openai": "^1.2.3",
+    "@lancedb/lancedb": "^0.15.0",
     "@modelcontextprotocol/sdk": "^1.25.3",
     "@types/multer": "^2.0.0",
     "amazon-cognito-identity-js": "^6.3.16",
@@ -187,6 +188,5 @@
     "*.{json,md,yml,yaml}": [
       "prettier --write"
     ]
-  },
-  "packageManager": "pnpm@10.28.2"
+  }
 }
diff --git a/src/runtime/embeddings/openai.ts b/src/runtime/embeddings/openai.ts
index 07539bf..f8f1d57 100644
--- a/src/runtime/embeddings/openai.ts
+++ b/src/runtime/embeddings/openai.ts
@@ -15,7 +15,7 @@ export class OpenAIEmbeddingProvider extends EmbeddingProvider {
   constructor(config?: EmbeddingProviderConfig) {
     super(config || {});
     this.openaiConfig = (this.config as OpenAIEmbeddingConfig) || {};
-    logger.info(
+    logger.debug(
       `[OPENAI-EMBEDDING] Provider created with model: ${this.openaiConfig.model || 'default'}`
     );
   }
diff --git a/src/runtime/graph/database.ts b/src/runtime/graph/database.ts
index 590140b..ddd8873 100644
--- a/src/runtime/graph/database.ts
+++ b/src/runtime/graph/database.ts
@@ -5,7 +5,7 @@ export interface GraphDatabase {
   disconnect(): Promise<void>;
   isConnected(): boolean;

-  // Node operations (synced from Agentlang KnowledgeNode)
+  // Node operations (synced from Agentlang KnowledgeEntity)
   createNode(node: GraphNode): Promise<string>;
   upsertNode(node: GraphNode): Promise<string>;
   findNodeById(id: string): Promise<GraphNode | null>;
diff --git a/src/runtime/graph/neo4j.ts b/src/runtime/graph/neo4j.ts
index 0d23afd..16a937b 100644
--- a/src/runtime/graph/neo4j.ts
+++ b/src/runtime/graph/neo4j.ts
@@ -69,18 +69,18 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        `CREATE (n:KnowledgeNode {
-          id: $id, name: $name, type: $type, description: $description,
+        `CREATE (n:KnowledgeEntity {
+          id: $id, name: $name, entityType: $entityType, description: $description,
           sourceType: $sourceType, sourceId: $sourceId, sourceChunk: $sourceChunk,
           instanceId: $instanceId, instanceType: $instanceType,
-          containerTag: $containerTag, userId: $userId, agentId: $agentId,
+          __tenant__: $containerTag, userId: $userId, agentId: $agentId,
           confidence: $confidence, isLatest: $isLatest,
           createdAt: datetime(), updatedAt: datetime()
         }) RETURN n.id AS id`,
         {
           id: node.id,
           name: node.name,
-          type: node.type,
+          entityType: node.entityType,
           description: node.description || null,
           sourceType: node.sourceType,
           sourceId: node.sourceId || null,
@@ -104,11 +104,11 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        `MERGE (n:KnowledgeNode {id: $id})
-         SET n.name = $name, n.type = $type, n.description = $description,
+        `MERGE (n:KnowledgeEntity {id: $id})
+         SET n.name = $name, n.entityType = $entityType, n.description = $description,
              n.sourceType = $sourceType, n.sourceId = $sourceId, n.sourceChunk = $sourceChunk,
              n.instanceId = $instanceId, n.instanceType = $instanceType,
-             n.containerTag = $containerTag, n.userId = $userId, n.agentId = $agentId,
+              n.__tenant__ = $containerTag, n.userId = $userId, n.agentId = $agentId,
              n.confidence = $confidence, n.isLatest = $isLatest,
              n.updatedAt = datetime(),
              n.createdAt = coalesce(n.createdAt, datetime())
@@ -116,7 +116,7 @@ export class Neo4jDatabase implements GraphDatabase {
         {
           id: node.id,
           name: node.name,
-          type: node.type,
+          entityType: node.entityType,
           description: node.description || null,
           sourceType: node.sourceType,
           sourceId: node.sourceId || null,
@@ -139,7 +139,7 @@ export class Neo4jDatabase implements GraphDatabase {
   async findNodeById(id: string): Promise<GraphNode | null> {
     const session = this.driver.session();
     try {
-      const result = await session.run('MATCH (n:KnowledgeNode {id: $id}) RETURN n', { id });
+      const result = await session.run('MATCH (n:KnowledgeEntity {id: $id}) RETURN n', { id });
       if (result.records.length === 0) return null;
       return this.recordToNode(result.records[0].get('n'));
     } finally {
@@ -151,7 +151,7 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        'MATCH (n:KnowledgeNode {containerTag: $containerTag}) RETURN n',
+        'MATCH (n:KnowledgeEntity {__tenant__: $containerTag}) RETURN n',
         { containerTag }
       );
       return result.records.map((r: any) => this.recordToNode(r.get('n')));
@@ -170,7 +170,7 @@ export class Neo4jDatabase implements GraphDatabase {
       if (!setClause) return;

       await session.run(
-        `MATCH (n:KnowledgeNode {id: $id}) SET ${setClause}, n.updatedAt = datetime()`,
+        `MATCH (n:KnowledgeEntity {id: $id}) SET ${setClause}, n.updatedAt = datetime()`,
         { id, ...updates }
       );
     } finally {
@@ -181,7 +181,7 @@ export class Neo4jDatabase implements GraphDatabase {
   async deleteNode(id: string): Promise<void> {
     const session = this.driver.session();
     try {
-      await session.run('MATCH (n:KnowledgeNode {id: $id}) DETACH DELETE n', { id });
+      await session.run('MATCH (n:KnowledgeEntity {id: $id}) DETACH DELETE n', { id });
     } finally {
       await session.close();
     }
@@ -192,7 +192,7 @@ export class Neo4jDatabase implements GraphDatabase {
     try {
       const edgeId = edge.id || crypto.randomUUID();
       await session.run(
-        `MATCH (a:KnowledgeNode {id: $sourceId}), (b:KnowledgeNode {id: $targetId})
+        `MATCH (a:KnowledgeEntity {id: $sourceId}), (b:KnowledgeEntity {id: $targetId})
          CREATE (a)-[r:${sanitizeCypherLabel(edge.relationship)} {
            id: $edgeId, weight: $weight, sourceType: $sourceType
          }]->(b)
@@ -216,7 +216,7 @@ export class Neo4jDatabase implements GraphDatabase {
     try {
       const edgeId = edge.id || crypto.randomUUID();
       await session.run(
-        `MATCH (a:KnowledgeNode {id: $sourceId}), (b:KnowledgeNode {id: $targetId})
+        `MATCH (a:KnowledgeEntity {id: $sourceId}), (b:KnowledgeEntity {id: $targetId})
          MERGE (a)-[r:${sanitizeCypherLabel(edge.relationship)}]->(b)
          SET r.id = $edgeId, r.weight = $weight, r.sourceType = $sourceType
          RETURN r`,
@@ -238,7 +238,7 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        `MATCH (a:KnowledgeNode {id: $nodeId})-[r]->(b:KnowledgeNode)
+        `MATCH (a:KnowledgeEntity {id: $nodeId})-[r]->(b:KnowledgeEntity)
          RETURN a.id AS sourceId, b.id AS targetId, type(r) AS relationship,
                 r.weight AS weight, r.sourceType AS sourceType, r.id AS id`,
         { nodeId }
@@ -260,7 +260,7 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        `MATCH (a:KnowledgeNode)-[r]->(b:KnowledgeNode {id: $nodeId})
+        `MATCH (a:KnowledgeEntity)-[r]->(b:KnowledgeEntity {id: $nodeId})
          RETURN a.id AS sourceId, b.id AS targetId, type(r) AS relationship,
                 r.weight AS weight, r.sourceType AS sourceType, r.id AS id`,
         { nodeId }
@@ -281,7 +281,7 @@ export class Neo4jDatabase implements GraphDatabase {
   async deleteEdgesForNode(nodeId: string): Promise<void> {
     const session = this.driver.session();
     try {
-      await session.run('MATCH (n:KnowledgeNode {id: $nodeId})-[r]-() DELETE r', { nodeId });
+      await session.run('MATCH (n:KnowledgeEntity {id: $nodeId})-[r]-() DELETE r', { nodeId });
     } finally {
       await session.close();
     }
@@ -295,8 +295,8 @@ export class Neo4jDatabase implements GraphDatabase {
     const session = this.driver.session();
     try {
       const result = await session.run(
-        `MATCH (seed:KnowledgeNode)
-         WHERE seed.id IN $seedIds AND seed.containerTag = $containerTag
+        `MATCH (seed:KnowledgeEntity)
+         WHERE seed.id IN $seedIds AND seed.__tenant__ = $containerTag
          CALL apoc.path.subgraphAll(seed, {maxLevel: $maxDepth, labelFilter: 'KnowledgeNode'})
          YIELD nodes, relationships
          UNWIND nodes AS n
@@ -349,9 +349,9 @@ export class Neo4jDatabase implements GraphDatabase {

     for (let depth = 0; depth <= maxDepth && currentIds.length > 0; depth++) {
       const result = await session.run(
-        `MATCH (n:KnowledgeNode)
-         WHERE n.id IN $ids AND n.containerTag = $containerTag
-         OPTIONAL MATCH (n)-[r]->(m:KnowledgeNode {containerTag: $containerTag})
+        `MATCH (n:KnowledgeEntity)
+         WHERE n.id IN $ids AND n.__tenant__ = $containerTag
+         OPTIONAL MATCH (n)-[r]->(m:KnowledgeEntity {__tenant__: $containerTag})
          RETURN n, m, r, type(r) AS relType, r.weight AS weight`,
         { ids: currentIds, containerTag }
       );
@@ -389,7 +389,7 @@ export class Neo4jDatabase implements GraphDatabase {
   async clearContainer(containerTag: string): Promise<void> {
     const session = this.driver.session();
     try {
-      await session.run('MATCH (n:KnowledgeNode {containerTag: $containerTag}) DETACH DELETE n', {
+      await session.run('MATCH (n:KnowledgeEntity {__tenant__: $containerTag}) DETACH DELETE n', {
         containerTag,
       });
     } finally {
@@ -400,13 +400,13 @@ export class Neo4jDatabase implements GraphDatabase {
   async getStats(containerTag?: string): Promise<{ nodeCount: number; edgeCount: number }> {
     const session = this.driver.session();
     try {
-      const whereClause = containerTag ? 'WHERE n.containerTag = $containerTag' : '';
+      const whereClause = containerTag ? 'WHERE n.__tenant__ = $containerTag' : '';
       const nodeResult = await session.run(
-        `MATCH (n:KnowledgeNode) ${whereClause} RETURN count(n) AS cnt`,
+        `MATCH (n:KnowledgeEntity) ${whereClause} RETURN count(n) AS cnt`,
         containerTag ? { containerTag } : {}
       );
       const edgeResult = await session.run(
-        `MATCH (n:KnowledgeNode)${whereClause ? ' ' + whereClause : ''}-[r]->() RETURN count(r) AS cnt`,
+        `MATCH (n:KnowledgeEntity)${whereClause ? ' ' + whereClause : ''}-[r]->() RETURN count(r) AS cnt`,
         containerTag ? { containerTag } : {}
       );
       return {
@@ -423,14 +423,14 @@ export class Neo4jDatabase implements GraphDatabase {
     return {
       id: props.id,
       name: props.name,
-      type: props.type,
+      entityType: props.entityType,
       description: props.description,
       sourceType: props.sourceType,
       sourceId: props.sourceId,
       sourceChunk: props.sourceChunk,
       instanceId: props.instanceId,
       instanceType: props.instanceType,
-      containerTag: props.containerTag,
+      containerTag: props.__tenant__,
       userId: props.userId,
       agentId: props.agentId,
       confidence: props.confidence ?? 1.0,
diff --git a/src/runtime/graph/types.ts b/src/runtime/graph/types.ts
index f16656f..4a4eb2c 100644
--- a/src/runtime/graph/types.ts
+++ b/src/runtime/graph/types.ts
@@ -3,7 +3,7 @@ export type SourceType = 'DOCUMENT' | 'CONVERSATION' | 'INSTANCE' | 'DERIVED';
 export interface GraphNode {
   id: string;
   name: string;
-  type: string; // Person, Organization, Location, Product, Concept, Event, Role
+  entityType: string; // Person, Organization, Location, Product, Concept, Event, Role
   description?: string;
   embedding?: number[];
   sourceType: SourceType;
@@ -64,7 +64,7 @@ export interface ExtractionResult {

 export interface ExtractedEntity {
   name: string;
-  type: string;
+  entityType: string;
   description?: string;
   // 1-5 scale where 5 = most central to the text
   salience?: number;
diff --git a/src/runtime/knowledge/context-builder.ts b/src/runtime/knowledge/context-builder.ts
index d4968ff..fed560d 100644
--- a/src/runtime/knowledge/context-builder.ts
+++ b/src/runtime/knowledge/context-builder.ts
@@ -23,6 +23,7 @@ export class ContextBuilder {
     query: string,
     containerTag: string,
     _userId: string,
+    tenantId: string,
     extraContainerTags?: string[]
   ): Promise<KnowledgeContext> {
     const startTime = Date.now();
@@ -40,7 +41,7 @@ export class ContextBuilder {
       let exactMatches: GraphNode[] = [];
       for (const tag of containerTags) {
         if (exactMatches.length >= MAX_SEED_NODES) break;
-        const matches = await this.findExactMatches(query, tag);
+        const matches = await this.findExactMatches(query, tag, tenantId);
         exactMatches = mergeNodes(exactMatches, matches, MAX_SEED_NODES);
       }

@@ -49,7 +50,7 @@ export class ContextBuilder {
       if (seedNodes.length < MAX_SEED_NODES) {
         for (const tag of containerTags) {
           if (seedNodes.length >= MAX_SEED_NODES) break;
-          const vectorSeeds = await this.vectorSearchNodes(query, tag);
+          const vectorSeeds = await this.vectorSearchNodes(query, tag, tenantId);
           seedNodes = mergeNodes(seedNodes, vectorSeeds, MAX_SEED_NODES);
         }
       }
@@ -133,13 +134,17 @@ export class ContextBuilder {
     }
   }

-  private async vectorSearchNodes(query: string, containerTag: string): Promise<GraphNode[]> {
+  private async vectorSearchNodes(
+    query: string,
+    containerTag: string,
+    tenantId: string
+  ): Promise<GraphNode[]> {
     try {
       // Use Agentlang's fullTextSearch via content? query
       // Limit to MAX_SEED_NODES to prevent memory issues
       const result: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeNode {
-          containerTag? "${escapeString(containerTag)}",
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {
+          __tenant__? "${escapeString(tenantId)}",
           name? "${escapeString(query)}"},
           @limit ${MAX_SEED_NODES}}`,
         undefined
@@ -154,7 +159,11 @@ export class ContextBuilder {
     }
   }

-  private async findExactMatches(query: string, containerTag: string): Promise<GraphNode[]> {
+  private async findExactMatches(
+    query: string,
+    containerTag: string,
+    tenantId: string
+  ): Promise<GraphNode[]> {
     const candidates = extractEntityCandidates(query);
     if (candidates.length === 0) return [];

@@ -165,8 +174,8 @@ export class ContextBuilder {
       if (results.length >= MAX_SEED_NODES) break;
       try {
         const result: Instance[] = await parseAndEvaluateStatement(
-          `{${CoreKnowledgeModuleName}/KnowledgeNode {` +
-            `containerTag? "${escapeString(containerTag)}", ` +
+          `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
+            `__tenant__? "${escapeString(tenantId)}", ` +
             `name? "${escapeString(candidate)}"}, ` +
             `@limit ${MAX_SEED_NODES}}`,
           undefined
@@ -240,8 +249,8 @@ export class ContextBuilder {
     context += '### Relevant Entities\n';
     const byType = new Map<string, GraphNode[]>();
     for (const node of nodes) {
-      if (!byType.has(node.type)) byType.set(node.type, []);
-      byType.get(node.type)!.push(node);
+      if (!byType.has(node.entityType)) byType.set(node.entityType, []);
+      byType.get(node.entityType)!.push(node);
     }
     for (const [type, entities] of byType) {
       context += `\n**${type}s:**\n`;
diff --git a/src/runtime/knowledge/conversation-processor.ts b/src/runtime/knowledge/conversation-processor.ts
index 1e0eaf1..e96b237 100644
--- a/src/runtime/knowledge/conversation-processor.ts
+++ b/src/runtime/knowledge/conversation-processor.ts
@@ -31,7 +31,7 @@ export class ConversationProcessor {
   private async buildExistingContext(containerTag: string): Promise<string> {
     try {
       const result: import('../module.js').Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeNode {containerTag? "${escapeString(containerTag)}", isLatest? true}, @limit 20}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {containerTag? "${escapeString(containerTag)}", isLatest? true}, @limit 20}`,
         undefined
       );

@@ -40,9 +40,9 @@ export class ConversationProcessor {
       const lines: string[] = [];
       for (const inst of result) {
         const name = inst.lookup('name') as string;
-        const type = inst.lookup('type') as string;
+        const entityType = inst.lookup('entityType') as string;
         const desc = inst.lookup('description') as string | undefined;
-        lines.push(`- ${name} (${type})${desc ? ': ' + desc.substring(0, 100) : ''}`);
+        lines.push(`- ${name} (${entityType})${desc ? ': ' + desc.substring(0, 100) : ''}`);
       }
       return lines.join('\n');
     } catch (err) {
@@ -101,6 +101,7 @@ export class ConversationProcessor {
           entity,
           containerTag,
           userId,
+          containerTag,
           'CONVERSATION',
           sessionId,
           `User: ${userMessage}\nAssistant: ${assistantResponse}`,
diff --git a/src/runtime/knowledge/deduplicator.ts b/src/runtime/knowledge/deduplicator.ts
index dbcb39e..7544e2e 100644
--- a/src/runtime/knowledge/deduplicator.ts
+++ b/src/runtime/knowledge/deduplicator.ts
@@ -91,6 +91,7 @@ export class SemanticDeduplicator {
     entity: ExtractedEntity,
     containerTag: string,
     userId: string,
+    tenantId: string,
     sourceType: SourceType,
     sourceId?: string,
     sourceChunk?: string,
@@ -101,7 +102,7 @@ export class SemanticDeduplicator {
     entity.name = normalizedName; // Use normalized name

     // Step 1: Check for exact name match (fast, no memory overhead)
-    const existingNode = await this.findNodeByExactName(normalizedName, containerTag);
+    const existingNode = await this.findNodeByExactName(normalizedName, containerTag, tenantId);

     if (existingNode) {
       logger.debug(`[KNOWLEDGE] Found existing node "${existingNode.name}" for "${entity.name}"`);
@@ -109,7 +110,7 @@ export class SemanticDeduplicator {
     }

     // Step 1b: Semantic similarity search (vector search + string similarity)
-    const similarNode = await this.findSimilarNode(entity, containerTag);
+    const similarNode = await this.findSimilarNode(entity, containerTag, tenantId);
     if (similarNode) {
       logger.debug(`[KNOWLEDGE] Found similar node "${similarNode.name}" for "${entity.name}"`);
       return await this.mergeNode(similarNode, entity);
@@ -122,7 +123,8 @@ export class SemanticDeduplicator {
     const embeddingService = this.getEmbeddingService();
     if (embeddingService) {
       try {
-        const embeddingText = `${entity.name} ${entity.type} ${entity.description || ''}`.trim();
+        const embeddingText =
+          `${entity.name} ${entity.entityType} ${entity.description || ''}`.trim();
         embedding = await embeddingService.embedText(embeddingText);
         logger.debug(
           `[KNOWLEDGE] Generated embedding for "${entity.name}" (${embedding.length} dimensions)`
@@ -155,6 +157,7 @@ export class SemanticDeduplicator {
     entities: ExtractedEntity[],
     containerTag: string,
     userId: string,
+    tenantId: string,
     sourceType: SourceType,
     sourceId?: string,
     sourceChunks?: Map<string, string>,
@@ -171,7 +174,7 @@ export class SemanticDeduplicator {
       entity.name = this.normalizeNodeName(entity.name);

       // Check exact name match
-      const existingNode = await this.findNodeByExactName(entity.name, containerTag);
+      const existingNode = await this.findNodeByExactName(entity.name, containerTag, tenantId);
       if (existingNode) {
         logger.debug(`[KNOWLEDGE] Batch: found existing node "${existingNode.name}"`);
         results.push(await this.mergeNode(existingNode, entity));
@@ -179,7 +182,7 @@ export class SemanticDeduplicator {
       }

       // Check similarity match
-      const similarNode = await this.findSimilarNode(entity, containerTag);
+      const similarNode = await this.findSimilarNode(entity, containerTag, tenantId);
       if (similarNode) {
         logger.debug(`[KNOWLEDGE] Batch: found similar node "${similarNode.name}"`);
         results.push(await this.mergeNode(similarNode, entity));
@@ -203,7 +206,7 @@ export class SemanticDeduplicator {
     if (embeddingService) {
       try {
         const textsToEmbed = needsEmbedding.map(item =>
-          `${item.entity.name} ${item.entity.type} ${item.entity.description || ''}`.trim()
+          `${item.entity.name} ${item.entity.entityType} ${item.entity.description || ''}`.trim()
         );
         logger.info(
           `[KNOWLEDGE] Batch: generating embeddings for ${textsToEmbed.length} new entities in 1 API call`
@@ -249,13 +252,14 @@ export class SemanticDeduplicator {
    */
   private async findNodeByExactName(
     normalizedName: string,
-    containerTag: string
+    containerTag: string,
+    tenantId: string
   ): Promise<GraphNode | null> {
     try {
       // Query directly by name instead of scanning recent nodes
       // This is much more efficient and prevents duplicates
       const result: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeNode {name? "${escapeString(normalizedName)}", containerTag? "${escapeString(containerTag)}", isLatest? true}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {name? "${escapeString(normalizedName)}", containerTag? "${escapeString(containerTag)}", __tenant__? "${escapeString(tenantId)}", isLatest? true}}`,
         undefined
       );

@@ -271,14 +275,16 @@ export class SemanticDeduplicator {

   private async findSimilarNode(
     entity: ExtractedEntity,
-    containerTag: string
+    containerTag: string,
+    tenantId: string
   ): Promise<GraphNode | null> {
     try {
       // Search using entity name only — do NOT pollute with type/description
       // which degrades full-text search quality
       const result: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeNode {` +
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
           `containerTag? "${escapeString(containerTag)}", ` +
+          `__tenant__? "${escapeString(tenantId)}", ` +
           `isLatest? true, ` +
           `name? "${escapeString(entity.name)}"}, ` +
           `@limit ${MAX_SIMILAR_CANDIDATES}}`,
@@ -294,7 +300,7 @@ export class SemanticDeduplicator {
         const candidate = instanceToGraphNode(inst);
         const candidateName = normalizeForMatch(candidate.name);
         const nameScore = nameSimilarity(queryName, candidateName);
-        const typeCompatible = isTypeCompatible(entity.type, candidate.type);
+        const typeCompatible = isTypeCompatible(entity.entityType, candidate.entityType);
         const score = typeCompatible ? nameScore : nameScore * 0.7;
         if (!bestMatch || score > bestMatch.score) {
           bestMatch = { node: candidate, score };
@@ -342,8 +348,8 @@ export class SemanticDeduplicator {
       updates.confidence = newConfidence;
     }

-    if (entity.type && shouldPreferType(entity.type, existing.type)) {
-      updates.type = entity.type;
+    if (entity.entityType && shouldPreferType(entity.entityType, existing.entityType)) {
+      updates.entityType = entity.entityType;
     }

     if (Object.keys(updates).length > 0) {
@@ -357,7 +363,7 @@ export class SemanticDeduplicator {
         }
         if (setClauses.length > 0) {
           await parseAndEvaluateStatement(
-            `{${CoreKnowledgeModuleName}/KnowledgeNode {id "${existing.id}", ${setClauses.join(', ')}}, @upsert}`,
+            `{${CoreKnowledgeModuleName}/KnowledgeEntity {id "${existing.id}", ${setClauses.join(', ')}}, @upsert}`,
             undefined
           );
         }
@@ -380,7 +386,7 @@ export class SemanticDeduplicator {
   ): Promise<GraphNode> {
     try {
       await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeNode {id "${existingId}", isLatest false}, @upsert}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {id "${existingId}", isLatest false}, @upsert}`,
         undefined
       );
       logger.info(`[KNOWLEDGE] Superseded node ${existingId.substring(0, 8)}... with new info`);
@@ -394,7 +400,7 @@ export class SemanticDeduplicator {
     if (embeddingService2) {
       try {
         const embeddingText =
-          `${replacement.name} ${replacement.type} ${replacement.description || ''}`.trim();
+          `${replacement.name} ${replacement.entityType} ${replacement.description || ''}`.trim();
         embedding = await embeddingService2.embedText(embeddingText);
       } catch (err) {
         logger.debug(`[KNOWLEDGE] Embedding generation failed for replacement: ${err}`);
@@ -416,7 +422,7 @@ export class SemanticDeduplicator {
   async removeNode(nodeId: string): Promise<void> {
     try {
       await parseAndEvaluateStatement(
-        `purge {${CoreKnowledgeModuleName}/KnowledgeNode {id? "${nodeId}"}}`,
+        `purge {${CoreKnowledgeModuleName}/KnowledgeEntity {id? "${nodeId}"}}`,
         undefined
       );
       // Also remove edges referencing this node
@@ -450,9 +456,9 @@ export class SemanticDeduplicator {
   ): Promise<GraphNode> {
     // Create in Agentlang first to get a UUID
     let query =
-      `{${CoreKnowledgeModuleName}/KnowledgeNode {` +
+      `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
       `name "${escapeString(entity.name)}",…
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/src/runtime/graph/database.ts b/src/runtime/graph/database.ts
index aea7d26..ed5bf79 100644
--- a/src/runtime/graph/database.ts
+++ b/src/runtime/graph/database.ts
@@ -28,6 +28,7 @@ export interface GraphDatabase {
   ): Promise<GraphTraversalResult>;

   // Bulk operations
+  clearAll(): Promise<void>;
   clearContainer(__tenant__: string): Promise<void>;
   getStats(__tenant__?: string): Promise<{ nodeCount: number; edgeCount: number }>;
 }
diff --git a/src/runtime/graph/neo4j-browser.ts b/src/runtime/graph/neo4j-browser.ts
index b7d58d8..750f6d5 100644
--- a/src/runtime/graph/neo4j-browser.ts
+++ b/src/runtime/graph/neo4j-browser.ts
@@ -102,6 +102,10 @@ export class Neo4jBrowserDatabase implements GraphDatabase {
     throw new Error('Neo4j Browser: expandGraph not yet implemented');
   }

+  async clearAll(): Promise<void> {
+    throw new Error('Neo4j Browser: clearAll not yet implemented');
+  }
+
   async clearContainer(__tenant__: string): Promise<void> {
     throw new Error('Neo4j Browser: clearContainer not yet implemented');
   }
diff --git a/src/runtime/graph/neo4j.ts b/src/runtime/graph/neo4j.ts
index ce0ea9c..676ffc3 100644
--- a/src/runtime/graph/neo4j.ts
+++ b/src/runtime/graph/neo4j.ts
@@ -73,7 +73,7 @@ export class Neo4jDatabase implements GraphDatabase {
           id: $id, name: $name, entityType: $entityType, description: $description,
           sourceType: $sourceType, sourceId: $sourceId, sourceChunk: $sourceChunk,
           instanceId: $instanceId, instanceType: $instanceType,
-          __tenant__: $containerTag, userId: $userId, agentId: $agentId,
+          __tenant__: $containerTag, agentId: $agentId,
           confidence: $confidence, isLatest: $isLatest,
           createdAt: datetime(), updatedAt: datetime()
         }) RETURN n.id AS id`,
@@ -88,7 +88,6 @@ export class Neo4jDatabase implements GraphDatabase {
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
           containerTag: node.__tenant__,
-          userId: node.userId,
           agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
@@ -108,7 +107,7 @@ export class Neo4jDatabase implements GraphDatabase {
          SET n.name = $name, n.entityType = $entityType, n.description = $description,
              n.sourceType = $sourceType, n.sourceId = $sourceId, n.sourceChunk = $sourceChunk,
              n.instanceId = $instanceId, n.instanceType = $instanceType,
-              n.__tenant__ = $containerTag, n.userId = $userId, n.agentId = $agentId,
+             n.__tenant__ = $containerTag, n.agentId = $agentId,
              n.confidence = $confidence, n.isLatest = $isLatest,
              n.updatedAt = datetime(),
              n.createdAt = coalesce(n.createdAt, datetime())
@@ -124,7 +123,6 @@ export class Neo4jDatabase implements GraphDatabase {
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
           containerTag: node.__tenant__,
-          userId: node.userId,
           agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
@@ -385,6 +383,16 @@ export class Neo4jDatabase implements GraphDatabase {
     return { nodes: Array.from(nodeMap.values()), edges };
   }

+  async clearAll(): Promise<void> {
+    const session = this.driver.session();
+    try {
+      await session.run('MATCH (n:KnowledgeEntity) DETACH DELETE n');
+      logger.info('[GRAPH] Cleared all Neo4j knowledge data');
+    } finally {
+      await session.close();
+    }
+  }
+
   async clearContainer(__tenant__: string): Promise<void> {
     const session = this.driver.session();
     try {
diff --git a/src/runtime/graph/types.ts b/src/runtime/graph/types.ts
index 3a51b59..cb1bcc7 100644
--- a/src/runtime/graph/types.ts
+++ b/src/runtime/graph/types.ts
@@ -12,7 +12,6 @@ export interface GraphNode {
   instanceId?: string;
   instanceType?: string;
   __tenant__: string;
-  userId: string;
   agentId?: string;
   confidence: number;
   createdAt: Date;
diff --git a/src/runtime/knowledge/context-builder.ts b/src/runtime/knowledge/context-builder.ts
index 5717f6a..16ab8fc 100644
--- a/src/runtime/knowledge/context-builder.ts
+++ b/src/runtime/knowledge/context-builder.ts
@@ -22,7 +22,6 @@ export class ContextBuilder {
   async buildContext(
     query: string,
     containerTag: string,
-    _userId: string,
     tenantId: string,
     extraContainerTags?: string[]
   ): Promise<KnowledgeContext> {
diff --git a/src/runtime/knowledge/conversation-processor.ts b/src/runtime/knowledge/conversation-processor.ts
index e96b237..9d12cad 100644
--- a/src/runtime/knowledge/conversation-processor.ts
+++ b/src/runtime/knowledge/conversation-processor.ts
@@ -12,9 +12,6 @@ import type {
 } from '../graph/types.js';
 import type { Environment } from '../interpreter.js';

-const MAX_CONVERSATION_NODES = parseInt(process.env.KG_MAX_CONVERSATION_NODES || '10', 10);
-const MAX_CONVERSATION_EDGES = parseInt(process.env.KG_MAX_CONVERSATION_EDGES || '20', 10);
-
 export class ConversationProcessor {
   private extractor: EntityExtractor;
   private deduplicator: SemanticDeduplicator;
@@ -55,7 +52,6 @@ export class ConversationProcessor {
     userMessage: string,
     assistantResponse: string,
     containerTag: string,
-    userId: string,
     sessionId: string,
     agentId?: string,
     env?: Environment,
@@ -88,19 +84,10 @@ export class ConversationProcessor {

       const nodeMap = new Map<string, GraphNode>();
       for (const entity of extraction.entities) {
-        // Limit nodes to prevent memory issues
-        if (allNodes.length >= MAX_CONVERSATION_NODES) {
-          logger.warn(
-            `[KNOWLEDGE] Reached max conversation nodes limit (${MAX_CONVERSATION_NODES})`
-          );
-          break;
-        }
-
         const beforeCount = allNodes.length;
         const node = await this.deduplicator.findOrCreateNode(
           entity,
           containerTag,
-          userId,
           containerTag,
           'CONVERSATION',
           sessionId,
@@ -119,17 +106,10 @@ export class ConversationProcessor {
       }

       for (const rel of extraction.relationships) {
-        if (allEdges.length >= MAX_CONVERSATION_EDGES) {
-          logger.warn(
-            `[KNOWLEDGE] Reached max conversation edges limit (${MAX_CONVERSATION_EDGES})`
-          );
-          break;
-        }
         const edge = await this.createEdgeFromRelationship(
           rel,
           nodeMap,
           containerTag,
-          userId,
           sessionId,
           agentId
         );
@@ -155,7 +135,6 @@ export class ConversationProcessor {
     rel: ExtractedRelationship,
     nodeMap: Map<string, GraphNode>,
     containerTag: string,
-    userId: string,
     sessionId: string,
     agentId?: string
   ): Promise<GraphEdge | null> {
@@ -214,8 +193,7 @@ export class ConversationProcessor {
           `relType "${escapeString(edge.relationship)}", ` +
           `weight ${edge.weight}, ` +
           `sourceType "CONVERSATION", ` +
-          `containerTag "${escapeString(containerTag)}", ` +
-          `userId "${escapeString(userId)}"` +
+          `__tenant__ "${escapeString(containerTag)}"` +
           (agentId ? `, agentId "${escapeString(agentId)}"` : '') +
           `}}`,
         undefined
diff --git a/src/runtime/knowledge/deduplicator.ts b/src/runtime/knowledge/deduplicator.ts
index 7544e2e..1bfa0be 100644
--- a/src/runtime/knowledge/deduplicator.ts
+++ b/src/runtime/knowledge/deduplicator.ts
@@ -90,7 +90,6 @@ export class SemanticDeduplicator {
   async findOrCreateNode(
     entity: ExtractedEntity,
     containerTag: string,
-    userId: string,
     tenantId: string,
     sourceType: SourceType,
     sourceId?: string,
@@ -139,7 +138,6 @@ export class SemanticDeduplicator {
     return await this.createNewNode(
       entity,
       containerTag,
-      userId,
       sourceType,
       sourceId,
       sourceChunk,
@@ -156,7 +154,6 @@ export class SemanticDeduplicator {
   async findOrCreateNodesBatch(
     entities: ExtractedEntity[],
     containerTag: string,
-    userId: string,
     tenantId: string,
     sourceType: SourceType,
     sourceId?: string,
@@ -228,7 +225,6 @@ export class SemanticDeduplicator {
       const node = await this.createNewNode(
         entity,
         containerTag,
-        userId,
         sourceType,
         sourceId,
         sourceChunk,
@@ -378,7 +374,6 @@ export class SemanticDeduplicator {
     existingId: string,
     replacement: ExtractedEntity,
     containerTag: string,
-    userId: string,
     sourceType: SourceType,
     sourceId?: string,
     sourceChunk?: string,
@@ -410,7 +405,6 @@ export class SemanticDeduplicator {
     return await this.createNewNode(
       replacement,
       containerTag,
-      userId,
       sourceType,
       sourceId,
       sourceChunk,
@@ -447,7 +441,6 @@ export class SemanticDeduplicator {
   private async createNewNode(
     entity: ExtractedEntity,
     containerTag: string,
-    userId: string,
     sourceType: SourceType,
     sourceId?: string,
     sourceChunk?: string,
@@ -460,8 +453,7 @@ export class SemanticDeduplicator {
       `name "${escapeString(entity.name)}", ` +
       `entityType "${escapeString(entity.entityType)}", ` +
       `sourceType "${sourceType}", ` +
-      `containerTag "${containerTag}", ` +
-      `userId "${userId}", ` +
+      `__tenant__ "${containerTag}", ` +
       `isLatest true, ` +
       `confidence 1.0`;

diff --git a/src/runtime/knowledge/document-processor.ts b/src/runtime/knowledge/document-processor.ts
index 32b22fe..41913eb 100644
--- a/src/runtime/knowledge/document-processor.ts
+++ b/src/runtime/knowledge/document-processor.ts
@@ -19,8 +19,6 @@ import { findProviderForLLM } from '../modules/ai.js';

 const DEFAULT_CHUNK_SIZE = parseInt(process.env.KG_CHUNK_SIZE || '1000', 10);
 const DEFAULT_CHUNK_OVERLAP = parseInt(process.env.KG_CHUNK_OVERLAP || '200', 10);
-const MAX_DOCUMENT_NODES = parseInt(process.env.KG_MAX_DOCUMENT_NODES || '1000', 10);
-const MAX_DOCUMENT_EDGES = parseInt(process.env.KG_MAX_DOCUMENT_EDGES || '2000', 10);
 const MAX_MEGA_BATCH_CHARS = parseInt(process.env.KG_MEGA_BATCH_CHARS || '50000', 10);
 const MAX_ENTITIES_PER_BATCH = parseInt(process.env.KG_MAX_ENTITIES_PER_BATCH || '50', 10);
 const MAX_RELATIONSHIPS_PER_BATCH = parseInt(
@@ -76,7 +74,6 @@ export class DocumentProcessor {
   async processDocument(
     document: Document,
     containerTag: string,
-    userId: string,
     tenantId: string,
     agentId?: string,
     env?: Environment,
@@ -158,8 +155,7 @@ export class DocumentProcessor {
       CORE_ENTITY_LIMIT,
       CORE_ENTITY_PER_TYPE,
       MIN_ENTITY_MENTIONS,
-      MIN_ENTITY_SALIENCE,
-      MAX_DOCUMENT_NODES
+      MIN_ENTITY_SALIENCE
     );

     if (coreEntities.length === 0) {
@@ -186,7 +182,6 @@ export class DocumentProcessor {
     const nodes = await this.deduplicator.findOrCreateNodesBatch(
       entitiesToCreate,
       containerTag,
-      userId,
       tenantId,
       'DOCUMENT',
       document.name,
@@ -245,14 +240,7 @@ export class DocumentProcessor {

     // Create edges from accumulated relationships
     for (const rel of candidateRelationships.values()) {
-      if (edgesCreated >= MAX_DOCUMENT_EDGES) break;
-      const edge = await this.createEdgeFromCandidate(
-        rel,
-        coreNodeMap,
-        containerTag,
-        userId,
-        agentId
-      );
+      const edge = await this.createEdgeFromCandidate(rel, coreNodeMap, containerTag, agentId);
       if (edge) {
         edgesCreated++;
       }
@@ -342,7 +330,6 @@ export class DocumentProcessor {
     rel: RelationshipCandidate,
     nodeMap: Map<string, GraphNode>,
     containerTag: string,
-    userId: string,
     agentId?: string
   ): Promise<GraphEdge | null> {
     const sourceNode = nodeMap.get(rel.sourceKey);
@@ -402,8 +389,7 @@ export class DocumentProcessor {
           `relType "${escapeString(edge.relationship)}", ` +
           `weight ${edge.weight}, ` +
           `sourceType "DOCUMENT", ` +
-          `containerTag "${escapeString(containerTag)}", ` +
-          `userId "${escapeString(userId)}"` +
+          `__tenant__ "${escapeString(containerTag)}"` +
           (agentId ? `, agentId "${escapeString(agentId)}"` : '') +
           `}}`,
         undefined
@@ -445,8 +431,7 @@ function selectCoreEntities(
   coreLimit: number,
   perTypeLimit: number,
   minMentions: number,
-  minSalience: number,
-  maxNodes: number
+  minSalience: number
 ): CandidateScore[] {
   const scored = Array.from(candidates.values()).map(candidate => {
     const salienceAvg = candidate.salienceSum / Math.max(1, candidate.salienceCount);
@@ -489,9 +474,5 @@ function selectCoreEntities(
     }
   }

-  let results = Array.from(selected.values());
-  if (results.length > maxNodes) {
-    results = results.sort((a, b) => b.score - a.score).slice(0, maxNodes);
-  }
-  return results;
+  return Array.from(selected.values());
 }
diff --git a/src/runtime/knowledge/prompts.ts b/src/runtime/knowledge/prompts.ts
index d79a23f..b4e6729 100644
--- a/src/runtime/knowledge/prompts.ts
+++ b/src/runtime/knowledge/prompts.ts
@@ -1,10 +1,10 @@
-export const ENTITY_EXTRACTION_PROMPT = `You are a knowledge graph entity extraction system for long-form documents.
-Extract ONLY the most important named entities (people, places, organizations, named events, artifacts, specific concepts) and their relationships.
+export const ENTITY_EXTRACTION_PROMPT = `You are a knowledge graph entity extraction system for enterprise documents and knowledge bases.
+Extract ONLY the most important named entities (customers, products, features, teams, vendors, systems, contracts, policies, integrations) and their relationships.

 CRITICAL RULES:
 - Prefer central, recurring entities over minor mentions.
 - Each entity must be a proper name or a specific named concept (no generic nouns).
-- Entity names must be specific and meaningful (e.g., "John Smith" not "J", "Federal Reserve" not "the bank").
+- Entity names must be specific and meaningful (e.g., "Acme Corp" not "the client", "Salesforce CRM" not "the system").
 - Do NOT extract: single letters, articles (the, a, an), pronouns (he, she, it), prepositions (in, on, at), or common words.
 - Return at most the number of entities/relationships requested in the user message. If unsure, return fewer.
 - Provide a salience score from 1-5 (5 = most central).
@@ -15,7 +15,7 @@ Return JSON with this exact structure:
   "entities": [
     {
       "name": "Entity name",
-      "type": "Person|Organization|Location|Product|Concept|Event|Role",
+      "type": "Customer|Product|Feature|Contract|Policy|Integration|Team|Vendor|System|Person|Organization|Role",
       "description": "Brief description",
       "salience": 1
     }
@@ -30,18 +30,20 @@ Return JSON with this exact structure:
 }

 Guidelines:
-- Use consistent types: Person, Organization, Location, Product, Concept, Event, Role
-- Relationship types: FOLLOWS, MEETS, VISITS, WORKS_FOR, LOCATED_IN, HAS_ROLE, etc.
+- Use consistent types: Customer, Product, Feature, Contract, Policy, Integration, Team, Vendor, System, Person, Organization, Role
+- Relationship types: OWNS, USES, INTEGRATES_WITH, DEPENDS_ON, HAS_SLA, WORKS_FOR, MANAGES, SUBSCRIBES_TO, SUPPORTS, BELONGS_TO, etc.
 - Only extract what is explicitly stated or strongly implied
 - Be concise in descriptions`;

-export const FACT_EXTRACTION_PROMPT = `You are a fact extraction system. Analyze the conversation and extract important facts.
+export const FACT_EXTRACTION_PROMPT = `You are a fact extraction system for enterprise knowledge. Analyze the input and extract important business facts.

 Extract facts in these categories:
-- user_preference: User preferences, likes, dislikes
-- user_fact: Facts about the user (name, role, company, etc.)
-- instance_reference: References to specific instances/entities mentioned
-- inferred: Derived insights from the conversation
+- business_rule: Business rules, policies, SLAs, compliance requirements
+- product_detail: Product capabilities, features, limitations, pricing tiers
+- customer_insight: Customer requirements, feedback, account details, contract terms
+- operational_fact: Process details, team responsibilities, system configurations
+- decision: Key decisions, approvals, action items from meetings or documents
+- inferred: Derived insights from the source material

 Respond in this JSON format:
 {
@@ -49,7 +51,7 @@ Respond in this JSON format:
     {
       "content": "Clear factual statement",
       "type": "FACT",
-      "category": "user_fact",
+      "category": "business_rule",
       "confidence": 0.9
     }
   ]
@@ -61,23 +63,23 @@ Rules:
 3. Assign confidence scores (0.0-1.0)
 4. If no facts can be extracted, return: {"facts": []}`;

-export const CONVERSATION_ENTITY_PROMPT = `You are a knowledge graph entity extraction system analyzing a conversation turn.
+export const CONVERSATION_ENTITY_PROMPT = `You are a knowledge graph entity extraction system analyzing an enterprise interaction turn (e.g., support ticket, meeting note, client communication, or internal discussion).

 ### Step 1: Classify Turn Intent
 Determine the turn_type:
 - "QUERY": The user is asking a question or requesting information. Do NOT extract entities from questions — return empty arrays.
-- "UPDATE": The user or assistant states new facts, corrections, or assertions that should be remembered.
+- "UPDATE": The user or assistant states new facts, corrections, or assertions that should be remembered (e.g., contract changes, product updates, customer requirements).
 - "MIXED": The user both asks questions AND provides new factual information. Only extract entities from the factual assertions, not from the questions.

 ### Step 2: Coreference Resolution (for UPDATE/MIXED only)
-- Resolve ALL references (pronouns, aliases, short names, nicknames) to their canonical full name.
-  - Example: "he", "Dr. Turing" → "Alan Turing"
+- Resolve ALL references (pronouns, aliases, short names, abbreviations) to their canonical full name.
+  - Example: "they", "SF team" → "Salesforce Integration Team"
 - If an entity matches one in the existing knowledge below, use the EXACT same name.

 ### Step 3: Update Classification (for UPDATE/MIXED only)
 For each entity, classify its update_type:
 - "new": Entity not seen before in existing knowledge
-- "update": Entity exists but this conversation provides corrected/changed information (e.g., new role, moved to new city, changed company)
+- "update": Entity exists but this interaction provides corrected/changed information (e.g., updated contract terms, changed product tier, new vendor contact)
 - "supplement": Entity exists and this adds additional information about a different aspect

 ### Existing Knowledge (if any)
@@ -87,7 +89,7 @@ Return JSON:
 {
   "turn_type": "QUERY|UPDATE|MIXED",
   "entities": [
-    { "name": "Entity name", "type": "Person|Organization|Location|Product|Concept|Event|Role", "description": "Current description based on this conversation", "update_type": "new|update|supplement" }
+    { "name": "Entity name", "type": "Customer|Product|Feature|Contract|Policy|Integration|Team|Vendor|System|Person|Organization|Role", "description": "Current description based on this interaction", "update_type": "new|update|supplement" }
   ],
   "relationships": [
     { "source": "Entity name", "target": "Entity name", "type": "RELATIONSHIP_TYPE" }
@@ -97,9 +99,10 @@ Return JSON:
 IMPORTANT: If turn_type is "QUERY", return empty entities and relationships arrays.
 If no entities can be extracted, return: {"turn_type": "QUERY", "entities": [], "relationships": []}`;

-export const RELATIONSHIP_EXTRACTION_PROMPT = `You are a knowledge graph relationship extraction system.
-You will be given a list of allowed entities and a text passage. Extract relationships ONLY between the allowed entities.
+export const RELATIONSHIP_EXTRACTION_PROMPT = `You are a knowledge graph relationship extraction system for enterprise knowledge bases.
+You will be given a list of allowed entities and a text passage from enterprise documentation (product docs, client records, contracts, support tickets, meeting notes, etc.). Extract relationships ONLY between the allowed entities.
 Do NOT invent entities or relationships that are not supported by the text.
+Use enterprise relationship types: OWNS, USES, INTEGRATES_WITH, DEPENDS_ON, HAS_SLA, MANAGES, SUBSCRIBES_TO, SUPPORTS, BELONGS_TO, CONTRACTS_WITH, ESCALATED_TO, ASSIGNED_TO, etc.

 Return JSON:
 {
@@ -111,13 +114,13 @@ Return JSON:

 If no relationships can be extracted, return: {"entities": [], "relationships": []}`;

-export const MEGABATCH_ENTITY_PROMPT = `You are a knowledge graph entity extraction system processing a large section of a document.
+export const MEGABATCH_ENTITY_PROMPT = `You are a knowledge graph entity extraction system processing a large section of enterprise documentation (product specs, client records, contracts, support tickets, meeting transcripts, etc.).
 Extract the most important named entities from the entire text below.

 CRITICAL RULES:
 - Focus on CENTRAL, RECURRING entities — not minor one-off mentions.
 - Each entity must be a proper name or a specific named concept (no generic nouns).
-- Entity names must be specific and meaningful (e.g., "John Smith" not "J", "Federal Reserve" not "the bank").
+- Entity names must be specific and meaningful (e.g., "Acme Corp" not "the client", "Salesforce CRM" not "the system").
 - Do NOT extract: single letters, articles, pronouns, prepositions, or common words.
 - Estimate how many times each entity is mentioned in the text (mentions field).
 - Provide a salience score from 1-5 (5 = most central to the text).
@@ -128,7 +131,7 @@ Return JSON with this exact structure:
   "entities": [
     {
       "name": "Entity name",
-      "type": "Person|Organization|Location|Product|Concept|Event|Role",
+      "type": "Customer|Product|Feature|Contract|Policy|Integration|Team|Vendor|System|Person|Organization|Role",
       "description": "Brief description of the entity's role in the text",
       "salience": 5,
       "mentions": 10
@@ -137,12 +140,12 @@ Return JSON with this exact structure:
 }

 Guidelines:
-- Use consistent types: Person, Organization, Location, Product, Concept, Event, Role
+- Use consistent types: Customer, Product, Feature, Contract, Policy, Integration, Team, Vendor, System, Person, Organization, Role
 - Be concise in descriptions
 - Return at most the limit specified in the user message`;

-export const MEGABATCH_RELATIONSHIP_PROMPT = `You are a knowledge graph relationship extraction system.
-You will be given a list of known entities and a large section of text.
+export const MEGABATCH_RELATIONSHIP_PROMPT = `You are a knowledge graph relationship extraction system for enterprise knowledge bases.
+You will be given a list of known entities and a large section of enterprise documentation (product docs, client records, contracts, support tickets, meeting notes, etc.).
 Extract relationships ONLY between the listed entities that are supported by the text.

 CRITICAL RULES:
@@ -161,5 +164,5 @@ Return JSON with this exact structure:
   ]
 }

-Common relationship types: KNOWS, MEETS, FOLLOWS, TALKS_TO, WORKS_FOR, LOCATED_IN, HAS_ROLE, BELONGS_TO, PART_OF, OWNS, CREATES, TRANSFORMS_INTO
+Common relationship types: OWNS, USES, INTEGRATES_WITH, DEPENDS_ON, HAS_SLA, WORKS_FOR, MANAGES, SUBSCRIBES_TO, SUPPORTS, BELONGS_TO, CONTRACTS_WITH, ESCALATED_TO, ASSIGNED_TO, PART_OF
 If no relationships can be extracted, return: {"relationships": []}`;
diff --git a/src/runtime/knowledge/service.ts b/src/runtime/knowledge/service.ts
index ea6b801..d197f06 100644
--- a/src/runtime/knowledge/service.ts
+++ b/src/runtime/knowledge/service.ts
@@ -194,7 +194,6 @@ export class KnowledgeService {
   async processDocument(
     document: Document,
     containerTag: string,
-    userId: string,
     agentId?: string,
     env?: Environment,
     llmName?: string
@@ -203,7 +202,6 @@ export class KnowledgeService {
     return this.documentProcessor.processDocument(
       document,
       containerTag,
-      userId,
       containerTag, // tenantId
       agentId,
       env,
@@ -214,7 +212,6 @@ export class KnowledgeService {
   async processDocuments(
     documents: Document[],
     containerTag: string,
-    userId: string,
     agentId?: string,
     env?: Environment,
     llmName?: string
@@ -231,7 +228,7 @@ export class KnowledgeService {

     for (const doc of documents) {
       logger.info(`[KNOWLEDGE] Processing document: ${doc.name} (${doc.content.length} chars)`);
-      const result = await this.processDocument(doc, containerTag, userId, agentId, env, llmName);
+      const result = await this.processDocument(doc, containerTag, agentId, env, llmName);
       logger.info(
         `[KNOWLEDGE] Document ${doc.name} processed: ${result.nodesCreated} nodes, ${result.edgesCreated} edges`
       );
@@ -253,14 +250,12 @@ export class KnowledgeService {
   async buildContext(
     query: string,
     containerTag: string,
-    userId: string,
     agentId?: string
   ): Promise<KnowledgeContext> {
     if (!this.enabled) {
       return { entities: [], relationships: [], instanceData: [], contextString: '' };
     }
-    const extraTags = await this.resolveDocumentContainerTags(agentId, containerTag, userId);
-    return this.contextBuilder.buildContext(query, containerTag, userId, containerTag, extraTags);
+    return this.contextBuilder.buildContext(query, containerTag, containerTag);
   }

   buildContextString(context: KnowledgeContext): string {
@@ -300,7 +295,6 @@ export class KnowledgeService {
         userMessage,
         assistantResponse,
         session.containerTag,
-        session.userId,
         session.sessionId,
         session.agentId,
         env,
@@ -443,7 +437,6 @@ export class KnowledgeService {
       const result = await this.processDocuments(
         matchingDocs,
         session.containerTag,
-        session.userId,
         session.agentId,
         env,
         llmName
@@ -516,7 +509,6 @@ export class KnowledgeService {
             instanceId: inst.lookup('instanceId') as string | undefined,
             instanceType: inst.lookup('instanceType') as string | undefined,
             __tenant__: inst.lookup('__tenant__') as string,
-            userId: inst.lookup('userId') as string,
             agentId: inst.lookup('agentId') as string | undefined,
             confidence: (inst.lookup('confidence') as number) || 1.0,
             createdAt: new Date(),
@@ -562,13 +554,18 @@ export class KnowledgeService {
   }

   /**
-   * Discover all containerTags that have knowledge data and sync each to Neo4j.
-   * Called on startup so Neo4j always reflects the authoritative entity store.
+   * Full rebuild of Neo4j from SQLite/LanceDB on startup.
+   * Clears all Neo4j data first, then syncs each agent container.
+   * SQLite/LanceDB is the source of truth; Neo4j is a derived view.
    */
   private async syncAllContainersToNeo4j(): Promise<void> {
     if (!this.graphDb.isConnected()) return;

     try {
+      // Clear all existing Neo4j data — full rebuild from source of truth
+      await this.graphDb.clearAll();
+      logger.info('[KNOWLEDGE] Cleared all Neo4j data for full rebuild');
+
       const allNodes: Instance[] = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeEntity {isLatest? true}}`,
         undefined
@@ -586,7 +583,7 @@ export class KnowledgeService {
       }

       logger.info(
-        `[KNOWLEDGE] Syncing ${allNodes.length} nodes across ${containerTags.size} container(s) to Neo4j`
+        `[KNOWLEDGE] Rebuilding Neo4j: ${allNodes.length} nodes across ${containerTags.size} agent container(s)`
       );

       for (const tag of containerTags) {
@@ -613,7 +610,6 @@ export class KnowledgeService {
       existingNodeId,
       replacement,
       session.containerTag,
-      session.userId,
       'CONVERSATION',
       session.sessionId,
       undefined,
@@ -646,7 +642,6 @@ export class KnowledgeService {
       confidence?: number;
     }>,
     containerTag: string,
-    userId: string,
     env?: Environment
   ): Promise<Array<{ id: string; name: string; entityType: string }>> {
     if (!this.enabled || entities.length === 0) {
@@ -663,8 +658,7 @@ export class KnowledgeService {
             `name "${escapeString(entity.name)}", ` +
             `entityType "${escapeString(entity.entityType)}", ` +
             `sourceType "${entity.sourceType}", ` +
-            `containerTag "${escapeString(containerTag)}", ` +
-            `userId "${userId}", ` +
+            `__tenant__ "${escapeString(containerTag)}", ` +
             `isLatest true, ` +
             `confidence ${entity.confidence || 1.0}` +
             (entity.description ? `, description "${escapeString(entity.description)}"` : '') +
@@ -708,7 +702,6 @@ export class KnowledgeService {
       sourceType: SourceType;
       containerTag: string;
     }>,
-    userId: string,
     env?: Environment
   ): Promise<Array<{ id: string; sourceId: string; targetId: string }>> {
     if (!this.enabled || edges.length === 0) {
@@ -727,8 +720,7 @@ export class KnowledgeService {
             `relType "${escapeString(edge.relType)}", ` +
             `weight ${edge.weight || 1.0}, ` +
             `sourceType "${edge.sourceType}", ` +
-            `containerTag "${escapeString(edge.containerTag)}", ` +
-            `userId "${userId}"` +
+            `__tenant__ "${escapeString(edge.containerTag)}"` +
             `}}`;

           const result = await parseAndEvaluateStatement(query, undefined, env);
@@ -824,7 +816,6 @@ export class KnowledgeService {
         await this.deduplicator.findOrCreateNode(
           { name: fact.content, entityType: 'Fact', description: fact.category },
           session.containerTag,
-          session.userId,
           session.containerTag,
           'CONVERSATION',
           session.sessionId,
@@ -836,15 +827,6 @@ export class KnowledgeService {
       }
     }
   }
-
-  private async resolveDocumentContainerTags(
-    _agentId: string | undefined,
-    _containerTag: string,
-    _userId: string
-  ): Promise<string[]> {
-    // Agent-level isolation: all knowledge in single container, no need to resolve multiple tags
-    return [];
-  }
 }

 export function getKnowledgeService(): KnowledgeService {
diff --git a/src/runtime/knowledge/utils.ts b/src/runtime/knowledge/utils.ts
index dfc05cf..e2928e5 100644
--- a/src/runtime/knowledge/utils.ts
+++ b/src/runtime/knowledge/utils.ts
@@ -32,7 +32,6 @@ export function instanceToGraphNode(inst: Instance): GraphNode {
     instanceId: inst.lookup('instanceId') as string | undefined,
     instanceType: inst.lookup('instanceType') as string | undefined,
     __tenant__: inst.lookup('__tenant__') as string,
-    userId: inst.lookup('userId') as string,
     agentId: inst.lookup('agentId') as string | undefined,
     confidence: (inst.lookup('confidence') as number) || 1.0,
     createdAt: new Date(),
diff --git a/src/runtime/modules/knowledge.ts b/src/runtime/modules/knowledge.ts
index b769b6c..4fa3502 100644
--- a/src/runtime/modules/knowledge.ts
+++ b/src/runtime/modules/knowledge.ts
@@ -15,7 +15,6 @@ entity KnowledgeEntity {
     instanceId String @optional,
     instanceType String @optional,
     __tenant__ String,
-    userId String @optional,
     agentId String @optional,
     confidence Float @default(1.0),
     isLatest Boolean @default(true),
@@ -31,7 +30,6 @@ entity KnowledgeEdge {
     weight Float @default(1.0),
     sourceType @enum("DOCUMENT", "CONVERSATION", "INSTANCE", "DERIVED") @optional,
     __tenant__ String,
-    userId String @optional,
     agentId String @optional,
     createdAt DateTime @default(now())
 }
@@ -64,7 +62,6 @@ entity DocumentGroup {
     name String,
     __tenant__ String,
     agentId String @optional,
-    userId String @optional,
     createdAt DateTime @default(now())
 }

@@ -77,7 +74,6 @@ entity Document {
     documentGroupId String @optional,
     __tenant__ String,
     agentId String @optional,
-    userId String @optional,
     createdAt DateTime @default(now()),
     @meta {"fullTextSearch": ["title", "content"]}
 }
diff --git a/src/runtime/resolvers/sqldb/database.ts b/src/runtime/resolvers/sqldb/database.ts
index 0e8f7cb..86901c7 100644
--- a/src/runtime/resolvers/sqldb/database.ts
+++ b/src/runtime/resolvers/sqldb/database.ts
@@ -261,13 +261,25 @@ function makeSqliteDataSource(
     try {
       const driver = ds.driver as any;
       const db = driver.databaseConnection || driver.nativeDatabase;
-      // Enable WAL mode for better concurrent read performance
+      // Enable WAL mode and additional pragmas for better write performance
       if (db?.pragma) {
         db.pragma('journal_mode = WAL');
-        logger.info('SQLite WAL mode enabled');
+
+        const syncMode = process.env.SQLITE_SYNC_MODE || 'NORMAL';
+        const busyTimeout = process.env.SQLITE_BUSY_TIMEOUT || '5000';
+        const cacheSize = process.env.SQLITE_CACHE_SIZE || '-20000';
+
+        db.pragma(`synchronous = ${syncMode}`);
+        db.pragma(`busy_timeout = ${busyTimeout}`);
+        db.pragma(`cache_size = ${cacheSize}`);
+        db.pragma('temp_store = MEMORY');
+
+        logger.info(
+          `SQLite pragmas enabled: WAL mode, synchronous=${syncMode}, busy_timeout=${busyTimeout}, cache_size=${cacheSize}, temp_store=MEMORY`
+        );
       }
     } catch (err: any) {
-      logger.warn(`Failed to enable WAL mode: ${err.message}.`);
+      logger.warn(`Failed to enable SQLite pragmas: ${err.message}.`);
     }
     return res;
   };
diff --git a/test/runtime/memory.test.ts b/test/runtime/memory.test.ts
index ca044e6..3f2fa84 100644
--- a/test/runtime/memory.test.ts
+++ b/test/runtime/memory.test.ts
@@ -60,8 +60,7 @@ describe('Knowledge Graph Memory System', () => {
           name: 'Alice',
           entityType: 'Person',
           description: 'Main character',
-          containerTag: 'test',
-          userId: 'u1',
+          __tenant__: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -73,8 +72,7 @@ describe('Knowledge Graph Memory System', () => {
           name: 'Wonderland',
           entityType: 'Location',
           description: 'Fantasy world',
-          containerTag: 'test',
-          userId: 'u1',
+          __tenant__: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -312,18 +310,17 @@ describe('Knowledge Graph Memory System', () => {
       expect(session.sessionId).toBeDefined();
       expect(session.userId).toBe('test-user-001');
       expect(session.agentId).toBe('supportAgent');
-      expect(session.containerTag).toBe(`${agentFqName}:test-user-001`);
+      expect(session.containerTag).toBe(agentFqName);

       const context = await knowledgeService.buildContext(
         'Hello',
-        session.containerTag,
-        session.userId
+        session.containerTag
       );
       expect(context).toBeDefined();
       expect(typeof context.contextString).toBe('string');
     });

-    test('isolates sessions between users', async () => {
+    test('shares knowledge container across users (agent-level isolation)', async () => {
       await doInternModule('KGSessionIsolation', `entity Data { id Int @id }`);
       const knowledgeService = getKnowledgeService();
       const agentFqName = 'KGSessionIsolation/testAgent';
@@ -339,12 +336,14 @@ describe('Knowledge Graph Memory System', () => {
         agentFqName
       );

-      expect(session1.containerTag).not.toBe(session2.containerTag);
-      expect(session1.containerTag).toContain('user-alpha');
-      expect(session2.containerTag).toContain('user-beta');
+      // Both users share the same agent-level container
+      expect(session1.containerTag).toBe(session2.containerTag);
+      expect(session1.containerTag).toBe(agentFqName);
+      // Sessions are still per-user
+      expect(session1.sessionId).not.toBe(session2.sessionId);
     });

-    test('formats containerTag as agentFqName:userId', async () => {
+    test('formats containerTag as agentFqName (agent-only)', async () => {
       await doInternModule('KGFormatTest', `entity X { id Int @id }`);
       const knowledgeService = getKnowledgeService();
       const agentFqName = 'KGFormatTest/testAgent';
@@ -355,7 +354,7 @@ describe('Knowledge Graph Memory System', () => {
         agentFqName
       );

-      expect(session.containerTag).toBe('KGFormatTest/testAgent:test-user');
+      expect(session.containerTag).toBe('KGFormatTest/testAgent');
     });
   });
 });
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/src/runtime/knowledge/extractor.ts b/src/runtime/knowledge/extractor.ts
index 1355d05..0c4a0ff 100644
--- a/src/runtime/knowledge/extractor.ts
+++ b/src/runtime/knowledge/extractor.ts
@@ -248,19 +248,21 @@ const INVALID_ENTITY_PATTERNS = [
   /^[a-zA-Z]\d?$/, // Single letter + optional digit (A1, B2, etc.)
   /^[\p{P}\s]+$/u, // Pure punctuation
   /^\d+$/, // Numbers only
-  /^chapter\s+[ivxlcdm]+$/i, // Chapter I, Chapter IV, etc.
 ];

 const INVALID_ENTITY_NAMES = new Set([
-  'chapter',
-  'chapters',
   'section',
   'sections',
   'page',
   'pages',
-  'volume',
-  'book',
   'part',
+  'paragraph',
+  'paragraphs',
+  'header',
+  'footer',
+  'appendix',
+  'introduction',
+  'conclusion',
 ]);

 // Common grammatical categories to skip
@@ -336,35 +338,50 @@ function normalizeEntityType(type: string): string {
   const normalized = type.trim().toLowerCase();
   switch (normalized) {
     case 'person':
-    case 'character':
-    case 'animal':
-    case 'creature':
+    case 'individual':
+    case 'contact':
       return 'Person';
     case 'organization':
     case 'org':
     case 'company':
     case 'institution':
+    case 'business':
       return 'Organization';
     case 'location':
     case 'place':
-    case 'setting':
+    case 'address':
+    case 'site':
       return 'Location';
     case 'event':
     case 'occasion':
+    case 'meeting':
       return 'Event';
     case 'role':
     case 'title':
+    case 'position':
       return 'Role';
     case 'product':
-    case 'artifact':
-    case 'object':
+    case 'item':
+    case 'sku':
       return 'Product';
+    case 'customer':
+    case 'client':
+    case 'account':
+      return 'Customer';
+    case 'contract':
+    case 'agreement':
+    case 'deal':
+      return 'Contract';
+    case 'policy':
+    case 'procedure':
+    case 'guideline':
+      return 'Policy';
     case 'concept':
     case 'idea':
     case 'topic':
       return 'Concept';
     default:
-      return 'Concept';
+      return normalized.charAt(0).toUpperCase() + normalized.slice(1);
   }
 }

diff --git a/src/runtime/knowledge/service.ts b/src/runtime/knowledge/service.ts
index 6915871..0b9a65d 100644
--- a/src/runtime/knowledge/service.ts
+++ b/src/runtime/knowledge/service.ts
@@ -11,6 +11,9 @@ import { parseAndEvaluateStatement, Environment } from '../interpreter.js';
 import { escapeString } from './utils.js';
 import type { Instance } from '../module.js';
 import type { Document, KnowledgeContext, ProcessingResult, SourceType } from '../graph/types.js';
+import { insertRows, DbContext } from '../resolvers/sqldb/database.js';
+import { DefaultAuthInfo } from '../resolvers/authinfo.js';
+import crypto from 'crypto';

 // Maximum number of session messages to retain per session (prevents database bloat)
 const MAX_KNOWLEDGE_SESSION_MESSAGES = parseInt(
@@ -628,7 +631,7 @@ export class KnowledgeService {

   /**
    * Batch insert multiple knowledge entities in a single transaction.
-   * Significantly faster than individual inserts for large document processing.
+   * Uses database-level bulk insert for significantly better performance.
    */
   async batchCreateEntities(
     entities: Array<{
@@ -649,49 +652,68 @@ export class KnowledgeService {
     }

     const results: Array<{ id: string; name: string; entityType: string }> = [];
+    const tableName = 'agentlang_knowledge_KnowledgeEntity';

     try {
-      await (env || new Environment()).callInTransaction(async () => {
-        for (const entity of entities) {
-          const query =
-            `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
-            `name "${escapeString(entity.name)}", ` +
-            `entityType "${escapeString(entity.entityType)}", ` +
-            `sourceType "${entity.sourceType}", ` +
-            `__tenant__ "${escapeString(containerTag)}", ` +
-            `isLatest true, ` +
-            `confidence ${entity.confidence || 1.0}` +
-            (entity.description ? `, description "${escapeString(entity.description)}"` : '') +
-            (entity.sourceId ? `, sourceId "${escapeString(entity.sourceId)}"` : '') +
-            (entity.sourceChunk
-              ? `, sourceChunk "${escapeString(entity.sourceChunk.substring(0, 500))}"`
-              : '') +
-            (entity.agentId ? `, agentId "${escapeString(entity.agentId)}"` : '') +
-            `}}`;
-
-          const result = await parseAndEvaluateStatement(query, undefined, env);
-          const inst = Array.isArray(result) ? result[0] : result;
-          if (inst) {
-            results.push({
-              id: inst.lookup('id') as string,
-              name: inst.lookup('name') as string,
-              entityType: inst.lookup('entityType') as string,
-            });
-          }
+      const activeEnv = env || new Environment();
+
+      await activeEnv.callInTransaction(async () => {
+        // Build array of row objects for bulk insert
+        const rows = entities.map(entity => {
+          const id = crypto.randomUUID();
+          const path = `${CoreKnowledgeModuleName}$KnowledgeEntity/${id}`;
+
+          return {
+            id,
+            name: entity.name,
+            entityType: entity.entityType,
+            description: entity.description || null,
+            sourceType: entity.sourceType,
+            sourceId: entity.sourceId || null,
+            sourceChunk: entity.sourceChunk ? entity.sourceChunk.substring(0, 500) : null,
+            instanceId: null,
+            instanceType: null,
+            __tenant__: containerTag,
+            agentId: entity.agentId || null,
+            confidence: entity.confidence || 1.0,
+            isLatest: true,
+            embedding: null,
+            __path__: path,
+          };
+        });
+
+        // Create DbContext with current transaction
+        const ctx = new DbContext(
+          `${CoreKnowledgeModuleName}/KnowledgeEntity`,
+          DefaultAuthInfo,
+          activeEnv,
+          activeEnv.getActiveTransactions().get('sqldb')
+        );
+
+        // Perform bulk insert
+        await insertRows(tableName, rows, ctx, false);
+
+        // Build results from inserted rows
+        for (let i = 0; i < rows.length; i++) {
+          results.push({
+            id: rows[i].id,
+            name: rows[i].name,
+            entityType: rows[i].entityType,
+          });
         }
       });

-      logger.info(`[KNOWLEDGE] Batch created ${results.length} entities in single transaction`);
+      logger.info(`[KNOWLEDGE] Bulk inserted ${results.length} entities in single transaction`);
       return results;
     } catch (err) {
-      logger.error(`[KNOWLEDGE] Batch entity creation failed: ${err}`);
+      logger.error(`[KNOWLEDGE] Bulk entity creation failed: ${err}`);
       throw err;
     }
   }

   /**
    * Batch insert multiple knowledge edges in a single transaction.
-   * Significantly faster than individual inserts for large document processing.
+   * Uses database-level bulk insert for significantly better performance.
    */
   async batchCreateEdges(
     edges: Array<{
@@ -709,36 +731,56 @@ export class KnowledgeService {
     }

     const results: Array<{ id: string; sourceId: string; targetId: string }> = [];
+    const tableName = 'agentlang_knowledge_KnowledgeEdge';

     try {
-      await (env || new Environment()).callInTransaction(async () => {
-        for (const edge of edges) {
-          const query =
-            `{${CoreKnowledgeModuleName}/KnowledgeEdge {` +
-            `sourceId "${edge.sourceId}", ` +
-            `targetId "${edge.targetId}", ` +
-            `relType "${escapeString(edge.relType)}", ` +
-            `weight ${edge.weight || 1.0}, ` +
-            `sourceType "${edge.sourceType}", ` +
-            `__tenant__ "${escapeString(edge.containerTag)}"` +
-            `}}`;
-
-          const result = await parseAndEvaluateStatement(query, undefined, env);
-          const inst = Array.isArray(result) ? result[0] : result;
-          if (inst) {
-            results.push({
-              id: inst.lookup('id') as string,
-              sourceId: inst.lookup('sourceId') as string,
-              targetId: inst.lookup('targetId') as string,
-            });
-          }
+      const activeEnv = env || new Environment();
+
+      await activeEnv.callInTransaction(async () => {
+        // Build array of row objects for bulk insert
+        const rows = edges.map(edge => {
+          const id = crypto.randomUUID();
+          const path = `${CoreKnowledgeModuleName}$KnowledgeEdge/${id}`;
+
+          return {
+            id,
+            sourceId: edge.sourceId,
+            targetId: edge.targetId,
+            relType: edge.relType,
+            weight: edge.weight || 1.0,
+            sourceType: edge.sourceType,
+            __tenant__: edge.containerTag,
+            agentId: null,
+            createdAt: new Date().toISOString(),
+            __path__: path,
+          };
+        });
+
+        // Create DbContext with current transaction
+        const ctx = new DbContext(
+          `${CoreKnowledgeModuleName}/KnowledgeEdge`,
+          DefaultAuthInfo,
+          activeEnv,
+          activeEnv.getActiveTransactions().get('sqldb')
+        );
+
+        // Perform bulk insert
+        await insertRows(tableName, rows, ctx, false);
+
+        // Build results from inserted rows
+        for (let i = 0; i < rows.length; i++) {
+          results.push({
+            id: rows[i].id,
+            sourceId: rows[i].sourceId,
+            targetId: rows[i].targetId,
+          });
         }
       });

-      logger.info(`[KNOWLEDGE] Batch created ${results.length} edges in single transaction`);
+      logger.info(`[KNOWLEDGE] Bulk inserted ${results.length} edges in single transaction`);
       return results;
     } catch (err) {
-      logger.error(`[KNOWLEDGE] Batch edge creation failed: ${err}`);
+      logger.error(`[KNOWLEDGE] Bulk edge creation failed: ${err}`);
       throw err;
     }
   }
diff --git a/src/runtime/knowledge/utils.ts b/src/runtime/knowledge/utils.ts
index e2928e5..711ad0a 100644
--- a/src/runtime/knowledge/utils.ts
+++ b/src/runtime/knowledge/utils.ts
@@ -2,12 +2,15 @@ import type { GraphNode, SourceType } from '../graph/types.js';
 import type { Instance } from '../module.js';

 export const TYPE_PRIORITY: Record<string, number> = {
-  Person: 6,
-  Organization: 5,
+  Customer: 10,
+  Contract: 9,
+  Person: 8,
+  Organization: 7,
+  Product: 6,
   Location: 5,
   Event: 4,
-  Role: 3,
-  Product: 2,
+  Policy: 3,
+  Role: 2,
   Concept: 1,
 };

diff --git a/src/runtime/resolvers/sqldb/database.ts b/src/runtime/resolvers/sqldb/database.ts
index 86901c7..155c5e8 100644
--- a/src/runtime/resolvers/sqldb/database.ts
+++ b/src/runtime/resolvers/sqldb/database.ts
@@ -604,7 +604,9 @@ export async function vectorStoreSearch(
         logger.warn(`[VECTOR] LanceDB store not found for ${tableName}`);
         return [];
       }
-      const results = await store.search(searchVec, tenantId, limit);
+      // Extract agentId from resourceFqName for agent-level filtering
+      const agentId = ctx.resourceFqName || undefined;
+      const results = await store.search(searchVec, tenantId, agentId, limit);
       return results.map(r => ({ id: r.id }));
     }

diff --git a/src/runtime/resolvers/vector/lancedb-store.ts b/src/runtime/resolvers/vector/lancedb-store.ts
index 64156da..594f651 100644
--- a/src/runtime/resolvers/vector/lancedb-store.ts
+++ b/src/runtime/resolvers/vector/lancedb-store.ts
@@ -93,6 +93,7 @@ export class LanceDBVectorStore implements VectorStore {
   async search(
     embedding: number[],
     tenantId?: string,
+    agentId?: string,
     limit: number = 10
   ): Promise<SearchResult[]> {
     if (!this.table) {
@@ -102,8 +103,23 @@ export class LanceDBVectorStore implements VectorStore {
     try {
       let query = this.table.vectorSearch(embedding).limit(limit);

+      // Build filter conditions for agent-level isolation
+      const filters: string[] = [];
+
       if (tenantId) {
-        query = query.where(`tenantId = '${tenantId}'`);
+        // Use parameterized filtering to prevent SQL injection
+        const escapedTenantId = tenantId.replace(/'/g, "''");
+        filters.push(`tenantId = '${escapedTenantId}'`);
+      }
+
+      if (agentId) {
+        // Add agent-level filtering for strict agent isolation
+        const escapedAgentId = agentId.replace(/'/g, "''");
+        filters.push(`agentId = '${escapedAgentId}'`);
+      }
+
+      if (filters.length > 0) {
+        query = query.where(filters.join(' AND '));
       }

       const results = await query.toArray();
diff --git a/src/runtime/resolvers/vector/types.ts b/src/runtime/resolvers/vector/types.ts
index 5ff9077..e6faa3f 100644
--- a/src/runtime/resolvers/vector/types.ts
+++ b/src/runtime/resolvers/vector/types.ts
@@ -20,7 +20,12 @@ export interface VectorStore {
   init(): Promise<void>;
   addEmbedding(record: VectorRecord): Promise<void>;
   addEmbeddings(records: VectorRecord[]): Promise<void>;
-  search(embedding: number[], tenantId?: string, limit?: number): Promise<SearchResult[]>;
+  search(
+    embedding: number[],
+    tenantId?: string,
+    agentId?: string,
+    limit?: number
+  ): Promise<SearchResult[]>;
   delete(id: string): Promise<void>;
   exists(id: string): Promise<boolean>;
   close(): Promise<void>;
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/src/runtime/defs.ts b/src/runtime/defs.ts
index 930955b..fdaa39d 100644
--- a/src/runtime/defs.ts
+++ b/src/runtime/defs.ts
@@ -4,7 +4,8 @@ export const PathAttributeName: string = '__path__';
 export const PathAttributeNameQuery: string = '__path__?';
 export const ParentAttributeName: string = '__parent__';
 export const DeletedFlagAttributeName: string = '__is_deleted__';
-export const TenantAttributeName: string = '__tenant__';
+export const AgentIdAttributeName: string = 'agentId';
+export const TenantAttributeName: string = 'agentId';

 export function isPathAttribute(n: string): boolean {
   return n.startsWith(PathAttributeName);
diff --git a/src/runtime/graph/database.ts b/src/runtime/graph/database.ts
index ed5bf79..d31b23a 100644
--- a/src/runtime/graph/database.ts
+++ b/src/runtime/graph/database.ts
@@ -9,7 +9,7 @@ export interface GraphDatabase {
   createNode(node: GraphNode): Promise<string>;
   upsertNode(node: GraphNode): Promise<string>;
   findNodeById(id: string): Promise<GraphNode | null>;
-  findNodesByContainer(__tenant__: string): Promise<GraphNode[]>;
+  findNodesByContainer(agentId: string): Promise<GraphNode[]>;
   updateNode(id: string, updates: Partial<GraphNode>): Promise<void>;
   deleteNode(id: string): Promise<void>;

@@ -21,14 +21,10 @@ export interface GraphDatabase {
   deleteEdgesForNode(nodeId: string): Promise<void>;

   // Traversal
-  expandGraph(
-    seedIds: string[],
-    maxDepth: number,
-    __tenant__: string
-  ): Promise<GraphTraversalResult>;
+  expandGraph(seedIds: string[], maxDepth: number, agentId: string): Promise<GraphTraversalResult>;

   // Bulk operations
   clearAll(): Promise<void>;
-  clearContainer(__tenant__: string): Promise<void>;
-  getStats(__tenant__?: string): Promise<{ nodeCount: number; edgeCount: number }>;
+  clearContainer(agentId: string): Promise<void>;
+  getStats(agentId?: string): Promise<{ nodeCount: number; edgeCount: number }>;
 }
diff --git a/src/runtime/graph/neo4j-browser.ts b/src/runtime/graph/neo4j-browser.ts
index 750f6d5..6e7cfc8 100644
--- a/src/runtime/graph/neo4j-browser.ts
+++ b/src/runtime/graph/neo4j-browser.ts
@@ -62,7 +62,7 @@ export class Neo4jBrowserDatabase implements GraphDatabase {
     throw new Error('Neo4j Browser: findNodeById not yet implemented');
   }

-  async findNodesByContainer(__tenant__: string): Promise<GraphNode[]> {
+  async findNodesByContainer(_agentId: string): Promise<GraphNode[]> {
     throw new Error('Neo4j Browser: findNodesByContainer not yet implemented');
   }

@@ -97,7 +97,7 @@ export class Neo4jBrowserDatabase implements GraphDatabase {
   async expandGraph(
     _seedIds: string[],
     _maxDepth: number,
-    __tenant__: string
+    _agentId: string
   ): Promise<GraphTraversalResult> {
     throw new Error('Neo4j Browser: expandGraph not yet implemented');
   }
@@ -106,11 +106,11 @@ export class Neo4jBrowserDatabase implements GraphDatabase {
     throw new Error('Neo4j Browser: clearAll not yet implemented');
   }

-  async clearContainer(__tenant__: string): Promise<void> {
+  async clearContainer(_agentId: string): Promise<void> {
     throw new Error('Neo4j Browser: clearContainer not yet implemented');
   }

-  async getStats(__tenant__?: string): Promise<{ nodeCount: number; edgeCount: number }> {
+  async getStats(_agentId?: string): Promise<{ nodeCount: number; edgeCount: number }> {
     throw new Error('Neo4j Browser: getStats not yet implemented');
   }
 }
diff --git a/src/runtime/graph/neo4j.ts b/src/runtime/graph/neo4j.ts
index b23a28c..2aa8171 100644
--- a/src/runtime/graph/neo4j.ts
+++ b/src/runtime/graph/neo4j.ts
@@ -73,7 +73,7 @@ export class Neo4jDatabase implements GraphDatabase {
           id: $id, name: $name, entityType: $entityType, description: $description,
           sourceType: $sourceType, sourceId: $sourceId, sourceChunk: $sourceChunk,
           instanceId: $instanceId, instanceType: $instanceType,
-          __tenant__: $containerTag, agentId: $agentId,
+          agentId: $agentId, agentId: $agentId,
           confidence: $confidence, isLatest: $isLatest,
           createdAt: datetime(), updatedAt: datetime()
         }) RETURN n.id AS id`,
@@ -87,7 +87,7 @@ export class Neo4jDatabase implements GraphDatabase {
           sourceChunk: node.sourceChunk || null,
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
-          containerTag: node.__tenant__,
+          agentId: node.agentId,
           agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
@@ -107,7 +107,7 @@ export class Neo4jDatabase implements GraphDatabase {
          SET n.name = $name, n.entityType = $entityType, n.description = $description,
              n.sourceType = $sourceType, n.sourceId = $sourceId, n.sourceChunk = $sourceChunk,
              n.instanceId = $instanceId, n.instanceType = $instanceType,
-             n.__tenant__ = $containerTag, n.agentId = $agentId,
+             n.agentId = $agentId, n.agentId = $agentId,
              n.confidence = $confidence, n.isLatest = $isLatest,
              n.updatedAt = datetime(),
              n.createdAt = coalesce(n.createdAt, datetime())
@@ -122,7 +122,7 @@ export class Neo4jDatabase implements GraphDatabase {
           sourceChunk: node.sourceChunk || null,
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
-          containerTag: node.__tenant__,
+          agentId: node.agentId,
           agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
@@ -145,11 +145,11 @@ export class Neo4jDatabase implements GraphDatabase {
     }
   }

-  async findNodesByContainer(__tenant__: string): Promise<GraphNode[]> {
+  async findNodesByContainer(agentId: string): Promise<GraphNode[]> {
     const session = this.driver.session();
     try {
-      const result = await session.run('MATCH (n:KnowledgeEntity {__tenant__: $tenant}) RETURN n', {
-        tenant: __tenant__,
+      const result = await session.run('MATCH (n:KnowledgeEntity {agentId: $tenant}) RETURN n', {
+        tenant: agentId,
       });
       return result.records.map((r: any) => this.recordToNode(r.get('n')));
     } finally {
@@ -287,20 +287,20 @@ export class Neo4jDatabase implements GraphDatabase {
   async expandGraph(
     seedIds: string[],
     maxDepth: number,
-    __tenant__: string
+    agentId: string
   ): Promise<GraphTraversalResult> {
     const session = this.driver.session();
     try {
       const result = await session.run(
         `MATCH (seed:KnowledgeEntity)
-         WHERE seed.id IN $seedIds AND seed.__tenant__ = $tenant
+         WHERE seed.id IN $seedIds AND seed.agentId = $tenant
          CALL apoc.path.subgraphAll(seed, {maxLevel: $maxDepth, labelFilter: 'KnowledgeEntity'})
          YIELD nodes, relationships
          UNWIND nodes AS n
          UNWIND relationships AS r
          RETURN DISTINCT n, startNode(r).id AS srcId, endNode(r).id AS tgtId,
                 type(r) AS relType, r.weight AS weight`,
-        { seedIds, maxDepth, tenant: __tenant__ }
+        { seedIds, maxDepth, tenant: agentId }
       );

       const nodeMap = new Map<string, GraphNode>();
@@ -327,7 +327,7 @@ export class Neo4jDatabase implements GraphDatabase {
       return { nodes: Array.from(nodeMap.values()), edges };
     } catch {
       // Fallback to manual BFS if APOC is not available
-      return await this.expandGraphBFS(seedIds, maxDepth, __tenant__, session);
+      return await this.expandGraphBFS(seedIds, maxDepth, agentId, session);
     } finally {
       await session.close();
     }
@@ -336,7 +336,7 @@ export class Neo4jDatabase implements GraphDatabase {
   private async expandGraphBFS(
     seedIds: string[],
     maxDepth: number,
-    __tenant__: string,
+    agentId: string,
     session: any
   ): Promise<GraphTraversalResult> {
     const visited = new Set<string>();
@@ -347,10 +347,10 @@ export class Neo4jDatabase implements GraphDatabase {
     for (let depth = 0; depth <= maxDepth && currentIds.length > 0; depth++) {
       const result = await session.run(
         `MATCH (n:KnowledgeEntity)
-         WHERE n.id IN $ids AND n.__tenant__ = $tenant
-         OPTIONAL MATCH (n)-[r]->(m:KnowledgeEntity {__tenant__: $tenant})
+         WHERE n.id IN $ids AND n.agentId = $tenant
+         OPTIONAL MATCH (n)-[r]->(m:KnowledgeEntity {agentId: $tenant})
          RETURN n, m, r, type(r) AS relType, r.weight AS weight`,
-        { ids: currentIds, tenant: __tenant__ }
+        { ids: currentIds, tenant: agentId }
       );

       const nextIds: string[] = [];
@@ -393,28 +393,28 @@ export class Neo4jDatabase implements GraphDatabase {
     }
   }

-  async clearContainer(__tenant__: string): Promise<void> {
+  async clearContainer(agentId: string): Promise<void> {
     const session = this.driver.session();
     try {
-      await session.run('MATCH (n:KnowledgeEntity {__tenant__: $tenant}) DETACH DELETE n', {
-        tenant: __tenant__,
+      await session.run('MATCH (n:KnowledgeEntity {agentId: $tenant}) DETACH DELETE n', {
+        tenant: agentId,
       });
     } finally {
       await session.close();
     }
   }

-  async getStats(__tenant__?: string): Promise<{ nodeCount: number; edgeCount: number }> {
+  async getStats(agentId?: string): Promise<{ nodeCount: number; edgeCount: number }> {
     const session = this.driver.session();
     try {
-      const whereClause = __tenant__ ? 'WHERE n.__tenant__ = $tenant' : '';
+      const whereClause = agentId ? 'WHERE n.agentId = $tenant' : '';
       const nodeResult = await session.run(
         `MATCH (n:KnowledgeEntity) ${whereClause} RETURN count(n) AS cnt`,
-        __tenant__ ? { tenant: __tenant__ } : {}
+        agentId ? { tenant: agentId } : {}
       );
       const edgeResult = await session.run(
         `MATCH (n:KnowledgeEntity)${whereClause ? ' ' + whereClause : ''}-[r]->() RETURN count(r) AS cnt`,
-        __tenant__ ? { tenant: __tenant__ } : {}
+        agentId ? { tenant: agentId } : {}
       );
       return {
         nodeCount: nodeResult.records[0]?.get('cnt')?.toNumber?.() ?? 0,
@@ -437,7 +437,7 @@ export class Neo4jDatabase implements GraphDatabase {
       sourceChunk: props.sourceChunk,
       instanceId: props.instanceId,
       instanceType: props.instanceType,
-      __tenant__: props.__tenant__,
+      agentId: props.agentId,
       agentId: props.agentId,
       confidence: props.confidence ?? 1.0,
       createdAt: props.createdAt ? new Date(props.createdAt) : new Date(),
diff --git a/src/runtime/graph/types.ts b/src/runtime/graph/types.ts
index cb1bcc7..9fcd134 100644
--- a/src/runtime/graph/types.ts
+++ b/src/runtime/graph/types.ts
@@ -11,8 +11,7 @@ export interface GraphNode {
   sourceChunk?: string;
   instanceId?: string;
   instanceType?: string;
-  __tenant__: string;
-  agentId?: string;
+  agentId: string;
   confidence: number;
   createdAt: Date;
   updatedAt: Date;
diff --git a/src/runtime/knowledge/context-builder.ts b/src/runtime/knowledge/context-builder.ts
index 16ab8fc..2217279 100644
--- a/src/runtime/knowledge/context-builder.ts
+++ b/src/runtime/knowledge/context-builder.ts
@@ -143,7 +143,7 @@ export class ContextBuilder {
       // Limit to MAX_SEED_NODES to prevent memory issues
       const result: Instance[] = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeEntity {
-          __tenant__? "${escapeString(tenantId)}",
+          agentId? "${escapeString(tenantId)}",
           name? "${escapeString(query)}"},
           @limit ${MAX_SEED_NODES}}`,
         undefined
@@ -174,7 +174,7 @@ export class ContextBuilder {
       try {
         const result: Instance[] = await parseAndEvaluateStatement(
           `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
-            `__tenant__? "${escapeString(tenantId)}", ` +
+            `agentId? "${escapeString(tenantId)}", ` +
             `name? "${escapeString(candidate)}"}, ` +
             `@limit ${MAX_SEED_NODES}}`,
           undefined
@@ -410,7 +410,7 @@ function mergeEdges(existing: GraphEdge[], incoming: GraphEdge[], limit: number)
 function groupNodesByContainer(nodes: GraphNode[]): Map<string, GraphNode[]> {
   const grouped = new Map<string, GraphNode[]>();
   for (const node of nodes) {
-    const tag = node.__tenant__ || '';
+    const tag = node.agentId || '';
     if (!grouped.has(tag)) grouped.set(tag, []);
     grouped.get(tag)!.push(node);
   }
diff --git a/src/runtime/knowledge/conversation-processor.ts b/src/runtime/knowledge/conversation-processor.ts
index 9d12cad..218ed28 100644
--- a/src/runtime/knowledge/conversation-processor.ts
+++ b/src/runtime/knowledge/conversation-processor.ts
@@ -193,7 +193,7 @@ export class ConversationProcessor {
           `relType "${escapeString(edge.relationship)}", ` +
           `weight ${edge.weight}, ` +
           `sourceType "CONVERSATION", ` +
-          `__tenant__ "${escapeString(containerTag)}"` +
+          `agentId "${escapeString(containerTag)}"` +
           (agentId ? `, agentId "${escapeString(agentId)}"` : '') +
           `}}`,
         undefined
diff --git a/src/runtime/knowledge/deduplicator.ts b/src/runtime/knowledge/deduplicator.ts
index 1bfa0be..1ecdba6 100644
--- a/src/runtime/knowledge/deduplicator.ts
+++ b/src/runtime/knowledge/deduplicator.ts
@@ -255,7 +255,7 @@ export class SemanticDeduplicator {
       // Query directly by name instead of scanning recent nodes
       // This is much more efficient and prevents duplicates
       const result: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEntity {name? "${escapeString(normalizedName)}", containerTag? "${escapeString(containerTag)}", __tenant__? "${escapeString(tenantId)}", isLatest? true}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {name? "${escapeString(normalizedName)}", containerTag? "${escapeString(containerTag)}", agentId? "${escapeString(tenantId)}", isLatest? true}}`,
         undefined
       );

@@ -280,7 +280,7 @@ export class SemanticDeduplicator {
       const result: Instance[] = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
           `containerTag? "${escapeString(containerTag)}", ` +
-          `__tenant__? "${escapeString(tenantId)}", ` +
+          `agentId? "${escapeString(tenantId)}", ` +
           `isLatest? true, ` +
           `name? "${escapeString(entity.name)}"}, ` +
           `@limit ${MAX_SIMILAR_CANDIDATES}}`,
@@ -453,7 +453,7 @@ export class SemanticDeduplicator {
       `name "${escapeString(entity.name)}", ` +
       `entityType "${escapeString(entity.entityType)}", ` +
       `sourceType "${sourceType}", ` +
-      `__tenant__ "${containerTag}", ` +
+      `agentId "${containerTag}", ` +
       `isLatest true, ` +
       `confidence 1.0`;

diff --git a/src/runtime/knowledge/document-processor.ts b/src/runtime/knowledge/document-processor.ts
index 41913eb..617f38d 100644
--- a/src/runtime/knowledge/document-processor.ts
+++ b/src/runtime/knowledge/document-processor.ts
@@ -389,7 +389,7 @@ export class DocumentProcessor {
           `relType "${escapeString(edge.relationship)}", ` +
           `weight ${edge.weight}, ` +
           `sourceType "DOCUMENT", ` +
-          `__tenant__ "${escapeString(containerTag)}"` +
+          `agentId "${escapeString(containerTag)}"` +
           (agentId ? `, agentId "${escapeString(agentId)}"` : '') +
           `}}`,
         undefined
diff --git a/src/runtime/knowledge/service.ts b/src/runtime/knowledge/service.ts
index 0b9a65d..42f8ee2 100644
--- a/src/runtime/knowledge/service.ts
+++ b/src/runtime/knowledge/service.ts
@@ -23,9 +23,7 @@ const MAX_KNOWLEDGE_SESSION_MESSAGES = parseInt(

 export interface KnowledgeSessionContext {
   sessionId: string;
-  userId: string;
   agentId: string;
-  containerTag: string; // Now agent-only: "agentFqName" (no user suffix)
 }

 let knowledgeServiceInstance: KnowledgeService | null = null;
@@ -108,24 +106,19 @@ export class KnowledgeService {

   // --- Session Management ---

-  async getOrCreateSession(
-    agentId: string,
-    userId: string,
-    agentFqName: string
-  ): Promise<KnowledgeSessionContext> {
+  async getOrCreateSession(agentFqName: string): Promise<KnowledgeSessionContext> {
     if (!this.enabled) {
       throw new Error('Knowledge base is disabled: no embedding API key configured');
     }

-    // Agent-level isolation only - all users share the same knowledge graph
-    const containerTag = agentFqName;
+    const agentId = agentFqName;

     try {
       const result = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeSession {
           agentId? "${escapeString(agentId)}",
-          userId? "${escapeString(userId)}",
-          __tenant__? "${escapeString(containerTag)}"}}`,
+
+          agentId? "${escapeString(agentId)}"}}`,
         undefined
       );

@@ -133,17 +126,15 @@ export class KnowledgeService {
         const session = result[0];
         return {
           sessionId: session.lookup('id'),
-          userId: session.lookup('userId'),
           agentId: session.lookup('agentId'),
-          containerTag: session.lookup('__tenant__'),
         };
       }

-      logger.info(`[KNOWLEDGE] Creating new session for agentId=${agentId}, userId=${userId}`);
+      logger.info(`[KNOWLEDGE] Creating new session for agentId=${agentId}`);
       const sessionStatement = `{${CoreKnowledgeModuleName}/KnowledgeSession {
           agentId "${escapeString(agentId)}",
-          userId "${escapeString(userId)}",
-          __tenant__ "${escapeString(containerTag)}",
+
+          agentId "${escapeString(agentId)}",
           messages "[]",
           createdAt now(),
           lastActivity now()}}`;
@@ -161,8 +152,8 @@ export class KnowledgeService {
       const queryResult = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeSession {
           agentId? "${escapeString(agentId)}",
-          userId? "${escapeString(userId)}",
-          __tenant__? "${escapeString(containerTag)}"}}`,
+
+          agentId? "${escapeString(agentId)}"}}`,
         undefined
       );

@@ -171,9 +162,7 @@ export class KnowledgeService {
         logger.info(`[KNOWLEDGE] Session found after creation: ${session.lookup('id')}`);
         return {
           sessionId: session.lookup('id'),
-          userId: session.lookup('userId'),
           agentId: session.lookup('agentId'),
-          containerTag: session.lookup('__tenant__'),
         };
       }

@@ -489,7 +478,7 @@ export class KnowledgeService {
       await this.graphDb.clearContainer(containerTag);

       const nodeResults: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEntity {          __tenant__? "${escapeString(containerTag)}", isLatest? true}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {          agentId? "${escapeString(agentId)}", isLatest? true}}`,
         undefined
       );

@@ -511,7 +500,7 @@ export class KnowledgeService {
             sourceChunk: inst.lookup('sourceChunk') as string | undefined,
             instanceId: inst.lookup('instanceId') as string | undefined,
             instanceType: inst.lookup('instanceType') as string | undefined,
-            __tenant__: inst.lookup('__tenant__') as string,
+            agentId: inst.lookup('agentId') as string,
             agentId: inst.lookup('agentId') as string | undefined,
             confidence: (inst.lookup('confidence') as number) || 1.0,
             createdAt: new Date(),
@@ -526,7 +515,7 @@ export class KnowledgeService {
       }

       const edgeResults: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEdge {__tenant__? "${escapeString(containerTag)}"}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEdge {agentId? "${escapeString(agentId)}"}}`,
         undefined
       );

@@ -581,7 +570,7 @@ export class KnowledgeService {

       const containerTags = new Set<string>();
       for (const inst of allNodes) {
-        const tag = inst.lookup('__tenant__') as string;
+        const tag = inst.lookup('agentId') as string;
         if (tag) containerTags.add(tag);
       }

@@ -673,7 +662,7 @@ export class KnowledgeService {
             sourceChunk: entity.sourceChunk ? entity.sourceChunk.substring(0, 500) : null,
             instanceId: null,
             instanceType: null,
-            __tenant__: containerTag,
+            agentId: containerTag,
             agentId: entity.agentId || null,
             confidence: entity.confidence || 1.0,
             isLatest: true,
@@ -749,7 +738,7 @@ export class KnowledgeService {
             relType: edge.relType,
             weight: edge.weight || 1.0,
             sourceType: edge.sourceType,
-            __tenant__: edge.containerTag,
+            agentId: edge.containerTag,
             agentId: null,
             createdAt: new Date().toISOString(),
             __path__: path,
diff --git a/src/runtime/knowledge/utils.ts b/src/runtime/knowledge/utils.ts
index 711ad0a..f0014cd 100644
--- a/src/runtime/knowledge/utils.ts
+++ b/src/runtime/knowledge/utils.ts
@@ -34,7 +34,7 @@ export function instanceToGraphNode(inst: Instance): GraphNode {
     sourceChunk: inst.lookup('sourceChunk') as string | undefined,
     instanceId: inst.lookup('instanceId') as string | undefined,
     instanceType: inst.lookup('instanceType') as string | undefined,
-    __tenant__: inst.lookup('__tenant__') as string,
+    agentId: inst.lookup('agentId') as string,
     agentId: inst.lookup('agentId') as string | undefined,
     confidence: (inst.lookup('confidence') as number) || 1.0,
     createdAt: new Date(),
diff --git a/src/runtime/modules/knowledge.ts b/src/runtime/modules/knowledge.ts
index 4fa3502..a5de76c 100644
--- a/src/runtime/modules/knowledge.ts
+++ b/src/runtime/modules/knowledge.ts
@@ -14,8 +14,7 @@ entity KnowledgeEntity {
     sourceChunk String @optional,
     instanceId String @optional,
     instanceType String @optional,
-    __tenant__ String,
-    agentId String @optional,
+    agentId String,
     confidence Float @default(1.0),
     isLatest Boolean @default(true),
     embedding String @optional,
@@ -29,16 +28,13 @@ entity KnowledgeEdge {
     relType String,
     weight Float @default(1.0),
     sourceType @enum("DOCUMENT", "CONVERSATION", "INSTANCE", "DERIVED") @optional,
-    __tenant__ String,
-    agentId String @optional,
+    agentId String,
     createdAt DateTime @default(now())
 }

 entity KnowledgeSession {
     id UUID @id @default(uuid()),
-    userId String @optional,
     agentId String,
-    __tenant__ String,
     messages String,
     contextNodeIds String @optional,
     activeInstances String @optional,
@@ -60,8 +56,7 @@ entity SessionMessage {
 entity DocumentGroup {
     id UUID @id @default(uuid()),
     name String,
-    __tenant__ String,
-    agentId String @optional,
+    agentId String,
     createdAt DateTime @default(now())
 }

@@ -72,8 +67,7 @@ entity Document {
     sourceType @enum("FILE", "URL", "TEXT", "CONVERSATION"),
     sourceUrl String @optional,
     documentGroupId String @optional,
-    __tenant__ String,
-    agentId String @optional,
+    agentId String,
     createdAt DateTime @default(now()),
     @meta {"fullTextSearch": ["title", "content"]}
 }
diff --git a/src/runtime/resolvers/sqldb/database.ts b/src/runtime/resolvers/sqldb/database.ts
index 155c5e8..9dfd646 100644
--- a/src/runtime/resolvers/sqldb/database.ts
+++ b/src/runtime/resolvers/sqldb/database.ts
@@ -532,7 +532,7 @@ export async function addRowForFullTextSearch(
       await qb
         .insert()
         .into(vecTableName)
-        .values([{ id: id, embedding: pgvector.toSql(vect), __tenant__: tenantId }])
+        .values([{ id: id, embedding: pgvector.toSql(vect), agentId: tenantId }])
         .execute();
     }
     logger.info(`[VECTOR] Successfully saved embedding to ${vecTableName} for ${id}`);
@@ -872,7 +872,7 @@ async function createLimitedOwnership(
       r: perms.has(RbacPermissionFlag.READ),
       d: perms.has(RbacPermissionFlag.DELETE),
       u: perms.has(RbacPermissionFlag.UPDATE),
-      __tenant__: tenantId,
+      agentId: tenantId,
     });
   });
   const tname = ownersTable(tableName);
diff --git a/package-lock.json b/package-lock.json
index 791fb7e..f4bdfb4 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -27,6 +27,8 @@
         "buffer": "^6.0.3",
         "chalk": "~5.6.2",
         "commander": "~14.0.2",
+        "date-fns": "^4.1.0",
+        "date-fns-tz": "^3.2.0",
         "deep-object-diff": "^1.1.9",
         "express": "^5.2.1",
         "handlebars": "^4.7.8",
@@ -45,6 +47,7 @@
         "vscode-languageserver": "^9.0.1",
         "winston": "^3.19.0",
         "winston-daily-rotate-file": "^5.0.0",
+        "xlsx": "^0.18.5",
         "zod": "^4.3.6"
       },
       "bin": {
@@ -4281,6 +4284,15 @@
         "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0"
       }
     },
+    "node_modules/adler-32": {
+      "version": "1.3.1",
+      "resolved": "https://registry.npmjs.org/adler-32/-/adler-32-1.3.1.tgz",
+      "integrity": "sha512-ynZ4w/nUUv5rrsR8UUGoe1VC9hZj6V5hU9Qw1HlMDJGEJw5S7TfTErWTjMys6M7vr0YWcPqs3qAr4ss0nDfP+A==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/ajv": {
       "version": "8.17.1",
       "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz",
@@ -5097,6 +5109,19 @@
         "url": "https://github.com/sponsors/sindresorhus"
       }
     },
+    "node_modules/cfb": {
+      "version": "1.2.2",
+      "resolved": "https://registry.npmjs.org/cfb/-/cfb-1.2.2.tgz",
+      "integrity": "sha512-KfdUZsSOw19/ObEWasvBP/Ac4reZvAGauZhs6S/gqNhXhI7cKwvlH7ulj+dOEYnca4bm4SGo8C1bTAQvnTjgQA==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "adler-32": "~1.3.0",
+        "crc-32": "~1.2.0"
+      },
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/chai": {
       "version": "6.2.2",
       "resolved": "https://registry.npmjs.org/chai/-/chai-6.2.2.tgz",
@@ -5321,6 +5346,15 @@
         "node": ">=12"
       }
     },
+    "node_modules/codepage": {
+      "version": "1.15.0",
+      "resolved": "https://registry.npmjs.org/codepage/-/codepage-1.15.0.tgz",
+      "integrity": "sha512-3g6NUTPd/YtuuGrhMnOMRjFc+LJw/bnMp3+0r/Wcz3IXUuCosKRJvMphm5+Q+bvTVGcJJuRvVLuYba+WojaFaA==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/color": {
       "version": "5.0.3",
       "resolved": "https://registry.npmjs.org/color/-/color-5.0.3.tgz",
@@ -5650,6 +5684,18 @@
         "node": ">= 0.4.0"
       }
     },
+    "node_modules/crc-32": {
+      "version": "1.2.2",
+      "resolved": "https://registry.npmjs.org/crc-32/-/crc-32-1.2.2.tgz",
+      "integrity": "sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==",
+      "license": "Apache-2.0",
+      "bin": {
+        "crc32": "bin/crc32.njs"
+      },
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/create-ecdh": {
       "version": "4.0.4",
       "resolved": "https://registry.npmjs.org/create-ecdh/-/create-ecdh-4.0.4.tgz",
@@ -6927,6 +6973,15 @@
         "node": ">= 0.6"
       }
     },
+    "node_modules/frac": {
+      "version": "1.1.2",
+      "resolved": "https://registry.npmjs.org/frac/-/frac-1.1.2.tgz",
+      "integrity": "sha512-w/XBfkibaTl3YDqASwfDUqkna4Z2p9cFSr1aHDt0WoMTECnRfBOv2WArlZILlqgWlmdIlALXGpM2AOhEk5W3IA==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/fresh": {
       "version": "2.0.0",
       "resolved": "https://registry.npmjs.org/fresh/-/fresh-2.0.0.tgz",
@@ -10499,6 +10554,18 @@
         "win32"
       ]
     },
+    "node_modules/ssf": {
+      "version": "0.11.2",
+      "resolved": "https://registry.npmjs.org/ssf/-/ssf-0.11.2.tgz",
+      "integrity": "sha512-+idbmIXoYET47hH+d7dfm2epdOMUDjqcB4648sTZ+t2JwoyBFL/insLfB/racrDmsKB3diwsDA696pZMieAC5g==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "frac": "~1.1.2"
+      },
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/stack-trace": {
       "version": "0.0.10",
       "resolved": "https://registry.npmjs.org/stack-trace/-/stack-trace-0.0.10.tgz",
@@ -12035,6 +12102,24 @@
         "node": ">= 12.0.0"
       }
     },
+    "node_modules/wmf": {
+      "version": "1.0.2",
+      "resolved": "https://registry.npmjs.org/wmf/-/wmf-1.0.2.tgz",
+      "integrity": "sha512-/p9K7bEh0Dj6WbXg4JG0xvLQmIadrner1bi45VMJTfnbVHsc7yIajZyoSoK60/dtVBs12Fm6WkUI5/3WAVsNMw==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
+    "node_modules/word": {
+      "version": "0.3.0",
+      "resolved": "https://registry.npmjs.org/word/-/word-0.3.0.tgz",
+      "integrity": "sha512-OELeY0Q61OXpdUfTp+oweA/vtLVg5VDOXh+3he3PNzLGG/y0oylSOC1xRVj0+l4vQ3tj/bB1HVHv1ocXkQceFA==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/word-wrap": {
       "version": "1.2.5",
       "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",
@@ -12168,6 +12253,27 @@
       "integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==",
       "license": "ISC"
     },
+    "node_modules/xlsx": {
+      "version": "0.18.5",
+      "resolved": "https://registry.npmjs.org/xlsx/-/xlsx-0.18.5.tgz",
+      "integrity": "sha512-dmg3LCjBPHZnQp5/F/+nnTa+miPJxUXB6vtk42YjBBKayDNagxGEeIdWApkYPOf3Z3pm3k62Knjzp7lMeTEtFQ==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "adler-32": "~1.3.0",
+        "cfb": "~1.2.1",
+        "codepage": "~1.15.0",
+        "crc-32": "~1.2.1",
+        "ssf": "~0.11.2",
+        "wmf": "~1.0.1",
+        "word": "~0.3.0"
+      },
+      "bin": {
+        "xlsx": "bin/xlsx.njs"
+      },
+      "engines": {
+        "node": ">=0.8"
+      }
+    },
     "node_modules/xtend": {
       "version": "4.0.2",
       "resolved": "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz",
diff --git a/package.json b/package.json
index c38800b..9e529d2 100644
--- a/package.json
+++ b/package.json
@@ -56,10 +56,10 @@
     "@aws-sdk/client-s3": "^3.975.0",
     "@aws-sdk/credential-providers": "^3.975.0",
     "@isomorphic-git/lightning-fs": "^4.6.2",
+    "@lancedb/lancedb": "^0.15.0",
     "@langchain/anthropic": "^1.3.12",
     "@langchain/core": "^1.1.17",
     "@langchain/openai": "^1.2.3",
-    "@lancedb/lancedb": "^0.15.0",
     "@modelcontextprotocol/sdk": "^1.25.3",
     "@types/multer": "^2.0.0",
     "amazon-cognito-identity-js": "^6.3.16",
@@ -69,6 +69,8 @@
     "buffer": "^6.0.3",
     "chalk": "~5.6.2",
     "commander": "~14.0.2",
+    "date-fns": "^4.1.0",
+    "date-fns-tz": "^3.2.0",
     "deep-object-diff": "^1.1.9",
     "express": "^5.2.1",
     "handlebars": "^4.7.8",
@@ -87,6 +89,7 @@
     "vscode-languageserver": "^9.0.1",
     "winston": "^3.19.0",
     "winston-daily-rotate-file": "^5.0.0",
+    "xlsx": "^0.18.5",
     "zod": "^4.3.6"
   },
   "overrides": {
Signed-off-by: Pratik Karki <pratik@fractl.io>

diff --git a/dracula.db b/dracula.db
deleted file mode 100644
index 5f33f24..0000000
Binary files a/dracula.db and /dev/null differ
diff --git a/dracula.db-shm b/dracula.db-shm
deleted file mode 100644
index 080441b..0000000
Binary files a/dracula.db-shm and /dev/null differ
diff --git a/dracula.db-wal b/dracula.db-wal
deleted file mode 100644
index ff8a8c4..0000000
Binary files a/dracula.db-wal and /dev/null differ
diff --git a/example/custom_config/config.al b/example/custom_config/config.al
index c432d25..75d9ba7 100644
--- a/example/custom_config/config.al
+++ b/example/custom_config/config.al
@@ -12,7 +12,12 @@
       "dbname": "./data/vector-store/cc-vectors.lance"
     },
     "knowledgeGraph": {
-      "enabled": true
+      "enabled": true,
+      "neo4j": {
+        "uri": "#js process.env.GRAPH_DB_URI || 'bolt://localhost:7687'",
+        "user": "#js process.env.GRAPH_DB_USER || 'neo4j'",
+        "password": "#js process.env.GRAPH_DB_PASSWORD || 'password'"
+      }
     },
     "retry": [
       {
diff --git a/example/dracula/config.al b/example/dracula/config.al
index 9d0caf5..51fd2f4 100644
--- a/example/dracula/config.al
+++ b/example/dracula/config.al
@@ -11,7 +11,12 @@
       "type": "lancedb"
     },
     "knowledgeGraph": {
-      "enabled": true
+      "enabled": true,
+      "neo4j": {
+        "uri": "#js process.env.GRAPH_DB_URI || 'bolt://localhost:7687'",
+        "user": "#js process.env.GRAPH_DB_USER || 'neo4j'",
+        "password": "#js process.env.GRAPH_DB_PASSWORD || 'password'"
+      }
     }
    }
 }
diff --git a/src/runtime/graph/neo4j.ts b/src/runtime/graph/neo4j.ts
index 2aa8171..7af0cbc 100644
--- a/src/runtime/graph/neo4j.ts
+++ b/src/runtime/graph/neo4j.ts
@@ -73,7 +73,7 @@ export class Neo4jDatabase implements GraphDatabase {
           id: $id, name: $name, entityType: $entityType, description: $description,
           sourceType: $sourceType, sourceId: $sourceId, sourceChunk: $sourceChunk,
           instanceId: $instanceId, instanceType: $instanceType,
-          agentId: $agentId, agentId: $agentId,
+          agentId: $agentId,
           confidence: $confidence, isLatest: $isLatest,
           createdAt: datetime(), updatedAt: datetime()
         }) RETURN n.id AS id`,
@@ -88,7 +88,6 @@ export class Neo4jDatabase implements GraphDatabase {
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
           agentId: node.agentId,
-          agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
         }
@@ -107,7 +106,7 @@ export class Neo4jDatabase implements GraphDatabase {
          SET n.name = $name, n.entityType = $entityType, n.description = $description,
              n.sourceType = $sourceType, n.sourceId = $sourceId, n.sourceChunk = $sourceChunk,
              n.instanceId = $instanceId, n.instanceType = $instanceType,
-             n.agentId = $agentId, n.agentId = $agentId,
+             n.agentId = $agentId,
              n.confidence = $confidence, n.isLatest = $isLatest,
              n.updatedAt = datetime(),
              n.createdAt = coalesce(n.createdAt, datetime())
@@ -123,7 +122,6 @@ export class Neo4jDatabase implements GraphDatabase {
           instanceId: node.instanceId || null,
           instanceType: node.instanceType || null,
           agentId: node.agentId,
-          agentId: node.agentId || null,
           confidence: node.confidence,
           isLatest: node.isLatest,
         }
@@ -438,7 +436,6 @@ export class Neo4jDatabase implements GraphDatabase {
       instanceId: props.instanceId,
       instanceType: props.instanceType,
       agentId: props.agentId,
-      agentId: props.agentId,
       confidence: props.confidence ?? 1.0,
       createdAt: props.createdAt ? new Date(props.createdAt) : new Date(),
       updatedAt: props.updatedAt ? new Date(props.updatedAt) : new Date(),
diff --git a/src/runtime/knowledge/conversation-processor.ts b/src/runtime/knowledge/conversation-processor.ts
index 218ed28..d769702 100644
--- a/src/runtime/knowledge/conversation-processor.ts
+++ b/src/runtime/knowledge/conversation-processor.ts
@@ -88,11 +88,9 @@ export class ConversationProcessor {
         const node = await this.deduplicator.findOrCreateNode(
           entity,
           containerTag,
-          containerTag,
           'CONVERSATION',
           sessionId,
-          `User: ${userMessage}\nAssistant: ${assistantResponse}`,
-          agentId
+          `User: ${userMessage}\nAssistant: ${assistantResponse}`
         );
         const canonicalKey = entity.name.toLowerCase();
         nodeMap.set(canonicalKey, node);
diff --git a/src/runtime/knowledge/deduplicator.ts b/src/runtime/knowledge/deduplicator.ts
index 1ecdba6..a0f7d47 100644
--- a/src/runtime/knowledge/deduplicator.ts
+++ b/src/runtime/knowledge/deduplicator.ts
@@ -89,19 +89,17 @@ export class SemanticDeduplicator {
    */
   async findOrCreateNode(
     entity: ExtractedEntity,
-    containerTag: string,
-    tenantId: string,
+    agentId: string,
     sourceType: SourceType,
     sourceId?: string,
-    sourceChunk?: string,
-    agentId?: string
+    sourceChunk?: string
   ): Promise<GraphNode> {
     // Normalize the entity name
     const normalizedName = this.normalizeNodeName(entity.name);
     entity.name = normalizedName; // Use normalized name

     // Step 1: Check for exact name match (fast, no memory overhead)
-    const existingNode = await this.findNodeByExactName(normalizedName, containerTag, tenantId);
+    const existingNode = await this.findNodeByExactName(normalizedName, agentId);

     if (existingNode) {
       logger.debug(`[KNOWLEDGE] Found existing node "${existingNode.name}" for "${entity.name}"`);
@@ -109,7 +107,7 @@ export class SemanticDeduplicator {
     }

     // Step 1b: Semantic similarity search (vector search + string similarity)
-    const similarNode = await this.findSimilarNode(entity, containerTag, tenantId);
+    const similarNode = await this.findSimilarNode(entity, agentId);
     if (similarNode) {
       logger.debug(`[KNOWLEDGE] Found similar node "${similarNode.name}" for "${entity.name}"`);
       return await this.mergeNode(similarNode, entity);
@@ -135,15 +133,7 @@ export class SemanticDeduplicator {
     }

     // Step 3: Create new node (embedding will be stored by the resolver)
-    return await this.createNewNode(
-      entity,
-      containerTag,
-      sourceType,
-      sourceId,
-      sourceChunk,
-      agentId,
-      embedding
-    );
+    return await this.createNewNode(entity, agentId, sourceType, sourceId, sourceChunk, embedding);
   }

   /**
@@ -153,12 +143,10 @@ export class SemanticDeduplicator {
    */
   async findOrCreateNodesBatch(
     entities: ExtractedEntity[],
-    containerTag: string,
-    tenantId: string,
+    agentId: string,
     sourceType: SourceType,
     sourceId?: string,
-    sourceChunks?: Map<string, string>,
-    agentId?: string
+    sourceChunks?: Map<string, string>
   ): Promise<GraphNode[]> {
     if (entities.length === 0) return [];

@@ -171,7 +159,7 @@ export class SemanticDeduplicator {
       entity.name = this.normalizeNodeName(entity.name);

       // Check exact name match
-      const existingNode = await this.findNodeByExactName(entity.name, containerTag, tenantId);
+      const existingNode = await this.findNodeByExactName(entity.name, agentId!);
       if (existingNode) {
         logger.debug(`[KNOWLEDGE] Batch: found existing node "${existingNode.name}"`);
         results.push(await this.mergeNode(existingNode, entity));
@@ -179,7 +167,7 @@ export class SemanticDeduplicator {
       }

       // Check similarity match
-      const similarNode = await this.findSimilarNode(entity, containerTag, tenantId);
+      const similarNode = await this.findSimilarNode(entity, agentId!);
       if (similarNode) {
         logger.debug(`[KNOWLEDGE] Batch: found similar node "${similarNode.name}"`);
         results.push(await this.mergeNode(similarNode, entity));
@@ -224,11 +212,10 @@ export class SemanticDeduplicator {
       const embedding = embeddings[i] || null;
       const node = await this.createNewNode(
         entity,
-        containerTag,
+        agentId!,
         sourceType,
         sourceId,
         sourceChunk,
-        agentId,
         embedding
       );
       results[index] = node;
@@ -248,14 +235,13 @@ export class SemanticDeduplicator {
    */
   private async findNodeByExactName(
     normalizedName: string,
-    containerTag: string,
-    tenantId: string
+    agentId: string
   ): Promise<GraphNode | null> {
     try {
       // Query directly by name instead of scanning recent nodes
       // This is much more efficient and prevents duplicates
       const result: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEntity {name? "${escapeString(normalizedName)}", containerTag? "${escapeString(containerTag)}", agentId? "${escapeString(tenantId)}", isLatest? true}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {name? "${escapeString(normalizedName)}", agentId? "${escapeString(agentId)}", isLatest? true}}`,
         undefined
       );

@@ -271,16 +257,14 @@ export class SemanticDeduplicator {

   private async findSimilarNode(
     entity: ExtractedEntity,
-    containerTag: string,
-    tenantId: string
+    agentId: string
   ): Promise<GraphNode | null> {
     try {
       // Search using entity name only — do NOT pollute with type/description
       // which degrades full-text search quality
       const result: Instance[] = await parseAndEvaluateStatement(
         `{${CoreKnowledgeModuleName}/KnowledgeEntity {` +
-          `containerTag? "${escapeString(containerTag)}", ` +
-          `agentId? "${escapeString(tenantId)}", ` +
+          `agentId? "${escapeString(agentId)}", ` +
           `isLatest? true, ` +
           `name? "${escapeString(entity.name)}"}, ` +
           `@limit ${MAX_SIMILAR_CANDIDATES}}`,
@@ -373,11 +357,10 @@ export class SemanticDeduplicator {
   async supersedeNode(
     existingId: string,
     replacement: ExtractedEntity,
-    containerTag: string,
+    agentId: string,
     sourceType: SourceType,
     sourceId?: string,
-    sourceChunk?: string,
-    agentId?: string
+    sourceChunk?: string
   ): Promise<GraphNode> {
     try {
       await parseAndEvaluateStatement(
@@ -404,11 +387,10 @@ export class SemanticDeduplicator {

     return await this.createNewNode(
       replacement,
-      containerTag,
+      agentId!,
       sourceType,
       sourceId,
       sourceChunk,
-      agentId,
       embedding
     );
   }
@@ -440,11 +422,10 @@ export class SemanticDeduplicator {

   private async createNewNode(
     entity: ExtractedEntity,
-    containerTag: string,
+    agentId: string,
     sourceType: SourceType,
     sourceId?: string,
     sourceChunk?: string,
-    agentId?: string,
     embedding?: number[] | null
   ): Promise<GraphNode> {
     // Create in Agentlang first to get a UUID
@@ -453,7 +434,7 @@ export class SemanticDeduplicator {
       `name "${escapeString(entity.name)}", ` +
       `entityType "${escapeString(entity.entityType)}", ` +
       `sourceType "${sourceType}", ` +
-      `agentId "${containerTag}", ` +
+      `agentId "${escapeString(agentId)}", ` +
       `isLatest true, ` +
       `confidence 1.0`;

@@ -466,9 +447,6 @@ export class SemanticDeduplicator {
     if (sourceChunk) {
       query += `, sourceChunk "${escapeString(sourceChunk.substring(0, 500))}"`;
     }
-    if (agentId) {
-      query += `, agentId "${escapeString(agentId)}"`;
-    }
     if (embedding && embedding.length > 0) {
       query += `, embedding "${escapeString(JSON.stringify(embedding))}"`;
     }
diff --git a/src/runtime/knowledge/document-processor.ts b/src/runtime/knowledge/document-processor.ts
index 617f38d..f7a1909 100644
--- a/src/runtime/knowledge/document-processor.ts
+++ b/src/runtime/knowledge/document-processor.ts
@@ -73,9 +73,7 @@ export class DocumentProcessor {

   async processDocument(
     document: Document,
-    containerTag: string,
-    tenantId: string,
-    agentId?: string,
+    agentId: string,
     env?: Environment,
     llmName?: string
   ): Promise<ProcessingResult> {
@@ -181,12 +179,10 @@ export class DocumentProcessor {

     const nodes = await this.deduplicator.findOrCreateNodesBatch(
       entitiesToCreate,
-      containerTag,
-      tenantId,
+      agentId,
       'DOCUMENT',
       document.name,
-      sourceChunks,
-      agentId
+      sourceChunks
     );

     for (let i = 0; i < coreEntities.length; i++) {
@@ -240,7 +236,7 @@ export class DocumentProcessor {

     // Create edges from accumulated relationships
     for (const rel of candidateRelationships.values()) {
-      const edge = await this.createEdgeFromCandidate(rel, coreNodeMap, containerTag, agentId);
+      const edge = await this.createEdgeFromCandidate(rel, coreNodeMap, agentId);
       if (edge) {
         edgesCreated++;
       }
@@ -329,8 +325,7 @@ export class DocumentProcessor {
   private async createEdgeFromCandidate(
     rel: RelationshipCandidate,
     nodeMap: Map<string, GraphNode>,
-    containerTag: string,
-    agentId?: string
+    agentId: string
   ): Promise<GraphEdge | null> {
     const sourceNode = nodeMap.get(rel.sourceKey);
     const targetNode = nodeMap.get(rel.targetKey);
@@ -389,8 +384,7 @@ export class DocumentProcessor {
           `relType "${escapeString(edge.relationship)}", ` +
           `weight ${edge.weight}, ` +
           `sourceType "DOCUMENT", ` +
-          `agentId "${escapeString(containerTag)}"` +
-          (agentId ? `, agentId "${escapeString(agentId)}"` : '') +
+          `agentId "${escapeString(agentId)}"` +
           `}}`,
         undefined
       );
diff --git a/src/runtime/knowledge/service.ts b/src/runtime/knowledge/service.ts
index 42f8ee2..d604a19 100644
--- a/src/runtime/knowledge/service.ts
+++ b/src/runtime/knowledge/service.ts
@@ -13,6 +13,7 @@ import type { Instance } from '../module.js';
 import type { Document, KnowledgeContext, ProcessingResult, SourceType } from '../graph/types.js';
 import { insertRows, DbContext } from '../resolvers/sqldb/database.js';
 import { DefaultAuthInfo } from '../resolvers/authinfo.js';
+import { isKnowledgeGraphEnabled, getKnowledgeGraphConfig } from '../state.js';
 import crypto from 'crypto';

 // Maximum number of session messages to retain per session (prevents database bloat)
@@ -40,7 +41,21 @@ export class KnowledgeService {
   private readonly enabled: boolean;

   constructor(graphDb?: GraphDatabase, embeddingConfig?: any) {
-    this.graphDb = graphDb || new Neo4jDatabase();
+    // Check if knowledge graph is enabled via config
+    const configEnabled = isKnowledgeGraphEnabled();
+    const kgConfig = getKnowledgeGraphConfig();
+
+    // Create Neo4jDatabase with config values if available
+    // Config values fall back to environment variables, then to defaults
+    if (graphDb) {
+      this.graphDb = graphDb;
+    } else {
+      const neo4jConfig = kgConfig?.neo4j;
+      const uri = neo4jConfig?.uri || process.env.GRAPH_DB_URI || 'bolt://localhost:7687';
+      const user = neo4jConfig?.user || process.env.GRAPH_DB_USER || 'neo4j';
+      const password = neo4jConfig?.password || process.env.GRAPH_DB_PASSWORD || 'password';
+      this.graphDb = new Neo4jDatabase(uri, user, password);
+    }

     // Ensure embedding config has API key from environment
     const resolvedEmbeddingConfig = embeddingConfig || {
@@ -61,7 +76,13 @@ export class KnowledgeService {
       process.env.OPENAI_API_KEY
     );

-    if (!hasApiKey) {
+    // Knowledge graph is enabled only if config says so AND we have an API key
+    if (!configEnabled) {
+      logger.info(
+        '[KNOWLEDGE] Knowledge graph is disabled in config. Set knowledgeGraph.enabled to true to enable.'
+      );
+      this.enabled = false;
+    } else if (!hasApiKey) {
       logger.warn(
         '[KNOWLEDGE] No embedding API key configured (AGENTLANG_OPENAI_KEY or OPENAI_API_KEY). ' +
           'Knowledge base features are disabled. Set an API key to enable knowledge graph extraction, ' +
@@ -191,14 +212,7 @@ export class KnowledgeService {
     llmName?: string
   ): Promise<ProcessingResult> {
     if (!this.enabled) return KnowledgeService.EMPTY_RESULT;
-    return this.documentProcessor.processDocument(
-      document,
-      containerTag,
-      containerTag, // tenantId
-      agentId,
-      env,
-      llmName
-    );
+    return this.documentProcessor.processDocument(document, containerTag, env, llmName);
   }

   async processDocuments(
@@ -247,7 +261,7 @@ export class KnowledgeService {
     if (!this.enabled) {
       return { entities: [], relationships: [], instanceData: [], contextString: '' };
     }
-    return this.contextBuilder.buildContext(query, containerTag, containerTag);
+    return this.contextBuilder.buildContext(query, containerTag, containerTag, []);
   }

   buildContextString(context: KnowledgeContext): string {
@@ -286,7 +300,7 @@ export class KnowledgeService {
       const entityResult = await this.conversationProcessor.processMessage(
         userMessage,
         assistantResponse,
-        session.containerTag,
+        session.agentId,
         session.sessionId,
         session.agentId,
         env,
@@ -428,7 +442,7 @@ export class KnowledgeService {
       // All knowledge is agent-level (shared across users)
       const result = await this.processDocuments(
         matchingDocs,
-        session.containerTag,
+        session.agentId,
         session.agentId,
         env,
         llmName
@@ -440,7 +454,7 @@ export class KnowledgeService {
       );

       // Sync to Neo4j after document processing
-      await this.syncToNeo4j(session.containerTag);
+      await this.syncToNeo4j(session.agentId);

       this.processedAgents.add(session.agentId);

@@ -478,7 +492,7 @@ export class KnowledgeService {
       await this.graphDb.clearContainer(containerTag);

       const nodeResults: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEntity {          agentId? "${escapeString(agentId)}", isLatest? true}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEntity {          agentId? "${escapeString(containerTag)}", isLatest? true}}`,
         undefined
       );

@@ -501,7 +515,6 @@ export class KnowledgeService {
             instanceId: inst.lookup('instanceId') as string | undefined,
             instanceType: inst.lookup('instanceType') as string | undefined,
             agentId: inst.lookup('agentId') as string,
-            agentId: inst.lookup('agentId') as string | undefined,
             confidence: (inst.lookup('confidence') as number) || 1.0,
             createdAt: new Date(),
             updatedAt: new Date(),
@@ -515,7 +528,7 @@ export class KnowledgeService {
       }

       const edgeResults: Instance[] = await parseAndEvaluateStatement(
-        `{${CoreKnowledgeModuleName}/KnowledgeEdge {agentId? "${escapeString(agentId)}"}}`,
+        `{${CoreKnowledgeModuleName}/KnowledgeEdge {agentId? "${escapeString(containerTag)}"}}`,
         undefined
       );

@@ -601,11 +614,9 @@ export class KnowledgeService {
     await this.deduplicator.supersedeNode(
       existingNodeId,
       replacement,
-      session.containerTag,
+      session.agentId,
       'CONVERSATION',
-      session.sessionId,
-      undefined,
-      session.agentId
+      session.sessionId
     );
   }

@@ -662,7 +673,6 @@ export class KnowledgeService {
             sourceChunk: entity.sourceChunk ? entity.sourceChunk.substring(0, 500) : null,
             instanceId: null,
             instanceType: null,
-            agentId: containerTag,
             agentId: entity.agentId || null,
             confidence: entity.confidence || 1.0,
             isLatest: true,
@@ -711,7 +721,7 @@ export class KnowledgeService {
       relType: string;
       weight?: number;
       sourceType: SourceType;
-      containerTag: string;
+      agentId: string;
     }>,
     env?: Environment
   ): Promise<Array<{ id: string; sourceId: string; targetId: string }>> {
@@ -738,8 +748,7 @@ export class KnowledgeService {
             relType: edge.relType,
             weight: edge.weight || 1.0,
             sourceType: edge.sourceType,
-            agentId: edge.containerTag,
-            agentId: null,
+            agentId: edge.agentId,
             createdAt: new Date().toISOString(),
             __path__: path,
           };
@@ -846,12 +855,10 @@ export class KnowledgeService {
       try {
         await this.deduplicator.findOrCreateNode(
           { name: fact.content, entityType: 'Fact', description: fact.category },
-          session.containerTag,
-          session.containerTag,
+          session.agentId,
           'CONVERSATION',
           session.sessionId,
-          undefined,
-          session.agentId
+          undefined
         );
       } catch (err) {
         logger.debug(`[KNOWLEDGE] Failed to store fact: ${err}`);
diff --git a/src/runtime/knowledge/utils.ts b/src/runtime/knowledge/utils.ts
index f0014cd..9770f78 100644
--- a/src/runtime/knowledge/utils.ts
+++ b/src/runtime/knowledge/utils.ts
@@ -35,7 +35,6 @@ export function instanceToGraphNode(inst: Instance): GraphNode {
     instanceId: inst.lookup('instanceId') as string | undefined,
     instanceType: inst.lookup('instanceType') as string | undefined,
     agentId: inst.lookup('agentId') as string,
-    agentId: inst.lookup('agentId') as string | undefined,
     confidence: (inst.lookup('confidence') as number) || 1.0,
     createdAt: new Date(),
     updatedAt: new Date(),
diff --git a/src/runtime/modules/core.ts b/src/runtime/modules/core.ts
index b37bffc..a75a762 100644
--- a/src/runtime/modules/core.ts
+++ b/src/runtime/modules/core.ts
@@ -1,6 +1,7 @@
 import { default as ai, normalizeGeneratedCode } from './ai.js';
 import { default as auth } from './auth.js';
 import { default as files } from './files.js';
+import { default as knowledge } from './knowledge.js';
 import { default as mcp } from './mcp.js';
 import {
   DefaultModuleName,
@@ -187,6 +188,7 @@ export function registerCoreModules() {
     { def: auth, name: makeCoreModuleName('auth') },
     { def: ai, name: makeCoreModuleName('ai') },
     { def: files, name: makeCoreModuleName('files') },
+    { def: knowledge, name: makeCoreModuleName('knowledge') },
     { def: mcp, name: mcpn },
   ];

diff --git a/src/runtime/resolvers/sqldb/impl.ts b/src/runtime/resolvers/sqldb/impl.ts
index 1e7ddb5..33d0e58 100644
--- a/src/runtime/resolvers/sqldb/impl.ts
+++ b/src/runtime/resolvers/sqldb/impl.ts
@@ -101,6 +101,10 @@ export class EmbeddingService {
     return await this.provider.embedText(query);
   }

+  async embedTexts(texts: string[]): Promise<number[][]> {
+    return await this.provider.embedTexts(texts);
+  }
+
   private averageEmbeddings(embeddings: number[][]): number[] {
     if (embeddings.length === 0) return [];
     const dimension = embeddings[0].length;
diff --git a/src/runtime/state.ts b/src/runtime/state.ts
index b1ff754..9280327 100644
--- a/src/runtime/state.ts
+++ b/src/runtime/state.ts
@@ -182,6 +182,14 @@ export function isMonitoringEnabled(): boolean {
   return internalMonitoringEnabled || AppConfig?.monitoring?.enabled === true;
 }

+export function isKnowledgeGraphEnabled(): boolean {
+  return AppConfig?.knowledgeGraph?.enabled === true;
+}
+
+export function getKnowledgeGraphConfig(): Config['knowledgeGraph'] | undefined {
+  return AppConfig?.knowledgeGraph;
+}
+
 type TtlCacheEntry<T> = {
   value: T;
   expireTime: number;
diff --git a/test/runtime/memory.test.ts b/test/runtime/memory.test.ts
index 7bee310..52d4902 100644
--- a/test/runtime/memory.test.ts
+++ b/test/runtime/memory.test.ts
@@ -60,7 +60,7 @@ describe('Knowledge Graph Memory System', () => {
           name: 'Alice',
           entityType: 'Person',
           description: 'Main character',
-          __tenant__: 'test',
+          agentId: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -72,7 +72,7 @@ describe('Knowledge Graph Memory System', () => {
           name: 'Wonderland',
           entityType: 'Location',
           description: 'Fantasy world',
-          __tenant__: 'test',
+          agentId: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -98,7 +98,7 @@ describe('Knowledge Graph Memory System', () => {
           id: '1',
           name: 'Alice',
           entityType: 'Person',
-          __tenant__: 'test',
+          agentId: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -109,7 +109,7 @@ describe('Knowledge Graph Memory System', () => {
           id: '2',
           name: 'White Rabbit',
           entityType: 'Person',
-          __tenant__: 'test',
+          agentId: 'test',
           confidence: 1.0,
           sourceType: 'DOCUMENT',
           isLatest: true,
@@ -138,7 +138,7 @@ describe('Knowledge Graph Memory System', () => {
           entityType: 'Product',
           instanceId: 'order-123',
           instanceType: 'Shop/Order',
-          __tenant__: 'test',
+          agentId: 'test',
           confidence: 1.0,
           sourceType: 'INSTANCE',
           isLatest: true,
@@ -298,57 +298,39 @@ describe('Knowledge Graph Memory System', () => {
       const knowledgeService = getKnowledgeService();
       const agentFqName = 'KnowledgeTestApp/supportAgent';

-      const session = await knowledgeService.getOrCreateSession(
-        'supportAgent',
-        'test-user-001',
-        agentFqName
-      );
+      const session = await knowledgeService.getOrCreateSession(agentFqName);
       expect(session).toBeDefined();
       expect(session.sessionId).toBeDefined();
-      expect(session.userId).toBe('test-user-001');
-      expect(session.agentId).toBe('supportAgent');
-      expect(session.containerTag).toBe(agentFqName);
+      expect(session.agentId).toBe(agentFqName);

-      const context = await knowledgeService.buildContext('Hello', session.containerTag);
+      const context = await knowledgeService.buildContext('Hello', agentFqName);
       expect(context).toBeDefined();
       expect(typeof context.contextString).toBe('string');
     });

-    test('shares knowledge container across users (agent-level isolation)', async () => {
+    test('shares knowledge container across sessions (agent-level isolation)', async () => {
       await doInternModule('KGSessionIsolation', `entity Data { id Int @id }`);
       const knowledgeService = getKnowledgeService();
       const agentFqName = 'KGSessionIsolation/testAgent';

-      const session1 = await knowledgeService.getOrCreateSession(
-        'testAgent',
-        'user-alpha',
-        agentFqName
-      );
-      const session2 = await knowledgeService.getOrCreateSession(
-        'testAgent',
-        'user-beta',
-        agentFqName
-      );
+      const session1 = await knowledgeService.getOrCreateSession(agentFqName);
+      const session2 = await knowledgeService.getOrCreateSession(agentFqName);

-      // Both users share the same agent-level container
-      expect(session1.containerTag).toBe(session2.containerTag);
-      expect(session1.containerTag).toBe(agentFqName);
-      // Sessions are still per-user
+      // Both sessions share the same agent-level container
+      expect(session1.agentId).toBe(agentFqName);
+      expect(session2.agentId).toBe(agentFqName);
+      // Sessions are still unique
       expect(session1.sessionId).not.toBe(session2.sessionId);
     });

-    test('formats containerTag as agentFqName (agent-only)', async () => {
+    test('uses agentFqName for agent-level isolation', async () => {
       await doInternModule('KGFormatTest', `entity X { id Int @id }`);
       const knowledgeService = getKnowledgeService();
       const agentFqName = 'KGFormatTest/testAgent';

-      const session = await knowledgeService.getOrCreateSession(
-        'testAgent',
-        'test-user',
-        agentFqName
-      );
+      const session = await knowledgeService.getOrCreateSession(agentFqName);

-      expect(session.containerTag).toBe('KGFormatTest/testAgent');
+      expect(session.agentId).toBe('KGFormatTest/testAgent');
     });
   });
 });
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Signed-off-by: Pratik Karki <pratik@fractl.io>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant