╔══════════════════════════════════════════════════════════════════╗
║ ║
║ 🌟 AI Robot - Intelligent Group Chat Assistant 🤖 ║
║ ║
║ 智能群聊助手 - Smart QQ/WeChat Group Chat AI Bot ║
║ ║
╚══════════════════════════════════════════════════════════════════╝
🚀 Plug-and-Play QQ Group AI Bot | Multi-Platform API Support | One-Click Launch | Ready to Use
- Project Overview
- Architecture
- Features
- System Requirements
- Installation
- Quick Start Guide
- Configuration
- Protocol Documentation
- Advanced Usage
- Troubleshooting
- Contributing
- License
- Changelog
AI Robot is a sophisticated, production-ready intelligent group chat assistant designed for QQ and WeChat platforms. Built with modern technologies (Electron, Vue 3, TypeScript), it provides seamless AI-powered conversations with support for multiple LLM providers.
| Feature | Description |
|---|---|
| 🎯 Zero-Configuration | Download → Install → API Key → Launch |
| 🌐 Multi-Platform AI | Alibaba Cloud, DeepSeek, Zhipu, Moonshot, OpenAI, Google, Ollama |
| 💬 Smart Conversations | @ mentions, command triggers, auto-reply in private chats |
| 🖼️ Vision Support | Image recognition with multimodal AI (Pro version) |
| 🎤 Voice Reply | Text-to-speech responses (Pro version) |
| 🔌 Plugin System | Extensible plugin architecture |
| 💾 Session Persistence | SQLite storage, survives restarts |
| 🎨 Modern UI | Electron + Vue 3, smooth and beautiful |
| 🌙 Dark Theme | Eye-friendly dark interface |
AI Robot follows a modular, adapter-based architecture that separates concerns and enables easy extensibility.
┌─────────────────────────────────────────────────────────────────┐
│ AI Robot │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ QQ Adapter │ │ IM Adapters │ │ WeChat │ │
│ │ (NapCatQQ) │ │ (Platform) │ │ (Future) │ │
│ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │
│ │ │ │ │
│ └──────────────────┼──────────────────┘ │
│ ▼ │
│ ┌─────────────────┐ │
│ │ Core Layer │ │
│ │ - IM Handler │ │
│ │ - LLM Selector │ │
│ │ - Storage │ │
│ │ - Plugin │ │
│ └────────┬────────┘ │
│ │ │
│ ┌──────────────────┼──────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ Alibaba │ │ Ollama │ │ Future │ │
│ │ Adapter │ │ Adapter │ │ Providers │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
│ │
│ ┌─────────────┐ ┌─────────────┐ │
│ │ Setup UI │ │ SQLite │ │
│ │ (Console) │ │ Storage │ │
│ └─────────────┘ └─────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
| Component | Location | Responsibility |
|---|---|---|
| apps/server | apps/server/ |
Main service entry, HTTP/WebSocket server, component initialization |
| apps/setup-ui | apps/setup-ui/ |
Visual console, environment detection, configuration wizard |
| packages/core | packages/core/ |
Core logic: IM handling, LLM routing, storage, plugins |
| packages/qq-adapter | packages/qq-adapter/ |
NapCatQQ WebSocket integration |
| packages/wechat-adapter | packages/wechat-adapter/ |
WeChat HTTP adapter (reserved) |
| packages/alibaba-adapter | packages/alibaba-adapter/ |
Alibaba Cloud/DashScope API |
| packages/ollama-adapter | packages/ollama-adapter/ |
Local Ollama model integration |
| packages/sqlite-storage | packages/sqlite-storage/ |
SQLite session persistence |
| packages/doctor | packages/doctor/ |
Environment diagnostics |
| Capability | Description | Status |
|---|---|---|
| Multi-turn Conversation | Context-aware dialogue with session memory | ✅ |
| System Prompts | Customizable AI personality and behavior | ✅ |
| Temperature Control | Adjust response creativity | ✅ |
| Token Limits | Configure max response length | ✅ |
| Vision (Multimodal) | Image understanding and description | 👑 Pro |
| Voice Synthesis | Text-to-speech responses | 👑 Pro |
| Feature | Description |
|---|---|
| @ Mention Trigger | Bot responds when mentioned in groups |
| Command Prefix | Use /ai or custom prefix to trigger |
| Private Auto-Reply | Automatic responses in private chats |
| Group Smart Reply | Intelligent group conversation handling |
| Message Quoting | Reply to specific messages |
| Option | Type | Default | Description |
|---|---|---|---|
LLM_PROVIDER |
string | alibaba |
AI provider selection |
ALIBABA_API_KEY |
string | - | Alibaba Cloud API key |
ALIBABA_MODEL |
string | qwen-plus |
Model selection |
OLLAMA_BASE_URL |
string | http://localhost:11434 |
Ollama server URL |
SESSION_STORAGE |
string | sqlite |
Storage backend |
SESSION_MAX_MESSAGES |
number | 100 |
Max messages per session |
CHAT_PREFIX |
string | /ai |
Command trigger prefix |
PRIVATE_AUTO_REPLY |
boolean | true |
Auto-reply in private chats |
GROUP_AI_TRIGGER |
string | both |
Group trigger mode: at, prefix, both |
| Platform | Requirement |
|---|---|
| Windows | Windows 10 64-bit |
| macOS | macOS 10.15 (Catalina) |
| Linux | Ubuntu 20.04 LTS |
| Platform | Recommendation |
|---|---|
| Windows | Windows 11 64-bit |
| macOS | macOS 12+ (Monterey) |
| Linux | Ubuntu 22.04 LTS |
| Software | Version | Required For |
|---|---|---|
| Node.js | 18.0.0+ | Source build |
| pnpm | 8.0.0+ | Source build |
| NapCatQQ | Latest | QQ integration |
| Ollama | Latest | Local models (optional) |
- Visit the Releases page
- Download the installer for your platform
- Install and launch the application
- Select your AI platform and enter your API Key
- Start NapCatQQ and scan QR code to login
- Click "Start Bot" to begin
# Clone repository
git clone https://github.com/badhope/ai-robot.git
cd ai-robot
# Install dependencies
pnpm install
# Development mode
pnpm dev
# Build for production
pnpm build:win # Windows
pnpm build:mac # macOS
pnpm build:linux # Linux# Build image
docker build -t ai-robot .
# Configure environment
cp .env.example .env
# Edit .env with your API keys
# Run container
cd deployments
docker-compose up -d📖 How to obtain API keys
Alibaba Cloud (Recommended)
- Visit Alibaba Cloud Bailian
- Login or register an account
- Enable DashScope service
- Navigate to "API-KEY Management" → "Create API Key"
DeepSeek
- Visit DeepSeek Platform
- Register an account
- Go to "API Keys" → "Create API Key"
Zhipu AI
- Visit Zhipu Open Platform
- Register an account
- Go to "API Keys" → "Add API Key"
OpenAI
- Visit OpenAI Platform
- Create an account
- Generate an API key from the dashboard
📖 NapCatQQ Setup Guide
- Download NapCatQQ
- Extract and run the application
- Scan QR code with your bot QQ account
- Ensure WebSocket port is set to
3001
For detailed instructions, see NapCatQQ Guide
# Run the application
pnpm dev
# Or use the desktop app
# Simply click "Start Bot" in the UIIn a QQ group:
@YourBot Hello, how are you?
Or use the command prefix:
/ai What's the weather today?
Create a .env file in the project root:
# ====================
# Server Configuration
# ====================
APP_HOST=0.0.0.0
APP_PORT=3000
# ====================
# QQ Configuration
# ====================
QQ_ENABLED=true
QQ_HTTP_PORT=3001
QQ_WS_URL=ws://localhost:3001
QQ_NUMBER=123456789
QQ_TOKEN=
# ====================
# AI Provider Configuration
# ====================
LLM_PROVIDER=alibaba
# Alibaba Cloud (Default)
ALIBABA_API_KEY=your-api-key-here
ALIBABA_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
ALIBABA_MODEL=qwen-plus
ALIBABA_TIMEOUT=120000
# Ollama (Local Mode)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen2.5:7b
OLLAMA_TIMEOUT=120000
# ====================
# Session Storage
# ====================
SESSION_STORAGE=sqlite
SQLITE_DB_PATH=./data/sessions.db
SESSION_MAX_MESSAGES=100
# ====================
# Trigger Rules
# ====================
CHAT_PREFIX=/ai
PRIVATE_AUTO_REPLY=true
GROUP_AI_TRIGGER=both
# ====================
# Logging
# ====================
LOG_LEVEL=info| Platform | Features | Free Tier | Rating |
|---|---|---|---|
| 🇨🇳 Alibaba Cloud | Fast, stable | Yes | ⭐⭐⭐⭐⭐ |
| 🇨🇳 DeepSeek | Cost-effective | Yes | ⭐⭐⭐⭐⭐ |
| 🇨🇳 Zhipu AI | Chinese LLM | Yes | ⭐⭐⭐⭐ |
| 🇨🇳 Moonshot | Long context | Yes | ⭐⭐⭐⭐ |
| 🌍 OpenAI | GPT-4 | Limited | ⭐⭐⭐⭐ |
| 🌍 Google Gemini | Large free tier | Yes | ⭐⭐⭐⭐ |
| 💻 Local Ollama | Completely free | Unlimited | ⭐⭐⭐ (GPU required) |
AI Robot implements multiple protocols for communication between components, external services, and messaging platforms. This section documents all protocols used in the system.
The IM (Instant Messaging) Platform Protocol defines the standard interface for integrating different messaging platforms (QQ, WeChat, etc.) into AI Robot.
interface IMAdapter {
name: string;
platform: 'wechat' | 'qq' | 'mock';
start(): Promise<void>;
stop(): Promise<void>;
sendReply(event: ChatMessageEvent, reply: ChatReply): Promise<void>;
onMessage(handler: (event: ChatMessageEvent) => Promise<void>): void;
}interface ChatMessageEvent {
platform: Platform;
chatType: 'private' | 'group';
messageId: string;
senderId: string;
senderName?: string;
roomId?: string;
roomName?: string;
text: string;
mentions?: string[];
isAt?: boolean;
replyToMessageId?: string;
timestamp: number;
raw?: unknown;
}interface ChatReply {
text: string;
replyToMessageId?: string;
}| Adapter | Protocol | Transport | Port |
|---|---|---|---|
| NapCatQQ | WebSocket | WS | 3001 |
| HTTP | REST | Configurable |
import { NapCatQQAdapter } from '@ai-robot/qq-adapter';
const adapter = new NapCatQQAdapter({
httpPort: 3001,
wsUrl: 'ws://localhost:3001',
qqNumber: '123456789',
});
await adapter.start();
adapter.onMessage(async (event) => {
console.log(`Message from ${event.senderName}: ${event.text}`);
await adapter.sendReply(event, { text: 'Hello!' });
});The LLM (Large Language Model) Provider Protocol defines the standard interface for integrating different AI providers into AI Robot.
interface LLMProvider {
name: string;
kind: 'local' | 'remote' | 'experimental';
generate(input: LLMGenerateRequest): Promise<LLMGenerateResponse>;
healthCheck(): Promise<boolean>;
listModels?(): Promise<string[]>;
}interface LLMGenerateRequest {
model?: string;
systemPrompt?: string;
messages: Array<{
role: 'system' | 'user' | 'assistant';
content: string;
}>;
temperature?: number;
topP?: number;
maxTokens?: number;
metadata?: Record<string, unknown>;
}interface LLMGenerateResponse {
provider: string;
model: string;
content: string;
usage?: {
promptTokens?: number;
completionTokens?: number;
totalTokens?: number;
};
raw?: unknown;
}| Provider | Kind | API Endpoint | Authentication |
|---|---|---|---|
| Alibaba Cloud | remote | dashscope.aliyuncs.com |
Bearer Token |
| Ollama | local | localhost:11434 |
None |
| OpenAI | remote | api.openai.com |
Bearer Token |
| DeepSeek | remote | api.deepseek.com |
Bearer Token |
import { AlibabaProvider } from '@ai-robot/alibaba-adapter';
const provider = new AlibabaProvider({
apiKey: 'sk-xxx',
model: 'qwen-plus',
});
const response = await provider.generate({
messages: [
{ role: 'user', content: 'Hello!' }
],
temperature: 0.7,
});
console.log(response.content);NapCatQQ uses WebSocket for real-time bidirectional communication between the QQ client and AI Robot.
| Parameter | Value |
|---|---|
| Protocol | WebSocket |
| Default URL | ws://localhost:3001 |
| Message Format | JSON |
| Encoding | UTF-8 |
Incoming Message (from NapCatQQ)
{
"post_type": "message",
"message_type": "group",
"sub_type": "normal",
"user_id": 123456789,
"group_id": 987654321,
"group_name": "Test Group",
"sender": {
"nickname": "User",
"card": "Card Name"
},
"message_id": 12345,
"message": [
{ "type": "text", "text": "Hello" },
{ "type": "at", "data": { "qq": 987654321 } }
],
"raw_message": "Hello @bot",
"time": 1234567890
}Outgoing Message (to NapCatQQ)
{
"message_type": "group",
"group_id": 987654321,
"message": "Reply text here"
}| Parameter | Value |
|---|---|
| Max Attempts | 10 |
| Base Delay | 1000ms |
| Max Delay | 30000ms |
| Backoff | Exponential |
Alibaba Cloud DashScope API provides access to Qwen series language models.
| Environment | URL |
|---|---|
| Production | https://dashscope.aliyuncs.com/compatible-mode/v1 |
| Chat Completions | /chat/completions |
Authorization: Bearer sk-xxxxxxxxxxxxxxxx
Content-Type: application/json{
"model": "qwen-plus",
"messages": [
{ "role": "system", "content": "You are a helpful assistant." },
{ "role": "user", "content": "Hello!" }
],
"temperature": 0.7,
"max_tokens": 2048
}{
"id": "chatcmpl-xxx",
"object": "chat.completion",
"created": 1234567890,
"model": "qwen-plus",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 10,
"completion_tokens": 20,
"total_tokens": 30
}
}| Model | Context | Description |
|---|---|---|
qwen-turbo |
8K | Fast, cost-effective |
qwen-plus |
32K | Balanced performance |
qwen-max |
32K | Best quality |
qwen-long |
1M | Long context |
Ollama provides a local REST API for running open-source language models.
| Endpoint | URL |
|---|---|
| Base URL | http://localhost:11434 |
| Chat API | /api/chat |
| Models List | /api/tags |
{
"model": "qwen2.5:7b",
"messages": [
{ "role": "user", "content": "Hello!" }
],
"stream": false,
"options": {
"temperature": 0.7,
"top_p": 0.9,
"num_predict": 512
}
}{
"model": "qwen2.5:7b",
"created_at": "2024-01-01T00:00:00Z",
"message": {
"role": "assistant",
"content": "Hello! How can I help you?"
},
"done": true,
"total_duration": 1000000000,
"prompt_eval_count": 10,
"eval_count": 20
}The Session Storage Protocol defines how conversation history is persisted and retrieved.
interface SessionStore {
getSession(sessionId: string): Promise<SessionMessage[]>;
appendMessage(sessionId: string, message: SessionMessage): Promise<void>;
clearSession(sessionId: string): Promise<void>;
}interface SessionMessage {
role: 'system' | 'user' | 'assistant';
content: string;
timestamp?: number;
}CREATE TABLE sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id TEXT NOT NULL,
role TEXT NOT NULL,
content TEXT NOT NULL,
timestamp INTEGER NOT NULL,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE INDEX idx_session_id ON sessions(session_id);
CREATE INDEX idx_timestamp ON sessions(session_id, timestamp);import { SQLiteSessionStore } from '@ai-robot/sqlite-storage';
const store = new SQLiteSessionStore({
dbPath: './data/sessions.db',
maxMessages: 100,
});
// Get session history
const messages = await store.getSession('group_123456');
// Append message
await store.appendMessage('group_123456', {
role: 'user',
content: 'Hello!',
timestamp: Date.now(),
});The HTTP API provides REST endpoints for external integrations and the setup UI.
| Method | Path | Description |
|---|---|---|
GET |
/health |
Health check |
POST |
/send |
Send message |
GET |
/status |
Bot status |
POST /send HTTP/1.1
Content-Type: application/json
{
"message_type": "group",
"group_id": 123456789,
"message": "Hello from HTTP API"
}The Plugin Protocol enables extensibility through custom message processors.
interface Plugin {
name: string;
priority: number;
process(event: ChatMessageEvent, context: PluginContext): Promise<PluginResult | null>;
}
interface PluginContext {
provider: LLMProvider;
storage: SessionStore;
config: Record<string, unknown>;
}
interface PluginResult {
handled: boolean;
reply?: string;
}Create custom prompts in the prompts/ directory:
prompts/
├── default/
│ ├── friendly.txt
│ └── tech-expert.txt
├── group/
│ ├── active.txt
│ └── concise.txt
└── helper/
└── assistant.txt
// In configuration
LLM_PROVIDER=ollama // Switch to local model
OLLAMA_MODEL=llama3:8b# Only respond to @ mentions
GROUP_AI_TRIGGER=at
# Only respond to prefix commands
GROUP_AI_TRIGGER=prefix
# Respond to both (default)
GROUP_AI_TRIGGER=bothpnpm doctorThis checks:
- Node.js version
- Configuration files
- API connectivity
- NapCatQQ connection
- SQLite status
🔧 Bot not responding
- Verify NapCatQQ is running
- Check WebSocket connection (
ws://localhost:3001) - Ensure
QQ_ENABLED=truein.env - Verify API key is configured
- Check bot is mentioned correctly with
@
🔧 API connection failed
- Verify API key is correct
- Check account has available credits
- Test network connectivity:
ping dashscope.aliyuncs.com - Verify base URL is correct
🔧 SQLite errors
- Check directory permissions
- Verify disk space available
- Delete and recreate database:
rm data/sessions.db
🔧 Ollama connection issues
- Verify Ollama is installed:
ollama --version - Check Ollama service:
ollama serve - Verify model is downloaded:
ollama list - Check configuration:
LLM_PROVIDER=ollama OLLAMA_BASE_URL=http://localhost:11434
| Error | Meaning | Solution |
|---|---|---|
401 Unauthorized |
Invalid API key | Check API key configuration |
Connection refused |
Service not running | Start the required service |
WebSocket closed |
NapCatQQ disconnected | Restart NapCatQQ |
Model not found |
Model doesn't exist | Download model or check name |
Out of credit |
Insufficient balance | Add credits to account |
We welcome all contributions! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/AmazingFeature - Make your changes
- Run tests:
pnpm test - Commit:
git commit -m 'Add AmazingFeature' - Push:
git push origin feature/AmazingFeature - Open a Pull Request
# Install dependencies
pnpm install
# Run in development
pnpm dev
# Run linter
pnpm lint
# Type check
pnpm typecheck
# Build
pnpm build- Use TypeScript for all new code
- Follow existing code conventions
- Add appropriate comments
- Update documentation
See CONTRIBUTING.md for detailed guidelines.
This project uses a dual-license model:
| Component | License |
|---|---|
| Open Source Parts | MIT License |
| Commercial Parts | Commercial License |
See LICENSE for details.
See CHANGELOG.md for version history.
[2.0.0] - 2024-03-26
- 🎉 New Electron desktop application architecture
- 🎨 Vue 3 + TypeScript modern UI
- 🌟 Star theme UI design
- 📱 Simple and Expert mode toggle
- 🔧 Visual configuration wizard
- 🧠 Multi-platform AI API support
- 💻 Local Ollama model support
- 🔌 Plugin system architecture
- 👑 Pro version features
- 📊 Conversation statistics
- 🌙 Dark theme support
| Channel | Link |
|---|---|
| contact@ai-robot.dev | |
| 💬 QQ Group | 123456789 |
| 🌐 Website | https://ai-robot.dev |
| 📖 Documentation | https://docs.ai-robot.dev |
| 🐛 Issues | GitHub Issues |
AI Robot 是一个成熟的、生产就绪的智能群聊助手,专为 QQ 和微信平台设计。基于现代技术栈(Electron、Vue 3、TypeScript)构建,提供流畅的 AI 对话体验,支持多种大语言模型提供商。
| 特性 | 描述 |
|---|---|
| 🎯 傻瓜式操作 | 下载 → 安装 → 填写 API Key → 一键启动 |
| 🌐 多平台支持 | 阿里云、DeepSeek、智谱、月之暗面、OpenAI、Google、本地 Ollama |
| 💬 智能对话 | 支持 @ 触发、命令触发、私聊自动回复 |
| 🖼️ 图片识别 | 识别图片内容,多模态 AI 支持(专业版) |
| 🎤 语音回复 | 文字转语音回复(专业版) |
| 🔌 插件系统 | 丰富的插件生态,可扩展功能 |
| 💾 会话持久化 | SQLite 存储,重启不丢失 |
| 🎨 现代界面 | Electron + Vue 3,流畅美观 |
| 🌙 暗色主题 | 护眼的深色界面 |
AI Robot 采用模块化、适配器模式的架构,实现关注点分离和易于扩展。
┌─────────────────────────────────────────────────────────────────┐
│ AI Robot │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ QQ 适配器 │ │ IM 适配器 │ │ 微信适配器 │ │
│ │ (NapCatQQ) │ │ (平台) │ │ (预留) │ │
│ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │
│ │ │ │ │
│ └──────────────────┼──────────────────┘ │
│ ▼ │
│ ┌─────────────────┐ │
│ │ 核心层 │ │
│ │ - IM 处理器 │ │
│ │ - LLM 选择器 │ │
│ │ - 存储层 │ │
│ │ - 插件系统 │ │
│ └────────┬────────┘ │
│ │ │
│ ┌──────────────────┼──────────────────┐ │
│ ▼ ▼ ▼ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ 阿里云 │ │ Ollama │ │ 未来 │ │
│ │ 适配器 │ │ 适配器 │ │ 提供商 │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
│ │
│ ┌─────────────┐ ┌─────────────┐ │
│ │ 设置界面 │ │ SQLite │ │
│ │ (控制台) │ │ 存储 │ │
│ └─────────────┘ └─────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────┘
| 组件 | 位置 | 职责 |
|---|---|---|
| apps/server | apps/server/ |
主服务入口,HTTP/WebSocket 服务,组件初始化 |
| apps/setup-ui | apps/setup-ui/ |
可视化控制台,环境检测,配置向导 |
| packages/core | packages/core/ |
核心逻辑:IM 处理,LLM 路由,存储,插件 |
| packages/qq-adapter | packages/qq-adapter/ |
NapCatQQ WebSocket 集成 |
| packages/wechat-adapter | packages/wechat-adapter/ |
微信 HTTP 适配器(预留) |
| packages/alibaba-adapter | packages/alibaba-adapter/ |
阿里云/DashScope API |
| packages/ollama-adapter | packages/ollama-adapter/ |
本地 Ollama 模型集成 |
| packages/sqlite-storage | packages/sqlite-storage/ |
SQLite 会话持久化 |
| packages/doctor | packages/doctor/ |
环境诊断 |
| 能力 | 描述 | 状态 |
|---|---|---|
| 多轮对话 | 上下文感知的对话,带会话记忆 | ✅ |
| 系统提示词 | 可自定义 AI 人格和行为 | ✅ |
| 温度控制 | 调整回复创造性 | ✅ |
| Token 限制 | 配置最大回复长度 | ✅ |
| 视觉(多模态) | 图像理解和描述 | 👑 专业版 |
| 语音合成 | 文字转语音回复 | 👑 专业版 |
| 功能 | 描述 |
|---|---|
| @ 提及触发 | 群聊中被 @ 时响应 |
| 命令前缀 | 使用 /ai 或自定义前缀触发 |
| 私聊自动回复 | 私聊中自动响应 |
| 群聊智能回复 | 智能群聊对话处理 |
| 消息引用 | 回复特定消息 |
| 平台 | 要求 |
|---|---|
| Windows | Windows 10 64 位 |
| macOS | macOS 10.15 (Catalina) |
| Linux | Ubuntu 20.04 LTS |
| 平台 | 推荐 |
|---|---|
| Windows | Windows 11 64 位 |
| macOS | macOS 12+ (Monterey) |
| Linux | Ubuntu 22.04 LTS |
| 软件 | 版本 | 用途 |
|---|---|---|
| Node.js | 18.0.0+ | 源码构建 |
| pnpm | 8.0.0+ | 源码构建 |
| NapCatQQ | 最新版 | QQ 集成 |
| Ollama | 最新版 | 本地模型(可选) |
- 前往 Releases 页面
- 下载对应平台的安装包
- 安装并启动应用
- 选择 AI 平台,填写 API Key
- 启动 NapCatQQ 并扫码登录
- 点击「启动机器人」开始使用
# 克隆仓库
git clone https://github.com/badhope/ai-robot.git
cd ai-robot
# 安装依赖
pnpm install
# 开发模式
pnpm dev
# 构建生产版本
pnpm build:win # Windows
pnpm build:mac # macOS
pnpm build:linux # Linux# 构建镜像
docker build -t ai-robot .
# 配置环境
cp .env.example .env
# 编辑 .env 填入 API 密钥
# 运行容器
cd deployments
docker-compose up -d📖 如何获取 API Key
阿里云(推荐)
- 打开 阿里云百炼
- 登录/注册阿里云账号
- 开通 DashScope 服务
- 点击左侧「API-KEY 管理」→「创建 API Key」
DeepSeek
- 打开 DeepSeek 官网
- 注册账号
- 进入「API Keys」→「创建 API Key」
智谱 AI
- 打开 智谱开放平台
- 注册账号
- 进入「API 密钥」→「添加 API 密钥」
# 运行应用
pnpm dev
# 或使用桌面应用
# 在界面中点击「启动机器人」在 QQ 群中:
@你的机器人 你好,今天天气怎么样?
或使用命令前缀:
/ai 今天天气怎么样?
在项目根目录创建 .env 文件:
# ====================
# 服务器配置
# ====================
APP_HOST=0.0.0.0
APP_PORT=3000
# ====================
# QQ 配置
# ====================
QQ_ENABLED=true
QQ_HTTP_PORT=3001
QQ_WS_URL=ws://localhost:3001
QQ_NUMBER=123456789
QQ_TOKEN=
# ====================
# AI 提供商配置
# ====================
LLM_PROVIDER=alibaba
# 阿里云(默认)
ALIBABA_API_KEY=your-api-key-here
ALIBABA_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
ALIBABA_MODEL=qwen-plus
ALIBABA_TIMEOUT=120000
# Ollama(本地模式)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=qwen2.5:7b
OLLAMA_TIMEOUT=120000
# ====================
# 会话存储
# ====================
SESSION_STORAGE=sqlite
SQLITE_DB_PATH=./data/sessions.db
SESSION_MAX_MESSAGES=100
# ====================
# 触发规则
# ====================
CHAT_PREFIX=/ai
PRIVATE_AUTO_REPLY=true
GROUP_AI_TRIGGER=both
# ====================
# 日志
# ====================
LOG_LEVEL=info| 平台 | 特点 | 免费额度 | 推荐指数 |
|---|---|---|---|
| 🇨🇳 阿里云通义 | 响应快,稳定 | 有 | ⭐⭐⭐⭐⭐ |
| 🇨🇳 DeepSeek | 性价比高 | 有 | ⭐⭐⭐⭐⭐ |
| 🇨🇳 智谱 AI | 国产大模型 | 有 | ⭐⭐⭐⭐ |
| 🇨🇳 月之暗面 | 长文本强 | 有 | ⭐⭐⭐⭐ |
| 🌍 OpenAI | GPT-4 | 有限 | ⭐⭐⭐⭐ |
| 🌍 Google Gemini | 免费额度大 | 有 | ⭐⭐⭐⭐ |
| 💻 本地 Ollama | 完全免费 | 无限 | ⭐⭐⭐(需显卡) |
AI Robot 实现了多种协议用于组件间通信、外部服务和消息平台集成。本节记录系统中使用的所有协议。
IM(即时通讯)平台协议定义了将不同消息平台(QQ、微信等)集成到 AI Robot 的标准接口。
interface IMAdapter {
name: string;
platform: 'wechat' | 'qq' | 'mock';
start(): Promise<void>;
stop(): Promise<void>;
sendReply(event: ChatMessageEvent, reply: ChatReply): Promise<void>;
onMessage(handler: (event: ChatMessageEvent) => Promise<void>): void;
}interface ChatMessageEvent {
platform: Platform;
chatType: 'private' | 'group';
messageId: string;
senderId: string;
senderName?: string;
roomId?: string;
roomName?: string;
text: string;
mentions?: string[];
isAt?: boolean;
replyToMessageId?: string;
timestamp: number;
raw?: unknown;
}| 适配器 | 协议 | 传输方式 | 端口 |
|---|---|---|---|
| NapCatQQ | WebSocket | WS | 3001 |
| 微信 | HTTP | REST | 可配置 |
LLM(大语言模型)提供商协议定义了将不同 AI 提供商集成到 AI Robot 的标准接口。
interface LLMProvider {
name: string;
kind: 'local' | 'remote' | 'experimental';
generate(input: LLMGenerateRequest): Promise<LLMGenerateResponse>;
healthCheck(): Promise<boolean>;
listModels?(): Promise<string[]>;
}interface LLMGenerateRequest {
model?: string;
systemPrompt?: string;
messages: Array<{
role: 'system' | 'user' | 'assistant';
content: string;
}>;
temperature?: number;
topP?: number;
maxTokens?: number;
metadata?: Record<string, unknown>;
}interface LLMGenerateResponse {
provider: string;
model: string;
content: string;
usage?: {
promptTokens?: number;
completionTokens?: number;
totalTokens?: number;
};
raw?: unknown;
}NapCatQQ 使用 WebSocket 实现 QQ 客户端与 AI Robot 之间的实时双向通信。
| 参数 | 值 |
|---|---|
| 协议 | WebSocket |
| 默认 URL | ws://localhost:3001 |
| 消息格式 | JSON |
| 编码 | UTF-8 |
传入消息(来自 NapCatQQ)
{
"post_type": "message",
"message_type": "group",
"user_id": 123456789,
"group_id": 987654321,
"sender": {
"nickname": "用户",
"card": "群名片"
},
"message": [
{ "type": "text", "text": "你好" },
{ "type": "at", "data": { "qq": 987654321 } }
],
"raw_message": "你好 @机器人",
"time": 1234567890
}传出消息(发送到 NapCatQQ)
{
"message_type": "group",
"group_id": 987654321,
"message": "回复内容"
}阿里云 DashScope API 提供对通义系列语言模型的访问。
| 环境 | URL |
|---|---|
| 生产环境 | https://dashscope.aliyuncs.com/compatible-mode/v1 |
| 对话补全 | /chat/completions |
Authorization: Bearer sk-xxxxxxxxxxxxxxxx
Content-Type: application/json{
"model": "qwen-plus",
"messages": [
{ "role": "system", "content": "你是一个有帮助的助手。" },
{ "role": "user", "content": "你好!" }
],
"temperature": 0.7,
"max_tokens": 2048
}| 模型 | 上下文 | 描述 |
|---|---|---|
qwen-turbo |
8K | 快速,性价比高 |
qwen-plus |
32K | 平衡性能 |
qwen-max |
32K | 最佳质量 |
qwen-long |
1M | 长上下文 |
Ollama 提供本地 REST API 用于运行开源语言模型。
| 端点 | URL |
|---|---|
| 基础 URL | http://localhost:11434 |
| 对话 API | /api/chat |
| 模型列表 | /api/tags |
{
"model": "qwen2.5:7b",
"messages": [
{ "role": "user", "content": "你好!" }
],
"stream": false,
"options": {
"temperature": 0.7,
"top_p": 0.9,
"num_predict": 512
}
}会话存储协议定义了对话历史的持久化和检索方式。
interface SessionStore {
getSession(sessionId: string): Promise<SessionMessage[]>;
appendMessage(sessionId: string, message: SessionMessage): Promise<void>;
clearSession(sessionId: string): Promise<void>;
}CREATE TABLE sessions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id TEXT NOT NULL,
role TEXT NOT NULL,
content TEXT NOT NULL,
timestamp INTEGER NOT NULL,
created_at INTEGER DEFAULT (strftime('%s', 'now'))
);
CREATE INDEX idx_session_id ON sessions(session_id);
CREATE INDEX idx_timestamp ON sessions(session_id, timestamp);在 prompts/ 目录创建自定义提示词:
prompts/
├── default/
│ ├── friendly.txt
│ └── tech-expert.txt
├── group/
│ ├── active.txt
│ └── concise.txt
└── helper/
└── assistant.txt
// 在配置中
LLM_PROVIDER=ollama // 切换到本地模型
OLLAMA_MODEL=llama3:8b# 仅响应 @ 提及
GROUP_AI_TRIGGER=at
# 仅响应前缀命令
GROUP_AI_TRIGGER=prefix
# 两者都响应(默认)
GROUP_AI_TRIGGER=bothpnpm doctor检查项目:
- Node.js 版本
- 配置文件
- API 连接
- NapCatQQ 连接
- SQLite 状态
🔧 机器人没有反应
- 确认 NapCatQQ 正在运行
- 检查 WebSocket 连接(
ws://localhost:3001) - 确保
.env中QQ_ENABLED=true - 验证 API Key 已配置
- 检查是否正确 @ 机器人
🔧 API 连接失败
- 验证 API Key 正确
- 检查账户有可用额度
- 测试网络连接:
ping dashscope.aliyuncs.com - 验证 Base URL 正确
🔧 SQLite 错误
- 检查目录权限
- 验证磁盘空间
- 删除并重建数据库:
rm data/sessions.db
| 错误 | 含义 | 解决方案 |
|---|---|---|
401 Unauthorized |
API Key 无效 | 检查 API Key 配置 |
Connection refused |
服务未启动 | 启动对应服务 |
WebSocket closed |
NapCatQQ 断开 | 重启 NapCatQQ |
Model not found |
模型不存在 | 下载模型或检查名称 |
Out of credit |
余额不足 | 充值账户 |
我们欢迎所有形式的贡献!
- Fork 本仓库
- 创建特性分支:
git checkout -b feature/AmazingFeature - 进行更改
- 运行测试:
pnpm test - 提交:
git commit -m 'Add AmazingFeature' - 推送:
git push origin feature/AmazingFeature - 提交 Pull Request
# 安装依赖
pnpm install
# 开发模式运行
pnpm dev
# 运行代码检查
pnpm lint
# 类型检查
pnpm typecheck
# 构建
pnpm build本项目采用双协议模式:
| 组件 | 协议 |
|---|---|
| 开源部分 | MIT License |
| 商业部分 | 商业许可协议 |
详见 LICENSE
详见 CHANGELOG.md
[2.0.0] - 2024-03-26
- 🎉 全新的 Electron 桌面应用架构
- 🎨 基于 Vue 3 + TypeScript 的现代化界面
- 🌟 星辰主题 UI 设计
- 📱 简单模式和专业模式双模式切换
- 🔧 可视化配置向导
- 🧠 支持多平台 AI API
- 💻 支持本地 Ollama 模型
- 🔌 插件系统架构
- 👑 专业版付费功能
- 🌙 暗色主题支持
| 渠道 | 链接 |
|---|---|
| 📧 邮箱 | contact@ai-robot.dev |
| 💬 QQ群 | 123456789 |
| 🌐 官网 | https://ai-robot.dev |
| 📖 文档 | https://docs.ai-robot.dev |
| 🐛 问题反馈 | GitHub Issues |
Made with ❤️ by AI Robot Team
⭐ 如果这个项目对你有帮助,请给一个 Star ⭐