Introduction
Create your own AI coding assistant! This extension lets you chat with DeepSeek R1 (a powerful open-source AI model) directly in VS Code using Ollama (a tool to run AI models locally). No subscriptions, no internet connection required!
-
Install Tools:
- VS Code
- Node.js & npm (v18+)
- Ollama:
# Mac/Linux: curl -fsSL https://ollama.ai/install.sh | sh # Windows: Download from https://ollama.ai/download
-
Install Extension Tools:
npm install -g yo generator-code
-
Download DeepSeek R1 Model: chose model according to your system capability.(smallest 1.5b)
ollama run deepseek-r1:1.5b
-
Generate Project:
npx yo code
- Choose TypeScript
- Name:
deepseek-assistant - Accept defaults for other options.
-
Project Structure:
src/extension.ts: Main codepackage.json: Extension configuration
-
Install Ollama SDK:
npm install ollama
-
Update
extension.ts:import * as vscode from 'vscode'; import ollama from 'ollama'; export function activate(context: vscode.ExtensionContext) { // Create a chat panel const panel = vscode.window.createWebviewPanel( 'deepseekChat', 'DeepSeek Assistant', vscode.ViewColumn.One, {} ); // HTML for the chat UI panel.webview.html = getWebviewContent(); // Handle messages from the UI panel.webview.onDidReceiveMessage(async (message) => { if (message.command === 'chat') { const response = await ollama.chat({ model: 'deepseek-r1', messages: [{ role: 'user', content: message.text }], stream: true, }); // Stream the response to the UI for await (const chunk of response) { panel.webview.postMessage({ command: 'chatResponse', text: chunk.message.content, }); } } }); } function getWebviewContent(): string { return ` <!DOCTYPE html> <html> <body> <textarea id="input" rows="5" cols="50"></textarea> <button onclick="sendPrompt()">Ask</button> <div id="response"></div> <script> const vscode = acquireVsCodeApi(); function sendPrompt() { const input = document.getElementById('input').value; vscode.postMessage({ command: 'chat', text: input }); } window.addEventListener('message', (event) => { document.getElementById('response').innerText += event.data.text; }); </script> </body> </html> `; }
-
Update
package.json:"contributes": { "commands": [{ "command": "deepseek.startChat", "title": "Start DeepSeek Chat" }] }
-
Register Command in
extension.ts:context.subscriptions.push( vscode.commands.registerCommand('deepseek.startChat', () => { activate(context); }) );
-
Open Debugger:
- Press
F5in VS Code. - A new VS Code window opens with your extension installed.
- Press
-
Run Command:
- Press
Ctrl+Shift+Pand type "Start DeepSeek Chat". - Type a question (e.g., "Explain binary search") and click Ask.
- Press
-
Troubleshooting:
- Ollama not running? Start it with
ollama serve. - Model missing? Run
ollama pull deepseek-r1.
- Ollama not running? Start it with
-
Package Extension:
npm install -g vsce vsce package
- Output:
deepseek-assistant-0.0.1.vsix
- Output:
-
Publish to Marketplace:
Follow VS Code Publishing Guide.
- Customize the UI with CSS for better visuals.
- Add support for code suggestions or file editing.
- Explore other models like Llama 3 or Mistral.
Resources:
You’ve built a private, open-source AI coding assistant!