Supercharge your AI assistant with professional-grade Rust intelligence.
Standard AI tools only see your files as text. Rust MCP acts as a semantic bridge, giving LLMs (like Claude, GPT-4, or local models) the same deep understanding that rust-analyzer provides to your IDE. It enables AI tools to work with Rust code idiomatically through Language Server Protocol capabilities, avoiding brittle string manipulation and providing intelligent code analysis.
Warning
This project is currently in Alpha. It is functional and useful, but expect breaking changes and occasional rough edges. However, most core tools are fully tested and ready for production-level development tasks. We welcome feedback and contributions!
While other MCP servers provide basic file access, Rust MCP focuses on Semantic Context:
- Context-Aware Refactoring: Rename symbols reliably using code snippets instead of brittle line numbers.
- Smart Navigation: Find real definitions and references, not just text matches.
- Compiler-Grade Feedback: Get actual
cargo checkdiagnostics directly in your chat. - Efficiency: Outline large files with
document_symbolsto save LLM tokens.
- Build:
cargo build --release - Configure: See the Configuration section below.
- Use: Through AI assistants with natural language prompts like "Show me the definition of the main function".
get_hover- Get symbol signature and documentation.get_symbol_source- Get source code of specific symbol.document_symbols- Get file structure (outline) - Recommended for large files.find_definition- Navigate to symbol definitions.find_references- Find all symbol uses.get_diagnostics- Get compiler errors/warnings for a specific file.workspace_symbols- Search project symbols.get_type_hierarchy- Get type relationships for symbols.
rename_symbol- Rename with scope awareness (context-aware).extract_function- (Experimental) Extract code into functions.inline_function- (Experimental) Inline function calls.
run_cargo_check- Execute cargo check with full error parsing.apply_clippy_suggestions- (In Progress) Apply clippy automatic fixes.
- Rust toolchain (1.70+)
rust-analyzerinstalled (defaults to~/.cargo/bin/rust-analyzer)- An MCP-compatible client (Claude Desktop, Gemini CLI, Roo-Code, etc.)
git clone https://github.com/dexwritescode/rust-mcp
cd rust-mcp
cargo build --releaseThe server binary will be available at target/release/rustmcp.
Detailed guides for setting up Rust MCP with various clients:
- Claude Desktop
- Gemini CLI
- Roo-Code (VS Code)
- ESP32 & Custom Toolchains
- Other MCP Clients (Cursor, etc.)
- Environment Variables
These examples show how you can interact with your AI assistant once Rust MCP is configured.
- "Find the definition of the
AppStatestruct and show me its source code." - "What does the
handle_requestfunction do? Show me its signature and documentation." - "Where else is the
UserRegistrytrait used or implemented in this workspace?" - "Give me a high-level outline of
src/lib.rsso I can understand its structure."
- "I want to rename the internal field
counttototal_processed. Here is the code block where it's defined..." - "Please rename the
Storagetrait toDataStoreacross the entire project." - "Help me rename this local variable
idxtoindexto improve readability."
- "Run a full
cargo checkand tell me if my recent changes introduced any new warnings." - "Check the current file for any borrow checker errors."
- "Show me the type hierarchy for
MyCustomErrorto see which traits it implements."
The server is built with a modular architecture:
src/analyzer/-rust-analyzerLSP client integration.src/server/- MCP server implementation and tool handlers.src/tools/- Modular tool logic.
To keep the server responsive, compilation helpers run with guardrails (30s timeout, 1MB output cap). See Compiler Safety for more info.
See the Troubleshooting section in individual guides or check that rust-analyzer is installed and accessible.
- Fork the repository
- Create a feature branch
- Implement your changes with tests
- Submit a pull request
Built with ❤️ for the Rust & AI ecosystem.