This repository is for building a private LLM agent that interfaces with an MCP client and server.
The MCP server is designed to connect only to my private Obsidian knowledge base, which is organized using Markdown files.
For more details, please refer to this Medium Article (How I built a local MCP server to connect Obsidian with AI).
For the details of building sLLM agent and MCP client, please check this Medium Article (How I built a Tool-calling Llama Agent with a Custom MCP Server).
This repository includes the following components:
- MCP Client
- MCP Server
- LLM Agent
The MCP server, named knowledge-vault, manages Markdown files that serve as topic-specific knowledge notes. It provides the following tools:
list_knowledges(): list the names and URIs of all knowledges written in the the vaultget_knowledge_by_uri(uri:str): get contents of the knowledge resource by uri
This repository also contains a simple LLM agent implementation. It currently uses the Llama 3.2 model and leverages the MCP client to retrieve relevant knowledge context.
The agent can be used via a chat interface built with Streamlit. Please note that it is a prototype and may contain bugs.
below screenshots are showing LLM loading and parameter settings and the interactive chat view.
pip install -r requirements.txt- Changed the model format from .gguf and LLamaCPP to the HuggingFace's transformers library.
- Changed the generation prompt to work better for more models.
- Added config.json file to improve modularity of code and to allow more mcpServers.
- Changed the tool call format from Python list to .json to fix parsing bugs.
- Added a weather server written in Python (inspired from isdaniel/mcp_weather_server).

