A lightweight knowledge base assistant using MCP with LLM integration. Features a streamlined server-client architecture combining custom tools with a knowledge base, all accessible via SSE transport. Ideal for building simple AI-powered knowledge assistants.
1 stars
Python
Updated May 11, 2025
Documentation
MCP Knowledge Base
A simple MCP client-server
Requirements
- Python 3.9 or higher
- Poetry for dependency management
- OpenAI API key
Setup
1. Install dependencies using Poetry:
bash
poetry install2Create a .env file in the project root or parent directory with your OpenAI API key:
code
OPENAI_API_KEY=your_api_key_hereProject Structure
server.py: MCP server implementation with toolsclient-sse.py: MCP client implementation with LLM capabilitiesdata/kb.json: Knowledge base data with MCP-related Q&Apyproject.toml: Poetry configuration file
Running the Application
1. Start the server:
bash
poetry run python server.py2. In a separate terminal, run the client:
bash
poetry run python client-sse.pyUsing the Client
The client has two modes:
1. Direct tool calls:
- Uncomment the
asyncio.run(test_direct_tool_calls())line inclient-sse.py - This directly calls the tools without using an LLM
2. LLM-powered interactions (default):
- Uses OpenAI to interpret queries and call appropriate tools
- Ask questions like "What is MCP?" or "What is the difference between stdio and SSE transports?"
Customizing
- Add new tools to
server.pyby creating additional functions with the@mcp.tool()decorator - Modify the knowledge base by updating
data/kb.json - Change the OpenAI model by modifying the
modelparameter in theMCPClientclass
Similar MCP
Based on tags & features
Trending MCP
Most active this week