Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Mcp Ollama Agent

    A TypeScript example showcasing the integration of Ollama with the Model Context Protocol (MCP) servers. This project provides an interactive command-line interface for an AI agent that can utilize the tools from multiple MCP Servers..

    27 stars
    TypeScript
    Updated Nov 4, 2025
    agent
    mcp-agent
    mcp-client
    mcp-server
    ollama
    ollama-tools
    tool-agent
    typescript

    Table of Contents

    • ✨ Features
    • 🚀 Getting Started
    • ⚙️ Configuration
    • 💡 Example Usage
    • System Prompts
    • 🤝 Contributing

    Table of Contents

    • ✨ Features
    • 🚀 Getting Started
    • ⚙️ Configuration
    • 💡 Example Usage
    • System Prompts
    • 🤝 Contributing

    Documentation

    TypeScript MCP Agent with Ollama Integration

    This project demonstrates integration between Model Context Protocol (MCP) servers and Ollama, allowing AI models to interact with various tools through a unified interface.

    ✨ Features

    • Supports multiple MCP servers (both uvx and npx tested)
    • Built-in support for file system operations and web research
    • Easy configuration through mcp-config.json similar to claude_desktop_config.json
    • Interactive chat interface with Ollama integration that should support any tools
    • Standalone demo mode for testing web and filesystem tools without an LLM

    🚀 Getting Started

    1. Prerequisites:

    • Node.js (version 18 or higher)
    • Ollama installed and running
    • Install the MCP tools globally that you want to use:
    bash
    # For filesystem operations
         npm install -g @modelcontextprotocol/server-filesystem
    
         # For web research
         npm install -g @mzxrai/mcp-webresearch

    2. Clone and install:

    bash
    git clone https://github.com/ausboss/mcp-ollama-agent.git
       cd mcp-ollama-agent
       npm install

    3. **Configure your tools and tool supported Ollama model in mcp-config.json:**

    json
    {
         "mcpServers": {
           "filesystem": {
             "command": "npx",
             "args": ["@modelcontextprotocol/server-filesystem", "./"]
           },
           "webresearch": {
             "command": "npx",
             "args": ["-y", "@mzxrai/mcp-webresearch"]
           }
         },
         "ollama": {
           "host": "http://localhost:11434",
           "model": "qwen2.5:latest"
         }
       }

    4. Run the demo to test filesystem and webresearch tools without an LLM:

    bash
    npx tsx ./src/demo.ts

    5. Or start the chat interface with Ollama:

    bash
    npm start

    ⚙️ Configuration

    • MCP Servers: Add any MCP-compatible server to the mcpServers section
    • Ollama: Configure host and model (must support function calling)
    • Supports both Python (uvx) and Node.js (npx) MCP servers

    💡 Example Usage

    This example used this model qwen2.5:latest

    code
    Chat started. Type "exit" to end the conversation.
    You: can you use your list directory tool to see whats in test-directory then use your read file tool to read it to me?
    Model is using tools to help answer...
    Using tool: list_directory
    With arguments: { path: 'test-directory' }
    Tool result: [ { type: 'text', text: '[FILE] test.txt' } ]
    Assistant:
    Model is using tools to help answer...
    Using tool: read_file
    With arguments: { path: 'test-directory/test.txt' }
    Tool result: [ { type: 'text', text: 'rosebud' } ]
    Assistant: The content of the file `test.txt` in the `test-directory` is:
    rosebud
    You: thanks
    Assistant: You're welcome! If you have any other requests or need further assistance, feel free to ask.

    System Prompts

    Some local models may need help with tool selection. Customize the system prompt in ChatManager.ts to improve tool usage.

    🤝 Contributing

    Contributions welcome! Feel free to submit issues or pull requests.

    Similar MCP

    Based on tags & features

    • MC

      Mcp Open Library

      TypeScript·
      42
    • ME

      Metmuseum Mcp

      TypeScript·
      14
    • MC

      Mcp Aoai Web Browsing

      Python·
      30
    • MC

      Mcp Ipfs

      TypeScript·
      11

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • MC

      Mcp Open Library

      TypeScript·
      42
    • ME

      Metmuseum Mcp

      TypeScript·
      14
    • MC

      Mcp Aoai Web Browsing

      Python·
      30
    • MC

      Mcp Ipfs

      TypeScript·
      11

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k