Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Simple Mcp Ollama Bridge

    A Simple bridge from Ollama to a fetch url mcp server

    1 stars
    Python
    Updated Jun 4, 2025

    Table of Contents

    • Quick Start
    • Additional Endpoint Support
    • Ollama
    • License
    • Contributing

    Table of Contents

    • Quick Start
    • Additional Endpoint Support
    • Ollama
    • License
    • Contributing

    Documentation

    MCP LLM Bridge

    A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama

    Read more about MCP by Anthropic here:

    • Resources
    • Prompts
    • Tools
    • Sampling

    Quick Start

    bash
    # Install
    curl -LsSf https://astral.sh/uv/install.sh | sh
    git clone https://github.com/bartolli/mcp-llm-bridge.git
    cd mcp-llm-bridge
    uv venv
    source .venv/bin/activate
    uv pip install -e .
    
    Note: reactivate the environment if needed to use the keys in `.env`: `source .venv/bin/activate`
    
    Then configure the bridge in [src/mcp_llm_bridge/main.py](src/mcp_llm_bridge/main.py)

    mcp_server_params=StdioServerParameters(

    command="uv",

    # CHANGE THIS = it needs to be an absolute directory! add the mcp fetch server at the directory (clone from https://github.com/modelcontextprotocol/servers/)

    args=["--directory", "~/llms/mcp/mc-server-fetch/servers/src/fetch", "run", "mcp-server-fetch"],

    env=None

    ),

    # llm_config=LLMConfig(

    # api_key=os.getenv("OPENAI_API_KEY"),

    # model=os.getenv("OPENAI_MODEL", "gpt-4o"),

    # base_url=None

    # ),

    llm_config=LLMConfig(

    api_key="ollama", # Can be any string for local testing

    model="llama3.2",

    base_url="http://localhost:11434/v1" # Point to your local model's endpoint

    ),

    )

    code
    ### Additional Endpoint Support
    
    The bridge also works with any endpoint implementing the OpenAI API specification:
    
    #### Ollama

    llm_config=LLMConfig(

    api_key="not-needed",

    model="mistral-nemo:12b-instruct-2407-q8_0",

    base_url="http://localhost:11434/v1"

    )

    code
    ## License
    
    [MIT](LICENSE.md)
    
    ## Contributing
    
    PRs welcome.

    Similar MCP

    Based on tags & features

    • NE

      Nebulablock Mcp Server

      Python·
      1
    • CH

      Chuk Mcp Linkedin

      Python00
    • PU

      Pursuit Mcp

      Python00
    • HE

      Hello Mcp

      Python00

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • NE

      Nebulablock Mcp Server

      Python·
      1
    • CH

      Chuk Mcp Linkedin

      Python00
    • PU

      Pursuit Mcp

      Python00
    • HE

      Hello Mcp

      Python00

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k