Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    1mcpserver

    17 stars
    Python
    Updated Oct 14, 2025

    Table of Contents

    • Demo video 🎥: https://youtu.be/W4EAmaTTb2A
    • Quick Setup
    • 1) Remote 🌍⚡💨
    • Configure your MCP client
    • 2) Local (prebuilt) 💻
    • 2A) Docker 🐳
    • 2B) npx 📦
    • 3) Local (from source) 🧩
    • Optional: grant file-system access 📁
    • Architecture 🧠
    • Quick Search ⚡
    • Deep Search 🌊
    • Change Log 🕒
    • Future 🔮
    • Credits 🙏
    • Troubleshooting 🧰
    • Star History

    Table of Contents

    • Demo video 🎥: https://youtu.be/W4EAmaTTb2A
    • Quick Setup
    • 1) Remote 🌍⚡💨
    • Configure your MCP client
    • 2) Local (prebuilt) 💻
    • 2A) Docker 🐳
    • 2B) npx 📦
    • 3) Local (from source) 🧩
    • Optional: grant file-system access 📁
    • Architecture 🧠
    • Quick Search ⚡
    • Deep Search 🌊
    • Change Log 🕒
    • Future 🔮
    • Credits 🙏
    • Troubleshooting 🧰
    • Star History

    Documentation

    1 MCP Server 🚀

    Website

    MCP of MCPs — automatically discover and configure MCP servers on your machine (remote or local).

    After setup, you can usually just say:

    “I want to perform . Call the deep_search tool and follow the outlined steps.”

    The goal is that you only install this MCP server, and it handles the rest (searching servers, selecting servers, configuring servers, etc.).

    Demo video 🎥: https://youtu.be/W4EAmaTTb2A

    Quick Setup

    Choose one of the following:

    1. Remote (simplest & fastest ⚡💨)

    2. Local (prebuilt) — Docker, uvx, or npx

    3. Local (from source) — run this repo directly

    1) Remote 🌍⚡💨

    Use the hosted endpoint (recommended for the simplest setup).

    Docs + guided setup: https://mcp.1mcpserver.com/

    Configure your MCP client

    Add the following entry to your client config file:

    • Cursor: ./.cursor/mcp.json
    • Gemini CLI: ./gemini/settings.json (see Gemini docs)
    • Claude Desktop:
    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
    • Codex:
    • macOS: ~/.codex/config.toml
    • Windows: %USERPROFILE%\.codex\config.toml

    Remote config (JSON):

    json
    {
      "mcpServers": {
        "1mcpserver": {
          "url": "https://mcp.1mcpserver.com/mcp/",
          "headers": {
            "Accept": "text/event-stream",
            "Cache-Control": "no-cache"
          }
        }
      }
    }

    If you already have other servers configured, just merge this entry under mcpServers For example:

    json
    {
      "mcpServers": {
        "1mcpserver": {
          "url": "https://mcp.1mcpserver.com/mcp/",
          "headers": {
            "Accept": "text/event-stream",
            "Cache-Control": "no-cache"
          }
        },
        "file-system": {
          "command": "npx",
          "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]
        }
      }
    }

    Tip: If your client supports it, move the config file into your home directory to apply globally.

    ---

    2) Local (prebuilt) 💻

    Use this when you want everything local, or when your MCP client only supports STDIO.

    2A) Docker 🐳

    bash
    docker run -p 8080:8080 ghcr.io/particlefuture/1mcpserver:latest

    Running on other host ports:

    bash
    docker run -p :8080 ghcr.io/particlefuture/1mcpserver:latest

    Running with stdio instead of streamable-http (You might see some delays when trying to connect):

    bash
    run --rm -i ghcr.io/particlefuture/1mcpserver:latest --local
    json
    {
      "mcpServers": {
        "1mcpserver": {
          "url": "https://mcp.1mcpserver.com/mcp/"
        }
      }
    }

    2B) npx 📦

    bash
    npx -y @1mcpserver/1mcpserver

    ---

    3) Local (from source) 🧩

    Clone this repo and run directly.

    bash
    git clone https://github.com/particlefuture/MCPDiscovery.git
    cd MCPDiscovery
    uv sync
    uv run server.py --local
    json
    {
      "mcpServers": {
        "1mcpserver": {
          "command": "/path/to/uv",
          "args": [
            "--directory",
            "",
            "run",
            "server.py",
            "--local"
          ]
        }
      }
    }

    If your client supports remote url servers, you can use the Remote setup instead.

    Optional: grant file-system access 📁

    If you want your LLM to have file-system access, add an MCP filesystem server and point it at the directory you want to allow:

    json
    {
      "mcpServers": {
        "file-system": {
          "command": "npx",
          "args": ["-y", "@modelcontextprotocol/server-filesystem", "~/"]
        }
      }
    }

    ---

    Architecture 🧠

    There are two search modes:

    Quick Search ⚡

    For explicit requests like: “I want an MCP server that handles payments.”

    Returns a shortlist of relevant MCP servers.

    Deep Search 🌊

    For higher-level or complex goals like: “Build a website that analyzes other websites.”

    The LLM breaks the goal into components/steps, finds MCP servers for each part, and if something is missing, it asks whether to:

    • ignore that part,
    • break it down further, or
    • implement it ourselves.

    Deep Search stages:

    1. Planning — identify servers, keys, and config changes

    2. Testing — verify servers (via test_server_template_code)

    3. Acting — execute the workflow using the configured servers

    ---

    Change Log 🕒

    • July 31 2025: Upgrade to 0.2.0. Added agentic planning.
    • Dec 12 2025: Support for Gemini + Codex
    • Dec 13 2025: Easier local setup with docker and npm.

    Future 🔮

    • Better demo videos (new domain, narrated walkthrough)
    • Model Context Communication Protocol (MCCP): standard server-to-server messaging
    • Avoid calling tools with an internal_ prefix unless instructed
    • Improve MCP server database schema: server, description, url, config json, extra setup (docker/api key/etc)

    Credits 🙏

    Data sources:

    • wong2/awesome-mcp-servers
    • metorial/mcp-containers
    • punkpeye/awesome-mcp-servers
    • modelcontextprotocol/servers

    Published to:

    • https://mcpservers.org/
    • https://glama.ai/mcp/servers

    Troubleshooting 🧰

    • If using a venv and you get ModuleNotFoundError even after installing: delete the venv and recreate it.

    Please create an issue or directly contact me zjia71@gatech.edu if you encounter ANY issue of frustration. I really hope the setup is as smooth as possible!!

    Star History

    Star History Chart

    Similar MCP

    Based on tags & features

    • ES

      Esp Rainmaker Mcp

      Python·
      9
    • PE

      Personalizationmcp

      Python·
      12
    • MA

      Mayamcp

      Python·
      27
    • BI

      Biothings Mcp

      Python·
      25

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • ES

      Esp Rainmaker Mcp

      Python·
      9
    • PE

      Personalizationmcp

      Python·
      12
    • MA

      Mayamcp

      Python·
      27
    • BI

      Biothings Mcp

      Python·
      25

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k