Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Mcp Client Langchain Py

    Simple MCP Client CLI Implementation Using LangChain ReAct Agent / Python

    10 stars
    Python
    Updated Sep 26, 2025
    langchain
    langchain-python
    mcp
    mcp-client
    modelcontextprotocol
    python
    tool-call
    tool-calling

    Table of Contents

    • Prerequisites
    • Quick Start
    • Features
    • Limitations
    • Usage
    • Basic Usage
    • With Options
    • Supported Model/API Providers
    • Configuration
    • Environment Variables
    • Popular MCP Servers to Try
    • Troubleshooting
    • Building from Source
    • Change Log
    • License
    • Contributing

    Table of Contents

    • Prerequisites
    • Quick Start
    • Features
    • Limitations
    • Usage
    • Basic Usage
    • With Options
    • Supported Model/API Providers
    • Configuration
    • Environment Variables
    • Popular MCP Servers to Try
    • Troubleshooting
    • Building from Source
    • Change Log
    • License
    • Contributing

    Documentation

    Simple MCP Client to Explore MCP Servers License: MIT pypi version

    Quickly test and explore MCP servers from the command line!

    A simple, text-based CLI client for Model Context Protocol (MCP) servers built with LangChain and Python.

    Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.

    Internally it uses LangChain Agent and

    a utility function convert_mcp_to_langchain_tools() from [langchain_mcp_tools](https://pypi.org/project/langchain-mcp-tools/).

    A TypeScript equivalent of this utility is available here

    Prerequisites

    • Python 3.11+
    • [optional] [uv (uvx)](https://docs.astral.sh/uv/getting-started/installation/)

    installed to run Python package-based MCP servers

    • [optional] [npm 7+ (npx)](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)

    to run Node.js package-based MCP servers

    • LLM API key(s) from

    OpenAI,

    Anthropic,

    Google AI Studio (for GenAI/Gemini),

    xAI,

    Cerebras,

    and/or

    Groq,

    as needed

    Quick Start

    • Install mcp-chat tool.

    This can take up to a few minutes to complete:

    bash
    pip install mcp-chat
    • Configure LLM and MCP Servers settings via the configuration file, llm_mcp_config.json5
    bash
    code llm_mcp_config.json5

    The following is a simple configuration for quick testing:

    json5
    {
        "llm": {
          "provider": "openai",       "model": "gpt-5-mini"
          // "provider": "anthropic",    "model": "claude-haiku-4-5"
          // "provider": "google_genai", "model": "gemini-2.5-flash"
          // "provider": "xai",          "model": "grok-4-1-fast-non-reasoning"
          // "provider": "cerebras",     "model": "gpt-oss-120b"
          // "provider": "groq",         "model": "openai/gpt-oss-20b"
        },
    
        "mcp_servers": {
          "us-weather": {  // US weather only
            "command": "npx", 
            "args": ["-y", "@h1deya/mcp-server-weather"]
          },
        },
    
        "example_queries": [
          "Tell me how LLMs work in a few sentences",
          "Are there any weather alerts in California?",
        ],
      }
    • Set up API keys
    bash
    echo "ANTHROPIC_API_KEY=sk-ant-...
      OPENAI_API_KEY=sk-proj-...
      GOOGLE_API_KEY=AI...
      XAI_API_KEY=xai-...
      CEREBRAS_API_KEY=csk-...
      GROQ_API_KEY=gsk_..." > .env
      
      code .env
    • Run the tool
    bash
    mcp-chat

    By default, it reads the configuration file, llm_mcp_config.json5, from the current directory.

    Then, it applies the environment variables specified in the .env file,

    as well as the ones that are already defined.

    Features

    • Easy setup: Works out of the box with popular MCP servers
    • Flexible configuration: JSON5 config with environment variable support
    • Multiple LLM/API providers: OpenAI, Anthropic, Google (GenAI), xAI, Ceberas, Groq
    • Command & URL servers: Support for both local and remote MCP servers
    • Local MCP Server logging: Save stdio MCP server logs with customizable log directory
    • Interactive testing: Example queries for the convenience of repeated testing

    Limitations

    • Tool Return Types: Currently, only text results of tool calls are supported.

    It uses LangChain's response_format: 'content' (the default) internally, which only supports text strings.

    While MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content.

    • MCP Features: Only MCP Tools are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented.

    Usage

    Basic Usage

    bash
    mcp-chat

    By default, it reads the configuration file, llm_mcp_config.json5, from the current directory.

    Then, it applies the environment variables specified in the .env file,

    as well as the ones that are already defined.

    It outputs local MCP server logs to the current directory.

    With Options

    bash
    # Specify the config file to use
    mcp-chat --config my-config.json5
    
    # Store local (stdio) MCP server logs in specific directory
    mcp-chat --log-dir ./logs
    
    # Enable verbose logging
    mcp-chat --verbose
    
    # Show help
    mcp-chat --help

    Supported Model/API Providers

    • OpenAI: gpt-5-mini, gpt-5.2, etc.
    • Anthropic: claude-haiku-4-5, claude-3-5-haiku-latest, etc.
    • Google (GenAI): gemini-2.5-flash, gemini-3-flash-preview, etc.
    • xAI: grok-3-mini, grok-4-1-fast-non-reasoning, etc.
    • Cerebras: gpt-oss-120b, etc.
    • Groq: openai/gpt-oss-20b, openai/gpt-oss-120b, etc.

    Configuration

    Create a llm_mcp_config.json5 file:

    • The configuration file format

    for MCP servers follows the same structure as

    Claude for Desktop,

    with one difference: the key name mcpServers has been changed

    to mcp_servers to follow the snake_case convention

    commonly used in JSON configuration files.

    • The file format is JSON5,

    where comments and trailing commas are allowed.

    • The format is further extended to replace ${...} notations

    with the values of corresponding environment variables.

    • Keep all the credentials and private info in the .env file

    and refer to them with ${...} notation as needed

    json5
    {
      "llm": {
        "provider": "openai",       "model": "gpt-5-mini"
        // "provider": "anthropic",    "model": "claude-haiku-4-5"
        // "provider": "google_genai", "model": "gemini-2.5-flash"
        // "provider": "xai",          "model": "grok-4-1-fast-non-reasoning"
        // "provider": "cerebras",     "model": "gpt-oss-120b"
        // "provider": "groq",         "model": "openai/gpt-oss-20b"
      },
    
      "example_queries": [
        "Read and briefly summarize the LICENSE file in the current directory",
        "Fetch the raw HTML content from bbc.com and tell me the titile",
        // "Search for 'news in California' and show the first hit",
        // "Tell me about my default GitHub profile",
        // "Tell me about my default Notion account",
      ],
    
      "mcp_servers": {
        // Local MCP server that uses `npx`
        // https://www.npmjs.com/package/@modelcontextprotocol/server-filesystem
        "filesystem": {
          "command": "npx",
          "args": [
            "-y",
            "@modelcontextprotocol/server-filesystem",
            "."  // path to a directory to allow access to
          ]
        },
    
        // Local MCP server that uses `uvx`
        // https://pypi.org/project/mcp-server-fetch/
        "fetch": {
          "command": "uvx",
          "args": [
            "mcp-server-fetch"
          ]
        },
    
        // Embedding the value of an environment variable
        // https://www.npmjs.com/package/@modelcontextprotocol/server-brave-search
        "brave-search": {
          "command": "npx",
          "args": [
            "-y",
            "@modelcontextprotocol/server-brave-search"
          ],
          "env": {
            "BRAVE_API_KEY": "${BRAVE_API_KEY}"
          }
        },
    
        // Example of remote MCP server authentication via Authorization header
        // https://github.com/github/github-mcp-server?tab=readme-ov-file#remote-github-mcp-server
        "github": {
          // To avoid auto protocol fallback, specify the protocol explicitly when using authentication
          "type": "http",
          "url": "https://api.githubcopilot.com/mcp/",
          "headers": {
            "Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}"
          }
        },
    
        // For remote MCP servers that require OAuth, consider using "mcp-remote"
        "notion": {
          "command": "npx",
          "args": ["-y", "mcp-remote", "https://mcp.notion.com/mcp"],
        },
      }
    }

    Environment Variables

    Create a .env file for API keys:

    bash
    OPENAI_API_KEY=sk-ant-...
    ANTHROPIC_API_KEY=sk-proj-...
    GOOGLE_API_KEY=AI...
    XAI_API_KEY=xai-...
    CEREBRAS_API_KEY=csk-...
    GROQ_API_KEY=gsk_...
    
    # Other services as needed
    GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
    BRAVE_API_KEY=BSA...

    Popular MCP Servers to Try

    There are quite a few useful MCP servers already available:

    • MCP Server Listing on the Official Site

    Troubleshooting

    • Make sure your configuration and .env files are correct, especially the spelling of the API keys
    • Check the local MCP server logs
    • Use --verbose flag to view the detailed logs
    • Refer to Debugging Section in MCP documentation

    Building from Source

    See README_DEV.md for details.

    Change Log

    Can be found here

    License

    MIT License - see LICENSE file for details.

    Contributing

    Issues and pull requests welcome! This tool aims to make MCP server testing as simple as possible.

    Similar MCP

    Based on tags & features

    • FH

      Fhir Mcp Server

      Python·
      55
    • MC

      Mcp Aoai Web Browsing

      Python·
      30
    • WE

      Web Eval Agent

      Python·
      1.2k
    • AW

      Aws Mcp Server

      Python·
      165

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • FH

      Fhir Mcp Server

      Python·
      55
    • MC

      Mcp Aoai Web Browsing

      Python·
      30
    • WE

      Web Eval Agent

      Python·
      1.2k
    • AW

      Aws Mcp Server

      Python·
      165

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k