Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Deepchat

    🐬DeepChat - A smart assistant that connects powerful AI to your personal world for the Model Context Protocol. Enhance AI assistants with powerful integrations

    4,495 stars
    TypeScript
    Updated Oct 19, 2025
    agent
    ai
    ai-assistant
    ai-chat
    chat
    chatbot
    chatgpt
    claude
    cross-platform
    deepseek
    gemini
    llm-client
    mcp
    mcp-client
    openai-client
    tool-calling

    Table of Contents

    • 📑 Table of Contents
    • 🚀 Project Introduction
    • 💡 Why Choose DeepChat
    • 🔥 Main Features
    • 🧩 ACP Integration (Agent Client Protocol)
    • 🤖 Supported Model Providers
    • Compatible with any model provider in OpenAI/Gemini/Anthropic API format
    • 🔍 Use Cases
    • 📦 Quick Start
    • Download and Install
    • Configure Models
    • Start Conversations
    • 💻 Development Guide
    • Install Dependencies
    • Start Development
    • Build
    • 👥 Community & Contribution
    • ⭐ Star History
    • 👨‍💻 Contributors
    • 🙏🏻 Thanks
    • 📃 License

    Table of Contents

    • 📑 Table of Contents
    • 🚀 Project Introduction
    • 💡 Why Choose DeepChat
    • 🔥 Main Features
    • 🧩 ACP Integration (Agent Client Protocol)
    • 🤖 Supported Model Providers
    • Compatible with any model provider in OpenAI/Gemini/Anthropic API format
    • 🔍 Use Cases
    • 📦 Quick Start
    • Download and Install
    • Configure Models
    • Start Conversations
    • 💻 Development Guide
    • Install Dependencies
    • Start Development
    • Build
    • 👥 Community & Contribution
    • ⭐ Star History
    • 👨‍💻 Contributors
    • 🙏🏻 Thanks
    • 📃 License

    Documentation

    DeepChat - Powerful Open-Source AI Agent Platform

    DeepChat is a feature-rich open-source AI agent platform that unifies models, tools, and agents: multi-LLM chat, MCP tool calling, and ACP agent integration.

    📑 Table of Contents

    • 📑 Table of Contents
    • 🚀 Project Introduction
    • 💡 Why Choose DeepChat
    • 🔥 Main Features
    • 🧩 ACP Integration (Agent Client Protocol)
    • 🤖 Supported Model Providers
    • Compatible with any model provider in OpenAI/Gemini/Anthropic API format
    • 🔍 Use Cases
    • 📦 Quick Start
    • Download and Install
    • Configure Models
    • Start Conversations
    • 💻 Development Guide
    • Install Dependencies
    • Start Development
    • Build
    • 👥 Community \& Contribution
    • ⭐ Star History
    • 👨‍💻 Contributors
    • 📃 License

    🚀 Project Introduction

    DeepChat is a powerful open-source AI agent platform that brings together models, tools, and agent runtimes in one desktop app. Whether you're using cloud APIs like OpenAI, Gemini, Anthropic, or locally deployed Ollama models, DeepChat delivers a smooth user experience.

    Beyond chat, DeepChat supports agentic workflows: rich tool calling via MCP (Model Context Protocol), and unique ACP (Agent Client Protocol) integration that lets you run ACP-compatible agents as first-class “models” with a dedicated workspace UI.

    💡 Why Choose DeepChat

    Compared to other AI tools, DeepChat offers the following unique advantages:

    • Unified Multi-Model Management: One application supports almost all mainstream LLMs, eliminating the need to switch between multiple apps
    • Seamless Local Model Integration: Built-in Ollama support allows you to manage and use local models without command-line operations
    • Agentic Protocol Ecosystem: Built-in MCP support enables tool calling (code execution, web access, etc.), and built-in ACP support connects external agents into DeepChat with a native workspace UX
    • Powerful Search Enhancement: Support for multiple search engines makes AI responses more accurate and timely, providing non-standard web search paradigms that can be quickly customized
    • Privacy-Focused: Local data storage and network proxy support reduce the risk of information leakage
    • Business-Friendly: Embraces open source under the Apache License 2.0, suitable for both commercial and personal use

    🔥 Main Features

    • 🌐 Multiple Cloud LLM Provider Support: DeepSeek, OpenAI, Kimi, Grok, Gemini, Anthropic, and more
    • 🏠 Local Model Deployment Support:
    • Integrated Ollama with comprehensive management capabilities
    • Control and manage Ollama model downloads, deployments, and runs without command-line operations
    • 🚀 Rich and Easy-to-Use Chat Capabilities
    • Complete Markdown rendering with code block rendering based on industry-leading CodeMirror
    • Multi-window + multi-tab architecture supporting parallel multi-session operations across all dimensions, use large models like using a browser, non-blocking experience brings excellent efficiency
    • Supports Artifacts rendering for diverse result presentation, significantly saving token consumption after MCP integration
    • Messages support retry to generate multiple variations; conversations can be forked freely, ensuring there's always a suitable line of thought
    • Supports rendering images, Mermaid diagrams, and other multi-modal content; supports GPT-4o, Gemini, Grok text-to-image capabilities
    • Supports highlighting external information sources like search results within the content
    • 🔍 Robust Search Extension Capabilities
    • Built-in integration with leading search APIs like BoSearch, Brave Search via MCP mode, allowing the model to intelligently decide when to search
    • Supports mainstream search engines like Google, Bing, Baidu, and Sogou Official Accounts search by simulating user web browsing, enabling the LLM to read search engines like a human
    • Supports reading any search engine; simply configure a search assistant model to connect various search sources, whether internal networks, API-less engines, or vertical domain search engines, as information sources for the model
    • 🔧 Excellent MCP (Model Context Protocol) Support
    • Complete support for the three core capabilities of Resources/Prompts/Tools in the MCP protocol
    • Supports semantic workflows, enabling more complex and intelligent automation by understanding the meaning and context of tasks.
    • Extremely user-friendly configuration interface
    • Aesthetically pleasing and clear tool call display
    • Detailed tool call debugging window with automatic formatting of tool parameters and return data
    • Built-in Node.js runtime environment; npx/node-like services require no extra configuration and work out-of-the-box
    • Supports StreamableHTTP/SSE/Stdio protocol Transports
    • Supports inMemory services with built-in utilities like code execution, web information retrieval, and file operations; ready for most common use cases out-of-the-box without secondary installation
    • Converts visual model capabilities into universally usable functions for any model via the built-in MCP service
    • 🤝 ACP (Agent Client Protocol) Agent Integration
    • Run ACP-compatible agents (built-in or custom commands) as selectable “models”
    • ACP workspace UI for structured plans, tool calls, and terminal output when provided by the agent
    • 💻 Multi-Platform Support: Windows, macOS, Linux
    • 🎨 Beautiful and User-Friendly Interface, user-oriented design, meticulously themed light and dark modes
    • 🔗 Rich DeepLink Support: Initiate conversations via links for seamless integration with other applications. Also supports one-click installation of MCP services for simplicity and speed
    • 🚑 Security-First Design: Chat data and configuration data have reserved encryption interfaces and code obfuscation capabilities
    • 🛡️ Privacy Protection: Supports screen projection hiding, network proxies, and other privacy protection methods to reduce the risk of information leakage
    • 💰 Business-Friendly:
    • Embraces open source, based on the Apache License 2.0 protocol, enterprise use without worry
    • Enterprise integration requires only minimal configuration code changes to use reserved encrypted obfuscation security capabilities
    • Clear code structure, both model providers and MCP services are highly decoupled, can be freely customized with minimal cost
    • Reasonable architecture, data interaction and UI behavior separation, fully utilizing Electron's capabilities, rejecting simple web wrappers, excellent performance

    For more details on how to use these features, see the User Guide.

    🧩 ACP Integration (Agent Client Protocol)

    DeepChat has built-in support for Agent Client Protocol (ACP), allowing you to integrate external agent runtimes into DeepChat with a native UI. Once enabled, ACP agents appear as first-class entries in the model selector, so you can use coding agents and task agents directly inside DeepChat.

    Quick start:

    1. Open Settings → ACP Agents and enable ACP

    2. Enable a built-in ACP agent or add a custom ACP-compatible command

    3. Select the ACP agent in the model selector to start an agent session

    To explore the ecosystem of compatible agents and clients, see: https://agentclientprotocol.com/overview/clients

    🤖 Supported Model Providers

    Compatible with any model provider in OpenAI/Gemini/Anthropic API format

    🔍 Use Cases

    DeepChat is suitable for various AI application scenarios:

    • Daily Assistant: Answering questions, providing suggestions, assisting with writing and creation
    • Development Aid: Code generation, debugging, technical problem solving
    • Learning Tool: Concept explanation, knowledge exploration, learning guidance
    • Content Creation: Copywriting, creative inspiration, content optimization
    • Data Analysis: Data interpretation, chart generation, report writing

    📦 Quick Start

    Download and Install

    You can install DeepChat using one of the following methods:

    Option 1: GitHub Releases

    Download the latest version for your system from the GitHub Releases page:

    • Windows: .exe installation file
    • macOS: .dmg installation file
    • Linux: .AppImage or .deb installation file

    Option 2: Official Website

    Download from the official website.

    Option 3: Homebrew (macOS only)

    For macOS users, you can install DeepChat using Homebrew:

    bash
    brew install --cask deepchat

    Configure Models

    1. Launch the DeepChat application

    2. Click the settings icon

    3. Select the "Model Providers" tab

    4. Add your API keys or configure local Ollama

    Start Conversations

    1. Click the "+" button to create a new conversation

    2. Select the model you want to use

    3. Start communicating with your AI assistant

    For a comprehensive guide on getting started and using all features, please refer to the User Guide.

    💻 Development Guide

    Please read the Contribution Guidelines

    Windows and Linux are packaged by GitHub Action.

    For Mac-related signing and packaging, please refer to the Mac Release Guide.

    Install Dependencies

    bash
    $ pnpm install
    $ pnpm run installRuntime
    # if got err: No module named 'distutils'
    $ pip install setuptools
    • For Windows: To allow non-admin users to create symlinks and hardlinks, enable Developer Mode in Settings or use an administrator account. Otherwise pnpm ops will fail.

    Start Development

    bash
    $ pnpm run dev

    Build

    bash
    # For Windows
    $ pnpm run build:win
    
    # For macOS
    $ pnpm run build:mac
    
    # For Linux
    $ pnpm run build:linux
    
    # Specify architecture packaging
    $ pnpm run build:win:x64
    $ pnpm run build:win:arm64
    $ pnpm run build:mac:x64
    $ pnpm run build:mac:arm64
    $ pnpm run build:linux:x64
    $ pnpm run build:linux:arm64

    For a more detailed guide on development, project structure, and architecture, please see the Developer Guide.

    👥 Community & Contribution

    DeepChat is an active open-source community project, and we welcome various forms of contribution:

    • 🐛 Report issues
    • 💡 Submit feature suggestions
    • 🔧 Submit code improvements
    • 📚 Improve documentation
    • 🌍 Help with translation

    Check the Contribution Guidelines to learn more about ways to participate in the project.

    ⭐ Star History

    Star History Chart

    👨‍💻 Contributors

    Thank you for considering contributing to deepchat! The contribution guide can be found in the Contribution Guidelines.

    🙏🏻 Thanks

    This project is built with the help of these awesome libraries:

    • Vue
    • Electron
    • Electron-Vite
    • oxlint

    📃 License

    LICENSE

    Similar MCP

    Based on tags & features

    • MC

      Mcp Server Browserbase

      TypeScript·
      2.7k
    • MC

      Mcp Open Library

      TypeScript·
      42
    • AN

      Anilist Mcp

      TypeScript·
      57
    • MC

      Mcp Ipfs

      TypeScript·
      11

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • MC

      Mcp Server Browserbase

      TypeScript·
      2.7k
    • MC

      Mcp Open Library

      TypeScript·
      42
    • AN

      Anilist Mcp

      TypeScript·
      57
    • MC

      Mcp Ipfs

      TypeScript·
      11

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k