Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Rust Sdk

    The official Rust SDK for the Model Context Protocol Trusted by 2500+ developers. Trusted by 2500+ developers. Trusted by 2500+ developers.

    2,510 stars
    Rust
    Updated Nov 4, 2025

    Table of Contents

    • Table of Contents
    • Usage
    • Import the crate
    • or dev channel
    • Third Dependencies
    • Build a Client
    • Build a Server
    • Resources
    • Server-side
    • Client-side
    • Notifications
    • Prompts
    • Server-side
    • Client-side
    • Notifications
    • Sampling
    • Server-side (requesting sampling)
    • Client-side (handling sampling)
    • Roots
    • Server-side
    • Client-side
    • Logging
    • Server-side
    • Client-side
    • Completions
    • Server-side
    • Client-side
    • Notifications
    • Progress notifications
    • Cancellation
    • Initialized notification
    • List-changed notifications
    • Subscriptions
    • Server-side
    • Client-side
    • Examples
    • OAuth Support
    • Related Resources
    • Related Projects
    • Extending rmcp
    • Built with rmcp
    • Development
    • Tips for Contributors
    • Using Dev Container

    Table of Contents

    • Table of Contents
    • Usage
    • Import the crate
    • or dev channel
    • Third Dependencies
    • Build a Client
    • Build a Server
    • Resources
    • Server-side
    • Client-side
    • Notifications
    • Prompts
    • Server-side
    • Client-side
    • Notifications
    • Sampling
    • Server-side (requesting sampling)
    • Client-side (handling sampling)
    • Roots
    • Server-side
    • Client-side
    • Logging
    • Server-side
    • Client-side
    • Completions
    • Server-side
    • Client-side
    • Notifications
    • Progress notifications
    • Cancellation
    • Initialized notification
    • List-changed notifications
    • Subscriptions
    • Server-side
    • Client-side
    • Examples
    • OAuth Support
    • Related Resources
    • Related Projects
    • Extending rmcp
    • Built with rmcp
    • Development
    • Tips for Contributors
    • Using Dev Container

    Documentation

    RMCP

    Crates.io Version

    docs.rs

    CI

    License

    An official Rust Model Context Protocol SDK implementation with tokio async runtime.

    Migrating to 1.x? See the migration guide for breaking changes and upgrade instructions.

    This repository contains the following crates:

    • rmcp: The core crate providing the RMCP protocol implementation - see rmcp
    • rmcp-macros: A procedural macro crate for generating RMCP tool implementations - see rmcp-macros

    For the full MCP specification, see modelcontextprotocol.io.

    Table of Contents

    • Usage
    • Resources
    • Prompts
    • Sampling
    • Roots
    • Logging
    • Completions
    • Notifications
    • Subscriptions
    • Examples
    • OAuth Support
    • Related Resources
    • Related Projects
    • Development

    Usage

    Import the crate

    toml
    rmcp = { version = "0.16.0", features = ["server"] }
    ## or dev channel
    rmcp = { git = "https://github.com/modelcontextprotocol/rust-sdk", branch = "main" }

    Third Dependencies

    Basic dependencies:

    • tokio
    • serde

    Json Schema generation (version 2020-12):

    • schemars

    Build a Client

    Start a client

    rust, ignore
    use rmcp::{ServiceExt, transport::{TokioChildProcess, ConfigureCommandExt}};
    use tokio::process::Command;
    
    #[tokio::main]
    async fn main() -> Result> {
        let client = ().serve(TokioChildProcess::new(Command::new("npx").configure(|cmd| {
            cmd.arg("-y").arg("@modelcontextprotocol/server-everything");
        }))?).await?;
        Ok(())
    }

    Build a Server

    Build a transport

    rust, ignore
    use tokio::io::{stdin, stdout};
    let transport = (stdin(), stdout());

    Build a service

    You can easily build a service by using [ServerHandler](crates/rmcp/src/handler/server.rs) or [ClientHandler](crates/rmcp/src/handler/client.rs).

    rust, ignore
    let service = common::counter::Counter::new();

    Start the server

    rust, ignore
    // this call will finish the initialization process
    let server = service.serve(transport).await?;

    Interact with the server

    Once the server is initialized, you can send requests or notifications:

    rust, ignore
    // request
    let roots = server.list_roots().await?;
    
    // or send notification
    server.notify_cancelled(...).await?;

    Waiting for service shutdown

    rust, ignore
    let quit_reason = server.waiting().await?;
    // or cancel it
    let quit_reason = server.cancel().await?;

    ---

    Resources

    Resources let servers expose data (files, database records, API responses) that clients can read. Each resource is identified by a URI and returns content as text or binary (base64-encoded) data. Resource templates allow servers to declare URI patterns with dynamic parameters.

    MCP Spec: Resources

    Server-side

    Implement list_resources(), read_resource(), and optionally list_resource_templates() on the ServerHandler trait. Enable the resources capability in get_info().

    rust
    use rmcp::{
        ErrorData as McpError, RoleServer, ServerHandler, ServiceExt,
        model::*,
        service::RequestContext,
        transport::stdio,
    };
    use serde_json::json;
    
    #[derive(Clone)]
    struct MyServer;
    
    impl ServerHandler for MyServer {
        fn get_info(&self) -> ServerInfo {
            ServerInfo {
                capabilities: ServerCapabilities::builder()
                    .enable_resources()
                    .build(),
                ..Default::default()
            }
        }
    
        async fn list_resources(
            &self,
            _request: Option,
            _context: RequestContext,
        ) -> Result {
            Ok(ListResourcesResult {
                resources: vec![
                    RawResource::new("file:///config.json", "config").no_annotation(),
                    RawResource::new("memo://insights", "insights").no_annotation(),
                ],
                next_cursor: None,
                meta: None,
            })
        }
    
        async fn read_resource(
            &self,
            request: ReadResourceRequestParams,
            _context: RequestContext,
        ) -> Result {
            match request.uri.as_str() {
                "file:///config.json" => Ok(ReadResourceResult {
                    contents: vec![ResourceContents::text(r#"{"key": "value"}"#, &request.uri)],
                }),
                "memo://insights" => Ok(ReadResourceResult {
                    contents: vec![ResourceContents::text("Analysis results...", &request.uri)],
                }),
                _ => Err(McpError::resource_not_found(
                    "resource_not_found",
                    Some(json!({ "uri": request.uri })),
                )),
            }
        }
    
        async fn list_resource_templates(
            &self,
            _request: Option,
            _context: RequestContext,
        ) -> Result {
            Ok(ListResourceTemplatesResult {
                resource_templates: vec![],
                next_cursor: None,
                meta: None,
            })
        }
    }

    Client-side

    rust
    use rmcp::model::{ReadResourceRequestParams};
    
    // List all resources (handles pagination automatically)
    let resources = client.list_all_resources().await?;
    
    // Read a specific resource by URI
    let result = client.read_resource(ReadResourceRequestParams {
        meta: None,
        uri: "file:///config.json".into(),
    }).await?;
    
    // List resource templates
    let templates = client.list_all_resource_templates().await?;

    Notifications

    Servers can notify clients when the resource list changes or when a specific resource is updated:

    rust
    // Notify that the resource list has changed (clients should re-fetch)
    context.peer.notify_resource_list_changed().await?;
    
    // Notify that a specific resource was updated
    context.peer.notify_resource_updated(ResourceUpdatedNotificationParam {
        uri: "file:///config.json".into(),
    }).await?;

    Clients handle these via ClientHandler:

    rust
    impl ClientHandler for MyClient {
        async fn on_resource_list_changed(
            &self,
            _context: NotificationContext,
        ) {
            // Re-fetch the resource list
        }
    
        async fn on_resource_updated(
            &self,
            params: ResourceUpdatedNotificationParam,
            _context: NotificationContext,
        ) {
            // Re-read the updated resource at params.uri
        }
    }

    Example: [examples/servers/src/common/counter.rs](examples/servers/src/common/counter.rs) (server), [examples/clients/src/everything_stdio.rs](examples/clients/src/everything_stdio.rs) (client)

    ---

    Prompts

    Prompts are reusable message templates that servers expose to clients. They accept typed arguments and return conversation messages. The #[prompt] macro handles argument validation and routing automatically.

    MCP Spec: Prompts

    Server-side

    Use the #[prompt_router], #[prompt], and #[prompt_handler] macros to define prompts declaratively. Arguments are defined as structs deriving JsonSchema.

    rust
    use rmcp::{
        ErrorData as McpError, RoleServer, ServerHandler, ServiceExt,
        handler::server::{router::prompt::PromptRouter, wrapper::Parameters},
        model::*,
        prompt, prompt_handler, prompt_router,
        schemars::JsonSchema,
        service::RequestContext,
        transport::stdio,
    };
    use serde::{Deserialize, Serialize};
    
    #[derive(Debug, Serialize, Deserialize, JsonSchema)]
    pub struct CodeReviewArgs {
        #[schemars(description = "Programming language of the code")]
        pub language: String,
        #[schemars(description = "Focus areas for the review")]
        pub focus_areas: Option>,
    }
    
    #[derive(Clone)]
    pub struct MyServer {
        prompt_router: PromptRouter,
    }
    
    #[prompt_router]
    impl MyServer {
        fn new() -> Self {
            Self { prompt_router: Self::prompt_router() }
        }
    
        /// Simple prompt without parameters
        #[prompt(name = "greeting", description = "A simple greeting")]
        async fn greeting(&self) -> Vec {
            vec![PromptMessage::new_text(
                PromptMessageRole::User,
                "Hello! How can you help me today?",
            )]
        }
    
        /// Prompt with typed arguments
        #[prompt(name = "code_review", description = "Review code in a given language")]
        async fn code_review(
            &self,
            Parameters(args): Parameters,
        ) -> Result {
            let focus = args.focus_areas
                .unwrap_or_else(|| vec!["correctness".into()]);
    
            Ok(GetPromptResult {
                description: Some(format!("Code review for {}", args.language)),
                messages: vec![
                    PromptMessage::new_text(
                        PromptMessageRole::User,
                        format!("Review my {} code. Focus on: {}", args.language, focus.join(", ")),
                    ),
                ],
            })
        }
    }
    
    #[prompt_handler]
    impl ServerHandler for MyServer {
        fn get_info(&self) -> ServerInfo {
            ServerInfo {
                capabilities: ServerCapabilities::builder().enable_prompts().build(),
                ..Default::default()
            }
        }
    }

    Prompt functions support several return types:

    • Vec -- simple message list
    • GetPromptResult -- messages with an optional description
    • Result -- either of the above, with error handling

    Client-side

    rust
    use rmcp::model::GetPromptRequestParams;
    
    // List all prompts
    let prompts = client.list_all_prompts().await?;
    
    // Get a prompt with arguments
    let result = client.get_prompt(GetPromptRequestParams {
        meta: None,
        name: "code_review".into(),
        arguments: Some(rmcp::object!({
            "language": "Rust",
            "focus_areas": ["performance", "safety"]
        })),
    }).await?;

    Notifications

    rust
    // Server: notify that available prompts have changed
    context.peer.notify_prompt_list_changed().await?;

    Example: [examples/servers/src/prompt_stdio.rs](examples/servers/src/prompt_stdio.rs) (server), [examples/clients/src/everything_stdio.rs](examples/clients/src/everything_stdio.rs) (client)

    ---

    Sampling

    Sampling flips the usual direction: the server asks the client to run an LLM completion. The server sends a create_message request, the client processes it through its LLM, and returns the result.

    MCP Spec: Sampling

    Server-side (requesting sampling)

    Access the client's sampling capability through context.peer.create_message():

    rust
    use rmcp::model::*;
    
    // Inside a ServerHandler method (e.g., call_tool):
    let response = context.peer.create_message(CreateMessageRequestParams {
        meta: None,
        task: None,
        messages: vec![SamplingMessage::user_text("Explain this error: connection refused")],
        model_preferences: Some(ModelPreferences {
            hints: Some(vec![ModelHint { name: Some("claude".into()) }]),
            cost_priority: Some(0.3),
            speed_priority: Some(0.8),
            intelligence_priority: Some(0.7),
        }),
        system_prompt: Some("You are a helpful assistant.".into()),
        include_context: Some(ContextInclusion::None),
        temperature: Some(0.7),
        max_tokens: 150,
        stop_sequences: None,
        metadata: None,
        tools: None,
        tool_choice: None,
    }).await?;
    
    // Extract the response text
    let text = response.message.content
        .first()
        .and_then(|c| c.as_text())
        .map(|t| &t.text);

    Client-side (handling sampling)

    On the client side, implement ClientHandler::create_message(). This is where you'd call your actual LLM:

    rust
    use rmcp::{ClientHandler, model::*, service::{RequestContext, RoleClient}};
    
    #[derive(Clone, Default)]
    struct MyClient;
    
    impl ClientHandler for MyClient {
        async fn create_message(
            &self,
            params: CreateMessageRequestParams,
            _context: RequestContext,
        ) -> Result {
            // Forward to your LLM, or return a mock response:
            let response_text = call_your_llm(&params.messages).await;
    
            Ok(CreateMessageResult {
                message: SamplingMessage::assistant_text(response_text),
                model: "my-model".into(),
                stop_reason: Some(CreateMessageResult::STOP_REASON_END_TURN.into()),
            })
        }
    }

    Example: [examples/servers/src/sampling_stdio.rs](examples/servers/src/sampling_stdio.rs) (server), [examples/clients/src/sampling_stdio.rs](examples/clients/src/sampling_stdio.rs) (client)

    ---

    Roots

    Roots tell servers which directories or projects the client is working in. A root is a URI (typically file://) pointing to a workspace or repository. Servers can query roots to know where to look for files and how to scope their work.

    MCP Spec: Roots

    Server-side

    Ask the client for its root list, and handle change notifications:

    rust
    use rmcp::{ServerHandler, model::*, service::{NotificationContext, RoleServer}};
    
    impl ServerHandler for MyServer {
        // Query the client for its roots
        async fn call_tool(
            &self,
            request: CallToolRequestParams,
            context: RequestContext,
        ) -> Result {
            let roots = context.peer.list_roots().await?;
            // Use roots.roots to understand workspace boundaries
            // ...
        }
    
        // Called when the client's root list changes
        async fn on_roots_list_changed(
            &self,
            _context: NotificationContext,
        ) {
            // Re-fetch roots to stay current
        }
    }

    Client-side

    Clients declare roots capability and implement list_roots():

    rust
    use rmcp::{ClientHandler, model::*};
    
    impl ClientHandler for MyClient {
        async fn list_roots(
            &self,
            _context: RequestContext,
        ) -> Result {
            Ok(ListRootsResult {
                roots: vec![
                    Root {
                        uri: "file:///home/user/project".into(),
                        name: Some("My Project".into()),
                    },
                ],
            })
        }
    }

    Clients notify the server when roots change:

    rust
    // After adding or removing a workspace root:
    client.notify_roots_list_changed().await?;

    ---

    Logging

    Servers can send structured log messages to clients. The client sets a minimum severity level, and the server sends messages through the peer notification interface.

    MCP Spec: Logging

    Server-side

    Enable the logging capability, handle level changes from the client, and send log messages via the peer:

    rust
    use rmcp::{ServerHandler, model::*, service::RequestContext};
    
    impl ServerHandler for MyServer {
        fn get_info(&self) -> ServerInfo {
            ServerInfo {
                capabilities: ServerCapabilities::builder()
                    .enable_logging()
                    .build(),
                ..Default::default()
            }
        }
    
        // Client sets the minimum log level
        async fn set_level(
            &self,
            request: SetLevelRequestParams,
            _context: RequestContext,
        ) -> Result {
            // Store request.level and filter future log messages accordingly
            Ok(())
        }
    }
    
    // Send a log message from any handler with access to the peer:
    context.peer.notify_logging_message(LoggingMessageNotificationParam {
        level: LoggingLevel::Info,
        logger: Some("my-server".into()),
        data: serde_json::json!({
            "message": "Processing completed",
            "items_processed": 42
        }),
    }).await?;

    Available log levels (from least to most severe): Debug, Info, Notice, Warning, Error, Critical, Alert, Emergency.

    Client-side

    Clients handle incoming log messages via ClientHandler:

    rust
    impl ClientHandler for MyClient {
        async fn on_logging_message(
            &self,
            params: LoggingMessageNotificationParam,
            _context: NotificationContext,
        ) {
            println!("[{}] {}: {}", params.level,
                params.logger.unwrap_or_default(), params.data);
        }
    }

    Clients can also set the server's log level:

    rust
    client.set_level(SetLevelRequestParams {
        level: LoggingLevel::Warning,
        meta: None,
    }).await?;

    ---

    Completions

    Completions give auto-completion suggestions for prompt or resource template arguments. As a user fills in arguments, the client can ask the server for suggestions based on what's already been entered.

    MCP Spec: Completions

    Server-side

    Enable the completions capability and implement the complete() handler. Use request.context to inspect previously filled arguments:

    rust
    use rmcp::{ErrorData as McpError, ServerHandler, model::*, service::RequestContext, RoleServer};
    
    impl ServerHandler for MyServer {
        fn get_info(&self) -> ServerInfo {
            ServerInfo {
                capabilities: ServerCapabilities::builder()
                    .enable_completions()
                    .enable_prompts()
                    .build(),
                ..Default::default()
            }
        }
    
        async fn complete(
            &self,
            request: CompleteRequestParams,
            _context: RequestContext,
        ) -> Result {
            let values = match &request.r#ref {
                Reference::Prompt(prompt_ref) if prompt_ref.name == "sql_query" => {
                    match request.argument.name.as_str() {
                        "operation" => vec!["SELECT", "INSERT", "UPDATE", "DELETE"],
                        "table" => vec!["users", "orders", "products"],
                        "columns" => {
                            // Adapt suggestions based on previously filled arguments
                            if let Some(ctx) = &request.context {
                                if let Some(op) = ctx.get_argument("operation") {
                                    match op.to_uppercase().as_str() {
                                        "SELECT" | "UPDATE" => {
                                            vec!["id", "name", "email", "created_at"]
                                        }
                                        _ => vec![],
                                    }
                                } else { vec![] }
                            } else { vec![] }
                        }
                        _ => vec![],
                    }
                }
                _ => vec![],
            };
    
            // Filter by the user's partial input
            let filtered: Vec = values.into_iter()
                .map(String::from)
                .filter(|v| v.to_lowercase().contains(&request.argument.value.to_lowercase()))
                .collect();
    
            Ok(CompleteResult {
                completion: CompletionInfo {
                    values: filtered,
                    total: None,
                    has_more: Some(false),
                },
            })
        }
    }

    Client-side

    rust
    use rmcp::model::*;
    
    let result = client.complete(CompleteRequestParams {
        meta: None,
        r#ref: Reference::Prompt(PromptReference {
            name: "sql_query".into(),
        }),
        argument: ArgumentInfo {
            name: "operation".into(),
            value: "SEL".into(),
        },
        context: None,
    }).await?;
    
    // result.completion.values contains suggestions like ["SELECT"]

    Example: [examples/servers/src/completion_stdio.rs](examples/servers/src/completion_stdio.rs)

    ---

    Notifications

    Notifications are fire-and-forget messages -- no response is expected. They cover progress updates, cancellation, and lifecycle events. Both sides can send and receive them.

    MCP Spec: Notifications

    Progress notifications

    Servers can report progress during long-running operations:

    rust
    use rmcp::model::*;
    
    // Inside a tool handler:
    for i in 0..total_items {
        process_item(i).await;
    
        context.peer.notify_progress(ProgressNotificationParam {
            progress_token: ProgressToken(NumberOrString::Number(i as i64)),
            progress: i as f64,
            total: Some(total_items as f64),
            message: Some(format!("Processing item {}/{}", i + 1, total_items)),
        }).await?;
    }

    Cancellation

    Either side can cancel an in-progress request:

    rust
    // Send a cancellation
    context.peer.notify_cancelled(CancelledNotificationParam {
        request_id: the_request_id,
        reason: Some("User requested cancellation".into()),
    }).await?;

    Handle cancellation in ServerHandler or ClientHandler:

    rust
    impl ServerHandler for MyServer {
        async fn on_cancelled(
            &self,
            params: CancelledNotificationParam,
            _context: NotificationContext,
        ) {
            // Abort work for params.request_id
        }
    }

    Initialized notification

    Clients send initialized after the handshake completes:

    rust
    // Sent automatically by rmcp during the serve() handshake.
    // Servers handle it via:
    impl ServerHandler for MyServer {
        async fn on_initialized(
            &self,
            _context: NotificationContext,
        ) {
            // Server is ready to receive requests
        }
    }

    List-changed notifications

    When available tools, prompts, or resources change, tell the client:

    rust
    context.peer.notify_tool_list_changed().await?;
    context.peer.notify_prompt_list_changed().await?;
    context.peer.notify_resource_list_changed().await?;

    Example: [examples/servers/src/common/progress_demo.rs](examples/servers/src/common/progress_demo.rs)

    ---

    Subscriptions

    Clients can subscribe to specific resources. When a subscribed resource changes, the server sends a notification and the client can re-read it.

    MCP Spec: Resources - Subscriptions

    Server-side

    Enable subscriptions in the resources capability and implement the subscribe() / unsubscribe() handlers:

    rust
    use rmcp::{ErrorData as McpError, ServerHandler, model::*, service::RequestContext, RoleServer};
    use std::sync::Arc;
    use tokio::sync::Mutex;
    use std::collections::HashSet;
    
    #[derive(Clone)]
    struct MyServer {
        subscriptions: Arc>>,
    }
    
    impl ServerHandler for MyServer {
        fn get_info(&self) -> ServerInfo {
            ServerInfo {
                capabilities: ServerCapabilities::builder()
                    .enable_resources()
                    .enable_resources_subscribe()
                    .build(),
                ..Default::default()
            }
        }
    
        async fn subscribe(
            &self,
            request: SubscribeRequestParams,
            _context: RequestContext,
        ) -> Result {
            self.subscriptions.lock().await.insert(request.uri);
            Ok(())
        }
    
        async fn unsubscribe(
            &self,
            request: UnsubscribeRequestParams,
            _context: RequestContext,
        ) -> Result {
            self.subscriptions.lock().await.remove(&request.uri);
            Ok(())
        }
    }

    When a subscribed resource changes, notify the client:

    rust
    // Check if the resource has subscribers, then notify
    context.peer.notify_resource_updated(ResourceUpdatedNotificationParam {
        uri: "file:///config.json".into(),
    }).await?;

    Client-side

    rust
    use rmcp::model::*;
    
    // Subscribe to updates for a resource
    client.subscribe(SubscribeRequestParams {
        meta: None,
        uri: "file:///config.json".into(),
    }).await?;
    
    // Unsubscribe when no longer needed
    client.unsubscribe(UnsubscribeRequestParams {
        meta: None,
        uri: "file:///config.json".into(),
    }).await?;

    Handle update notifications in ClientHandler:

    rust
    impl ClientHandler for MyClient {
        async fn on_resource_updated(
            &self,
            params: ResourceUpdatedNotificationParam,
            _context: NotificationContext,
        ) {
            // Re-read the resource at params.uri
        }
    }

    ---

    Examples

    See examples.

    OAuth Support

    See Oauth_support for details.

    Related Resources

    • MCP Specification
    • Schema

    Related Projects

    Extending rmcp

    • rmcp-actix-web - An actix_web backend for rmcp
    • rmcp-openapi - Transform OpenAPI definition endpoints into MCP tools

    Built with rmcp

    • goose - An open-source, extensible AI agent that goes beyond code suggestions
    • apollo-mcp-server - MCP server that connects AI agents to GraphQL APIs via Apollo GraphOS
    • rustfs-mcp - High-performance MCP server providing S3-compatible object storage operations for AI/LLM integration
    • containerd-mcp-server - A containerd-based MCP server implementation
    • rmcp-openapi-server - High-performance MCP server that exposes OpenAPI definition endpoints as MCP tools
    • nvim-mcp - A MCP server to interact with Neovim
    • terminator - AI-powered desktop automation MCP server with cross-platform support and >95% success rate
    • stakpak-agent - Security-hardened terminal agent for DevOps with MCP over mTLS, streaming, secret tokenization, and async task management
    • video-transcriber-mcp-rs - High-performance MCP server for transcribing videos from 1000+ platforms using whisper.cpp
    • NexusCore MCP - Advanced malware analysis & dynamic instrumentation MCP server with Frida integration and stealth unpacking capabilities
    • spreadsheet-mcp - Token-efficient MCP server for spreadsheet analysis with automatic region detection, recalculation, screenshot, and editing support for LLM agents
    • hyper-mcp - A fast, secure MCP server that extends its capabilities through WebAssembly (WASM) plugins
    • rudof-mcp - RDF validation and data processing MCP server with ShEx/SHACL validation, SPARQL queries, and format conversion. Supports stdio and streamable HTTP transports with full MCP capabilities (tools, prompts, resources, logging, completions, tasks)
    • McpMux - Desktop app to configure MCP servers once at McpMux, connect every AI client (Cursor, Claude Desktop, VS Code, Windsurf) through a single encrypted local gateway with Spaces for project organization, FeatureSets to switch toolsets per client, and a built-in server registry

    Development

    Tips for Contributors

    See docs/CONTRIBUTE.MD to get some tips for contributing.

    Using Dev Container

    If you want to use dev container, see docs/DEVCONTAINER.md for instructions on using Dev Container for development.

    Similar MCP

    Based on tags & features

    • IM

      Imagen3 Mcp

      Rust·
      46
    • MC

      Mcp Access Point

      Rust·
      135
    • WI

      Winx Code Agent

      Rust·
      19
    • CO

      Code Assistant

      Rust·
      103

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • IM

      Imagen3 Mcp

      Rust·
      46
    • MC

      Mcp Access Point

      Rust·
      135
    • WI

      Winx Code Agent

      Rust·
      19
    • CO

      Code Assistant

      Rust·
      103

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k