Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Mcp Wolframalpha

    A Python-powered Model Context Protocol MCP server and client that uses Wolfram Alpha via API.

    49 stars
    Python
    Updated Oct 29, 2025

    Table of Contents

    • Features
    • Installation
    • Clone the Repo
    • Set Up Environment Variables
    • Install Requirements
    • Configuration
    • Client Usage Example
    • Run with Gradio UI
    • Docker
    • UI
    • Run as CLI Tool
    • Docker
    • Contact

    Table of Contents

    • Features
    • Installation
    • Clone the Repo
    • Set Up Environment Variables
    • Install Requirements
    • Configuration
    • Client Usage Example
    • Run with Gradio UI
    • Docker
    • UI
    • Run as CLI Tool
    • Docker
    • Contact

    Documentation

    MCP Wolfram Alpha (Server + Client)

    Seamlessly integrate Wolfram Alpha into your chat applications.

    This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.

    Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.

    Ask DeepWiki

    ---

    Features

    • Wolfram|Alpha Integration for math, science, and data queries.
    • Modular Architecture Easily extendable to support additional APIs and functionalities.
    • Multi-Client Support Seamlessly handle interactions from multiple clients or interfaces.
    • MCP-Client example using Gemini (via LangChain).
    • UI Support using Gradio for a user-friendly web interface to interact with Google AI and Wolfram Alpha MCP server.

    ---

    Installation

    Clone the Repo

    bash
    git clone https://github.com/ricocf/mcp-wolframalpha.git
    
       cd mcp-wolframalpha

    Set Up Environment Variables

    Create a .env file based on the example:

    • WOLFRAM_API_KEY=your_wolframalpha_appid
    • GeminiAPI=your_google_gemini_api_key *(Optional if using Client method below.)*

    Install Requirements

    bash
    pip install -r requirements.txt

    Install the required dependencies with uv:

    Ensure [uv](https://github.com/astral-sh/uv) is installed.

    bash
    uv sync

    Configuration

    To use with the VSCode MCP Server:

    1. Create a configuration file at .vscode/mcp.json in your project root.

    2. Use the example provided in configs/vscode_mcp.json as a template.

    3. For more details, refer to the VSCode MCP Server Guide.

    To use with Claude Desktop:

    json
    {
      "mcpServers": {
        "WolframAlphaServer": {
          "command": "python3",
          "args": [
            "/path/to/src/core/server.py"
          ]
        }
      }
    }

    Client Usage Example

    This project includes an LLM client that communicates with the MCP server.

    Run with Gradio UI

    • Required: GeminiAPI
    • Provides a local web interface to interact with Google AI and Wolfram Alpha.
    • To run the client directly from the command line:
    bash
    python main.py --ui

    Docker

    To build and run the client inside a Docker container:

    bash
    docker build -t wolframalphaui -f .devops/ui.Dockerfile .
    
    docker run wolframalphaui

    UI

    • Intuitive interface built with Gradio to interact with both Google AI (Gemini) and the Wolfram Alpha MCP server.
    • Allows users to switch between Wolfram Alpha, Google AI (Gemini), and query history.

    UI

    Run as CLI Tool

    • Required: GeminiAPI
    • To run the client directly from the command line:
    bash
    python main.py

    Docker

    To build and run the client inside a Docker container:

    bash
    docker build -t wolframalpha -f .devops/llm.Dockerfile .
    
    docker run -it wolframalpha

    Contact

    Feel free to give feedback. The e-mail address is shown if you execute this in a shell:

    sh
    printf "\x61\x6b\x61\x6c\x61\x72\x69\x63\x31\x40\x6f\x75\x74\x6c\x6f\x6f\x6b\x2e\x63\x6f\x6d\x0a"

    Similar MCP

    Based on tags & features

    • AS

      Aseprite Mcp

      Python·
      92
    • IS

      Isaac Sim Mcp

      Python·
      83
    • MA

      Mayamcp

      Python·
      27
    • BI

      Biothings Mcp

      Python·
      25

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • AS

      Aseprite Mcp

      Python·
      92
    • IS

      Isaac Sim Mcp

      Python·
      83
    • MA

      Mayamcp

      Python·
      27
    • BI

      Biothings Mcp

      Python·
      25

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k