Track MCP LogoTrack MCP
Track MCP LogoTrack MCP

The world's largest repository of Model Context Protocol servers. Discover, explore, and submit MCP tools.

Product

  • Categories
  • Top MCP
  • New & Updated
  • Submit MCP

Company

  • About

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy

© 2026 TrackMCP. All rights reserved.

Built with ❤️ by Krishna Goyal

    Gpt2099.nu

    a Nushell cross.stream extension to interact with LLMs and MCP servers

    17 stars
    Nushell
    Updated Nov 1, 2025
    anthropic
    code-generation
    cross-stream
    gemini
    llm
    model-context-protocol
    nushell

    Table of Contents

    • Features
    • Getting started
    • Step 1.
    • Step 2.
    • Step 3.
    • Step 4.
    • Step 5.
    • Step 6.
    • Documentation
    • Reference Documentation
    • FAQ
    • Original intro

    Table of Contents

    • Features
    • Getting started
    • Step 1.
    • Step 2.
    • Step 3.
    • Step 4.
    • Step 5.
    • Step 6.
    • Documentation
    • Reference Documentation
    • FAQ
    • Original intro

    Documentation

    gpt2099 Discord

    A Nushell scriptable

    MCP client

    with editable context threads

    stored in cross.stream

    Features

    • Consistent API Across Models: Connect to Anthropic, Cerebras, Cohere, Gemini, and OpenAI

    through a single, simple interface. (Add providers easily.)

    • Persistent, Editable Conversations:

    Conversation threads are

    saved across sessions. Review, edit, and control your own context window — no black-box history.

    • Flexible Tool Integration: Connect to MCP servers to extend functionality. gpt2099 already

    rivals Claude Code for local file

    editing, but with full provider independence and deeper flexibility.

    • Document Support: Upload and reference documents (PDFs, images, text files) directly in

    conversations with automatic content-type detection and optional caching.

    Built on cross.stream for event-driven processing, gpt2099

    brings modern AI directly into your Nushell workflow — fully scriptable, fully inspectable, all in

    the terminal.

    https://github.com/user-attachments/assets/1254aaa1-2ca2-46b5-96e8-b5e466c735bd

    "lady on the track" provided by mobygratis

    Getting started

    Step 1.

    First, install and configure [cross.stream](https://github.com/cablehead/xs). Once set up, you'll

    have the full [cross.stream](https://github.com/cablehead/xs) ecosystem of tools for editing and

    working with your context windows.

    • https://cablehead.github.io/xs/getting-started/installation/

    After this step you should be able to run:

    nushell
    "as easy as" | .append abc123
    .last abc123 | .cas

    Step 2.

    It really is easy from here.

    nushell
    overlay use -pr ./gpt

    Step 3.

    Initialize the cross.stream command that performs the actual LLM call. This appends the command to

    your event stream so later gpt invocations can use it:

    nushell
    gpt init

    Step 4.

    Enable your preferred provider. This stores the API key for later use:

    nushell
    gpt provider enable

    Step 5.

    Set up a milli alias for a lightweight model (try OpenAI's gpt-4.1-mini or Anthropic's

    claude-3-5-haiku-20241022):

    nushell
    gpt provider ptr milli --set

    Step 6.

    Give it a spin:

    nushell
    "hola" | gpt -p milli

    Documentation

    • **Commands Reference** - Complete command syntax and options
    • **How-To Guides** - Task-oriented workflows:
    • Configure Providers - Set up AI providers and model

    aliases

    • Work with Documents - Register and use documents in

    conversations

    • Manage Conversations - Threading, bookmarking, and

    continuation

    • Use MCP Servers - Extend functionality with external tools
    • Generate Code Context - Create structured context from

    Git repositories

    Reference Documentation

    • **Provider API** - Technical specification for implementing

    providers

    • **Schemas** - Complete data structure reference for all gpt2099

    schemas

    FAQ

    • Why does the name include 2099? What else would you call the future?

    Original intro

    This is how the project looked, 4 hours into its inception:

    https://github.com/user-attachments/assets/768cc655-a892-47cc-bf64-8b5f61c41f35

    Similar MCP

    Based on tags & features

    • FA

      Fal Mcp Server

      Python·
      8
    • AN

      Anilist Mcp

      TypeScript·
      57
    • BI

      Biomcp

      Python·
      327
    • ME

      Metmuseum Mcp

      TypeScript·
      14

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k
    View All MCP Servers

    Similar MCP

    Based on tags & features

    • FA

      Fal Mcp Server

      Python·
      8
    • AN

      Anilist Mcp

      TypeScript·
      57
    • BI

      Biomcp

      Python·
      327
    • ME

      Metmuseum Mcp

      TypeScript·
      14

    Trending MCP

    Most active this week

    • PL

      Playwright Mcp

      TypeScript·
      22.1k
    • SE

      Serena

      Python·
      14.5k
    • MC

      Mcp Playwright

      TypeScript·
      4.9k
    • MC

      Mcp Server Cloudflare

      TypeScript·
      3.0k