a Nushell cross.stream extension to interact with LLMs and MCP servers
Documentation
gpt2099 
A Nushell scriptable
stored in cross.stream
Features
- Consistent API Across Models: Connect to Anthropic, Cerebras, Cohere, Gemini, and OpenAI
through a single, simple interface. (Add providers easily.)
- Persistent, Editable Conversations:
saved across sessions. Review, edit, and control your own context window — no black-box history.
- Flexible Tool Integration: Connect to MCP servers to extend functionality.
gpt2099already
rivals Claude Code for local file
editing, but with full provider independence and deeper flexibility.
- Document Support: Upload and reference documents (PDFs, images, text files) directly in
conversations with automatic content-type detection and optional caching.
Built on cross.stream for event-driven processing, gpt2099
brings modern AI directly into your Nushell workflow — fully scriptable, fully inspectable, all in
the terminal.
https://github.com/user-attachments/assets/1254aaa1-2ca2-46b5-96e8-b5e466c735bd
"lady on the track" provided by mobygratis
Getting started
Step 1.
First, install and configure [cross.stream](https://github.com/cablehead/xs). Once set up, you'll
have the full [cross.stream](https://github.com/cablehead/xs) ecosystem of tools for editing and
working with your context windows.
- https://cablehead.github.io/xs/getting-started/installation/
After this step you should be able to run:
"as easy as" | .append abc123
.last abc123 | .casStep 2.
It really is easy from here.
overlay use -pr ./gptStep 3.
Initialize the cross.stream command that performs the actual LLM call. This appends the command to
your event stream so later gpt invocations can use it:
gpt initStep 4.
Enable your preferred provider. This stores the API key for later use:
gpt provider enableStep 5.
Set up a milli alias for a lightweight model (try OpenAI's gpt-4.1-mini or Anthropic's
claude-3-5-haiku-20241022):
gpt provider ptr milli --setStep 6.
Give it a spin:
"hola" | gpt -p milliDocumentation
- **Commands Reference** - Complete command syntax and options
- **How-To Guides** - Task-oriented workflows:
- Configure Providers - Set up AI providers and model
aliases
- Work with Documents - Register and use documents in
conversations
- Manage Conversations - Threading, bookmarking, and
continuation
- Use MCP Servers - Extend functionality with external tools
- Generate Code Context - Create structured context from
Git repositories
Reference Documentation
- **Provider API** - Technical specification for implementing
providers
- **Schemas** - Complete data structure reference for all gpt2099
schemas
FAQ
- Why does the name include 2099? What else would you call the future?
Original intro
This is how the project looked, 4 hours into its inception:
https://github.com/user-attachments/assets/768cc655-a892-47cc-bf64-8b5f61c41f35
Similar MCP
Based on tags & features
Trending MCP
Most active this week