Back to Articles

Crush: Why Charm Built a Terminal AI Assistant Around LSP Instead of Simple File Context

[ View on GitHub ]

Crush: Why Charm Built a Terminal AI Assistant Around LSP Instead of Simple File Context

Hook

Most AI coding assistants feed your LLM raw file contents and hope for the best. Crush wires in the same Language Server Protocol that powers VS Code’s IntelliSense—giving models semantic understanding instead of glorified grep results.

Context

The explosion of AI coding assistants in 2023-2024 created a paradox: while LLMs got smarter, their integration into development workflows remained surprisingly primitive. Many tools operate with limited awareness of actual codebase structure, seeing files as text blobs rather than interconnected systems with type hierarchies and semantic relationships.

Charm’s Crush takes a different approach. Rather than building yet another AI wrapper that processes source files as plain text, it positions itself as an orchestrator between your existing development infrastructure and AI providers. It’s built for developers who already live in the terminal and use Language Server Protocol-powered editors—offering AI that appears to understand code through the same infrastructure their tools use. With 21,803 stars, Crush represents a bet that terminal-native workflows aren’t dead, they just needed better AI integration.

Technical Insight

LLM Backend

Code Intelligence

Commands & Prompts

User Input

Retrieve Context

Code Query

Semantic Info

Tool Calls

Extension Results

Enriched Prompt

AI Response

Display

Render

gopls, nil, etc

API Calls

API Calls

Terminal User

Charm TUI Layer

Session Manager

Multi-Provider LLM Router

LSP Client

MCP Server Interface

Language Servers

OpenAI API

Anthropic API

System architecture — auto-generated

Crush’s architecture stands on three pillars: LSP integration for code intelligence, MCP (Model Context Protocol) for extensibility, and session-based context management.

LSP Integration: Semantic Context Over String Matching

Crush integrates with Language Server Protocol servers—the same infrastructure that powers IDE features like “Go to Definition” and “Find References.” The README shows LSP configuration in the Nix module example:

lsp = {
  go = { command = "gopls"; enabled = true; };
  nix = { command = "nil"; enabled = true; };
};

Each language gets its own LSP server configuration. For Go, that’s gopls. For Nix, it’s nil. This architecture means Crush can potentially leverage the same semantic understanding your editor uses—type information, call hierarchies, and symbol relationships rather than just text matching.

This architectural choice has an important implication: Crush’s code understanding quality likely scales with the LSP ecosystem’s maturity. Languages with robust LSP servers (Go, TypeScript, Rust, Python) should provide richer context. This puts Crush on the same improvement path as the broader editor ecosystem.

Multi-Provider Architecture

Crush supports multiple LLM providers through OpenAI and Anthropic-compatible APIs. The README states it allows mid-session model switching while preserving context. The Nix configuration example shows provider setup:

providers = {
  openai = {
    id = "openai";
    name = "OpenAI";
    base_url = "https://api.openai.com/v1";
    type = "openai";
    api_key = "sk-fake123456789abcdef...";
    models = [
      {
        id = "gpt-4";
        name = "GPT-4";
      }
    ];
  };
};

This reveals Crush’s provider abstraction layer. Each provider gets a normalized interface (base_url, type, api_key, models) regardless of whether it’s OpenAI, Anthropic, Groq, or OpenRouter. The README lists environment variables for numerous providers including Anthropic, OpenAI, Groq, OpenRouter, Gemini, Azure OpenAI, AWS Bedrock, and others.

MCP: Extension Protocol Support

The README states Crush supports Model Context Protocol (MCP) with three transport types: http, stdio, and sse (server-sent events). MCP is described as a way to “add capabilities” to Crush, positioning it as extensible beyond built-in LSP and file system capabilities.

Session Management: Multi-Project Context

Crush maintains “multiple work sessions and contexts per project” according to the features list. This appears designed to prevent context pollution between different projects and enable async workflows across repositories.

Gotcha

Cloud-Only Provider List

The README’s environment variable table lists only cloud-based API providers: Anthropic, OpenAI, Groq, OpenRouter, Gemini, Azure, AWS Bedrock, and others. There’s no mention of local model runtimes like Ollama or llama.cpp in the documented configuration. While Crush theoretically supports any OpenAI-compatible API (which could include locally-hosted servers), this isn’t explicitly documented.

Manual LSP Configuration

Crush doesn’t appear to automatically discover or configure LSP servers based on the README examples. You’re responsible for installing language-specific LSP servers (gopls, rust-analyzer, etc.) and providing the configuration. For experienced developers already running LSP servers in their editors, this is straightforward. The README shows explicit configuration examples but doesn’t describe auto-detection capabilities.

Terminal-First Design

Crush is explicitly positioned as a terminal application (“Your new coding bestie, now available in your favourite terminal”). The README emphasizes terminal support across platforms but doesn’t describe GUI features or integrated editing capabilities. It appears designed as a conversational tool rather than a direct code manipulation interface.

Verdict

Use Crush if: You work primarily in the terminal, you’re comfortable configuring LSP servers for your languages, you work across multiple projects and value isolated conversation contexts, you want flexibility to switch between LLM providers, or you value extensibility through standard protocols (LSP and MCP). Crush appears designed for terminal-native developers who treat AI as a composable tool in their workflow.

Consider alternatives if: You prefer GUI-based AI coding tools, you want a solution that handles LSP configuration automatically, you need offline operation without setting up your own API endpoints, or you’re looking for an all-in-one solution rather than an orchestrator that integrates with existing tools. Crush positions itself as infrastructure that works with your existing development setup rather than replacing it.

// ADD TO YOUR README
[![Featured on Starlog](https://starlog.is/api/badge/ai-agents/charmbracelet-crush.svg)](https://starlog.is/api/badge-click/ai-agents/charmbracelet-crush)