llm-context
Share code context with LLMs via Model Context Protocol or clipboard.
LLM Context
Reduce friction when providing context to LLMs. Share relevant project files instantly through smart selection and rule-based filtering.
The Problem
Getting project context into LLM chats is tedious:
- Manually copying/pasting files takes forever
- Hard to identify which files are relevant
- Including too much hits context limits, too little misses important details
- AI requests for additional files require manual fetching
- Repeating this process for every conversation
The Solution
lc-select # Smart file selection
lc-context # Instant formatted context
# Paste and work - AI can access additional files seamlessly
Result: From "I need to share my project" to productive AI collaboration in seconds.
Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7 and 4.0), as well as Groks (3 and 4), using LLM Context itself to share code during development. All code in the repository is heavily human-curated (by me 😇, @restlessronin).
Installation
uv tool install "llm-context>=0.5.0"
Quick Start
Basic Usage
# One-time setup
cd your-project
lc-init
# Daily usage
lc-select
lc-context
MCP Integration (Recommended)
{
"mcpServers": {
"llm-context": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
With MCP, AI can access additional files directly during conversations.
Project Customization
# Create project-specific filters
cat > .llm-context/rules/flt-repo-base.md << 'EOF'
---
compose:
filters: [lc/flt-base]
gitignores:
full-files: ["*.md", "/tests", "/node_modules"]
---
EOF
# Customize main development rule
cat > .llm-context/rules/prm-code.md << 'EOF'
---
instructions: [lc/ins-developer, lc/sty-python]
compose:
filters: [flt-repo-base]
excerpters: [lc/exc-base]
---
Additional project-specific guidelines and context.
EOF
Core Commands
| Command | Purpose |
|---|---|
lc-init | Initialize project configuration |
lc-select | Select files based on current rule |
lc-context | Generate and copy context |
lc-context -nt | Generate context for non-MCP environments |
lc-set-rule <name> | Switch between rules |
lc-missing | Handle file and context requests (non-MCP) |
Rule System
Rules use a systematic five-category structure:
- Prompt Rules (
prm-): Generate project contexts (e.g.,lc/prm-developer,lc/prm-rule-create) - Filter Rules (
flt-): Control file inclusion (e.g.,lc/flt-base,lc/flt-no-files) - Instruction Rules (
ins-): Provide guidelines (e.g.,lc/ins-developer,lc/ins-rule-framework) - Style Rules (
sty-): Enforce coding standards (e.g.,lc/sty-python,lc/sty-code) - Excerpt Rules (
exc-): Configure extractions for context reduction (e.g.,lc/exc-base)
Example Rule
---
description: "Debug API authentication issues"
compose:
filters: [lc/flt-no-files]
excerpters: [lc/exc-base]
also-include:
full-files: ["/src/auth/**", "/tests/auth/**"]
---
Focus on authentication system and related tests.
Workflow Patterns
Daily Development
lc-set-rule lc/prm-developer
lc-select
lc-context
# AI can review changes, access additional files as needed
Focused Tasks
# Let AI help create minimal context
lc-set-rule lc/prm-rule-create
lc-context -nt
# Work with AI to create task-specific rule using tmp-prm- prefix
MCP Benefits
- Code review: AI examines your changes for completeness/correctness
- Additional files: AI accesses initially excluded files when needed
- Change tracking: See what's been modified during conversations
- Zero friction: No manual file operations during development discussions
Key Features
- Smart File Selection: Rules automatically include/exclude appropriate files
- Instant Context Generation: Formatted context copied to clipboard in seconds
- MCP Integration: AI can access additional files without manual intervention
- Systematic Rule Organization: Five-category system for clear rule composition
- AI-Assisted Rule Creation: Let AI help create minimal context for specific tasks
- Code Excerpting: Extractions of significant content to reduce context while preserving structure
Learn More
- User Guide - Complete documentation
- Design Philosophy
- Real-world Examples
License
Apache License, Version 2.0. See LICENSE for details.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
DeepSeek-Claude MCP Server
Enhance Claude's reasoning capabilities by integrating DeepSeek's advanced engine.
MCP Utils
A Python package with utilities and helpers for building MCP-compliant servers, often using Flask and Redis.
Fused MCP
A Python-based MCP server for data scientists to run Python code with a Claude client.
Zeek-MCP
Integrates Zeek network analysis with conversational AI clients. Requires an external Zeek installation.
MCP Code Sandbox Server
Execute code securely in isolated sandbox environments using the E2B API.
D2 MCP Server
Generate, render, and manipulate D2 diagrams with incremental editing capabilities.
Tecton
Feature engineering assistance using the Tecton platform, integrated with Cursor.
llm-mcp
A Ruby gem for integrating Large Language Models (LLMs) via the Model Context Protocol (MCP) into development workflows.
Web3 MCP
Interact with multiple blockchains including Solana, Ethereum, THORChain, XRP Ledger, TON, Cardano, and UTXO chains.
CrowdCent MCP Server
Integrates with the CrowdCent Challenge API, allowing AI assistants to manage prediction challenges, datasets, and submissions.