Share code context with LLMs via Model Context Protocol or clipboard.
LLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages .gitignore
patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).
Note: This project was developed in collaboration with several Claude Sonnets - 3.5, 3.6 and 3.7 (and more recently Grok-3 as well), using LLM Context itself to share code during development. All code in the repository is human-curated (by me 😇, @restlessronin).
We've switched to a Markdown (+ YAML front matter)-based rules system replacing the previous TOML/YAML-based profiles. This is a breaking change that affects configuration. See the User Guide for details on the new rule format and how to use it.
For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: LLM Context: Harnessing Vanilla AI Chats for Development
To see LLM Context in action with real-world examples and workflows, read: Full Context Magic - When AI Finally Understands Your Entire Project
Install LLM Context using uv:
uv tool install "llm-context>=0.3.0"
To upgrade to the latest version:
uv tool upgrade llm-context
Warning: LLM Context is under active development. Updates may overwrite configuration files prefixed with
lc-
. We recommend all configuration files be version controlled for this reason.
Add to 'claude_desktop_config.json':
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
Once configured, you can start working with your project in two simple ways:
Say: "I would like to work with my project" Claude will ask you for the project root path.
Or directly specify: "I would like to work with my project /path/to/your/project" Claude will automatically load the project context.
For optimal results, combine initial context through Claude's Project Knowledge UI with dynamic code access via MCP. This provides both comprehensive understanding and access to latest changes. See Full Context Magic for details and examples.
lc-init
(only needed once)lc-sel-files
.llm-context/curr_ctx.yaml
lc-context
(with optional flags: -p
for prompt, -u
for user notes)lc-context -p
to include instructionslc-clip-files
lc-init
: Initialize project configurationlc-set-rule <n>
: Switch rules (system rules are prefixed with "lc-")lc-sel-files
: Select files for inclusionlc-sel-outlines
: Select files for outline generationlc-context [-p] [-u] [-f FILE]
: Generate and copy context
-p
: Include prompt instructions-u
: Include user notes-f FILE
: Write to output filelc-prompt
: Generate project instructions for LLMslc-clip-files
: Process LLM file requestslc-changed
: List files modified since last context generationlc-outlines
: Generate outlines for code fileslc-clip-implementations
: Extract code implementations requested by LLMs (doesn't support C/C++)LLM Context provides advanced features for customizing how project content is captured and presented:
.gitignore
patternslc-clip-implementations
commandSee our User Guide for detailed documentation of these features.
Check out our comprehensive list of alternatives - the sheer number of tools tackling this problem demonstrates its importance to the developer community.
LLM Context evolves from a lineage of AI-assisted development tools:
I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.
I am grateful for the help of Claude-3.5-Sonnet in the development of this project.
This project is licensed under the Apache License, Version 2.0. See the LICENSE file for details.
Retrieving and analyzing issues from Sentry.io
Create crafted UI components inspired by the best 21st.dev design engineers.
ALAPI MCP Tools,Call hundreds of API interfaces via MCP
APIMatic MCP Server is used to validate OpenAPI specifications using APIMatic. The server processes OpenAPI files and returns validation summaries by leveraging APIMatic’s API.
Flag features, manage company data, and control feature access using Bucket
Enable AI Agents to fix build failures from CircleCI.
Query and analyze your Opik logs, traces, prompts and all other telemtry data from your LLMs in natural language.
Run code in secure sandboxes hosted by E2B
Tool platform by IBM to build, test and deploy tools for any data source
Run Python in a code sandbox.