Context-Fabric
Corpus search and linguistic analysis for AI Agents
Context-Fabric
Production-ready corpus analysis for the age of AI
AI agents running advanced grammatical queries via the Model Context Protocol
Overview
Context-Fabric brings corpus analysis into the AI era. Built on the proven Text-Fabric data model, it introduces a memory-mapped architecture enabling parallel processing for production deployments—REST APIs, multi-worker services, and AI agent tools via MCP.
- Built for Production — Memory-mapped arrays enable true parallelization. Multiple workers share data instead of duplicating it.
- AI-Native — MCP server exposes corpus operations to Claude, GPT, and other LLM-powered tools.
- Powerful Data Model — Standoff annotation, graph traversal, pattern search, and arbitrary feature annotations.
- Dramatic Efficiency — 3.5x faster loads, 65% less memory in single process, 62% less with parallel workers.
MCP Server for AI Agents
Context-Fabric includes cfabric-mcp, a Model Context Protocol server that exposes corpus operations to AI agents:
# Start the MCP server
cfabric-mcp --corpus /path/to/bhsa
# Or with SSE transport for remote clients
cfabric-mcp --corpus /path/to/bhsa --sse 8000
The server provides 10 tools for discovery, search, and data access—designed for iterative, token-efficient agent workflows.
Memory Efficiency
Text-Fabric loads entire corpora into memory—effective for single-user research, but each parallel worker duplicates that memory footprint. Context-Fabric's memory-mapped arrays change the equation:
| Scenario | Memory Reduction |
|---|---|
| Single process | 65% less |
| 4 workers (spawn) | 62% less |
| 4 workers (fork) | 62% less |
Mean reduction across 10 corpora. Memory measured as total RSS after loading from cache.
Installation
# Core library
pip install context-fabric
# With MCP server
pip install context-fabric[mcp]
Quick Start
from cfabric.core import Fabric
# Load a corpus
CF = Fabric(locations='path/to/corpus')
api = CF.load('feature1 feature2')
# Navigate nodes
for node in api.N.walk():
print(api.F.feature1.v(node))
# Traverse structure
embedders = api.L.u(node) # nodes containing this node
embedded = api.L.d(node) # nodes within this node
# Search patterns
results = api.S.search('''
clause
phrase function=Pred
word sp=verb
''')
Core API
| API | Purpose |
|---|---|
| N | Walk nodes in canonical order |
| F | Access node features |
| E | Access edge features |
| L | Navigate locality (up/down the hierarchy) |
| T | Retrieve text representations |
| S | Search with structural templates |
Performance
Context-Fabric trades one-time compilation cost for dramatic runtime efficiency. Compile once, benefit forever.
| Metric | Mean Improvement |
|---|---|
| Load time | 3.5x faster |
| Memory (single) | 65% less |
| Memory (spawn) | 62% less |
| Memory (fork) | 62% less |
Mean across 10 corpora. The larger cache enables memory-mapped access—no deserialization, instant loads, shared memory across workers.
Run benchmarks yourself:
pip install context-fabric[benchmarks]
cfabric-bench memory --corpus path/to/corpus
Packages
| Package | Description |
|---|---|
| context-fabric | Core graph engine |
| cfabric-mcp | MCP server for AI agents |
| cfabric-benchmarks | Performance benchmarking suite |
Links
Citation
If you use Context-Fabric in your research, please cite:
Kingham, Cody. "Carrying Text-Fabric Forward: Context-Fabric and the Scalable Corpus Ecosystem." January 2026.
Authors
Context-Fabric by Cody Kingham, built on Text-Fabric by Dirk Roorda.
License
MIT
관련 서버
Zomato MCP
An mcp server for your food ordering needs.
Karrito
Manage WhatsApp digital catalogs for LATAM sellers — 30 tools for products, orders, discounts, reviews, customers, shipping, and analytics.
Wordle MCP - Go
Fetches daily Wordle solutions using the official Wordle API.
AILibrary MCP Server
API for AI agents to search, license, and download b-roll video clips and voiceovers. Pay-per-request, no human interaction required.
创思大模型安全 MCP
A content security protection system for large language models, providing real-time risk identification and interception to ensure safe and compliant applications.
Policy Layer
Non-custodial spending controls for AI agent wallets — enforce limits, allowlists, and kill switches before transactions execute.
Lcontext
An MCP server that exposes user behavior as queryable data for AI coding agents.
FinancialData.Net MCP Server
Stock Market & Financial Data MCP Server – FinancialData.Net
Payman API
Integrates with Payman AI's payment APIs to manage payees, payments, and balances using natural language.
Data Wallets MCP
It connects Agents to data wallet with DID and verifiable credentials