A knowledge management tool for code repositories using vector embeddings, powered by a local Ollama service.
A knowledge management tool for code repositories using vector embeddings. This tool helps maintain and query knowledge about your codebase using advanced embedding techniques.
First, you need to build the distribution files:
# Clone the repository
git clone https://github.com/yourusername/code-knowledge-tool.git
cd code-knowledge-tool
# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate
# Install build tools
python -m pip install --upgrade pip build
# Build the package
python -m build
This will create two files in the dist/ directory:
# Install Ollama (if not already installed)
curl https://ollama.ai/install.sh | sh
# Start Ollama service
ollama serve
# Navigate to where you built the package
cd /path/to/code_knowledge_tool
# Install from the wheel file
pip install dist/code_knowledge_tool-0.1.0-py3-none-any.whl
This option is best if you want to modify the tool or contribute to its development:
# Assuming you're already in the code-knowledge-tool directory
# and have activated your virtual environment
# Install in editable mode with development dependencies
pip install -e ".[dev]"
For Cline (VSCode):
# Open the settings file
open ~/Library/Application\ Support/Code/User/globalStorage/rooveterinaryinc.roo-cline/settings/cline_mcp_settings.json
Add this configuration:
{
"mcpServers": {
"code_knowledge": {
"command": "python",
"args": ["-m", "code_knowledge_tool.mcp_tool"],
"env": {
"PYTHONPATH": "${workspaceFolder}"
}
}
}
}
For RooCode:
# Open the settings file
open ~/Library/Application\ Support/RooCode/roocode_config.json
Add the same configuration as above.
This tool can serve as your project's memory bank and RAG context provider. To set this up:
cp clinerules_template.md /path/to/your/project/.clinerules
The template includes comprehensive instructions for:
See clinerules_template.md for the full configuration and usage details.
The project follows an integration-first testing approach, focusing on end-to-end functionality and MCP contract compliance. The test suite consists of:
MCP Contract Tests
Package Build Tests
To run the tests:
# Install test dependencies
pip install -e ".[dev]"
# Run all tests
pytest
# Run specific test suites
pytest tests/integration/test_mcp_contract.py -v # MCP functionality
pytest tests/integration/test_package_build.py -v # Installation verification
Test Environment Requirements:
# Ensure Ollama is running
ollama serve
The tests use a temporary directory (test_knowledge_store) that is cleaned up automatically between test runs.
For more details on the testing strategy and patterns, see the documentation in docs/
.
If you want to make this package available through pip (i.e., pip install code-knowledge-tool
), you would need to:
pip install twine
twine upload dist/*
However, for now, use the local build and installation methods described above.
MIT License
An AI-assisted web development tool for creating, modifying, and deploying code through natural language conversations.
An MCP server for the SourceSync.ai API to manage and synchronize source code context.
A code sandbox for AI assistants to safely execute arbitrary code. Requires a 302AI API key for authentication.
Validate and visualize chess positions using FEN notation.
A Cookiecutter template for creating MCP servers with Apple container support and configurable transport methods.
Provides LLM access to the Cucumber Studio testing platform for managing and executing tests.
Finds relevant code snippets, developer articles, and blog posts based on your queries.
Provides access to OpenTelemetry traces and metrics through Logfire.
Generate high-quality images using Google's Imagen 3.0 model via the Gemini API.
An autonomous agent that integrates large language models with ParaView for creating and manipulating scientific visualizations using natural language and visual inputs.