A knowledge management tool for code repositories using vector embeddings, powered by a local Ollama service.
A knowledge management tool for code repositories using vector embeddings. This tool helps maintain and query knowledge about your codebase using advanced embedding techniques.
First, you need to build the distribution files:
# Clone the repository
git clone https://github.com/yourusername/code-knowledge-tool.git
cd code-knowledge-tool
# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate
# Install build tools
python -m pip install --upgrade pip build
# Build the package
python -m build
This will create two files in the dist/ directory:
# Install Ollama (if not already installed)
curl https://ollama.ai/install.sh | sh
# Start Ollama service
ollama serve
# Navigate to where you built the package
cd /path/to/code_knowledge_tool
# Install from the wheel file
pip install dist/code_knowledge_tool-0.1.0-py3-none-any.whl
This option is best if you want to modify the tool or contribute to its development:
# Assuming you're already in the code-knowledge-tool directory
# and have activated your virtual environment
# Install in editable mode with development dependencies
pip install -e ".[dev]"
For Cline (VSCode):
# Open the settings file
open ~/Library/Application\ Support/Code/User/globalStorage/rooveterinaryinc.roo-cline/settings/cline_mcp_settings.json
Add this configuration:
{
"mcpServers": {
"code_knowledge": {
"command": "python",
"args": ["-m", "code_knowledge_tool.mcp_tool"],
"env": {
"PYTHONPATH": "${workspaceFolder}"
}
}
}
}
For RooCode:
# Open the settings file
open ~/Library/Application\ Support/RooCode/roocode_config.json
Add the same configuration as above.
This tool can serve as your project's memory bank and RAG context provider. To set this up:
cp clinerules_template.md /path/to/your/project/.clinerules
The template includes comprehensive instructions for:
See clinerules_template.md for the full configuration and usage details.
The project follows an integration-first testing approach, focusing on end-to-end functionality and MCP contract compliance. The test suite consists of:
MCP Contract Tests
Package Build Tests
To run the tests:
# Install test dependencies
pip install -e ".[dev]"
# Run all tests
pytest
# Run specific test suites
pytest tests/integration/test_mcp_contract.py -v # MCP functionality
pytest tests/integration/test_package_build.py -v # Installation verification
Test Environment Requirements:
# Ensure Ollama is running
ollama serve
The tests use a temporary directory (test_knowledge_store) that is cleaned up automatically between test runs.
For more details on the testing strategy and patterns, see the documentation in docs/
.
If you want to make this package available through pip (i.e., pip install code-knowledge-tool
), you would need to:
pip install twine
twine upload dist/*
However, for now, use the local build and installation methods described above.
MIT License
Provides developers with continuous, project-centric context awareness. Requires a TursoDB database.
A tool for interacting with Jenkins CI/CD servers, requiring environment variables for configuration.
An MCP server with built-in GitHub OAuth support, designed for deployment on Cloudflare Workers.
Interact with Binalyze AIR's digital forensics and incident response capabilities using natural language.
Generate images using the Together AI API. Supports custom aspect ratios, save paths, and batch generation.
Explore and interact with Swagger/OpenAPI specifications, allowing for browsing endpoints and retrieving details on API operations.
Generate MCP servers using Smithery with Cursor IDE integration.
Provides interactive user feedback and command execution for AI-assisted development.
Execute shell commands without permission prompts.
MCP server to provide Jira Tickets information to AI coding agents like Cursor.