An MCP server that dynamically loads tools from an external JSON file configured via an environment variable.
After cloning the repository, run the command to install the dependencies:
yarn install
You should also add the tools.json file to the root of the project, for example:
{
"tools": [
{
"name": "architecture_info",
"description": "Obtaining mandatory information about the architecture of frontend application projects",
"inputSchema": {},
"plugin": {
"name": "file",
"args": {
"path": "/path/to/folder/public/architecture.md"
}
}
},
{
"name": "search_tasks",
"description": "Before executing this function, you must retrieve the project architecture information from 'architecture_info'. This is mandatory information and you must respect it. After that you need to find the task you are talking about, analyze what needs to be done and implement it in the project according to the architecture and requirements. You don't need to invent anything additional from yourself, just what is required",
"inputSchema": {},
"plugin": {
"name": "file",
"args": {
"path": "/path/to/folder/public/tasks.txt"
}
}
},
{
"name": "optimize_prompt",
"description": "Generates a final, structured prompt for the AI model based on the provided context sections and instructions. This tool should be called after all relevant data has been collected. The result is intended to be used as the FINAL prompt for the AI. Clients must use the returned prompt as the input for the AI model.",
"inputSchema": {
"type": "object",
"properties": {
"sections": {
"type": "array",
"items": {
"type": "object",
"properties": {
"title": { "type": "string" },
"content": { "type": "string" }
},
"required": ["title", "content"]
}
},
"instructions": { "type": "string" }
},
"required": ["sections"]
},
"plugin": {
"name": "promptOptimizer",
"args": {}
}
}
]
}
To build the project, you must execute the command:
yarn build
{
"mcpServers": {
"mcp-assistant-local": {
"command": "npx",
"args": [
"tsx",
"/path/to/folder/src/index.ts"
],
"env": {
"TOOLS_PATH": "/path/to/folder/tools.json"
}
}
}
}
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
A framework for AI-powered command execution and a plugin-based tool system. It can be run as a standalone service or embedded in other projects to expose a consistent API for invoking tools and managing tasks.
A server for monitoring and analyzing Java Virtual Machine (JVM) processes using Arthas, with a Python interface.
Integrates Ollama's local LLM models with MCP-compatible applications. Requires a local Ollama installation.
Enables secure, contextual AI interactions with Jenkins tools via the Model Context Protocol.
Generate images using the Together AI API. Supports custom aspect ratios, save paths, and batch generation.
Analyzes source code to generate UML and flow diagrams with AI-powered explanations.
Provides real-time crypto and Web3 intelligence using the Hive Intelligence API.
A proof-of-concept MCP server built with Node.js and TypeScript, compatible with Claude Desktop.
A server providing tools for developers working with Starwind UI components.
An open-source library to connect any LLM to any MCP server, enabling the creation of custom agents with tool access.