Kanka MCP Server
A MCP server to manage entries in Kanka platform.
Kanka MCP Server
Minimal Model Context Protocol (MCP) proxy for the Kanka REST API using Node.js and Express.
Prerequisites
- Node.js 20+.
- A Kanka API token.
Install
npm install
MCP Installation Rules (client-side)
This section follows the same installation logic used in official MCP docs and official MCP servers:
- Use explicit
command,args, andenvin your client config. - Use absolute paths for local scripts/binaries (avoid relative paths).
- Keep secrets in
env(for exampleKANKA_API_TOKEN), not hardcoded in code/config files. - Pick one transport for each use case:
- local STDIO (
node index.js --stdio) - remote Streamable HTTP (
/mcp)
- local STDIO (
- Restart the MCP client after editing its MCP config.
Option A: Local STDIO server (recommended for desktop clients)
Use a standard MCP mcpServers entry (same structure used by official examples):
{
"mcpServers": {
"kanka": {
"command": "node",
"args": ["<ABSOLUTE_PATH_TO_REPO>/index.js", "--stdio"],
"env": {
"KANKA_API_TOKEN": "<YOUR_KANKA_TOKEN>"
}
}
}
}
Notes:
- On Windows, prefer forward slashes in paths (
C:/...) or escaped backslashes (C:\\...). - Optional OAuth env vars are supported too:
KANKA_CLIENT_ID,KANKA_CLIENT_SECRET,KANKA_REDIRECT_URI.
Option B: Streamable HTTP server
Start the server with PORT set:
# Bash
PORT=5000 npm start
# PowerShell
$env:PORT = "5000"
npm start
Then connect an MCP client to http://127.0.0.1:5000/mcp.
Auth can be provided using:
Authorization: Bearer <token>(preferred), or?token=<token>on the first initialization request.
Run modes
- STDIO (CLI/IDE):
node index.js --stdioornpm start -- --stdio. This is also the default whenPORTis unset. - HTTP / Streamable MCP:
PORT=5000 npm start(defaults to5000). You can pass?token=<your_token>on the first call if you do not want to rely on the env var.
OAuth helper endpoints
GET /oauth/login: redirect to Kanka for OAuth consent (requiresKANKA_CLIENT_IDandKANKA_REDIRECT_URI).GET /oauth/callback: exchanges the returnedcodeforaccess_tokenandrefresh_tokenand returns the payload.GET /.well-known/oauth-authorization-server: OAuth metadata for MCP clients.GET /oauth/authorize: starts OAuth flow (proxying through Kanka atapp.kanka.io).POST /oauth/token: exchanges authorization codes (and refresh tokens) for access tokens viaapp.kanka.io.
You can also override Kanka OAuth settings per request by passing kanka_client_id, kanka_client_secret, kanka_redirect_uri, and/or scope as query parameters (authorize/login) or form fields (token). When omitted, no scope is sent to Kanka (recommended).
MCP endpoints
The server exposes MCP-compatible transports. Clients handle initialization and tool calls; no custom JSON endpoints are required.
Streamable HTTP (recommended, protocol 2025-11-25):
GET /mcp(or/when the client expects an SSE stream) for the SSE stream (sendAuthorization: Bearer <token>or?token=<token>)POST /mcpfor JSON-RPC requests (sendAuthorization: Bearer <token>or?token=<token>on the first initialize call if not using the env var)DELETE /mcpto terminate a session
Deprecated HTTP+SSE fallback (protocol 2024-11-05):
GET /sseto open the SSE stream (sendAuthorization: Bearer <token>or?token=<token>)POST /message?sessionId=<id>to send JSON-RPCPOST /messages?sessionId=<id>alias for legacy clients
Token handling:
- Set
KANKA_API_TOKENin the environment for a default token. - Supply
Authorization: Bearer <token>(preferred) or?token=<token>when initiating HTTP/SSE sessions if you prefer per-session tokens.
Contributing
Community contributions are welcome.
- Read CONTRIBUTING.md for workflow, coding standards, and PR checklist.
- Review CODE_OF_CONDUCT.md before participating.
- Use the provided GitHub issue templates and PR template to keep reports and reviews consistent.
Local quality checks
npm run lint
npm run format:check
npm test
Servidores relacionados
Promptheus
AI-powered prompt refinement tool with adaptive questioning and multi-provider support. Intelligently refines prompts through clarifying questions, supports 6+ AI providers (Google Gemini, Anthropic Claude, OpenAI, Groq, Alibaba Qwen, Zhipu GLM), and provides comprehensive prompt engineering capabilities.
Rememberizer
Access personal and team knowledge from documents and Slack discussions.
User Prompt MCP
An MCP server for Cursor that enables requesting user input during generation process.
context-distill
context-distill is an MCP server that compresses noisy command output into precise, actionable summaries for LLM workflows. Use distill_batch for large logs and distill_watch for cycle-to-cycle deltas. Built with Go, Cobra, Viper, and DI for reliable local and provider-backed distillation.
Goatcounter
Interact with the Goatcounter web analytics API.
Brandfolio - Make Your Brand Machine-Readable for AI
Your brand deserves consistency across every AI interaction. Brandfolio transforms your brand identity into a format that works with AI on every platform.
Monday.com
Interact with Monday.com boards, items, updates, and documents.
Ramp
Interact with Ramp's Developer API to run analysis on your spend and gain insights leveraging LLMs
Todoist
Manage Todoist projects, sections, tasks, and labels using natural language with AI assistants.
Actual Budget
Integrate Actual Budget with LLM assistants to manage your personal finances.