Vectorize MCP server for advanced retrieval, Private Deep Research, Anything-to-Markdown file extraction and text chunking.
A Model Context Protocol (MCP) server implementation that integrates with Vectorize for advanced Vector retrieval and text extraction.
export VECTORIZE_ORG_ID=YOUR_ORG_ID
export VECTORIZE_TOKEN=YOUR_TOKEN
export VECTORIZE_PIPELINE_ID=YOUR_PIPELINE_ID
npx -y @vectorize-io/vectorize-mcp-server@latest
For one-click installation, click one of the install buttons below:
For the quickest installation, use the one-click install buttons at the top of this section.
To install manually, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
{
"mcp": {
"inputs": [
{
"type": "promptString",
"id": "org_id",
"description": "Vectorize Organization ID"
},
{
"type": "promptString",
"id": "token",
"description": "Vectorize Token",
"password": true
},
{
"type": "promptString",
"id": "pipeline_id",
"description": "Vectorize Pipeline ID"
}
],
"servers": {
"vectorize": {
"command": "npx",
"args": ["-y", "@vectorize-io/vectorize-mcp-server@latest"],
"env": {
"VECTORIZE_ORG_ID": "${input:org_id}",
"VECTORIZE_TOKEN": "${input:token}",
"VECTORIZE_PIPELINE_ID": "${input:pipeline_id}"
}
}
}
}
}
Optionally, you can add the following to a file called .vscode/mcp.json
in your workspace to share the configuration with others:
{
"inputs": [
{
"type": "promptString",
"id": "org_id",
"description": "Vectorize Organization ID"
},
{
"type": "promptString",
"id": "token",
"description": "Vectorize Token",
"password": true
},
{
"type": "promptString",
"id": "pipeline_id",
"description": "Vectorize Pipeline ID"
}
],
"servers": {
"vectorize": {
"command": "npx",
"args": ["-y", "@vectorize-io/vectorize-mcp-server@latest"],
"env": {
"VECTORIZE_ORG_ID": "${input:org_id}",
"VECTORIZE_TOKEN": "${input:token}",
"VECTORIZE_PIPELINE_ID": "${input:pipeline_id}"
}
}
}
}
{
"mcpServers": {
"vectorize": {
"command": "npx",
"args": ["-y", "@vectorize-io/vectorize-mcp-server@latest"],
"env": {
"VECTORIZE_ORG_ID": "your-org-id",
"VECTORIZE_TOKEN": "your-token",
"VECTORIZE_PIPELINE_ID": "your-pipeline-id"
}
}
}
}
Perform vector search and retrieve documents (see official API):
{
"name": "retrieve",
"arguments": {
"question": "Financial health of the company",
"k": 5
}
}
Extract text from a document and chunk it into Markdown format (see official API):
{
"name": "extract",
"arguments": {
"base64document": "base64-encoded-document",
"contentType": "application/pdf"
}
}
Generate a Private Deep Research from your pipeline (see official API):
{
"name": "deep-research",
"arguments": {
"query": "Generate a financial status report about the company",
"webSearch": true
}
}
npm install
npm run dev
Change the package.json version and then:
git commit -am "x.y.z"
git tag x.y.z
git push origin
git push origin --tags
An MCP server providing semantic search capabilities for APLCart data.
An MCP server for advanced research assistance, configurable via environment variables.
Provides an AI search tool to enhance AI model responses with real-time search results from various search engines using the Higress ai-search feature.
Local RAG (on-premises) with MCP server.
Search for news articles using the Naver News API. Requires Naver News API credentials.
A Model Context Protocol (MCP) server for the Open Library API that enables AI assistants to search for book and author information.
Search and download scientific articles from PubMed's E-utilities API.
Access the Wolfram Alpha API for computational knowledge and real-time data.
Search for videos, users, and retrieve danmaku from the Bilibili API.
Semantic search for Hex package documentation. Requires local Elixir and Mix installation.