Pinecone
Connect AI tools with Pinecone projects to search, configure indexes, generate code, and manage data.
Pinecone Developer MCP Server
The Model Context Protocol (MCP) is a standard that allows coding assistants and other AI tools to interact with platforms like Pinecone. The Pinecone Developer MCP Server allows you to connect these tools with Pinecone projects and documentation.
Once connected, AI tools can:
- Search Pinecone documentation to answer questions accurately.
- Help you configure indexes based on your application's needs.
- Generate code informed by your index configuration and data, as well as Pinecone documentation and examples.
- Upsert and search for data in indexes, allowing you to test queries and evaluate results within your dev environment.
See the docs for more detailed information.
This MCP server is focused on improving the experience of developers working with Pinecone as part of their technology stack. It is intended for use with coding assistants. Pinecone also offers the Assistant MCP, which is designed to provide AI assistants with relevant context sourced from your knowledge base.
Setup
To configure the MCP server to access your Pinecone project, you will need to generate an API key using the console. Without an API key, your AI tool will still be able to search documentation. However, it will not be able to manage or query your indexes.
The MCP server requires Node.js v18 or later. Ensure that node and npx are available in your PATH.
Next, you will need to configure your AI assistant to use the MCP server.
Configure Cursor
To add the Pinecone MCP server to a project, create a .cursor/mcp.json file in the project root (if it doesn't already exist) and add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
You can check the status of the server in Cursor Settings > MCP.
To enable the server globally, add the configuration to the .cursor/mcp.json in your home directory instead.
It is recommended to use rules to instruct Cursor on proper usage of the MCP server. Check out the docs for some suggestions.
Configure Claude desktop
Use Claude desktop to locate the claude_desktop_config.json file by navigating to Settings > Developer > Edit Config. Add the following configuration:
{
"mcpServers": {
"pinecone": {
"command": "npx",
"args": [
"-y", "@pinecone-database/mcp"
],
"env": {
"PINECONE_API_KEY": "<your pinecone api key>"
}
}
}
}
Restart Claude desktop. On the new chat screen, you should see a hammer (MCP) icon appear with the new MCP tools available.
Use as a Gemini CLI extension
To install this as a Gemini CLI extension, run the following command:
gemini extensions install https://github.com/pinecone-io/pinecone-mcp
You will need to provide your Pinecone API key in the PINECONE_API_KEY environment variable.
export PINECONE_API_KEY=<your pinecone api key>
When you run gemini and press ctrl+t, pinecone should now be shown in the list of installed MCP servers.
Usage
Once configured, your AI tool will automatically make use of the MCP to interact with Pinecone. You may be prompted for permission before a tool can be used.
Example prompts
Here are some prompts you can try with your AI assistant:
- "Search the Pinecone docs for information about metadata filtering"
- "List all my Pinecone indexes and describe their configurations"
- "Create a new index called 'my-docs' using the multilingual-e5-large model"
- "Upsert these documents into my index: [paste your documents]"
- "Search my index for records related to 'authentication best practices'"
- "What namespaces exist in my index, and how many records are in each?"
Tools
Pinecone Developer MCP Server provides the following tools for AI assistants to use:
search-docs: Search the official Pinecone documentation.list-indexes: Lists all Pinecone indexes.describe-index: Describes the configuration of an index.describe-index-stats: Provides statistics about the data in the index, including the number of records and available namespaces.create-index-for-model: Creates a new index that uses an integrated inference model to embed text as vectors.upsert-records: Inserts or updates records in an index with integrated inference.search-records: Searches for records in an index based on a text query, using integrated inference for embedding. Has options for metadata filtering and reranking.cascading-search: Searches for records across multiple indexes, deduplicating and reranking the results.rerank-documents: Reranks a collection of records or text documents using a specialized reranking model.
Limitations
Only indexes with integrated inference are supported. Assistants, indexes without integrated inference, standalone embeddings, and vector search are not supported.
Troubleshooting
MCP server not appearing in your AI tool
- Ensure Node.js v18 or later is installed:
node --version - Verify
npxis available in your PATH:which npx - Check that your configuration file is in the correct location and has valid JSON syntax
- Restart your AI tool after making configuration changes
"Invalid API key" or authentication errors
- Verify your API key is correct in the Pinecone console
- Check that the
PINECONE_API_KEYenvironment variable is set correctly in your MCP configuration - Ensure there are no extra spaces or quotes around the API key value
Tools not working as expected
- The MCP server only supports indexes with integrated inference. If you're trying to use a serverless index without integrated inference, you'll need to create a new index with an embedding model
- Check the MCP server logs for error messages. In Cursor, view logs in Cursor Settings > MCP
Connection issues
- If using a corporate network, ensure your firewall allows connections to
api.pinecone.io - Try running the server manually to see detailed error output:
PINECONE_API_KEY=<your-key> npx @pinecone-database/mcp
Contributing
We welcome your collaboration in improving the developer MCP experience. Please submit issues in the GitHub issue tracker. Information about contributing can be found in CONTRIBUTING.md.
İlgili Sunucular
AudioAlpha
AudioAlpha turns 100+ daily finance and crypto podcasts into structured intelligence — α-sentiment scores, narrative signals, asset mentions, transcripts, and market snapshots with 40+ custom metrics. Built for AI-driven research and trading workflows.
CData SAP HANA XS Advanced Server
A read-only MCP server for SAP HANA XS Advanced, powered by the CData JDBC Driver.
Dremio
Integrate Large Language Models (LLMs) with the Dremio data lakehouse platform.
Metabase Server
Integrates with Metabase for data visualization and business intelligence. Requires METABASE_URL, METABASE_USERNAME, and METABASE_PASSWORD environment variables.
Chroma MCP Server
An MCP server for the Chroma embedding database, providing persistent, searchable working memory for AI-assisted development with features like automated context recall and codebase indexing.
Charity MCP Server
Access charity and nonprofit organization data from the IRS database via CharityAPI.org.
MongoDB Lens
Full Featured MCP Server for MongoDB Database.
World Bank MCP Server
Interact with the open World Bank data API to list and analyze economic and development indicators for various countries.
CData Jira Assets
A read-only MCP server for Jira Assets, powered by the CData JDBC Driver.
Vestige MCP
Provides comprehensive DeFi analytics and data for the Algorand ecosystem through the Vestige API.