OpenAI
A server for interacting with the OpenAI API. Requires an API key.
MCP OpenAI Server
A Model Context Protocol (MCP) server that lets you seamlessly use OpenAI's models right from Claude.
Features
- Direct integration with OpenAI's chat models
- Support for multiple models including:
- gpt-4o
- gpt-4o-mini
- o1-preview
- o1-mini
- Simple message passing interface
- Basic error handling
Prerequisites
- Node.js >= 18 (includes
npmandnpx) - Claude Desktop app
- OpenAI API key
Installation
First, make sure you've got the Claude Desktop app installed and you've requested an OpenAI API key.
Add this entry to your claude_desktop_config.json (on Mac, you'll find it at ~/Library/Application\ Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"mcp-openai": {
"command": "npx",
"args": ["-y", "@mzxrai/mcp-openai@latest"],
"env": {
"OPENAI_API_KEY": "your-api-key-here (get one from https://platform.openai.com/api-keys)"
}
}
}
}
This config lets Claude Desktop fire up the OpenAI MCP server whenever you need it.
Usage
Just start chatting with Claude and when you want to use OpenAI's models, ask Claude to use them.
For example, you can say,
Can you ask o1 what it thinks about this problem?
or,
What does gpt-4o think about this?
The server currently supports these models:
- gpt-4o (default)
- gpt-4o-mini
- o1-preview
- o1-mini
Tools
openai_chat- Sends messages to OpenAI's chat completion API
- Arguments:
messages: Array of messages (required)model: Which model to use (optional, defaults to gpt-4o)
Problems
This is alpha software, so may have bugs. If you have an issue, check Claude Desktop's MCP logs:
tail -n 20 -f ~/Library/Logs/Claude/mcp*.log
Development
# Install dependencies
pnpm install
# Build the project
pnpm build
# Watch for changes
pnpm watch
# Run in development mode
pnpm dev
Requirements
- Node.js >= 18
- OpenAI API key
Verified Platforms
- macOS
- Linux
License
MIT
Author
İlgili Sunucular
Stock Market MCP Server
Provides real-time US stock market data and company financial information using the Alpha Vantage API.
Bitrix24
The Bitrix24 MCP Server is designed to connect external systems to Bitrix24. It provides AI agents with standardized access to Bitrix24 features and data via the Model Context Protocol (MCP). The MCP server enables external AI systems to interact with Bitrix24 modules through a single standardized interface. You can connect the Bitrix24 MCP Server to the AI model you already use and manage Bitrix24 directly from it. The MCP server allows actions to be performed and data to be retrieved strictly within the access rights configured in your Bitrix24: the AI agent receives only the information and capabilities that are explicitly requested and authorized. Interaction with the Tasks module is supported (the list of supported modules and available actions is gradually expanding).
Remote MCP Server on Cloudflare
An MCP server designed to run on Cloudflare Workers, featuring OAuth login support and Cloudflare KV integration for data storage.
AWS MCP Servers
Access AWS documentation, best practices, and service integrations via the Model Context Protocol.
PayPal by CData
A read-only MCP server for querying live PayPal data, powered by the CData JDBC Driver.
GCP MCP Server
Manage and interact with Google Cloud Platform (GCP) resources through an AI assistant.
RunPod MCP Server
Interact with the RunPod REST API to manage cloud GPU resources.
Alibaba Cloud Observability
Access Alibaba Cloud observability products such as SLS, ARMS, and CloudMonitor.
Shared Memory MCP
An example project for deploying a remote MCP server on Cloudflare Workers without authentication.
S2T Accelerators
36 enterprise MCP tools for AWS security, infrastructure generation, AI workflows, and AI agent governance.