Query OpenAI models directly from Claude using MCP protocol
Query OpenAI models directly from Claude using MCP protocol.
Add to claude_desktop_config.json
:
{
"mcpServers": {
"openai-server": {
"command": "python",
"args": ["-m", "src.mcp_server_openai.server"],
"env": {
"PYTHONPATH": "C:/path/to/your/mcp-server-openai",
"OPENAI_API_KEY": "your-key-here"
}
}
}
}
git clone https://github.com/pierrebrunelle/mcp-server-openai
cd mcp-server-openai
pip install -e .
# Run tests from project root
pytest -v test_openai.py -s
# Sample test output:
Testing OpenAI API call...
OpenAI Response: Hello! I'm doing well, thank you for asking...
PASSED
MIT License
An MCP server and toolkit for integrating with the commercetools platform APIs.
Interact with Freshservice modules for IT service management operations.
An MCP server for managing ONOS (Open Network Operating System) networks.
A server for interacting with the Alpaca trading API. Requires API credentials via environment variables.
Deploy, configure & interrogate your resources on the Cloudflare developer platform (e.g. Workers/KV/R2/D1)
An MCP server for interacting with the Coolify API to manage servers and applications.
A Model Context Protocol server built on Huawei Cloud services, providing secure and controlled cloud access for large AI models.
Backs up Cloudflare projects to a specified GitHub repository.
An MCP server for interacting with the CoSense collaborative sensemaking platform, supporting public and private projects.
Manage virtual machines across multiple cloud providers, including AWS EC2, Azure Virtual Machines, and GCP Compute Engine.