o3 Search
Web search using OpenAI's o3 model. Requires an OpenAI API key.
o3-search-mcp (gpt-5, o4-mini support)
MCP server that enables the use of OpenAI's high-end models and their powerful web search capabilities. By registering it with any AI coding agent, the agent can autonomously consult with OpenAI models to solve complex problems.
|
|
Use Cases
(Although called o3 to match the MCP name, you can specify gpt-5 or o4-mini via env for the model to use)
🐛 When you're stuck debugging
o3's web search can scan a wide range of sources, including GitHub issues and Stack Overflow, significantly increasing the chances of resolving niche problems. Example prompts:
> I'm getting the following error on startup, please fix it. If it's too difficult, ask o3.
> [Paste error message here]
> The WebSocket connection isn't working. Please debug it. If you don't know how, ask o3.
📚 When you want to reference the latest library information
You can get answers from the powerful web search even when there's no well-organized documentation. Example prompts:
> I want to upgrade this library to v2. Proceed while consulting with o3.
> I was told this option for this library doesn't exist. It might have been removed. Ask o3 what to specify instead and replace it.
🧩 When tackling complex tasks
In addition to search, you can also use it as a sounding board for design. Example prompts:
> I want to create a collaborative editor, so please design it. Also, ask o3 for a design review and discuss if necessary.
Also, since it's provided as an MCP server, the AI agent may decide on its own to talk to o3 when it deems it necessary, without any instructions from you. This will dramatically expand the range of problems it can solve on its own!
Installation
npx (Recommended)
Claude Code:
$ claude mcp add o3 \
-s user \ # If you omit this line, it will be installed in the project scope
-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
-e OPENAI_API_KEY=your-api-key \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-e OPENAI_API_TIMEOUT=300000 \
-e OPENAI_MAX_RETRIES=3 \
-- npx o3-search-mcp
json:
{
"mcpServers": {
"o3-search": {
"command": "npx",
"args": ["o3-search-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}
Local Setup
If you want to download the code and run it locally:
git clone [email protected]:yoshiko-pg/o3-search-mcp.git
cd o3-search-mcp
pnpm install
pnpm build
Claude Code:
$ claude mcp add o3 \
-s user \ # If you omit this line, it will be installed in the project scope
-e OPENAI_MODEL=o3 \ # o4-mini, gpt-5 also available
-e OPENAI_API_KEY=your-api-key \
-e OPENAI_MODEL=o3 \
-e SEARCH_CONTEXT_SIZE=medium \
-e REASONING_EFFORT=medium \
-e OPENAI_API_TIMEOUT=300000 \
-e OPENAI_MAX_RETRIES=3 \
-- node /path/to/o3-search-mcp/build/index.js
json:
{
"mcpServers": {
"o3-search": {
"command": "node",
"args": ["/path/to/o3-search-mcp/build/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key",
// Optional: o3, o4-mini, gpt-5 (default: o3)
"OPENAI_MODEL": "o3",
// Optional: low, medium, high (default: medium)
"SEARCH_CONTEXT_SIZE": "medium",
"REASONING_EFFORT": "medium",
// Optional: API timeout in milliseconds (default: 300000)
"OPENAI_API_TIMEOUT": "300000",
// Optional: Maximum number of retries (default: 3)
"OPENAI_MAX_RETRIES": "3"
}
}
}
}
Environment Variables
| Environment Variable | Options | Default | Description |
|---|---|---|---|
OPENAI_API_KEY | Required | - | OpenAI API Key |
OPENAI_MODEL | Optional | o3 | Model to use Values: o3, o4-mini, gpt-5 |
SEARCH_CONTEXT_SIZE | Optional | medium | Controls the search context size Values: low, medium, high |
REASONING_EFFORT | Optional | medium | Controls the reasoning effort level Values: low, medium, high |
OPENAI_API_TIMEOUT | Optional | 300000 | API request timeout in milliseconds Example: 300000 for 5 minutes |
OPENAI_MAX_RETRIES | Optional | 3 | Maximum number of retries for failed requests The SDK automatically retries on rate limits (429), server errors (5xx), and connection errors |
Notes
To use the o3 model from the OpenAI API, you need to either raise your tier to 4 or verify your organization. If you register an API key that is not yet enabled for o3 with this MCP, calls will result in an error. Reference: https://help.openai.com/en/articles/10362446-api-access-to-o1-o3-and-o4-models
Serveurs connexes
Esports Events
Get the latest information about esports matches. 50+ supported games: Counter-Strike, Valorant, League of Legends, Rocket League, ...
Shodan
Query Shodan's database of internet-connected devices and vulnerabilities using the Shodan API.
Rememberizer Common Knowledge
Provides semantic search and retrieval for internal company knowledge bases, including documents and Slack discussions.
signalfuse-mcp
Crypto trading signals, sentiment, macro regime, web search & code execution via x402 micropayments on Base
Haloscan MCP Server
An MCP server for interacting with the Haloscan SEO API.
Local RAG
Privacy-first local RAG server for semantic document search without external APIs
MCP Market Russia
Search 1000+ Russian construction companies and real estate agencies for AI agents
Jina AI MCP Tools
Integrates with Jina AI APIs for web reading, search, and fact-checking.
Hatch MCP Server
Find emails, phone numbers, company data, and LinkedIn URLs using the Hatch API.
Google Maps Extractor MCP
AI-powered lead generation from Google Maps. Search businesses, enrich with emails/phones/socials, score leads 0-100, export CSV. Free alternative to Apollo.io. No API keys required.