OpenAI WebSearch MCP Server 🔍
An advanced MCP server that provides intelligent web search capabilities using OpenAI's reasoning models. Perfect for AI assistants that need up-to-date information with smart reasoning capabilities.
✨ Features
- 🧠 Reasoning Model Support: Full compatibility with OpenAI's latest reasoning models (gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini)
- ⚡ Smart Effort Control: Intelligent
reasoning_effort
defaults based on use case - 🔄 Multi-Mode Search: Fast iterations with gpt-5-mini or deep research with gpt-5
- 🌍 Localized Results: Support for location-based search customization
- 📝 Rich Descriptions: Complete parameter documentation for easy integration
- 🔧 Flexible Configuration: Environment variable support for easy deployment
🚀 Quick Start
One-Click Installation for Claude Desktop
OPENAI_API_KEY=sk-xxxx uvx --with openai-websearch-mcp openai-websearch-mcp-install
Replace sk-xxxx
with your OpenAI API key from the OpenAI Platform.
⚙️ Configuration
Claude Desktop
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Cursor
Add to your MCP settings in Cursor:
- Open Cursor Settings (
Cmd/Ctrl + ,
) - Search for "MCP" or go to Extensions → MCP
- Add server configuration:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Claude Code
Claude Code automatically detects MCP servers configured for Claude Desktop. Use the same configuration as above for Claude Desktop.
Local Development
For local testing, use the absolute path to your virtual environment:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "/path/to/your/project/.venv/bin/python",
"args": ["-m", "openai_websearch_mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini",
"PYTHONPATH": "/path/to/your/project/src"
}
}
}
}
🛠️ Available Tools
openai_web_search
Intelligent web search with reasoning model support.
Parameters
Parameter | Type | Description | Default |
---|---|---|---|
input | string | The search query or question to search for | Required |
model | string | AI model to use. Supports gpt-4o, gpt-4o-mini, gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini | gpt-5-mini |
reasoning_effort | string | Reasoning effort level: low, medium, high, minimal | Smart default |
type | string | Web search API version | web_search_preview |
search_context_size | string | Context amount: low, medium, high | medium |
user_location | object | Optional location for localized results | null |
💬 Usage Examples
Once configured, simply ask your AI assistant to search for information using natural language:
Quick Search
> "Search for the latest developments in AI reasoning models using openai_web_search"
Deep Research
> "Use openai_web_search with gpt-5 and high reasoning effort to provide a comprehensive analysis of quantum computing breakthroughs"
Localized Search
> "Search for local tech meetups in San Francisco this week using openai_web_search"
The AI assistant will automatically use the openai_web_search
tool with appropriate parameters based on your request.
🤖 Model Selection Guide
Quick Multi-Round Searches 🚀
- Recommended:
gpt-5-mini
withreasoning_effort: "low"
- Use Case: Fast iterations, real-time information, multiple quick queries
- Benefits: Lower latency, cost-effective for frequent searches
Deep Research 🔬
- Recommended:
gpt-5
withreasoning_effort: "medium"
or"high"
- Use Case: Comprehensive analysis, complex topics, detailed investigation
- Benefits: Multi-round reasoned results, no need for agent iterations
Model Comparison
Model | Reasoning | Default Effort | Best For |
---|---|---|---|
gpt-4o | ❌ | N/A | Standard search |
gpt-4o-mini | ❌ | N/A | Basic queries |
gpt-5-mini | ✅ | low | Fast iterations |
gpt-5 | ✅ | medium | Deep research |
gpt-5-nano | ✅ | medium | Balanced approach |
o3 | ✅ | medium | Advanced reasoning |
o4-mini | ✅ | medium | Efficient reasoning |
📦 Installation
Using uvx (Recommended)
# Install and run directly
uvx openai-websearch-mcp
# Or install globally
uvx install openai-websearch-mcp
Using pip
# Install from PyPI
pip install openai-websearch-mcp
# Run the server
python -m openai_websearch_mcp
From Source
# Clone the repository
git clone https://github.com/yourusername/openai-websearch-mcp.git
cd openai-websearch-mcp
# Install dependencies
uv sync
# Run in development mode
uv run python -m openai_websearch_mcp
👩💻 Development
Setup Development Environment
# Clone and setup
git clone https://github.com/yourusername/openai-websearch-mcp.git
cd openai-websearch-mcp
# Create virtual environment and install dependencies
uv sync
# Run tests
uv run python -m pytest
# Install in development mode
uv pip install -e .
Environment Variables
Variable | Description | Default |
---|---|---|
OPENAI_API_KEY | Your OpenAI API key | Required |
OPENAI_DEFAULT_MODEL | Default model to use | gpt-5-mini |
🐛 Debugging
Using MCP Inspector
# For uvx installations
npx @modelcontextprotocol/inspector uvx openai-websearch-mcp
# For pip installations
npx @modelcontextprotocol/inspector python -m openai_websearch_mcp
Common Issues
Issue: "Unsupported parameter: 'reasoning.effort'" Solution: This occurs when using non-reasoning models (gpt-4o, gpt-4o-mini) with reasoning_effort parameter. The server automatically handles this by only applying reasoning parameters to compatible models.
Issue: "No module named 'openai_websearch_mcp'" Solution: Ensure you've installed the package correctly and your Python path includes the package location.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- 🤖 Generated with Claude Code
- 🔥 Powered by OpenAI's Web Search API
- 🛠️ Built on the Model Context Protocol
Co-Authored-By: Claude [email protected]/[email protected]