Allows AI models to interact with your Okta environment to manage and analyze resources, designed for IAM engineers, security teams, and administrators.
This release represents a complete architectural overhaul with these key improvements:
tool_registry.py
dependency for simpler, more maintainable codebasejwks_uri
validationΒ
Please read this section carefully before using Okta MCP Server.
When you make a request, the interaction happens directly between the LLM and the Okta MCP tools - the client application is no longer in the middle. All data returned by these tools (including complete user profiles, group memberships, etc.) is sent to and stored in the LLM's context during the entire transaction for that conversation.
Key Privacy Considerations:
MCP is designed for lightweight workflows similar to Zapier, not bulk data operations.
Recommendation: Limit requests to fewer than 100 entities per transaction. Avoid operations that require fetching large datasets or multiple API calls.
Examples:
β Avoid these types of requests:
β Better approaches:
π‘ For larger data sets and complex queries: Consider using the Okta AI Agent for larger queries and data sets, The agent is being enhanced with similar "actionable" features to handle larger datasets and more complex scenarios in the very near future.
The HTTP transport modes (both Streamable HTTP and SSE) have significant security risks:
mcp-remote
Best Practice: Only use the STDIO transport method (default mode) unless you have specific security controls in place and understand the risks.
The Okta MCP Server currently provides the following tools:
User Management
list_okta_users
- Retrieve users with filtering, search, and pagination optionsget_okta_user
- Get detailed information about a specific user by ID or loginlist_okta_user_groups
- List all groups that a specific user belongs tolist_okta_user_applications
- List all application links (assigned applications) for a specific userlist_okta_user_factors
- List all authentication factors enrolled for a specific userGroup Operations
list_okta_groups
- Retrieve groups with filtering, search, and pagination optionsget_okta_group
- Get detailed information about a specific grouplist_okta_group_members
- List all members of a specific grouplist_okta_assigned_applications_for_group
- List all applications assigned to a specific groupApplication Management
list_okta_applications
- Retrieve applications with filtering, search, and pagination optionslist_okta_application_users
- List all users assigned to a specific applicationlist_okta_application_group_assignments
- List all groups assigned to a specific applicationPolicy & Network Management
list_okta_policy_rules
- List all rules for a specific policy with detailed conditions and actionsget_okta_policy_rule
- Get detailed information about a specific policy rulelist_okta_network_zones
- List all network zones with IP ranges and configuration detailsSystem Log Events
get_okta_event_logs
- Retrieve Okta system log events with time-based filtering and search optionsDate & Time Utilities
get_current_time
- Get current UTC time in ISO 8601 formatparse_relative_time
- Convert natural language time expressions to ISO 8601 formatAdditional tools for applications, factors, policies, and more advanced operations are on the roadmap and will be added in future releases.
β
Python 3.8+ installed on your machine
β
Okta tenant with appropriate API access
β
An MCP-compatible AI client (Claude Desktop, Microsoft Copilot Studio, etc.)
β οΈ Important Model Compatibility Note:
Not all AI models work with this MCP server. Testing has only been performed with:
- GPT-4.0
- Claude 3.7 Sonnet
- Google-2.5-pro
You must use latest model versions that explicitly support tool calling/function calling capabilities. Older models or models without tool calling support will not be able to interact with the Okta MCP Server.
The Okta MCP Server supports multiple AI providers through its flexible configuration system. This allows you to connect to various large language models based on your specific needs and existing access.
Provider | Environment Variable | Description |
---|---|---|
OpenAI | AI_PROVIDER=openai | Connect to OpenAI API with models like GPT-4o. Requires an OpenAI API key. |
Azure OpenAI | AI_PROVIDER=azure_openai | Use Azure-hosted OpenAI models with enhanced security and compliance features. |
Anthropic | AI_PROVIDER=anthropic | Connect to Anthropic's Claude models (primarily tested with Claude 3.7 Sonnet). |
Google Vertex AI | AI_PROVIDER=vertex_ai | Use Google's Gemini models via Vertex AI. Requires Google Cloud service account. |
OpenAI Compatible | AI_PROVIDER=openai_compatible | Connect to any OpenAI API-compatible endpoint, such as Fireworks.ai, Ollama, or other providers that implement the OpenAI API specification. |
# Clone the repository
git clone https://github.com/fctr-id/okta-mcp-server.git
cd okta-mcp-server
# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate # On Windows use: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
β οΈ NOTICE: If you clone this repository anew or pull updates, always make sure to re-run
pip install -r requirements.txt
to ensure all dependencies are up-to-date.
Create a config file with your Okta settings:
To use the command line client (no memory), use the instructions below
# Copy the sample config
cp .env.sample .env
# Edit the env with your settings
# Required: Okta domain and API token and LLM settings
cd clients
python mcp-cli-stdio-client.py
To use MCP hosts like Claude Code, vsCode ...etc find the json config below
The Okta MCP Server supports multiple transport protocols:
claude_desktop_config.json
:
{
"mcpServers": {
"okta-mcp-server": {
"command": "DIR/okta-mcp-server/venv/Scripts/python",
"args": [
"DIR/okta-mcp-server/main.py"
],
"env": {
"OKTA_CLIENT_ORGURL": "https://dev-1606.okta.com",
"OKTA_API_TOKEN": "OKTA_API_TOKEN"
}
}
}
}
Replace DIR
with your absolute directory path and OKTA_API_TOKEN
with your actual tokenCurrent Standard - Modern HTTP-based transport with advanced features:
Starting the Streamable HTTP Server:
# Start server with explicit risk acknowledgment
python main.py --http --iunderstandtherisks
# Server will start on http://localhost:3000/mcp
# Connect using streamable HTTP compatible clients
Features:
For Streamable HTTP Client Testing:
cd clients
python mcp-cli-streamable-client.py
β οΈ EXTREMELY DANGEROUS - READ CAREFULLY
For MCP clients that don't natively support remote connections, you can use mcp-remote
via NPX:
Prerequisites:
Setup:
# 1. Install mcp-remote globally
npm install -g @anthropic/mcp-remote
# 2. Start your Okta MCP Server in HTTP mode
python main.py --http --iunderstandtherisks
# 3. Configure your MCP client (e.g., Claude Desktop)
Claude Desktop Configuration:
{
"mcpServers": {
"okta-mcp-server": {
"command": "npx",
"args": [
"mcp-remote",
"http://localhost:3000/mcp"
],
"env": {
"OKTA_CLIENT_ORGURL": "https://dev-1606.okta.com",
"OKTA_API_TOKEN": "your_actual_api_token"
}
}
}
}
π¨ CRITICAL SECURITY WARNINGS:
When might you need this approach:
β οΈ DEPRECATED: SSE transport is deprecated and not recommended for new implementations.
# Run in SSE mode (requires explicit risk acknowledgment)
python main.py --sse --iunderstandtherisks
The Okta MCP Server provides Docker images for all transport types, offering containerized deployment options.
STDIO Transport (Recommended): For Claude Desktop or other MCP clients, configure to use the Docker container:
{
"mcpServers": {
"okta-mcp-server": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-e", "OKTA_CLIENT_ORGURL",
"-e", "OKTA_API_TOKEN",
"fctrid/okta-mcp-server:stdio"
],
"env": {
"OKTA_CLIENT_ORGURL": "https://your-org.okta.com",
"OKTA_API_TOKEN": "your_api_token"
}
}
}
}
Streamable HTTP Transport (Current Standard):
# Start the HTTP container
docker run -d --name okta-mcp-http \
-p 3000:3000 \
-e OKTA_API_TOKEN=your_api_token \
-e OKTA_CLIENT_ORGURL=https://your-org.okta.com \
fctrid/okta-mcp-server:http
# Configure your MCP client to connect to http://localhost:3000/mcp
SSE Transport (Deprecated - Not Recommended):
# Start the SSE container (deprecated)
docker run -d --name okta-mcp-sse \
-p 3000:3000 \
-e OKTA_API_TOKEN=your_api_token \
-e OKTA_CLIENT_ORGURL=https://your-org.okta.com \
fctrid/okta-mcp-server:sse
# Configure your MCP client to connect to http://localhost:3000/sse
Building Images Locally:
# Build all variants
docker build --target stdio -t okta-mcp-server:stdio .
docker build --target http -t okta-mcp-server:http .
docker build --target sse -t okta-mcp-server:sse .
v0.1.0-BETA - Current (MAJOR ARCHITECTURAL OVERHAUL!)
v0.3.0 - Previous
Future plans include:
Before raising an issue, check:
Still having problems? Open an issue on GitHub or email support@fctr.io (response times may vary)
Have an idea or suggestion? Open a feature request on GitHub!
Interested in contributing? We'd love to have you! Contact info@fctr.io for collaboration opportunities.
Check out License.md
for the fine print.
π Β© 2025 Fctr Identity. All rights reserved. Made with β€οΈ for the Okta and AI communities.
Interact with your TagoIO account to access devices, data, and platform resources for development and intelligent data analysis.
Manage Linode cloud infrastructure resources through natural language conversation.
Administer Google Workspace using the GAM command-line tool.
Interact with Alpaca's market data and brokerage services via its API.
Interact with Stripe API
Access Grafana resources like dashboards, datasources, Prometheus, Loki, and alerts.
A remote MCP server deployable on Cloudflare Workers without authentication.
An MCP server for accessing YouTube Analytics data, powered by the CData JDBC Driver.
Retrieves user geolocation information using EdgeOne Pages Functions and exposes it via an MCP server.
An MCP server for processing payments using stdio transport, configured via environment variables.