OpenOcean Finance
An MCP server for executing token swaps across multiple decentralized exchanges using OpenOcean's aggregation API
OPENOCEAN-MCP Server
An MCP server for executing token swaps across multiple decentralized exchanges using OpenOcean's aggregation API.
Overview
This project implements a Model Context Protocol (MCP) server to interact with decentralized exchanges (DEXs). It allows MCP-compatible clients (like AI assistants, IDE extensions, or custom applications) to access functionalities such as getting quotes for swaps and executing swaps across multiple chains.
This server is built using TypeScript and fastmcp.
Features (MCP Tools)
The server exposes the following tools that MCP clients can utilize:
CHAIN_LIST: Fetch chain list.- Parameters:
GAS_PRICE: Fetch gas price.- Parameters:
chain(string)
- Parameters:
QUOTE: Fetch a quote for a swap.- Parameters:
chain(string),inTokenAddress(string),outTokenAddress(string),amount(string),slippage(string)
- Parameters:
SWAP: Building swap transaction.- Parameters:
chain(string),inTokenAddress(string),outTokenAddress(string),amount(string),slippage(string),account(string)
- Parameters:
GET_TRANSACTION: Fetch transaction info.- Parameters:
chain(string),hash(string)
- Parameters:
TOKEN_LIST: Fetch token list.- Parameters:
chain(string)
- Parameters:
DEX_LIST: Fetch dex list.- Parameters:
chain(string)
- Parameters:
Parameter breakdown
chain: The chain code of the DEX.inTokenAddress: The token you want to sell.outTokenAddress: The token you want to buy.amount: Token amount with decimals. For example, if 1 USDT is input, use 1000000 (1 USDT * 10^6).slippage: Define the acceptable slippage level by inputting a percentage value within the range of 0.05 to 50. 1% slippage set as 1.account: user's wallet address.hash: Hash from the OpenOcean contract on the blockchain.
Prerequisites
- Node.js (v18 or newer recommended)
- pnpm (See https://pnpm.io/installation)
Installation
There are a few ways to use openocean-mcp:
1. Using pnpm dlx (Recommended for most MCP client setups):
You can run the server directly using pnpm dlx without needing a global installation. This is often the easiest way to integrate with MCP clients. See the "Running the Server with an MCP Client" section for examples.
(pnpm dlx is pnpm's equivalent of npx)
2. Global Installation from npm (via pnpm):
Install the package globally to make the openocean-mcp command available system-wide:
pnpm add -g openocean-mcp
3. Building from Source (for development or custom modifications):
-
Clone the repository:
git clone https://github.com/openocean-finance/openocean-mcp.git cd openocean-mcp -
Install dependencies:
pnpm install -
Build the server: This compiles the TypeScript code to JavaScript in the
distdirectory.pnpm run buildThe
preparescript also runspnpm run build, so dependencies are built upon installation if you clone and runpnpm install.
Configuration (Environment Variables)
This MCP server may require certain environment variables to be set by the MCP client that runs it. These are typically configured in the client's MCP server definition (e.g., in a mcp.json file for Cursor, or similar for other clients).
- Any necessary environment variables for wallet providers or API keys.
Running the Server with an MCP Client
MCP clients (like AI assistants, IDE extensions, etc.) will run this server as a background process. You need to configure the client to tell it how to start your server.
Below is an example configuration snippet that an MCP client might use (e.g., in a mcp_servers.json or similar configuration file). This example shows how to run the server using the published npm package via pnpm dlx.
{
"mcpServers": {
"openocean-mcp-server": {
"command": "pnpm",
"args": ["dlx", "openocean-mcp"]
}
}
}
Alternative if Globally Installed:
If you have installed openocean-mcp globally (pnpm add -g openocean-mcp), you can simplify the command and args:
{
"mcpServers": {
"openocean-mcp-server": {
"command": "openocean-mcp",
"args": []
}
}
}
command: The executable to run.- For
pnpm dlx:"pnpm"(with"dlx"as the first arg) - For global install:
"openocean-mcp"
- For
args: An array of arguments to pass to the command.- For
pnpm dlx:["dlx", "openocean-mcp"] - For global install:
[]
- For
env: An object containing environment variables to be set when the server process starts. This is where you provide any necessary environment variables.workingDirectory: Generally not required when using the published package viapnpm dlxor a global install, as the package should handle its own paths correctly. If you were running from source (node dist/index.js), then settingworkingDirectoryto the project root would be important.
Verwandte Server
Alpha Vantage MCP Server
SponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Hayhooks
Deploy and serve Haystack pipelines as REST APIs, MCP Tools, and OpenAI-compatible chat completion backends.
Anubis
Embeds intelligent guidance into AI workflows to organize development and ensure quality.
xmp4
See how OSS libs in your stack are really used: source, tests, callers. C#, Java, TS, Python, Rust, PHP+.
pageguard-mcp
Privacy compliance scanner for AI coding tools. Detects tracking technologies, cookies, and third-party data collection from local projects and live websites.
Azure DevOps
Manage Azure DevOps projects, work items, builds, and releases.
MCP Hangar
Kubernetes-native registry for managing multiple MCP servers with lazy loading, health monitoring, and RBAC
Atomic APIs
17 developer utility APIs as MCP tools — PII redaction, prompt injection detection, web-to-markdown, WCAG scanning, receipt OCR, and more — zero configuration, sub-second responses.
Qartez MCP
Code intelligence MCP server - PageRank, blast radius, co-change, hotspots, clone detection across 34 languages in a single Rust binary.
Scorecard
Access Scorecard's AI model evaluation and testing tools via a Cloudflare Workers deployment.
MLflow MCP Server
Integrates with MLflow, enabling AI assistants to interact with experiments, runs, and registered models.