IBM wxflows
Tool platform by IBM to build, test and deploy tools for any data source
Using watsonx.ai Flows Engine with Model Context Protocol (MCP)
Here's a step-by-step tutorial for setting up and deploying a project with wxflows, including installing necessary tools, deploying the app, and running it locally.
This example consists of the following pieces:
- MCP TypeScript SDK (mcp server)
- wxflows SDK (tools)
You can use any of the supported MCP clients.
This guide will walk you through installing the wxflows CLI, initializing and deploying a project, and running the application locally. We’ll use google_books and wikipedia tools as examples for tool calling with wxflows.
Before you start
Clone this repository and open the right directory:
git clone https://github.com/IBM/wxflows.git
cd examples/mcp/javascript
Step 1: Set up wxflows
Before you can start building AI applications using watsonx.ai Flows Engine:
- Sign up for a free account
- Download & install the Node.js CLI
- Authenticate your account
Step 2: Deploy a Flows Engine project
Move into the wxflows directory:
cd wxflows
There's already a wxflows project for you set up this repository with the following values:
- Defines an endpoint
api/mcp-examplefor the project. - Imports
google_bookstool with a description for searching books and specifying fieldsbooks|book. - Imports
wikipediatool with a description for Wikipedia searches and specifying fieldssearch|page.
You can deploy this tool configuration to a Flows Engine endpoint by running:
wxflows deploy
This command deploys the endpoint and tools defined, these will be used by the wxflows SDK in your application.
Step 3: Set Up Environment Variables
From the project’s root directory copy the sample environment file to create your .env file:
cp .env.sample .env
Edit the .env file and add your credentials, such as API keys and other required environment variables. Ensure the credentials are correct to allow the tools to authenticate and interact with external services.
Step 4: Install Dependencies in the Application
To run the application you need to install the necessary dependencies:
npm i
This command installs all required packages, including the @wxflows/sdk package and any dependencies specified in the project.
Step 5: Build the MCP server
Build the server by running:
npm run build
Step 6: Use in a MCP client
Finally, you can use the MCP server in a client. To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"wxflows-server": {
"command": "node",
"args": ["/path/to/wxflows-server/build/index.js"],
"env": {
"WXFLOWS_APIKEY": "YOUR_WXFLOWS_APIKEY",
"WXFLOWS_ENDPOINT": "YOUR_WXFLOWS_ENDPOINT"
}
}
}
}
You can now open Claude Desktop and should be seeing the tools from the wxflows-server listed. You can now test the google_books and wikipedia tools through Claude Desktop.
Summary
You’ve now successfully set up, deployed, and run a wxflows project with google_books and wikipedia tools. This setup provides a flexible environment to leverage external tools for data retrieval, allowing you to further build and expand your app with wxflows. See the instructions in tools to add more tools or create your own tools from Databases, NoSQL, REST or GraphQL APIs.
Support
Please reach out to us on Discord if you have any questions or want to share feedback. We'd love to hear from you!
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"weather-server": {
"command": "/path/to/weather-server/build/index.js"
}
}
}
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Related Servers
Supervisord MCP
A tool for managing Supervisord processes, integrated with AI agents via the Model Context Protocol (MCP). It offers standardized process control, real-time monitoring, and robust operations.
FluidMCP CLI
A command-line tool to run MCP servers from a single file, with support for automatic dependency resolution, environment setup, and package installation from local or S3 sources.
MCP Analytics with GitHub OAuth
A remote MCP server with GitHub OAuth authentication and built-in analytics tracking.
Sentry
Interact with the Sentry API to monitor application errors and performance.
Figma MCP Server with Chunking
An MCP server for the Figma API, with chunking and pagination to handle large files.
Read Docs MCP
Enables AI agents to access and understand package documentation from local or remote repositories.
Blend MCP
An AI gateway for the Blend Protocol on Stellar, enabling DeFi actions like lending, borrowing, and pool creation through AI assistants or apps.
HED MCP Server
An MCP server for Hierarchical Event Descriptors (HED) that automates sidecar creation and annotation for BIDS event files using LLMs.
CoinAPI MCP Server
Access real-time and historical crypto market data from CoinAPI’s MCP server, built for developers and AI agents needing reliable, unified market coverage.
Remote MCP Server (Authless)
An example of a remote MCP server deployable on Cloudflare Workers, without authentication.