Petclinic
Interacts with the Swagger Petstore API using Petclinic v3 APIs, exposing tools for OpenAI models.
petclinic-mcp
Petclinic MCP server
Petclinic MCP server uses petclinic v2 apis (https://petstore.swagger.io/). It interacts with the Swagger Petstore API (similar to a "PetClinic") and exposes tools to be used by OpenAI models.
It exposes following capabilites
- fetch_petsByStatus: Available status values : available, pending, sold

Prerequisites
- uv package manager
- Python
Running locally
- tip use stdio transport to avoid remote server setup. Change petclinic_mcp_server.py line 39 to use stdio transport
mcp.run(transport='stdio')
- Clone the project, navigate to the project directory and initiate it with uv:
uv init
- Create virtual environment and activate it:
uv venv
source .venv/bin/activate
- Install dependencies:
uv add mcp httpx
- Launch the mcp inspector
npx @modelcontextprotocol/inspector uv run petclinic_mcp_server.py
- OR launch the mcp server without inspector
uv run petclinic_mcp_server.py
Configuration for Claude Desktop
You will need to supply a configuration for the server for your MCP Client. Here's what the configuration looks like for claude_desktop_config.json:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/{your-project-path}/petclinic-mcp/"
]
},
"research": {
"command": "/{your-uv-install-path}/uv",
"args": [
"--directory",
"/{your-project-path}/petclinic-mcp/",
"run",
"petclinic_mcp_server.py"]
},
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
}
}
}
Deploy to Cloud Foundry
- tip use sse transport to deploy petclinic mcp server as a remote server. Change petclinic_mcp_server.py line 39 to use stdio transport
mcp.run(transport='sse')
- Login to your Cloud Foundry account and push the application
cf push -f manifest.yml
Binding to MCP Agents
Model Context Protocol (MCP) servers are lightweight programs that expose specific capabilities to AI models through a standardized interface. These servers act as bridges between LLMs and external tools, data sources, or services, allowing your AI application to perform actions like searching databases, accessing files, or calling external APIs without complex custom integrations.
Create a user-provided service that provides the URL for an existing MCP server:
cf cups petclinic-mcp-server -p '{"mcpServiceURL":"https://your-petclinic-mcp-server.example.com"}'
Bind the MCP service to your application:
cf bind-service ai-tool-chat petclinic-mcp-server
Restart your application:
cf restart ai-tool-chat
Your chatbot will now register with the research MCP agent, and the LLM will be able to invoke the agent's capabilities when responding to chat requests.
Related Servers
Scout Monitoring MCP
sponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
sponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
MCP RAG Server
A lightweight Python server for Retrieval-Augmented Generation (RAG) using AWS Lambda. It retrieves knowledge from external data sources like arXiv and PubMed.
TMUX
Lets agents create sessions, split panes, run commands, and capture output with TMUX
WebDev MCP
Provides a collection of useful web development tools.
UUIDv7 Generator
A server for generating version 7 universally unique identifiers (UUIDv7).
Figma
Integrate Figma design data with AI coding tools using a local MCP server.
Synth MCP
Access financial data like stock prices, currency info, and insider trading data using the Synth Finance API.
Wazuh MCP Server
Integrates Wazuh security data with local LLM clients.
Test Code Generator
Generates Vitest test code from JSON specifications using boundary value analysis and equivalence partitioning.
OpenAPI Invoker
Invokes any OpenAPI specification through a Model Context Protocol (MCP) server.
Modellix Docs
Search the Modellix knowledge base to quickly find relevant technical information, code examples, and API references. Retrieve implementation details and official guides to solve development queries efficiently. Access direct links to documentation for deeper context on specific features and tools.