A Node.js MCP server example for the OpenWeather API, requiring an API key.
Bring the model context protocol (MCP) in action:
The MCP server in this example connects to the OpenWeather API.
Actions:
Generate an MCP server, using Postman.
See the sections below for a description of the actions taken.
sudo snap install postman
postman
Login with an account > free plan.
In Postman:
The API requests in this example serve as the "tools" that an LLM can use.
In the ./src directory:
npm install
Start the MCP server and configure it with an API key to access the underlying API.
See the sections below for a description of the actions taken.
In the ./src directory:
node mcpServer.js
In this example, we use the OpenWeather API.
In OpenWeather:
Copy the API key into src/.env (not checked in in this repo):
In the ./src directory:
node mcpServer.js
Install and configure Cursor as an MCP client that embeds the above deployed MCP server into its models.
In this example, no specific model binding is selected (we just use the Cursor defaults).
See the sections below for a description of the actions taken.
Install:
Observe preconfigured models:
Add the running MCP server to Cursor.
{
"mcpServers": {
"weather-mcp-agent": {
"command": "node",
"args": ["your-parent-dirs/mcp-nodejs/src/mcpServer.js"]
}
}
}
Observe a green dot, indicating that the MCP-server is ready to use.
Sometimes it seems additionally selecting the disable/enable switch is required.
Ask a question to the models:
Based on the question, the LLM internally reasons and decides which tool to use in order to produce an answer.
This example is based on:
An MCP server for managing the software development lifecycle, with support for an optional external SQLite database.
Enable AI Agents to fix build failures from CircleCI.
Enables AI assistants to use a Neo4j knowledge graph for standardized coding workflows, acting as a dynamic instruction manual and project memory.
A local MCP server for developers that mirrors your in-development MCP server, allowing seamless restarts and tool updates so you can build, test, and iterate on your MCP server within the same AI session without interruption.
A GDB/MI protocol server based on the MCP protocol, providing remote application debugging capabilities with AI assistants.
Share code context with LLMs via Model Context Protocol or clipboard.
Provides real-time access to Chainlink's decentralized on-chain price feeds.
Integrates Ollama's local LLM models with MCP-compatible applications. Requires a local Ollama installation.
Interact with the JFrog Platform API for repository management, build tracking, and release lifecycle management.
Predict anything with Chronulus AI forecasting and prediction agents.