A starter project with setup instructions and example MCP servers, including a weather server.
This document outlines the steps to set up the mcp
project environment.
Create a new conda environment named mcp
with Python 3.12:
conda create -n mcp python=3.12 -y
Install the uv
package manager using the following command:
curl -LsSf https://astral.sh/uv/install.sh | sh
Install the fastmcp
package using uv
in the mcp
environment:
conda run -n mcp uv pip install fastmcp
Confirm the fastmcp
installation by running the following command:
conda run -n mcp fastmcp version
mcp_hello.py
is a "hello world" type mcp server. You can run the MCP inspector for it using the following command:
conda run -n mcp fastmcp dev mcp_hello.py:mcp
Copy the provided session token from CLI, click on the provided link, paste in Configuration -> Proxy Session Token, click connect.
Click on Tools in the top menu bar. "hello_world" should be listed with a parameter "name". Input your name and click "Run Tool". The tool should succeed and return a greeting.
mcp_resources.py
defines MCP resources. You can run tests for these resources using the --test
argument:
uv run mcp_resources.py --test
mcp_weather.py
exposes a tool to get current weather data from OpenWeatherMap.
Before running: Ensure you have set your OPENWEATHER_API_KEY
in the .env
file:
OPENWEATHER_API_KEY=YOUR_API_KEY_HERE
To run the weather server manually:
conda run -n mcp fastmcp dev mcp_weather.py:mcp
To run tests for the weather tool:
uv run mcp_weather.py --test
To allow the Gemini CLI to automatically start and connect to your mcp_weather
server, you need to configure its settings.json
file.
Locate settings.json
:
The settings.json
file is typically located at:
~/.gemini/settings.json
%APPDATA%\gemini\settings.json
If the file or directory does not exist, create them.
Add mcpServers
entry:
Add the following entry to the mcpServers
section in your settings.json
file. Replace /mnt/d/Projects/_sandbox/mcp/
with the absolute path to your mcp
project directory.
{
"mcpServers": {
"weather_server": {
"command": "uv",
"args": [
"run",
"/mnt/d/Projects/_sandbox/mcp/mcp_weather.py"
],
"cwd": "/mnt/d/Projects/_sandbox/mcp",
"timeout": 10000
}
}
}
Once configured, when you run gemini
, the CLI will automatically start your mcp_weather.py
server and make its get_current_weather
tool available to the Gemini model.
An MCP server offering PureScript development tools for AI assistants. Requires Node.js and the PureScript compiler for full functionality.
MCP servers for Deephaven to orchestrate data workers and power documentation Q&A with LLMs, enabling AI-driven data workflows.
Bootstrap Model Context Protocol (MCP) servers and clients in TypeScript with best practices, examples, and proper tooling setup.
Execute pre-configured and secure shell commands via a Go-based MCP server.
A structured development workflow for LLM-based coding, including feature clarification, planning, phased development, and progress tracking.
Test web pages and HTML for accessibility issues and WCAG compliance using Axe-core and Puppeteer.
Popular MCP server that enables AI agents to scaffold, build, run and test iOS, macOS, visionOS and watchOS apps or simulators and wired and wireless devices. It has powerful UI-automation capabilities like controlling the simulator, capturing run-time logs, as well as taking screenshots and viewing the accessibility hierarchy.
A server for monitoring and analyzing Java Virtual Machine (JVM) processes using Arthas, with a Python interface.
Enables persistent memory for Claude using a local knowledge graph of entities, relations, and observations.
A simple note storage system with tools for adding notes and generating scripts from them.