MCP Agentic AI Crash Course with Python

A comprehensive crash course on the Model Context Protocol (MCP), covering everything from basic concepts to building production-ready MCP servers and clients in Python.

I'll create a comprehensive README.md file based on the MCP Agentic AI Crash Course content. This will serve as a guide for anyone following along with the tutorial.

# MCP Agentic AI Crash Course with Python

A comprehensive crash course on Model Context Protocol (MCP) covering everything from basic concepts to building production-ready MCP servers and clients.

## šŸ“‹ Table of Contents

- [Overview](#overview)
- [What is MCP?](#what-is-mcp)
- [Prerequisites](#prerequisites)
- [Installation](#installation)
- [Project Structure](#project-structure)
- [Building MCP Server from Scratch](#building-mcp-server-from-scratch)
- [Running MCP Server](#running-mcp-server)
- [Integration Methods](#integration-methods)
- [Client Implementation](#client-implementation)
- [Docker Setup](#docker-setup)
- [Course Information](#course-information)

## šŸŽÆ Overview

This crash course covers:
- **MCP Fundamentals**: Understanding Model Context Protocol architecture   
- **Server Development**: Building MCP servers from scratch   
- **Multiple Integration Methods**: MCP Inspector, Claude Desktop, Cursor IDE   
- **Client Implementation**: Creating MCP clients with LLM integration   
- **Production Deployment**: Docker setup for deployment   

## šŸ”§ What is MCP?

**Model Context Protocol (MCP)** is a standardized way for AI assistants to connect with external services and data sources   . 

### Key Benefits:
- **Unified Protocol**: Like a USB-C cable for AI services - one protocol for multiple connections   
- **Service Provider Managed**: Updates and maintenance handled by service providers   
- **Reduced Code Complexity**: No need to write wrapper code for each service   

### Architecture:

LLM/AI Assistant ↔ MCP Client ↔ MCP Protocol ↔ MCP Server ↔ External Services


## šŸ“‹ Prerequisites

- Python 3.11 or higher
- Basic understanding of Python and async programming
- Familiarity with APIs and HTTP requests
- Docker (for deployment)

## šŸš€ Installation

### 1. Set up UV (Python Package Manager)
```bash
# Install UV if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh

2. Create Project Environment

# Initialize project
uv init MCP-crash-course
cd MCP-crash-course

# Create virtual environment
uv venv

# Activate environment (Windows)
.venv\Scripts\activate
# Activate environment (macOS/Linux)
source .venv/bin/activate

3. Install Dependencies

# Core MCP dependencies
uv add mcp-cli
uv add httpx
uv add mcpus

# For LLM integration
uv add langchain-groq

# For development
uv add fastapi uvicorn

šŸ“ Project Structure

MCP-crash-course/
ā”œā”€ā”€ server/
│   ā”œā”€ā”€ weather.py          # Main MCP server
│   ā”œā”€ā”€ server.py           # Production server with SSE
│   └── client-sse.py       # SSE client example
ā”œā”€ā”€ client.py               # MCP client implementation
ā”œā”€ā”€ weather.json            # Server configuration
ā”œā”€ā”€ requirements.txt        # Dependencies
ā”œā”€ā”€ Dockerfile             # Docker configuration
└── README.md              # This file

šŸ—ļø Building MCP Server from Scratch

1. Create Weather Service Server (server/weather.py)

from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP

# Initialize MCP server
mcp = FastMCP("weather")

# Weather API configuration
WEATHER_API_BASE = "https://api.weather.gov"
USER_AGENT = "MCP-Weather-Server/1.0"

async def make_weather_request(url: str) -> dict[str, Any]:
    """Make request to weather API with proper error handling"""
    headers = {
        "User-Agent": USER_AGENT,
        "Accept": "application/json"
    }
    
    async with httpx.AsyncClient() as client:
        response = await client.get(url, headers=headers, timeout=30)
        response.raise_for_status()
        return response.json()

def format_alerts(response: dict[str, Any]) -> str:
    """Format weather alerts response"""
    if not response.get("features"):
        return "No weather alerts found for this state."
    
    alerts = []
    for feature in response["features"]:
        properties = feature.get("properties", {})
        alerts.append(f"Alert: {properties.get('headline', 'N/A')}")
    
    return "\n".join(alerts)

@mcp.tool()
async def get_alerts(state: str) -> str:
    """Get weather alerts for a US state (provide 2-character state code)"""
    url = f"{WEATHER_API_BASE}/alerts?area={state.upper()}"
    
    try:
        response = await make_weather_request(url)
        return format_alerts(response)
    except Exception as e:
        return f"Error fetching weather alerts: {str(e)}"

# Resource example
@mcp.resource("config://app")
async def get_app_config() -> str:
    """Get application configuration"""
    return "MCP Weather Server v1.0 - Provides weather alerts for US states"

if __name__ == "__main__":
    mcp.run()

šŸš€ Running MCP Server

Method 1: MCP Inspector (Development)

# Start MCP Inspector
uv run mcp dev server/weather.py

# Access at http://localhost:3000
# Select STDIO transport and connect

Method 2: Claude Desktop Integration

# Install server to Claude Desktop
uv run mcp install server/weather.py

# Configuration automatically added to Claude Desktop settings

Method 3: Cursor IDE Integration

  1. Open Cursor IDE
  2. Go to File → Preferences → Cursor Settings
  3. Navigate to MCP section
  4. Add server configuration:
{
  "mcpServers": {
    "weather": {
      "command": "uv",
      "args": ["run", "server/weather.py"],
      "cwd": "/path/to/your/project"
    }
  }
}

šŸ”— Integration Methods

Configuration File (weather.json)

{
  "mcpServers": {
    "weather": {
      "command": "uv",
      "args": ["run", "server/weather.py"],
      "cwd": "/path/to/your/project"
    }
  }
}

Transport Types :

  • STDIO: For local development and same-machine communication
  • SSE (Server-Sent Events): For production with separate client/server hosting

šŸ’» Client Implementation

Basic Client (client.py)

import asyncio
from langchain_groq import ChatGroq
from mcpus import MCPAgent, MCPClient

async def main():
    # Load configuration
    client = MCPClient("weather.json")
    
    # Initialize LLM
    llm = ChatGroq(
        model="llama-3.1-70b-versatile",
        api_key="your-groq-api-key"
    )
    
    # Create MCP Agent
    agent = MCPAgent(llm=llm, client=client)
    
    # Interactive loop
    while True:
        query = input("Ask about weather: ")
        if query.lower() in ['quit', 'exit']:
            break
            
        response = await agent.run(query)
        print(f"Response: {response}")

if __name__ == "__main__":
    asyncio.run(main())

Running the Client

# Set your Groq API key
export GROQ_API_KEY="your-api-key-here"

# Run client
uv run client.py

🐳 Docker Setup

Dockerfile

FROM python:3.11-slim

WORKDIR /app

# Install UV
RUN pip install uv

# Copy requirements and install dependencies
COPY requirements.txt .
RUN uv venv && uv pip install -r requirements.txt

# Copy application files
COPY server/ ./server/
COPY *.py ./
COPY *.json ./

# Expose port
EXPOSE 8000

# Run server
CMD ["uv", "run", "server/server.py"]

Building and Running

# Build Docker image
docker build -t mcp-server .

# Run container
docker run -p 8000:8000 mcp-server

Production Server with SSE (server/server.py)

import asyncio
from mcp.server.fastmcp import FastMCP
from mcp.server.stdio import stdio_server
from mcp.server.sse import sse_server

# Your weather server code here...

if __name__ == "__main__":
    import sys
    
    if "--sse" in sys.argv:
        # Run with SSE transport
        mcp.run_sse(host="0.0.0.0", port=8000)
    else:
        # Run with STDIO transport
        mcp.run()

šŸŽ“ Course Information

This tutorial is part of the 2.0 Agentic AI and GenAI with MCP course :

  • Start Date: May 10th, 2025
  • Schedule: Every Saturday and Sunday, 3 hours per session
  • Focus: Complete coverage of Agentic AI and Generative AI with MCP

šŸ”§ Development Tips

Environment Variables

Create a .env file:

GROQ_API_KEY=your-groq-api-key
WEATHER_API_KEY=your-weather-api-key  # if needed

Testing Your Server

# Test with MCP Inspector
uv run mcp dev server/weather.py

# Test specific tool
# In MCP Inspector: get_alerts("CA")

Debugging

  • Use MCP Inspector for development and testing
  • Check server logs for connection issues
  • Verify JSON configuration syntax
  • Ensure proper port configuration for SSE transport

šŸ“š Additional Resources

šŸ¤ Contributing

Feel free to submit issues and enhancement requests!

šŸ“„ License

This project is licensed under the MIT License.


This README provides a comprehensive guide covering all the major topics from the video, including setup instructions, code examples, and deployment options. It's structured to help users follow along with the tutorial and implement their own MCP servers and clients   .# mcpcrashcourse

Related Servers