AI-powered theatrical lighting design for the LacyLights system.
An MCP (Model Context Protocol) server that provides AI-powered theatrical lighting design capabilities for the LacyLights system.
get_fixture_inventory
- Query available lighting fixtures and their capabilitiesanalyze_fixture_capabilities
- Analyze specific fixtures for color mixing, positioning, effects, etc.generate_scene
- Generate lighting scenes based on script context and design preferencesanalyze_script
- Extract lighting-relevant information from theatrical scriptsoptimize_scene
- Optimize existing scenes for energy efficiency, dramatic impact, etc.create_cue_sequence
- Create sequences of lighting cues from existing scenesgenerate_act_cues
- Generate complete cue suggestions for theatrical actsoptimize_cue_timing
- Optimize cue timing for smooth transitions or dramatic effectanalyze_cue_structure
- Analyze and recommend improvements to cue listsnpm install
cp .env.example .env
# Edit .env with your configuration
npm run build
The MCP server currently uses an in-memory pattern storage system for simplicity. If you want to use ChromaDB for persistent vector storage and more advanced RAG capabilities:
# Start ChromaDB with Docker
docker-compose up -d chromadb
# Verify it's running
curl http://localhost:8000/api/v2/heartbeat
# Install ChromaDB
pip install chromadb
# Start the server
chroma run --host localhost --port 8000
Then update your .env
file:
# Uncomment these lines in .env
CHROMA_HOST=localhost
CHROMA_PORT=8000
Note: The current implementation works without ChromaDB using built-in lighting patterns. ChromaDB enhances the system with vector similarity search for more sophisticated pattern matching.
OPENAI_API_KEY
- OpenAI API key for AI-powered lighting generationLACYLIGHTS_GRAPHQL_ENDPOINT
- GraphQL endpoint for your lacylights-node backend (default: http://localhost:4000/graphql)Make sure your lacylights-node
backend is running first, then:
# Start in development mode (with auto-reload)
npm run dev
# Or build and run in production mode
npm run build
npm start
You should see:
RAG service initialized with in-memory patterns
LacyLights MCP Server running on stdio
CHROMA_HOST
- ChromaDB host for RAG functionality (default: localhost)CHROMA_PORT
- ChromaDB port (default: 8000)# Development mode
npm run dev
# Production mode
npm start
Add this server to your Claude configuration:
{
"mcpServers": {
"lacylights": {
"command": "/usr/local/bin/node",
"args": ["/Users/bernard/src/lacylights/lacylights-mcp/run-mcp.js"],
"env": {
"OPENAI_API_KEY": "your_openai_api_key_here",
"LACYLIGHTS_GRAPHQL_ENDPOINT": "http://localhost:4000/graphql"
}
}
}
}
Important: If the above doesn't work, you may need to specify the exact path to your Node.js 14+ installation. You can find it with:
which node
Note: Use the absolute path to run-mcp.js
in your configuration. This wrapper ensures proper CommonJS module loading.
## Example Usage
### Generate a Scene
Use the generate_scene tool to create lighting for:
### Analyze a Script
Use the analyze_script tool with the full text of Act 1 of Macbeth to:
### Create Cue Sequence
Use the create_cue_sequence tool to create a cue list for Act 1 using the scenes generated from script analysis.
## AI-Powered Features
### Script Analysis with RAG
- Analyzes theatrical scripts to extract lighting-relevant information
- Uses vector embeddings to match script contexts with lighting patterns
- Provides intelligent suggestions based on dramatic context
### Intelligent Scene Generation
- Creates detailed DMX values for fixtures based on artistic intent
- Considers fixture capabilities and positioning
- Applies color theory and lighting design principles
### Cue Timing Optimization
- Analyzes cue sequences for optimal timing
- Considers dramatic pacing and technical constraints
- Provides multiple optimization strategies
## Development
### Project Structure
src/ ├── tools/ # MCP tool implementations │ ├── fixture-tools.ts │ ├── scene-tools.ts │ └── cue-tools.ts ├── services/ # Core services │ ├── graphql-client.ts │ ├── rag-service.ts │ └── ai-lighting.ts ├── types/ # TypeScript type definitions │ └── lighting.ts └── index.ts # MCP server entry point
### Adding New Tools
1. Create tool implementation in appropriate file under `src/tools/`
2. Add tool definition to `src/index.ts` in the `ListToolsRequestSchema` handler
3. Add tool handler in the `CallToolRequestSchema` handler
4. Update this README with tool documentation
### Testing
```bash
npm test
This MCP server is designed to work with the existing LacyLights system:
The MCP server acts as an AI layer that enhances the existing system with intelligent automation and design assistance.
Module import errors
GraphQL connection errors
lacylights-node
backend is running on port 4000LACYLIGHTS_GRAPHQL_ENDPOINT
environment variableOpenAI API errors
OPENAI_API_KEY
is set in the .env
fileMCP connection errors in Claude
run-mcp.js
wrapper script, not dist/index.js
directly"Unexpected token ?" error
"command": "/opt/homebrew/bin/node"
which node
The simplified implementation uses:
MIT
An intelligent trading assistant that fetches live stock prices using the Yahoo Finance API.
AI-driven venture capitalist agents powered by Octagon Private Markets' real-time intelligence.
Send Nano currency and retrieve account and block information using the Nano node RPC.
Detects Chinese mobile phone carriers, including China Mobile, China Unicom, China Telecom, and virtual operators.
Provides weather data using the Open-Meteo API.
Search for movies and manage playlists on your Plex Media Server using the Plex API.
Access real-time gaming data across popular titles like League of Legends, TFT, and Valorant, offering champion analytics, esports schedules, meta compositions, and character statistics.
Generates O'RLY? (O'Reilly parody) book covers.
MCP to interface with multiple blockchains, staking, DeFi, swap, bridging, wallet management, DCA, Limit Orders, Coin Lookup, Tracking and more.
An MCP server for analyzing product or service reviews.