GW_MCP
An MCP (Model Context Protocol) server providing tools to query Gravitational Wave (GW) data from GraceDB and GWOSC.
GW MCP Server
An MCP (Model Context Protocol) server providing tools to query Gravitational Wave (GW) data from GraceDB and GWOSC.
What is MCP?
MCP (Model Context Protocol) allows AI assistants like Claude to use external tools. This server gives Claude the ability to query gravitational wave databases in real-time, so you can ask questions about GW events in natural language.
Sample Questions
Once connected, you can ask Claude questions like:
- "What was the GPS time of GW150914?"
- "Show me all events in the GWTC-3 catalog"
- "Get strain data from the Hanford detector around GW150914"
- "Search for gravitational wave events with FAR less than 1e-10"
- "What files are available for a specific GraceDB event?"
Example
Access Levels
| Source | Public Access | Authenticated Access |
|---|---|---|
| GWOSC | Full (strain data, catalogs) | N/A |
| GraceDB | Released events only | Full access to current observing run |
| GCN Kafka | No | Requires NASA Earthdata credentials |
This server works out-of-the-box with public data only.
Installation
Step 1: Create Conda Environment
Create new conda environment with Python 3.11
conda create -n opticsGPT python=3.11 -y
Activate the environment
conda activate opticsGPT
Step 2: Install Dependencies
cd C:\Users\Asus\Desktop\OpticsGPT\GW_MCP
Install from requirements.txt
pip install -r requirements.txt
Step 3: Verify Installation
Test GWOSC (should print GPS time)
python -c "from gwosc.datasets import event_gps; print('GW150914 GPS:', event_gps('GW150914'))"
Usage with Claude Desktop
Add to claude_desktop_config.json:
Or create the file
{ "mcpServers": { "GW-Data": { "command": "C:/Users/Asus/anaconda3/envs/mcp/python.exe", "args": ["C:/Users/Asus/Desktop/GW_MCP/server.py"] } } }
WARNING: You must update the paths below to match your system. Change:
- The Python executable path to your conda environment location
- The server.py path to where you cloned this repository
Find your Python path with:
conda activate mcp && where python
Using with Other LLMs and Frameworks
LangChain Integration
You can use this MCP server with LangChain using the langchain-mcp-adapters package:
pip install langchain-mcp-adapters
from langchain_mcp_adapters.client import MCPClient from langchain_openai import ChatOpenAI
Connect to the MCP server
client = MCPClient( command="python", args=["path/to/GW_MCP/server.py"] )
Get tools from MCP server
tools = client.get_tools()
Use with any LangChain-compatible LLM
llm = ChatOpenAI(model="gpt-4") llm_with_tools = llm.bind_tools(tools)
Open Source LLMs
For open source LLMs (Ollama, LMStudio, etc.), you can:
- Use MCP-compatible clients: Some open source projects like MCP CLI support connecting MCP servers to local LLMs.
- Direct function calling: Import the service classes directly in your Python code:
from services.gracedb_service import get_gracedb_service from services.gwosc_service import get_gwosc_service
Use services directly
gwosc = get_gwosc_service() gps_time = gwosc.get_event_gps("GW150914") print(f"GPS time: {gps_time}")
- Build a REST API: Wrap the services in a FastAPI/Flask server for any LLM that supports function calling via HTTP.
Available Tools
| Tool | Source | Description |
|---|---|---|
| search_gw_events | GraceDB | Search events by FAR, GPS time, pipeline |
| get_event_details | GraceDB | Get full metadata for an event |
| get_superevent | GraceDB | Get superevent information |
| get_event_files | GraceDB | List files (skymaps, PSD, etc.) |
| get_event_labels | GraceDB | Get event labels (DQV, PE_READY, etc.) |
| get_event_gps | GWOSC | Get GPS time for named event (GW150914, etc.) |
| get_catalog_events | GWOSC | Query GWTC-1/2/3 catalogs |
| fetch_strain_data | GWOSC | Get strain time-series by GPS range |
| fetch_event_strain | GWOSC | Get strain centered on a named event |
For LIGO/Virgo Collaboration Members
If you have LIGO credentials, you can access real-time alerts from the current observing run.
How GraceDB Authentication Works
By default, the ligo-gracedb client searches for credentials in this order:
- SciToken at
/tmp/bt_u${UID}orSCITOKEN_FILEenvironment variable - X.509 credentials from the
credparameter (cert/key pair or proxy file) - Environment variables:
X509_USER_CERT+X509_USER_KEY - Environment variable:
X509_USER_PROXY - Proxy from ligo-proxy-init:
/tmp/x509up_u${UID} - Default location:
~/.globus/usercert.pemand~/.globus/userkey.pem - No credentials (public access only)
Option 1: Using Environment Variables (Recommended)
Set these before running the MCP server:
For SciToken
export SCITOKEN_FILE=/path/to/your/scitoken
OR for X.509 certificate
export X509_USER_CERT=/path/to/usercert.pem export X509_USER_KEY=/path/to/userkey.pem
OR for proxy file
export X509_USER_PROXY=/tmp/x509up_u${UID}
Option 2: Modify the Service Code
Edit src/gw_mcp_server/services/gracedb_service.py:
For X.509 cert/key pair:
self._client = GraceDb( cred=('/path/to/cert.pem', '/path/to/key.pem') )
For combined proxy file:
self._client = GraceDb( cred='/path/to/proxy.pem' )
To explicitly use only SciToken:
self._client = GraceDb(use_auth='scitoken')
To explicitly use only X.509:
self._client = GraceDb(use_auth='x509')
Option 3: Force Public Access Only
If you want to explicitly disable authentication attempts:
self._client = GraceDb(force_noauth=True)
Getting Credentials
- ligo-proxy-init: Run
ligo-proxy-initto create a short-lived proxy from your certificate - htgettoken: Use
htgettokento obtain a SciToken - CILogon: Get certificates from https://cilogon.org
Test Your Authentication
from ligo.gracedb.rest import GraceDb
client = GraceDb() client.show_credentials() # Prints auth type and info
Test access to current run superevents
try: for se in client.superevents('category: Production', max_results=5): print(se['superevent_id']) except Exception as e: print(f"Auth required: {e}")
Useful GraceDb Client Options
GraceDb( service_url='https://gracedb.ligo.org/api/', # Production server # service_url='https://gracedb-playground.ligo.org/api/', # Test server cred=None, # Path to credentials force_noauth=False, # Skip credential lookup fail_if_noauth=False, # Fail if no credentials found reload_cred=False, # Auto-reload expiring credentials reload_buffer=300, # Seconds before expiry to reload use_auth='all', # 'all', 'scitoken', or 'x509' retries=5, # Max retries on server error )
For full documentation: https://ligo-gracedb.readthedocs.io/en/latest/
Data Sources
| Source | URL | Auth Required |
|---|---|---|
| GWOSC | https://gwosc.org | No |
| GraceDB | https://gracedb.ligo.org | For current run |
| GCN | https://gcn.nasa.gov | Yes (Earthdata) |
Citation
If you use this software in your research, please cite:
@software{gw_mcp, title={GW MCP Server: Gravitational Wave Data Access for AI Agents}, author={Adam Zacharia Anil}, year={2025}, url={https://github.com/adamzacharia/GW_MCP} }
License
MIT
相關伺服器
WebSearch-MCP
Self-hosted Websearch API
Tavily Search
A comprehensive search agent powered by the Tavily API for in-depth and reliable search results across various topics.
o3 Search
Web search using OpenAI's o3 model. Requires an OpenAI API key.
Tavily
A comprehensive search API for real-time web search, data extraction, and crawling, requiring a Tavily API key.
门店大数据服务
Provides comprehensive offline store information queries, including enterprise restaurant brand store search, offline store search, and restaurant brand store statistics.
Perplexica Search
Perform conversational searches with the Perplexica AI-powered answer engine.
WHOIS MCP Server
A WHOIS server for checking domain availability using the Chinaz API.
Greenbook
A lightweight Model Context Protocol (MCP) server that exposes Greenbook data and tools for market research professionals, analysts, and related workflows.
MCP Deep Search
A server for performing deep web searches using the @just-every/search library, requiring API keys via an environment file.
HR & Compensation MCP Server
H1B salary data, compensation benchmarks, and job market analysis