Cernion Grid Intelligence
87+ specialized tools for German and European energy data. Direct AI access to Marktstammdatenregister (MaStR), ENTSO-E, Redispatch 2.0, and Grid Operations for utilities and datacenters.
Cernion Energy Tools
MicroService Agent System for Energy Markets
A modular, scalable microservices platform built with Moleculer for developing energy market applications with AI integration (Google Gemini) and MCP (Model Context Protocol) support.
Features
- ๐ Moleculer Microservices Framework โ Fast, modern, and powerful microservices framework
- ๐ API Gateway โ HTTP REST API with automatic route generation
- ๐ค AI Agent โ Natural-language query planner powered by Google Gemini: describe your energy data need in plain text and the agent generates, executes, and interprets a multi-step microservice plan automatically
- ๐ข Inhouse Data Sources โ Register, infer, cache, and discover internal utility datasets (CSV, REST, GeoJSON, XLSX, DOCX, Scraper) alongside public energy tools
- ๐งฉ Research Web App โ Built-in single-page application at
/appfor interactive, browser-based testing of the AI agent โ no separate tooling required - ๐ฅ Live CSV Export โ Every agent result exposes a parameterised GET endpoint (
/api/agent/session/:id/csv?param=value) for zero-config integration with automation tools such as Microsoft Power Automate, Excel Power Query, or cron jobs - ๏ฟฝ Datapoints โ Named, versioned, health-monitored data sources backed by embedded PouchDB. Promote any agent session to a managed datapoint, track refresh history and schema stability, and retrieve live data as JSON or CSV via
/api/datapoints. See the health overview for a dashboard of all registered datapoints.- ๐ธ Snapshots โ Seal a group of datapoints as a consistent unit with SHA-256 provenance hashing. Create, validate (drift detection), list, and remove snapshots via/api/datapoints/snapshot*(v0.13) - ๐ OSM Geo Layer โ Grid infrastructure analysis via OpenStreetMap/Overpass: VNB assignment validation, nearby infrastructure, substation inventory, and grid topology (v0.10)
- ๐ OEP Connector โ Read-only access to the Open Energy Platform (scenario data, NEP references, research datasets) via
/api/oep/*(v0.12) - ๐ Grid Connection Validation โ Deterministic 6-step Netzanschluss pipeline (
POST /api/grid-connection/validate): inventory โ delta โ capacity โ EWK benchmark โ Go/No-Go decision โ audit trail. No LLM โ identical inputs, identical findings. Reports sealed with PouchDB snapshots for EU AI Act Art. 12 compliance (v0.14) - ๐ค Energy Sharing Validation โ Deterministic 6-step ยง 42c EnWG pipeline (
POST /api/energy-sharing/validate): generator/consumer eligibility, MaLo validation, share-sum check, DV validation. Regulatory deadline: 01.06.2026 (v0.15) - ๐ MaStR Data Quality Audit โ 8-step portfolio quality audit (
POST /api/mastr-quality/audit): registration completeness, capacity plausibility, NAP/MeLo connectivity, duplicate detection, geo spot-check. Weighted 0โ100 score across 5 dimensions (v0.17) - โก Redispatch Ex-Post Audit โ 7-step Redispatch 2.0 settlement readiness audit (
POST /api/redispatch/audit): portfolio assembly (Weg A/B), NAP/MeLo/DV checks, curtailment data, financial risk scoring (v0.18) - ๐๏ธ Dashboard API โ Read-only UI aggregator with 4 composite endpoints (
GET /api/dashboard/*): VNB overview, market snapshot, quality summary, finding-codes reference. All upstream calls parallel viaPromise.allSettled, graceful degradation, 5โ15 min cache (v0.19) - ๐ง OEO / OEMetadata โ Open Energy Ontology annotations on all 45+ REST endpoints, OEMetadata v2.0 export with optional JSON Schema validation (v0.11.4โv0.12)
- ๐ Data Provenance โ SHA-256 provenance hashing on every datapoint refresh for EU AI Act Art. 12 compliance, plus explainability log for agent corrections (v0.11.5)
- ๐งน Prompt Scrubber โ Field-level PII masking with energy-domain allowlist before sending data to external LLMs (v0.11.5)- ๏ฟฝ๐ MCP Support โ Model Context Protocol SDK integration
- ๐ OpenAPI Documentation โ Automatic API documentation at
/api/docs - ๐งญ DSO/VNB Lookup โ VNBdigital search/lookup and BDEW โ MaStR resolution
- ๐ ๏ธ CLI Tool โ Command-line interface for calling microservices
- ๐ฆ Service Templates โ Ready-to-use skeleton service template
- ๐ Hot Reload โ Automatic service reloading during development
- ๐ฏ Best Practices โ ESLint, Prettier, and structured project layout
Documentation
- CHANGELOG.md - Release notes and notable changes
- MCP_TOOLS.md - MCP tool reference
- MCP_SERVICES.md - Microservice-to-tool mapping
- BEARER_TOKEN_AUTHENTICATION.md - Auth guide
- docs/BACKEND_CONTEXT.md - Backend architecture reference (services, PouchDB, finding codes, auth)
- llm.txt - Generated LLM context artifact (architecture + domain knowledge + cookbook + OpenAPI)
- docs/ui-contracts/ - Frontend โ backend API contracts (v0.20, 14 docs)
- docs/MAINTENANCE_MILESTONE_CHECKLIST.md - Pre-milestone quality/security gate checklist
- SECURITY.md - Security policy and disclosure
- CODE_OF_CONDUCT.md - Community guidelines
CI/CD & Transparency
- Pull requests and pushes to
mainrun automated quality checks (lint, build, unit coverage gates, integration discovery sanity, OpenAPI audit, security audits). - Security analysis is continuously enforced with CodeQL.
- Version tags (
v*) trigger a release pipeline (release:check+ build + GitHub Release). llm.txtis validated in release checks and regenerated from source-of-truth files vianpm run generate:llm.- In maintenance CI,
llm.txtsync is checked strictly whenCHANGELOG.mdchanges. - Coverage reports are uploaded and publicly visible via Codecov.
- Recommended repository setting: enable branch protection on
mainand requireMaintenance CI+CodeQLchecks before merge.
Quick Start
Prerequisites
- Node.js 18+
- npm or yarn
Installation
# Clone the repository
git clone https://github.com/energychain/cernion-energy-tools.git
cd cernion-energy-tools
# Install dependencies
npm install
# Copy environment variables
cp .env.example .env
# Edit .env and add your API keys (see Configuration section)
nano .env
Running the Services
# Start all services
npm start
# Or use development mode with hot reload
npm run dev
The API Gateway will start on http://localhost:3000 by default.
| URL | Description |
|---|---|
http://localhost:3000/app | Research Web App โ AI agent UI for interactive testing |
http://localhost:3000/api/docs | Swagger UI โ full OpenAPI documentation |
http://localhost:3000/api/openapi.json | Raw OpenAPI spec |
Using the CLI
# Call a microservice action
npm run cli -- skeleton.hello --name=John
# Health check
npm run cli -- skeleton.health
# Get help
npm run cli -- --help
Research Web App
The built-in web application at /app lets you explore all microservices using plain-text natural language โ no curl, no Swagger form, no coding required.
Workflow
-
Describe your question โ type in plain English or German, e.g. "Alle PV-Anlagen im Netz der Enercity in Hannover"
-
Review the plan โ the AI decomposes the question into a numbered sequence of microservice calls and shows you exactly which services will be called and with which parameters.
-
Adjust parameters โ concrete values extracted from your query (dates, postal codes, MeLo IDs, operator names, โฆ) appear as pre-filled, editable form fields. Change any value without re-generating the plan.
-
Run & explore โ results appear in a sortable, filterable table. The raw JSON from every step is available for debugging.
-
Share or automate โ a shareable URL and a Live CSV link are generated automatically (see below).
Live CSV for Automation
Every completed analysis exposes a parameterised CSV endpoint:
GET /api/agent/session/<id>/csv?param1=value1¶m2=value2
- The query re-runs live against the real data sources every time it is called โ data is never stale.
- GET parameters override the saved values, so the same session URL can be reused with different dates, regions, or identifiers.
- The CSV URL updates in real time in the UI as you change any form field.
Power Automate / Excel Power Query example:
http://10.0.0.8:3900/api/agent/session/2a70e478-90ce-4fa5-b996-6f98efdba7cf/csv?startDate=2026-03-01
Point a HTTP โ Get file action or a Power Query Web data source at this URL. Change the startDate parameter to fetch a different reporting period โ no re-analysis needed.
Other automation patterns:
- Schedule a cron job / GitHub Action to pull fresh CSVs daily
- Feed directly into pandas
read_csv(url)in a Jupyter notebook - Use as a data source in Grafana, Power BI, or any tool that accepts a CSV URL
Creating New Services
Using the Service Creator
# Create a new service interactively
npm run create
# Or specify a name directly
npm run create -- my-service
This creates a new service in custom-services/ from the skeleton template and generates a matching test in custom-tests/.
Custom services are local-only and ignored by git. Core services shipped with the project live in services/.
Manual Service Creation
-
Copy the skeleton template:
cp templates/skeleton.service.js custom-services/my-service.service.js -
Edit the service โ change the
nameproperty, add actions, events, and methods. -
Restart services:
npm start
Custom Services & Tests
- Custom services live in
custom-services/and are loaded at startup. - Custom tests live in
custom-tests/and are excluded from release coverage. - Run custom tests without global coverage thresholds:
npm run test:custom -- my-service.service.test.js
Project Structure
cernion-energy-tools/
โโโ services/ # Core microservices (shipped with release)
โ โโโ api.service.js # API Gateway + Swagger UI
โ โโโ agent.service.js # AI agent โ plan/execute/export
โ โโโ assets.service.js # MaStR installation assets
โ โโโ datapoint.service.js # Named datapoints + snapshots (v0.11โv0.13)
โ โโโ osm-geo.service.js # OSM geo layer (v0.10)
โ โโโ oep.service.js # Open Energy Platform (v0.12)
โ โโโ datasource-registry.service.js
โ โโโ datasource-connector.service.js
โ โโโ datasource-cache.service.js
โ โโโ datasource-discovery.service.js
โ โโโ forecast.service.js
โ โโโ gas-storage.service.js
โ โโโ german-grid.service.js
โ โโโ grid-operations.service.js
โ โโโ ... # See services/ for full list
โโโ src/
โ โโโ app.html # Research Web App (single-page)
โ โโโ connectors/ # Built-in datasource connector plugins
โ โโโ mcp-client.js # Centralised MCP tool caller
โ โโโ async-job-poller.js # Async job polling
โ โโโ prompt-scrubber.js # PII masking for LLM prompts
โ โโโ oeo-mappings.js # OEO class mappings (~150 entries)
โ โโโ validation-findings.js # Grid connection finding constants (v0.14)
โ โโโ oemetadata-builder.js # OEMetadata v2.0 builder
โโโ custom-services/ # Local/custom services (git-ignored)
โโโ custom-connectors/ # Local/custom datasource plugins (git-ignored)
โโโ custom-tests/ # Local/custom tests (git-ignored)
โโโ templates/
โ โโโ skeleton.service.js
โโโ tests/ # Core test suite
โโโ scripts/ # Build / audit scripts
โโโ index.js # Main entry point
โโโ cli.js # CLI tool
โโโ create-service.js # Interactive service creator
โโโ moleculer.config.js # Moleculer configuration
โโโ .env.example # Environment variables template
โโโ package.json
Configuration
Environment Variables
Copy .env.example to .env and edit:
| Variable | Default | Description |
|---|---|---|
PORT | 3000 | API Gateway port |
LOG_LEVEL | info | Logging level (info, debug, warn, error) |
GEMINI_API_KEY | โ | Google Gemini API key (required for AI agent) |
GEMINI_MODEL | gemini-3-pro-preview | Gemini model name |
MCP_SERVER_URL | โ | MCP server URL |
CERNION_TOKEN | โ | Cernion MCP token (request here or email [email protected]) |
NAMESPACE | โ | Moleculer namespace for service isolation |
TRANSPORTER | โ | Message transporter (NATS, Redis, MQTT, โฆ) |
REQUEST_TIMEOUT_MS | 900000 | Broker request timeout in ms |
RETRY_POLICY_ENABLED | false | Enable broker-level retries for retryable errors |
CIRCUIT_BREAKER_ENABLED | false | Enable circuit breaker protection |
BULKHEAD_ENABLED | false | Enable bulkhead concurrency protection |
METRICS_ENABLED | false | Enable Moleculer metrics collection |
TRACING_ENABLED | false | Enable Moleculer tracing |
ASYNC_POLLER_DEBUG | false | Enable verbose async job poller debug logging |
ASYNC_POLLER_LOG_MAX_CHARS | 400 | Max chars for poller debug payload snippets |
DATASOURCE_MONGO_COLLECTION_REGISTRY | datasource_registry | Collection name for datasource definitions |
DATASOURCE_MONGO_COLLECTION_CACHE | datasource_cache | Collection name for cached datasource rows |
DATASOURCE_MONGO_COLLECTION_AUDIT | datasource_audit | Collection name for privacy/audit records |
DATASOURCE_CONNECTOR_PLUGINS_DIR | src/connectors | Built-in datasource connector directory |
DATASOURCE_CUSTOM_PLUGINS_DIR | custom-connectors | Custom datasource connector directory |
DATASOURCE_MAX_INFER_SAMPLE_ROWS | 200 | Max sample rows used for schema inference |
DATASOURCE_SCRAPER_TIMEOUT_MS | 30000 | Timeout for scraper connector page loads |
DATASOURCE_DEFAULT_PRIVACY_CONTEXT | ai-agent | Default privacy mode for datasource reads |
GRID_CONNECTION_DB_PATH | ./.grid-connections | PouchDB path for Netzanschluss validation reports (v0.14) |
For complete operational options (retry backoff, circuit-breaker thresholds, bulkhead queue limits), see .env.example.
Inhouse Data Sources (v0.9)
The v0.9 datasource layer adds a second data plane next to MCP-backed public energy tools: internal utility and grid-operator data.
Services
datasource-registryโ CRUD for source definitions, cache policy, Data Dictionary, dictionary version history, and schema inference draftsdatasource-connectorโ plugin runtime for reading heterogeneous sources through built-in or custom connectorsdatasource-cacheโ privacy-aware cached row access, status inspection, refresh, invalidation, and DSGVO audit traildatasource-discoveryโ AI-ready inhouse source descriptors for the agent and future Logic Builder integrations
Built-in connector plugins
csvโ delimited files from disk, including.gzrestโ JSON/CSV HTTP endpointsgeojsonโ feature flattening with centroid coordinatesxlsxโ spreadsheet row extraction via SheetJSdocxโ Word extraction scaffold (optionalmammothdependency)scraperโ HTML/table extraction scaffold viacheerioorpuppeteer
Public REST endpoints
POST /api/datasourcesGET /api/datasourcesGET /api/datasources/:idPUT /api/datasources/:idDELETE /api/datasources/:idGET /api/datasources/:id/dictionaryPUT /api/datasources/:id/dictionaryGET /api/datasources/:id/dictionary/historyGET /api/datasources/:id/dictionary/:versionPOST /api/datasources/:id/inferPOST /api/datasources/:id/refreshGET /api/datasource-cache/:sourceIdGET /api/datasource-cache/:sourceId/statusGET /api/datasource-cache/:sourceId/auditPOST /api/datasource-cache/:sourceId/refreshDELETE /api/datasource-cache/:sourceIdGET /api/datasource-discoveryGET /api/datasource-discovery/search?q=...GET /api/datasource-discovery/:sourceId/descriptor
Current implementation status
- Implemented: service scaffolds, public REST exposure, OpenAPI tag grouping, in-memory cache/registry flow, connector loader, CSV/REST/GeoJSON/XLSX reads, discovery descriptors, and agent prompt integration
- Scaffolded with optional dependencies:
docx,scraper - Planned follow-up: persistent MongoDB backend, richer connector validation, and Logic Builder integration
Moleculer Configuration
Edit moleculer.config.js to customise logger settings, transporter, cacher, circuit breaker, metrics, and tracing.
Open Energy Ontology (OEO) Integration
Since v0.11.4 Cernion is annotated with machine-readable mappings to the Open Energy Ontology (v2.11.0).
| Layer | What it does |
|---|---|
src/oeo-mappings.js | Static lookup (~150 entries): installation types, grid concepts, voltage levels, market types, ENTSO-E PSR codes, units. Includes German labels. |
x-oeo-class in OpenAPI | Every REST endpoint carries x-oeo-class arrays linking to OEO class IRIs. |
semanticHints.oeoClasses | Datasource discovery descriptors expose domain-level OEO annotations. |
| Classifier keyword boost | German OEO labels (e.g. "Solaranlage", "Stromnetz") enrich the heuristic scorer for German-language uploads. |
GET /api/datapoints/oeo-context | JSON-LD @context document mapping datapoint fields to OEO IRIs. |
scripts/sync-oeo.js | Validates mappings against upstream OEO releases. Run: npm run sync:oeo. |
Upstream dependency
The ontology is maintained by @OpenEnergyPlatform/ontology.
All inline references are tagged with // @OpenEnergyPlatform/ontology โ OEO_XXXXX label
so that GitHub search surfaces our dependency to upstream maintainers.
Available Scripts
| Script | Description |
|---|---|
npm start | Start all services |
npm run dev | Start with hot reload and REPL |
npm run cli | Run CLI tool |
npm run create | Create new service from template |
npm run lint | Run ESLint |
npm run lint:fix | Auto-fix ESLint issues |
npm run format | Format code with Prettier |
npm test | Run full test suite with coverage |
npm run test:unit | Run unit/service tests with coverage thresholds |
npm run test:unit:ci | CI-safe unit run (--runInBand --forceExit) |
npm run test:integration | Run integration tests (*.integration.test.js) |
npm run test:e2e | Run live end-to-end integration test (assets.integration.test.js) |
npm run test:custom | Run custom tests (no coverage threshold) |
npm run test:watch | Watch mode |
npm run audit:openapi | Audit OpenAPI request/parameter quality |
npm run audit:security | Run blocking dependency audit (critical severity) |
npm run audit:security:advisory | Run advisory dependency audit (high+) |
npm run export:openapi | Generate openapi-export.json with x-ui-page annotations |
npm run release:check | Run core release gates (unit coverage, OpenAPI, critical security audit) |
npm run sync:oeo | Validate/update OEO mappings from upstream release |
npm run sync:oemetadata | Validate/update OEMetadata schema from upstream |
npm run build | No-op passthrough for CI compatibility |
Operational Profiles
- Local development: keep reliability toggles off (
RETRY_POLICY_ENABLED=false,CIRCUIT_BREAKER_ENABLED=false,BULKHEAD_ENABLED=false). - Production baseline: enable at least
CIRCUIT_BREAKER_ENABLED=trueandBULKHEAD_ENABLED=trueafter validation in staging. - Incident debugging: temporarily enable
ASYNC_POLLER_DEBUG=truewith conservativeASYNC_POLLER_LOG_MAX_CHARS.
Service Architecture
Each service follows this structure:
module.exports = {
name: 'service-name',
settings: { /* service-specific settings */ },
actions: {
myAction: {
rest: 'GET /my-action',
params: { param1: { type: 'string' } },
openapi: { summary: 'โฆ', tags: ['MyService'] },
async handler(ctx) { /* โฆ */ }
}
},
events: { /* event handlers */ },
methods: { /* internal methods */ },
created() {}, async started() {}, async stopped() {}
};
AI Agent
The agent service (services/agent.service.js) exposes four REST actions used by the Research Web App:
| Endpoint | Description |
|---|---|
POST /api/agent/analyze | Generate a multi-step execution plan from a free-text query |
POST /api/agent/execute | Run the plan and return results + an AI-generated summary |
GET /api/agent/session/:id | Retrieve a saved session (shareable URL) |
GET /api/agent/session/:id/csv?โฆ | Re-run plan and download results as CSV |
Parameter Extraction (RULE 5)
Every concrete value from the user's message (dates, postal codes, IDs, operator names, โฆ) is automatically surfaced as an editable form field with the extracted value pre-filled. Structural parameters (format, limit, type, โฆ) remain hardcoded. This makes every generated query a reusable template that can be adjusted without re-analysis.
Robust Plan Execution
normalizePlan()โ normalises varying key names from the LLM (useTool/args/labelโaction/params/description)resolveChainedRef()โ resolves__step_N.fieldPathreferences between steps, strips{{โฆ}}wrapperseffectiveInputsโ seeds fromrequiredInputs[].default, overlaid by user-supplied values; overrides hardcoded step params for any declaredrequiredInputname- Self-healing re-plan โ if a step returns an empty result, the agent automatically retries with a re-generated plan (one attempt, guarded by
repairAttemptflag)
API Gateway
The API Gateway (services/api.service.js) provides:
- Base URL:
http://localhost:3000/api - Research Web App:
http://localhost:3000/app - API Docs:
http://localhost:3000/api/docs - OpenAPI spec:
http://localhost:3000/api/openapi.json
Release Checklist
- Update version in
package.jsonand OpenAPI version inservices/api.service.js - Update
CHANGELOG.md - Run tests:
npm test(must pass with coverage thresholds) - Run lint:
npm run lint - Run OpenAPI audit:
npm run audit:openapi - Run dependency security audit:
npm run audit:security - Ensure
custom-services/,custom-tests/,.sessions/, and.envare not committed - Commit, tag, and push: see Release Process in copilot-instructions
Contributing
Contributions are welcome! Please read CONTRIBUTING.md before submitting a pull request.
- Fork the repository
- Create a feature branch (
git checkout -b feat/my-feature) - Make your changes with tests
- Run
npm testandnpm run lint - Submit a pull request
Versioning
This project follows Semantic Versioning. See CHANGELOG.md for the full release history.
Security
Please report security issues privately. See SECURITY.md for the responsible disclosure policy.
Code of Conduct
Please follow our community guidelines in CODE_OF_CONDUCT.md.
License
GPL-3.0 โ see LICENSE for details.
Support
- GitHub Issues: https://github.com/energychain/cernion-energy-tools/issues
- Cernion Token: https://cernion.de/ or email [email protected]
Acknowledgments
- Moleculer โ Microservices framework
- Google Gemini โ AI plan generation
- MCP โ Model Context Protocol
- Cernion โ German energy data backend
Related Servers
MCP Trader Server
An MCP server for stock and cryptocurrency analysis with technical analysis tools.
MCP-India-Stack
MCP server for Indian APIs โ GSTIN, IFSC, PAN, UPI, pincode, HSN/SAC. Zero auth. Offline-first. For AI agents.
PancakeSwap PoolSpy
Tracks newly created liquidity pools on PancakeSwap, providing real-time data for DeFi analysts, traders, and developers.
Kalshi MCP
Self-hosted MCP server for Kalshi prediction market trading via DFlow on Solana. 40 tools for market discovery, order management, position tracking, and Jupiter swaps.
Intra Pay Pagamentos
Payments of Brazil - PIX
Nexbid
Agentic commerce infrastructure for AI agents. MCP-native product discovery, contextual ad matching, and purchase facilitation with European privacy compliance (nDSG/GDPR).
Euroleague Live
Provides club information and advanced player statistics for Euroleague and Eurocup basketball from the Euroleague API.
MCP Seat Reservation Server
A server for managing a comprehensive seat reservation system.
Image Reader
A server for extracting and understanding content from images.
ToolBooth
The MCP server that gives your LLM a trading account without giving it the keys to blow up your portfolio