Automatically integrate the Divvi referral SDK into JavaScript and TypeScript blockchain applications.
A Model Context Protocol (MCP) server that provides AI assistants with tools for working with the Divvi ecosystem. Currently focuses on automatic integration of the @divvi/referral-sdk into JavaScript/TypeScript blockchain applications, with more Divvi-related functionality planned for future releases.
Integrating referral tracking into blockchain applications typically requires understanding SDK documentation, modifying transaction flows, and maintaining integration code as protocols evolve. This MCP server eliminates that complexity by turning any AI assistant into a Divvi integration expert that can automatically implement referral tracking following current best practices.
Compatible with any JavaScript/TypeScript blockchain application, including:
The server currently provides an MCP tool (integrate_divvi_referral_sdk
) that:
ā ļø Prerequisites: Before using this tool, you need to install and configure the MCP server. See the Installation section below.
The easiest way to use this is to simply ask your AI assistant:
"Integrate this dapp with Divvi"
Your AI assistant will guide you through providing the necessary configuration (your Divvi dapp address and campaign addresses) and handle the entire integration process.
For more specific control, you can provide the exact parameters:
"Integrate my dapp with Divvi using consumer address 0x1234..."
The AI assistant will:
To complete the integration, you'll need:
// The AI assistant will call this internally
integrate_divvi_referral_sdk({
consumerAddress: '0x1234567890123456789012345678901234567890',
})
Technical Note: This integration is implemented as an MCP "tool" rather than a "prompt" type. While the MCP specification includes a prompt type that would be more semantically appropriate for this use case, we chose the tool implementation for broader compatibility, as Cursor doesn't currently support MCP prompts.
The easiest way to use the Divvi MCP server is via npm:
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"divvi-mcp": {
"command": "npx",
"args": ["-y", "@divvi/mcp-server"],
"env": {}
}
}
}
Cursor supports MCP servers through its settings. Add the server configuration:
divvi-mcp
npx
["-y", "@divvi/mcp-server"]
Alternatively, if Cursor uses a configuration file, add:
{
"mcpServers": {
"divvi-mcp": {
"command": "npx",
"args": ["-y", "@divvi/mcp-server"]
}
}
}
Configure according to your MCP client's documentation, using:
npx
["-y", "@divvi/mcp-server"]
For development or if you prefer building from source:
Clone the repository:
git clone https://github.com/divvi-xyz/divvi-mcp.git
cd divvi-mcp
Install dependencies:
yarn install
Build the server:
yarn build
Configure your AI assistant to point to the local build:
Use node
as the command and ["/path/to/divvi-mcp/dist/index.js"]
as the args in your MCP client configuration.
The tool instructs AI agents to read the official SDK documentation from:
https://raw.githubusercontent.com/divvi-xyz/divvi-referral-sdk/refs/heads/main/README.md
This ensures the integration always follows the latest patterns and examples.
AI agents analyze the target project to understand:
Following the official documentation, AI agents implement:
The integration ensures:
# Start in development mode
yarn dev
# Run tests
yarn test
# Type checking
yarn typecheck
# Linting
yarn lint
# Format code
yarn format
src/
āāā index.ts # Main MCP server implementation
āāā index.test.ts # Unit tests
āāā ...
scripts/ # Build and utility scripts
dist/ # Compiled output
git checkout -b feature/my-feature
yarn test
integrate_divvi_referral_sdk
Provides instructions for integrating the @divvi/referral-sdk into a project.
Parameter | Type | Required | Description |
---|---|---|---|
consumerAddress | string | Yes | Your Divvi dapp wallet address (builder registration) |
Comprehensive integration instructions that guide AI agents through:
Additional tools for the Divvi ecosystem are planned for future releases. These may include:
Stay tuned for updates as we expand the server's capabilities!
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Built with ā¤ļø for the Divvi ecosystem
Image generation and editing using the FLUX.1 Kontext [Max] model via the Replicate API, featuring advanced text rendering and contextual understanding.
Analyze Solana metrics from InfluxDB and generate Grafana dashboards.
Provides Go language updates and best practices in a structured Markdown format for LLM coding agents.
Popular MCP server that enables AI agents to scaffold, build, run and test iOS, macOS, visionOS and watchOS apps or simulators and wired and wireless devices. It has powerful UI-automation capabilities like controlling the simulator, capturing run-time logs, as well as taking screenshots and viewing the accessibility hierarchy.
Integrate with Google Gemini through its command-line interface (CLI).
An MCP server for AI coding assistants to control, inspect, and modify Bevy applications using the Bevy Remote Protocol (BRP).
A Model Context Protocol (MCP) server that enables AI assistants to integrate with Prometheus Alertmanager
Provides real-time crypto and Web3 intelligence using the Hive Intelligence API.
An MCP server for managing application context using the Togello API.
An LLM-powered server for automating unit, integration, E2E, and API tests.