Loop MCP Server
Enables LLMs to process array items sequentially with a specific task.
Loop MCP Server
An MCP (Model Context Protocol) server that enables LLMs to process arrays item by item with a specific task.
Overview
This MCP server provides tools for:
- Initializing an array with a task description
- Fetching items one by one or in batches for processing
- Storing results for each processed item or batch
- Retrieving all results (only after all items are processed)
- Optional result summarization
- Configurable batch size for efficient processing
Installation
npm install
Usage
Running the Server
npm start
Available Tools
-
initialize_array - Set up the array and task
array
: The array of items to processtask
: Description of what to do with each itembatchSize
(optional): Number of items to process in each batch (default: 1)
-
get_next_item - Get the next item to process
- Returns: Current item, index, task, and remaining count
-
get_next_batch - Get the next batch of items based on batch size
- Returns: Array of items, indices, task, and remaining count
-
store_result - Store the result of processing
result
: The processing result (single value or array for batch processing)
-
get_all_results - Get all results after completion
summarize
(optional): Include a summary- Note: This will error if processing is not complete
-
reset - Clear the current processing state
Example Workflows
Single Item Processing
// 1. Initialize
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5],
task: 'Square each number'
});
// 2. Process each item
while (true) {
const item = await callTool('get_next_item');
if (item.text === 'All items have been processed.') break;
// Process the item (e.g., square it)
const result = item.value * item.value;
await callTool('store_result', { result });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Batch Processing
// 1. Initialize with batch size
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
task: 'Double each number',
batchSize: 3
});
// 2. Process in batches
while (true) {
const batch = await callTool('get_next_batch');
if (batch.text === 'All items have been processed.') break;
// Process the batch
const results = batch.items.map(item => item * 2);
await callTool('store_result', { result: results });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Running the Example
node example-client.js
Integration with Claude Desktop
Add to your Claude Desktop configuration:
{
"mcpServers": {
"loop-processor": {
"command": "node",
"args": ["/path/to/loop_mcp/server.js"]
}
}
}
Related Servers
Geo Location Demo
Retrieves user geolocation information using EdgeOne Pages Functions and exposes it via the Model Context Protocol (MCP).
Shell MCP
Securely execute shell commands with whitelisting, resource limits, and timeout controls for LLMs.
TypeScript MCP
A TypeScript-specialized server providing advanced code manipulation and analysis capabilities.
Remote MCP Server (Authless)
A remote MCP server deployable on Cloudflare Workers without authentication.
Laravel Codebase Introspection
Introspects Laravel codebases to provide structured information about views, routes, classes, and models using the mateffy/laravel-introspect package.
302AI Sandbox MCP Server
A code sandbox for AI assistants to safely execute arbitrary code. Requires a 302AI API key for authentication.
DINO-X
Advanced computer vision and object detection MCP server powered by Dino-X, enabling AI agents to analyze images, detect objects, identify keypoints, and perform visual understanding tasks.
Playwright MCP Explorer
An intelligent tool that uses MCP to autonomously explore websites and generate Playwright tests from natural language descriptions.
Criage MCP Server
An MCP server for the Criage package manager, providing full client functionality via the MCP protocol.
Enhanced AutoGen MCP Server
Integrates with Microsoft's AutoGen framework to enable sophisticated multi-agent conversations via the Model Context Protocol.