Loop MCP Server
Enables LLMs to process array items sequentially with a specific task.
Loop MCP Server
An MCP (Model Context Protocol) server that enables LLMs to process arrays item by item with a specific task.
Overview
This MCP server provides tools for:
- Initializing an array with a task description
- Fetching items one by one or in batches for processing
- Storing results for each processed item or batch
- Retrieving all results (only after all items are processed)
- Optional result summarization
- Configurable batch size for efficient processing
Installation
npm install
Usage
Running the Server
npm start
Available Tools
-
initialize_array - Set up the array and task
array: The array of items to processtask: Description of what to do with each itembatchSize(optional): Number of items to process in each batch (default: 1)
-
get_next_item - Get the next item to process
- Returns: Current item, index, task, and remaining count
-
get_next_batch - Get the next batch of items based on batch size
- Returns: Array of items, indices, task, and remaining count
-
store_result - Store the result of processing
result: The processing result (single value or array for batch processing)
-
get_all_results - Get all results after completion
summarize(optional): Include a summary- Note: This will error if processing is not complete
-
reset - Clear the current processing state
Example Workflows
Single Item Processing
// 1. Initialize
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5],
task: 'Square each number'
});
// 2. Process each item
while (true) {
const item = await callTool('get_next_item');
if (item.text === 'All items have been processed.') break;
// Process the item (e.g., square it)
const result = item.value * item.value;
await callTool('store_result', { result });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Batch Processing
// 1. Initialize with batch size
await callTool('initialize_array', {
array: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
task: 'Double each number',
batchSize: 3
});
// 2. Process in batches
while (true) {
const batch = await callTool('get_next_batch');
if (batch.text === 'All items have been processed.') break;
// Process the batch
const results = batch.items.map(item => item * 2);
await callTool('store_result', { result: results });
}
// 3. Get final results
const results = await callTool('get_all_results', { summarize: true });
Running the Example
node example-client.js
Integration with Claude Desktop
Add to your Claude Desktop configuration:
{
"mcpServers": {
"loop-processor": {
"command": "node",
"args": ["/path/to/loop_mcp/server.js"]
}
}
}
関連サーバー
Alpha Vantage MCP Server
スポンサーAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
VSCode MCP
Interact with VSCode through the Model Context Protocol, enabling AI agents to perform development tasks.
Shadcn Space MCP
Integrate shadcn space MCP server into your IDE to generate ready-to-use shadcn/ui components without guesswork.
Botrite by Lattiq
Stateful health monitoring, diagnostics, and web attestation for AI agents. 11 MCP tools. Free Founder's Beta
agent-lsp
A stateful LSP runtime for AI agents: warm language server sessions with 50+ tools for go-to-definition, find-references, diagnostics, rename, and more across 30+ languages.
MCP Everything
A demonstration server for the Model Context Protocol (MCP) showcasing various features like tools, resources, and prompts in TypeScript and Python.
CrowdCent MCP Server
Integrates with the CrowdCent Challenge API, allowing AI assistants to manage prediction challenges, datasets, and submissions.
Cursor Talk to Figma MCP
Integrates Cursor AI with Figma to read and programmatically modify designs.
Nucleus MCP
Local-first, cross-platform memory sync for AI coding tools (Cursor, Claude, Windsurf) with persistent engrams and hypervisor security.
ExMCP Test Server
An Elixir-based MCP server for testing and experimenting with the Model Context Protocol.
AgentMode
An all-in-one MCP server for developers, connecting coding AI to databases, data warehouses, data pipelines, and cloud services.