Refine Prompt
Refines and structures prompts for large language models using the Anthropic API.
✨ Refine Prompt ✨
Transform ordinary prompts into powerful, structured instructions for any LLM
Installation • Getting Started • How It Works • Features • Examples • License
📋 Overview
Refine Prompt is an intelligent prompt engineering tool that transforms ordinary prompts into powerful, structured instructions for any large language model (LLM). Using Claude's advanced capabilities, it enhances your prompts to produce exceptional results across all AI platforms.
🔑 Important: Your prompt must include the keyword "refine" to activate the enhancement process.
Example:
"refine, Create a function that calculates the factorial of a number"
🚀 Installation
📦 Via NPM
npm install refine-prompt
🛠️ Via Smithery
You can install this MCP server directly through Smithery by visiting: Smithery - Refine Prompt
📁 Via Local Repository
# Clone the repository
git clone https://github.com/felippefarias/refine-prompt.git
cd refine-prompt
# Install dependencies
npm install
🏁 Getting Started
🔑 API Key Setup
Refine Prompt requires an Anthropic API key to access Claude's advanced capabilities:
export ANTHROPIC_API_KEY=your_anthropic_api_key
Without an API key, the tool will display an error message.
⚙️ Running the Server
npm start
Or with MCP Inspector:
npx @modelcontextprotocol/inspector npm start
🔧 Tool: rewrite_prompt - The Power of Refinement
Transform your ideas into expertly crafted prompts. This powerful tool analyzes your input and generates a professionally engineered prompt that maximizes AI understanding and response quality.
📝 Parameters
| Parameter | Description | Required |
|---|---|---|
prompt | Your raw prompt that needs refinement (must include "refine" keyword) | ✅ |
language | For code-related prompts, specify the target programming language | ❌ |
📋 Example Usage
For general prompts
{
"name": "rewrite_prompt",
"arguments": {
"prompt": "refine, Summarize the main points of the article titled \"The Future of AI\""
}
}
For code-related prompts
{
"name": "rewrite_prompt",
"arguments": {
"prompt": "refine, Create a function to convert temperature between Celsius and Fahrenheit",
"language": "typescript"
}
}
🧠 How It Works
The server uses Claude 3 Sonnet by Anthropic to intelligently rewrite your prompts for better results. Every prompt must include the keyword "refine" to trigger the enhancement process.
It enhances your prompt by:
- 📐 Adding clear structure and context
- 📝 Specifying requirements and expectations
- 🔍 Including domain-specific considerations
- 🌐 Optimizing for any LLM understanding
✨ Features
| Feature | Description |
|---|---|
| 🤖 AI-Powered Refinement | Leverages Claude 3.5 Sonnet's advanced capabilities to transform your prompts |
| 🔑 Activation with Keywords | Simply include "refine" in your prompt to trigger the enhancement |
| 🌐 Universal Compatibility | Optimizes prompts for any type of task or domain |
| 💻 Code-Specific Intelligence | Provides specialized enhancements for programming tasks when language is specified |
| 🔄 Seamless Integration | Works flawlessly with any LLM-powered application or workflow |
| 🎯 Precision-Focused | Uses 0.2 temperature setting to ensure reliable, consistent output quality |
| 📊 Structural Clarity | Adds logical organization, clear instructions, and proper formatting |
⚙️ Configuration
🖥️ Usage with Claude Desktop
Add this to your claude_desktop_config.json:
NPX
{
"mcpServers": {
"refine-prompt": {
"command": "npx",
"args": [
"-y",
"refine-prompt"
]
}
}
}
Local Installation
# Clone the repository
git clone https://github.com/felippefarias/refine-prompt.git
cd refine-prompt
# Install dependencies
npm install
# Run the server
node index.js
📊 Examples
📝 General Prompt Enhancement
| Input | Arguments |
|---|---|
|
|
💻 Code-Related Prompt Enhancement
| Input | Arguments |
|---|---|
|
|
The tool will rewrite both prompts to be more structured and detailed for optimal results with any LLM.
🚀 Elevate Your AI Interactions
Refine Prompt bridges the gap between human thinking and AI understanding. By transforming your natural language instructions into expertly engineered prompts, it helps you unlock the full potential of any language model. Whether you're a developer, content creator, researcher, or AI enthusiast, Refine Prompt gives you the power to communicate with AI more effectively.
📄 License
Refine Prompt is licensed under the MIT License. You are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
เซิร์ฟเวอร์ที่เกี่ยวข้อง
Scout Monitoring MCP
ผู้สนับสนุนPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
ผู้สนับสนุนAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
missiond
Multi-agent orchestration for Claude Code - spawn and control multiple Claude instances via MCP
Azure DevOps
Integrate with Azure DevOps services to manage work items, repositories, and pipelines.
Dart MCP Server
An MCP server that exposes Dart SDK commands for AI-powered development.
Enhanced QR Code MCP Server
Generates QR codes with metadata, batch processing, and file management tools.
A2ABench
Agent-native developer Q&A API with MCP + A2A endpoints for citations, job pickup, and answer submission.
Cookiecutter MCP UV Container
A Cookiecutter template for creating MCP servers with Apple container support and configurable transport methods.
Code Editor
Enables AI assistants to write, edit, and manage code files directly in a specified directory, respecting .gitignore patterns.
PGYER
MCP Server for PGYER platform, supports uploading, querying apps, etc.
Codelogic
Utilize Codelogic's rich software dependency data in your AI programming assistant.
mcp-of-mcps
MCP of MCPs is a meta-server that merges all your MCP servers into a single smart endpoint. It gives AI agents instant tool discovery, selective schema loading, and massively cheaper execution, so you stop wasting tokens and time. With persistent tool metadata, semantic search, and direct code execution between tools, it turns chaotic multi-server setups into a fast, efficient, hallucination-free workflow. It also automatically analyzes the tools output schemas if not exist and preserves them across sessions for consistent behavior.