Integration with QA Sphere test management system, enabling LLMs to discover, summarize, and interact with test cases directly from AI-powered IDEs
A Model Context Protocol server for the QA Sphere test management system.
This integration enables Large Language Models (LLMs) to interact directly with QA Sphere test cases, allowing you to discover, summarize, and chat about test cases. In AI-powered IDEs that support MCP, you can reference specific QA Sphere test cases within your development workflow.
example.eu2.qasphere.com
)This server is compatible with any MCP client. Configuration instructions for popular clients are provided below.
Claude
→ Settings
→ Developer
→ Edit Config
claude_desktop_config.json
mcpServers
dictionarySettings...
→ Cursor settings
→ Add new global MCP server
Click the button below to automatically install and configure the QA Sphere MCP server:
qasphere
npx -y qasphere-mcp
For any MCP client, use the following configuration format:
{
"mcpServers": {
"qasphere": {
"command": "npx",
"args": ["-y", "qasphere-mcp"],
"env": {
"QASPHERE_TENANT_URL": "your-company.region.qasphere.com",
"QASPHERE_API_KEY": "your-api-key"
}
}
}
}
Replace the placeholder values with your actual QA Sphere URL and API key.
This project is licensed under the MIT License - see the LICENSE file for details.
If you encounter any issues or need assistance, please file an issue on the GitHub repository.
Generate and edit images using OpenAI's DALL-E models via the official Python SDK.
Analyze images using OpenRouter's vision models. Requires an OpenRouter API key.
An iOS mobile automation server using Appium and WebDriverAgent, built with clean architecture and SOLID principles.
Predict anything with Chronulus AI forecasting and prediction agents.
Parses HAR (HTTP Archive) files and displays requests in a simplified format for AI assistants.
A framework for AI-powered command execution and a plugin-based tool system. It can be run as a standalone service or embedded in other projects to expose a consistent API for invoking tools and managing tasks.
Interact with the Lean theorem prover via the Language Server Protocol (LSP), enabling LLM agents to understand, analyze, and modify Lean projects.
Interact with the Prefect API for workflow orchestration and management.
Integrates with Azure DevOps, allowing interaction with its services. Requires a Personal Access Token (PAT) for authentication.
Add translation keys to Lokalise projects. Requires a Lokalise API key.