ProjectFlow
A workflow management system for AI-assisted development with MCP support, featuring flexible storage via file system or PostgreSQL.
ProjectFlow
A workflow management system for AI-assisted development, similar to Jira or Azure DevOps. Supports both API-driven interactions and Model Context Protocol for seamless AI agent integration.
Features
- Hierarchical task management (Epics, Stories, Subtasks)
- š NEW: Natural Language Chat Interface - Interact with ProjectFlow using conversational commands
- REST API for programmatic access
- Model Context Protocol (MCP) support for AI agents
- Web interface for human users
- Flexible storage: File system (JSON) or PostgreSQL database
- Clean, modern UI with accessibility features
- Containerized deployment
Tech Stack
- Backend: Go 1.24
- Storage: File system (JSON) or PostgreSQL database
- Frontend: HTML templates, CSS, JavaScript
- Containerization: Docker/Podman
- Protocols: HTTP REST API + Model Context Protocol
Quick Start
Prerequisites
- Go 1.24 or later
- Docker/Podman (for containerized deployment)
Running Locally
-
Clone the repository:
git clone https://github.com/aykay76/projectflow.git cd projectflow
-
Run the application:
go run cmd/server/main.go
-
Open your browser and navigate to
http://localhost:16191
š¬ Natural Language Chat Interface
ProjectFlow now features an AI-powered chat interface that allows you to manage tasks and projects using natural language commands. Simply click the chat button (š¬) in the header or use the keyboard shortcut ā+/
(Mac) or Ctrl+/
(Windows/Linux) to get started.
Quick Examples
Create a high priority task to fix the login bug
List all tasks in the PF project
Mark task PF-123 as done
Show me overdue tasks
Create a new project called "Website Redesign"
Getting Started with Chat
- Open the chat interface: Click the š¬ button in the header or press
ā+/
- Type your request: Use natural language to describe what you want to do
- Get instant results: The AI will interpret your request and perform the action
For detailed chat commands and examples, see the Chat Interface Guide.
LLM Configuration
The chat interface supports multiple LLM providers:
- š Ollama: Use local LLM models for privacy and offline capability
- OpenAI GPT: Use OpenAI's GPT models for natural language understanding
- Groq: Fast cloud-based LLM inference
- Anthropic Claude: Leverage Anthropic's Claude AI for conversational interactions
Quick Setup with Ollama (Local LLM)
For the fastest setup with privacy and no API costs:
# Install Ollama
brew install ollama # macOS
# or curl -fsSL https://ollama.com/install.sh | sh # Linux
# Start Ollama and install a model
ollama serve &
ollama pull llama3.2
# Configure ProjectFlow
export LLM_PROVIDER=ollama
export LLM_OLLAMA_MODEL=llama3.2
./projectflow
See the Ollama Quick Start Guide for detailed setup instructions.
Cloud LLM Providers
For cloud-based LLMs, configure your API key:
# For OpenAI
export LLM_PROVIDER=openai
export LLM_API_KEY=your-openai-key
export LLM_MODEL=gpt-4
# For Groq
export LLM_PROVIDER=groq
export LLM_API_KEY=your-groq-key
export LLM_MODEL=llama-3.1-8b-instant
Environment Variables
Server Configuration:
PORT
: Server port (default: 16191)SHUTDOWN_TIMEOUT
: Graceful shutdown timeout in seconds (default: 30)LOG_LEVEL
: Logging level - DEBUG, INFO, WARN, ERROR (default: INFO)LOG_FORMAT
: Log format - json or text (default: text)
Storage Configuration:
STORAGE_TYPE
: Storage backend - file or postgres (default: file)
File Storage:
DATA_DIR
: Directory for data storage (default: ./data)
PostgreSQL Storage:
DB_HOST
: Database host (default: localhost)DB_PORT
: Database port (default: 5432)DB_NAME
: Database name (default: projectflow)DB_USER
: Database user (default: projectflow)DB_PASSWORD
: Database password (required for postgres)DB_SSL_MODE
: SSL mode - disable, require, verify-ca, verify-full, prefer, allow (default: prefer)
LLM Configuration (for Chat Interface):
LLM_PROVIDER
: LLM provider - ollama, groq, openai, disabled (default: disabled)LLM_API_KEY
: API key for cloud LLM providers (required for groq, openai)LLM_BASE_URL
: Custom base URL for the LLM provider (optional)LLM_MODEL
: Model name to use (default varies by provider)LLM_TIMEOUT
: Request timeout in seconds (default: 60)LLM_MAX_TOKENS
: Maximum tokens per response (default: 1000)
Ollama-specific (for local LLM):
LLM_OLLAMA_HOST
: Ollama server URL (default: http://localhost:11434)LLM_OLLAMA_MODEL
: Ollama model name (default: llama3.2)
For detailed PostgreSQL setup, see PostgreSQL Storage Documentation.
Using Docker
-
Build the image:
podman build -t projectflow .
-
Run the container:
podman run -p 16191:16191 -v $(pwd)/data:/app/data projectflow
API Documentation
Chat API
POST /api/chat
- Send a natural language message to the chat interfaceGET /api/chat/history
- Retrieve conversation history
LLM API
GET /api/llm/info
- Get LLM provider information and statusGET /api/llm/health
- Check LLM provider healthPOST /api/llm/chat
- Send direct messages to the LLM (bypasses ProjectFlow translation)
Chat Request/Response
Send Message:
POST /api/chat
{
"message": "Create a high priority task to fix the login bug",
"conversation_id": "optional-uuid"
}
Response:
{
"response": "I've created task PF-123: 'Fix login bug' with high priority.",
"actions_taken": ["create_task"],
"task_ids": ["PF-123"],
"conversation_id": "uuid",
"confidence": 0.95,
"intent": "create_task"
}
Get History:
GET /api/chat/history?conversation_id=uuid
{
"id": "uuid",
"messages": [
{
"id": "msg-uuid",
"role": "user",
"content": "Create a task...",
"timestamp": "2025-06-22T15:17:44.334579Z"
}
],
"created": "2025-06-22T15:17:44.334574Z",
"updated": "2025-06-22T15:17:44.334574Z"
}
LLM API Examples
Get LLM Info:
GET /api/llm/info
{
"enabled": true,
"provider": "ollama",
"model": "llama3.2",
"status": "healthy",
"timestamp": "2025-06-23T08:56:47.927Z",
"metadata": {
"host": "http://localhost:11434",
"version": "0.1.17"
}
}
Check LLM Health:
GET /api/llm/health
{
"healthy": true,
"status": "healthy",
"provider": "ollama",
"timestamp": "2025-06-23T08:56:47.927Z",
"duration_ms": 45,
"suggestions": []
}
Direct LLM Chat:
POST /api/llm/chat
{
"messages": [
{"role": "user", "content": "Hello!"}
],
"max_tokens": 1000,
"temperature": 0.7
}
Response:
{
"response": {
"choices": [
{
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}
]
},
"provider": "ollama",
"model": "llama3.2"
}
Tasks API
GET /api/tasks
- List all tasksPOST /api/tasks
- Create a new taskGET /api/tasks/{id}
- Get task by IDPUT /api/tasks/{id}
- Update taskDELETE /api/tasks/{id}
- Delete taskGET /api/hierarchy
- Get tasks in hierarchical structure
Task Structure
{
"id": "string",
"title": "string",
"description": "string",
"status": "string",
"priority": "string",
"parent_id": "string",
"children": ["string"],
"created_at": "timestamp",
"updated_at": "timestamp"
}
Hierarchy Structure
The /api/hierarchy
endpoint returns tasks in a nested structure:
[
{
"task": {
"id": "string",
"title": "string",
"description": "string",
"status": "string",
"priority": "string",
"type": "string",
"parent_id": "string",
"children": ["string"],
"created_at": "timestamp",
"updated_at": "timestamp"
},
"child_tasks": [
{
"task": { /* nested task */ },
"child_tasks": [ /* recursively nested */ ]
}
]
}
]
Development
Project Structure
āāā cmd/server/ # Application entry point
āāā internal/
ā āāā handlers/ # HTTP handlers
ā āāā models/ # Data models
ā āāā storage/ # Storage implementations
āāā pkg/api/ # Public API definitions
āāā web/
ā āāā templates/ # HTML templates
ā āāā static/ # CSS, JS, images
āāā data/ # Local data storage
āāā Dockerfile # Container definition
Running Tests
go test ./...
Building
go build -o bin/projectflow cmd/server/main.go
Model Context Protocol (MCP)
ProjectFlow includes a Model Context Protocol (MCP) server that enables AI agents to interact with tasks programmatically. This allows AI assistants to create, read, update, and delete tasks as part of their workflow.
MCP Server Setup
-
Start the MCP server:
go run cmd/mcp-server/main.go
The MCP server runs on port 3001 by default.
-
Configure your MCP client: Use the provided
mcp-config.json
file or configure manually:{ "mcpServers": { "projectflow": { "command": "go", "args": ["run", "cmd/mcp-server/main.go"], "cwd": "/path/to/projectflow" } } }
Available MCP Tools
The MCP server provides these tools for task management:
list_tasks
- List all tasks with optional filteringcreate_task
- Create a new taskget_task
- Get a specific task by IDupdate_task
- Update an existing taskdelete_task
- Delete a taskget_task_hierarchy
- Get tasks in hierarchical structure
Available MCP Resources
The MCP server exposes these resources:
tasks://all
- List of all taskstasks://hierarchy
- Hierarchical task structuretasks://summary
- Project summary with statistics
Example Usage
# Start both servers
go run cmd/server/main.go & # HTTP server on :16191
go run cmd/mcp-server/main.go & # MCP server on :3001
# Use with MCP-compatible AI clients
# The AI can now create, manage, and query tasks programmatically
Integration with AI Agents
AI agents can use the MCP interface to:
- Create and manage development tasks
- Track project progress
- Generate reports and summaries
- Automate workflow processes
- Integrate with other development tools
For detailed MCP documentation, see docs/mcp.md.
Project Integration with VS Code
ProjectFlow can be seamlessly integrated into your VS Code projects, allowing you to store and manage tasks alongside your code in Git. This enables powerful AI-assisted development workflows where coding agents can create, update, and track development tasks directly within your project context.
Setup .vscode/mcp.json
Add a .vscode/mcp.json
file to your project root to configure ProjectFlow as an MCP server:
{
"mcpServers": {
"projectflow": {
"command": "go",
"args": ["run", "cmd/mcp-server/main.go"],
"cwd": "/path/to/projectflow",
"env": {
"STORAGE_DIR": "./.projectflow/data"
}
}
}
}
Project-Specific Task Storage
When integrated with your project, ProjectFlow will store tasks in a .projectflow/data/
directory within your project:
your-project/
āāā .vscode/
ā āāā mcp.json # MCP configuration
āāā .projectflow/
ā āāā data/
ā āāā tasks/ # Project-specific tasks
ā āāā epic-1.json # Your development epics
ā āāā story-1.json # User stories
ā āāā task-1.json # Development tasks
āāā src/ # Your application code
āāā README.md
āāā .gitignore
Benefits of Project Integration
- Unified Version Control: Tasks are versioned alongside your code
- Context-Aware AI: Coding agents understand both code and task context
- Team Collaboration: Shared task management through Git
- Branch-Specific Tasks: Different branches can have different task states
- Automated Workflows: AI agents can create tasks from code analysis
Example Workflow
-
Initialize ProjectFlow in your project:
mkdir -p .projectflow/data/projects echo ".projectflow/data/projects/*/*.json" >> .gitignore # Optional: exclude project and task files
-
Configure VS Code MCP:
{ "mcpServers": { "projectflow": { "command": "go", "args": ["run", "/path/to/projectflow/cmd/mcp-server/main.go"], "env": { "STORAGE_DIR": "./.projectflow/data" } } } }
-
Use with AI Coding Agents:
- AI agents can create tasks based on code analysis
- Track development progress alongside code changes
- Generate tasks from TODO comments in code
- Link tasks to specific commits or pull requests
Integration with Development Workflow
The ProjectFlow MCP integration enables powerful development workflows:
- Automated Task Creation: AI agents analyze code and create relevant tasks
- Progress Tracking: Link tasks to commits and pull requests
- Code Review Tasks: Generate review tasks for specific code changes
- Bug Tracking: Create and track bugs directly from code analysis
- Feature Planning: Plan features as hierarchical tasks (Epic ā Story ā Task)
Frontend Access
While the primary interface is through MCP and AI agents, you can still access the web frontend:
-
Start the ProjectFlow server pointing to your project's data:
STORAGE_DIR=./.projectflow/data go run /path/to/projectflow/cmd/server/main.go
-
Open
http://localhost:16191
to view and manage tasks in the web interface
Git Integration Best Practices
- Commit task changes: Include task updates in your commits
- Branch-specific tasks: Use different task states per branch
- Team synchronization: Pull task updates when syncing with team
- Task cleanup: Archive completed tasks periodically
Documentation
User Documentation
- User Guide - Comprehensive guide for end users
- Chat Interface Guide - Natural language commands and examples
- FAQ - Frequently asked questions and answers
Administrator Documentation
- Deployment Guide - Production deployment options
- LLM Setup Guide - Configure AI providers
- PostgreSQL Storage - Database setup and configuration
- Troubleshooting Guide - Common issues and solutions
Developer Documentation
- Developer Guide - Extending and customizing ProjectFlow
- MCP Documentation - Model Context Protocol integration
- Configuration Guide - Environment variables and settings
- In-App Help System - Frontend help implementation
Contributing
- Fork the repository
- Create a feature branch
- Make your changes with proper tests
- Submit a pull request
See our Developer Guide for detailed contribution guidelines.
License
MIT License - see LICENSE file for details.
Related Servers
Open MCP Server
A service framework supporting the Model Context Protocol (MCP) to integrate enterprise systems and AI platforms via RESTful, gRPC, and Dubbo protocols.
MCP Software Engineer Server
An MCP server that enables an AI to function as a full-stack software engineer with complete development capabilities.
Bifrost
Exposes VSCode's development tools and language features to AI tools through an MCP server.
Ollama MCP Server
A bridge to use local LLMs from Ollama within the Model Context Protocol.
pfSense MCP Server
Enables natural language interaction with pfSense firewalls through GenAI applications.
Oso Cloud MCP Server
Understand, develop, and debug authorization policies in Oso Cloud.
MCP Server Starter
A starter project for building MCP servers with TypeScript and Bun.
LaTeX PDF MCP Server
Converts LaTeX source code into professionally formatted PDF documents.
JSON Diff
A JSON diff tool to compare two JSON strings.
Markdown Sidecar MCP Server
Serve and access markdown documentation for locally installed NPM, Go, or PyPi packages.