SimpleChatJS
A lightweight AI chat application with MCP support, built with pure JavaScript and Node.js.
SimpleChatJS
A lightweight, no-frills AI chat application built with pure JavaScript and Node.js. Designed for developers who appreciate simple architecture, and direct transparency with their chat interface.
Philosophy
SimpleChatJS embraces a back-to-basics approach:
- Pure JavaScript Frontend - No React, Vue, or complex frameworks. Just vanilla JS, HTML, and CSS that any developer can understand and modify.
- Clean Architecture - Well-organized backend with clear separation of concerns. Easy to extend and maintain.
- OpenAI Compatible - Works with any OpenAI-compatible API including Ollama, providing flexibility in your AI provider choice.
- MCP Integration - Built-in support for Model Context Protocol, enabling powerful tool integrations.
- 1998-Style Simplicity - Edit files, refresh browser. No build tools, no compilation step, no complex deployment pipeline.
Features
Core Chat Functionality
- Multiple persistent chat sessions with SQLite storage
- Real-time streaming responses
- Clean, dark-mode interface
- Message history and chat management
Model Context Protocol (MCP) Support
- Connect to MCP servers for enhanced AI capabilities
- Tool execution with real-time feedback
- Configurable tool enabling/disabling
- Server-sent events for live tool status updates
Developer-Friendly
- No Build Tools - Direct file editing with immediate browser refresh
- Clear Code Organization - Frontend and backend properly separated into logical modules
- Comprehensive Logging - Built-in debug panels and structured logging
- Simple Deployment - Single command startup with included scripts
Advanced Features
- Conductor Mode - Multi-phase AI reasoning (experimental feature with ongoing improvements)
- Debug Data Separation - Technical debugging information separate from chat content
- Flexible API Configuration - Easy switching between different AI providers
Quick Start
Prerequisites
You'll need an AI API server running before starting SimpleChat JS. This could be:
- Ollama running locally (
ollama serve) - OpenAI API with your API key
- Any OpenAI-compatible API (LM Studio, vLLM, etc.)
Requiremnets not currently installed/included
- Node.js
Simple setup
Just run start.bat/start.sh.
- All node dependencies will be installed
- Server will start and you can connect to the localhost on port 50505 in a browser
DIY Installation
-
Clone or download this repository
-
Install dependencies:
npm install -
Start the application:
Windows:
start.batMac/Linux:
./start.shOr using npm:
npm start -
Open your browser to
http://localhost:50505 -
Configure your API settings in the Settings panel:
- Set your API URL (e.g.,
http://localhost:11434/v1for Ollama) - Add your API key if required
- Select your model
- Set your API URL (e.g.,
MCP Setup (Optional)
To enable tool integrations:
- Click the "MCP Config" button in the interface
- Configure your MCP servers in the JSON editor
- Click "Connect MCP Servers"
- Enable specific tools in the Settings panel
Example MCP configuration:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "./"],
"env": {}
}
}
}
Architecture
Frontend Structure
src/js/
├── app/ # Core application logic
├── chat/ # Chat functionality and conductor mode
├── tools/ # Tool handling and MCP integration
├── render/ # Message rendering and streaming
└── ui/ # User interface components
Backend Structure
backend/
├── server.js # Main entry point
├── config/ # Database and configuration
├── routes/ # API endpoints
├── services/ # Business logic
└── utils/ # Shared utilities
Data Flow
- Frontend sends requests to REST API endpoints
- Chat Service processes requests and streams responses
- MCP Service handles tool execution when needed
- Tool Events provide real-time updates via Server-Sent Events
- Database persists chat history and settings
API Compatibility
SimpleChat JS works with any API that follows OpenAI's chat completions format:
- Ollama - Local AI models
- OpenAI API - GPT models with API key
- LM Studio - Local API server
- vLLM - High-performance inference server
- Anthropic Claude - Via compatible proxies
- Custom APIs - Any service implementing the OpenAI format
Development
Making Changes
- Frontend Changes - Edit files in
src/js/, refresh browser - Backend Changes - Edit files in
backend/, restart server - Styling - Edit
src/css/style.css, refresh browser
Adding Features
- New API Endpoints - Add to appropriate route file in
backend/routes/ - New Services - Create service files in
backend/services/ - Frontend Components - Add to appropriate directory in
src/js/
Debugging
- Enable debug panels in Settings
- Check browser console for frontend logs
- Monitor server terminal for backend logs
- Use the debug data viewer for AI request/response analysis
Known Limitations
- Conductor Mode is experimental and may have rough edges in complex scenarios
- Tool Execution timing can vary significantly based on tool complexity
- Browser Compatibility focused on modern browsers (Chrome, Firefox, Safari)
Contributing
SimpleChat JS values clean, readable code over complex frameworks. When contributing:
- Keep the vanilla JavaScript approach
- Maintain clear separation between frontend and backend
- Add appropriate logging for debugging
- Test with multiple AI providers when possible
License
MIT License - Feel free to use, modify, and distribute as needed.
Why SimpleChat JS?
In a world of complex frameworks and build tools, SimpleChat JS proves that powerful AI applications can be built with fundamental web technologies. It's designed for developers who want:
- Direct Control - No black-box frameworks
- Easy Customization - Modify any aspect without fighting abstractions
- Learning Clarity - Understand exactly how AI chat applications work
- Rapid Development - Change code, refresh browser, see results
SimpleChat JS is a workhorse, not a show pony. It gets the job done efficiently and lets you focus on what matters: building great AI experiences.
Related Servers
Confluence
Interact with Confluence to execute CQL queries, retrieve page content, and update pages.
A2A Client MCP Server
An MCP server client for the Agent-to-Agent (A2A) protocol, enabling LLMs to interact with A2A agents.
RabbitMQ MCP Server
Interact with queues and topics on a RabbitMQ instance.
IMAP MCP
An IMAP Model Context Protocol (MCP) server to expose IMAP operations as tools for AI assistants.
WhatsApp (TypeScript/Baileys)
Connects a personal WhatsApp account to an AI agent using the WhatsApp Web multi-device API.
oVice
Manage oVice workspaces, groups, users, and send notifications through the oVice API.
MCP Server Whisper
Advanced audio transcription and processing using OpenAI's Whisper and GPT-4o models.
Gmail MCP Server
Allows AI agents to search Gmail threads, learn your writing style, and draft emails.
Slack Webhook
Post messages to Slack channels using incoming webhooks.
WhatsApp Web
An MCP server for interacting with WhatsApp Web, allowing you to send and receive messages.