A context insertion and search server for Claude Desktop and Cursor IDE, using configurable API endpoints.
Where AI memories live forever - A decentralized semantic memory platform powered by blockchain and vector search
ArchiveNET is a revolutionary decentralized memory management platform that combines the power of AI embeddings with blockchain permanence. Built on Arweave, it provides enterprise-grade semantic search capabilities through advanced vector database technology, enabling applications to store, search, and retrieve contextual information with unprecedented permanence and accuracy.
ArchiveNET is a comprehensive monorepo consisting of four main components:
The world's first decentralized vector engine built on Arweave blockchain, implementing the Hierarchical Navigable Small Worlds (HNSW) algorithm for approximate nearest neighbor search.
Key Features:
A robust Express.js API service providing semantic memory management with AI-powered search capabilities.
Stack:
A modern Next.js application providing an intuitive interface for memory management and search operations.
Features:
The central Model Context Protocol (MCP) server that orchestrates memory operations and provides intelligent context management.
Clone the repository:
git clone https://github.com/s9swata/archivenet.git
cd ArchiveNET
Start with Docker Compose:
docker-compose up -d
Manual setup (alternative):
# API Setup
cd API
npm install
npm run build
npx drizzle-kit push
npm run dev
# Frontend Setup
cd ../client
npm install
npm run dev
Create .env
files in respective directories:
API/.env:
DATABASE_URL=your_postgres_url
REDIS_URL=redis://localhost:6379
JWT_SECRET=your_jwt_secret
ARWEAVE_WALLET_PATH=./data/wallet.json
// Store a memory
const response = await fetch("/api/memories", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
content: "Project discussion about AI integration",
metadata: { project: "AI-Platform", priority: "high" },
}),
});
// Search memories
const searchResults = await fetch("/api/memories/search", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: "AI project discussions",
limit: 10,
}),
});
git checkout -b feature/amazing-feature
)git commit -m 'Add amazing feature'
)git push origin feature/amazing-feature
)This project is licensed under the MIT License - see the LICENSE file for details.
Made with ❤️ for the decentralized AI future
An MCP server for the Vercel AI SDK, enabling integrations with Figma and 21st.dev Magic.
Up-to-date Docs For Any Cursor Prompt
Make your AI agent speak every language on the planet, using Lingo.dev Localization Engine.
An MCP client for Cursor that uses OpenRouter.ai to access multiple AI models. Requires an OpenRouter API key.
A Next.js-based MCP server with OAuth 2.1 authentication support using Google as the default provider. Requires a PostgreSQL database and optionally Redis for SSE transport.
Introspects Laravel codebases to provide structured information about views, routes, classes, and models using the mateffy/laravel-introspect package.
Provides local access to Cursor chat history for AI analysis and insights, with no external services or API keys required.
The ultimate open-source server for advanced Gemini API interaction with MCP, intelligently selects models.
An MCP server for managing the software development lifecycle, with support for an optional external SQLite database.
Facilitates software development planning through an interactive and structured approach.