Lingo.dev
Make your AI agent speak every language on the planet, using Lingo.dev Localization Engine.
Lingo.dev - Open-source i18n toolkit for LLM-powered localization
MCP • CLI • CI/CD • SDK • Compiler
Quick Start
| Tool | Use Case | Quick Command |
|---|---|---|
| MCP | AI-assisted i18n setup for React apps | Prompt: Set up i18n |
| CLI | Translate JSON, YAML, markdown, CSV, PO files | npx lingo.dev@latest run |
| CI/CD | Automated translation pipeline in GitHub Actions | uses: lingodotdev/lingo.dev@main |
| SDK | Runtime translation for dynamic content | npm install lingo.dev |
| Compiler | Build-time React localization without i18n wrappers | withLingo() plugin |
Lingo.dev MCP
Setting up i18n in React apps is notoriously error-prone - even for experienced developers. AI coding assistants make it worse: they hallucinate non-existent APIs, forget middleware configurations, break routing, or implement half a solution before getting lost. The problem is that i18n setup requires a precise sequence of coordinated changes across multiple files (routing, middleware, components, configuration), and LLMs struggle to maintain that context.
Lingo.dev MCP solves this by giving AI assistants structured access to framework-specific i18n knowledge. Instead of guessing, your assistant follows verified implementation patterns for Next.js, React Router, and TanStack Start.
Supported IDEs:
- Claude Code
- Cursor
- GitHub Copilot Agents
- Codex (OpenAI)
Supported frameworks:
- Next.js (App Router & Pages Router v13-16)
- TanStack Start (v1)
- React Router (v7)
Usage:
After configuring the MCP server in your IDE (see quickstart guides), prompt your assistant:
Set up i18n with the following locales: en, es, and pt-BR. The default locale is 'en'.
The assistant will:
- Configure locale-based routing (e.g.,
/en,/es,/pt-BR) - Set up language switching components
- Implement automatic locale detection
- Generate necessary configuration files
Note: AI-assisted code generation is non-deterministic. Review generated code before committing.
Read the docs →
Lingo.dev CLI
Keeping translations in sync is tedious. You add a new string, forget to translate it, ship broken UI to international users. Or you send JSON files to translators, wait days, then manually merge their work back. Scaling to 10+ languages means managing hundreds of files that constantly drift out of sync.
Lingo.dev CLI automates this. Point it at your translation files, run one command, and every locale updates. A lockfile tracks what's already translated, so you only pay for new or changed content. Supports JSON, YAML, CSV, PO files, and markdown.
Setup:
Initialize project
npx lingo.dev@latest init
Run translations
npx lingo.dev@latest run
How it works:
- Extracts translatable content from configured files
- Sends content to LLM provider for translation
- Writes translated content back to filesystem
- Creates
i18n.lockfile to track completed translations (avoids redundant processing)
Configuration:
The init command generates an i18n.json file. Configure locales and buckets:
{ "$schema": "https://lingo.dev/schema/i18n.json", "version": "1.10", "locale": { "source": "en", "targets": ["es", "fr", "de"] }, "buckets": { "json": { "include": ["locales/[locale].json"] } } }
The provider field is optional (defaults to Lingo.dev Engine). For custom LLM providers:
{ "provider": { "id": "openai", "model": "gpt-4o-mini", "prompt": "Translate from {source} to {target}" } }
Supported LLM providers:
- Lingo.dev Engine (recommended)
- OpenAI
- Anthropic
- Mistral
- OpenRouter
- Ollama
Read the docs →
Lingo.dev CI/CD
Translations are the feature that's always "almost done." Engineers merge code without updating locales. QA catches missing translations in staging - or worse, users catch them in production. The root cause: translation is a manual step that's easy to skip under deadline pressure.
Lingo.dev CI/CD makes translations automatic. Every push triggers translation. Missing strings get filled before code reaches production. No discipline required - the pipeline handles it.
Supported platforms:
- GitHub Actions
- GitLab CI/CD
- Bitbucket Pipelines
GitHub Actions setup:
Create .github/workflows/translate.yml:
name: Translate on: push: branches: [main] permissions: contents: write jobs: translate: runs-on: ubuntu-latest steps: - uses: actions/checkout@v4 - name: Lingo.dev uses: lingodotdev/lingo.dev@main with: api-key: ${{ secrets.LINGODOTDEV_API_KEY }}
Setup requirements:
- Add
LINGODOTDEV_API_KEYto repository secrets (Settings > Secrets and variables > Actions) - For PR workflows: Enable "Allow GitHub Actions to create and approve pull requests" in Settings > Actions > General
Workflow options:
Commit translations directly:
uses: lingodotdev/lingo.dev@main with: api-key: ${{ secrets.LINGODOTDEV_API_KEY }}
Create pull requests with translations:
uses: lingodotdev/lingo.dev@main with: api-key: ${{ secrets.LINGODOTDEV_API_KEY }} pull-request: true env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Available inputs:
| Input | Default | Description |
|---|---|---|
| api-key | (required) | Lingo.dev API key |
| pull-request | false | Create PR instead of committing directly |
| commit-message | "feat: update translations via @LingoDotDev" | Custom commit message |
| pull-request-title | "feat: update translations via @LingoDotDev" | Custom PR title |
| working-directory | "." | Directory to run in |
| parallel | false | Enable parallel processing |
Read the docs →
Lingo.dev SDK
Static translation files work for UI labels, but what about user-generated content? Chat messages, product descriptions, support tickets - content that doesn't exist at build time can't be pre-translated. You're stuck showing untranslated text or building a custom translation pipeline.
Lingo.dev SDK translates content at runtime. Pass any text, object, or HTML and get back a localized version. Works for real-time chat, dynamic notifications, or any content that arrives after deployment. Available for JavaScript, PHP, Python, and Ruby.
Installation:
npm install lingo.dev
Usage:
import { LingoDotDevEngine } from "lingo.dev/sdk";
const lingoDotDev = new LingoDotDevEngine({ apiKey: process.env.LINGODOTDEV_API_KEY, });
// Translate objects (preserves structure) const translated = await lingoDotDev.localizeObject( { greeting: "Hello", farewell: "Goodbye" }, { sourceLocale: "en", targetLocale: "es" }, ); // { greeting: "Hola", farewell: "Adiós" }
// Translate text const text = await lingoDotDev.localizeText("Hello!", { sourceLocale: "en", targetLocale: "fr", });
// Translate to multiple languages at once const results = await lingoDotDev.batchLocalizeText("Hello!", { sourceLocale: "en", targetLocales: ["es", "fr", "de"], });
// Translate chat (preserves speaker names) const chat = await lingoDotDev.localizeChat( [{ name: "Alice", text: "Hello!" }], { sourceLocale: "en", targetLocale: "es" }, );
// Translate HTML (preserves markup) const html = await lingoDotDev.localizeHtml("
Welcome
", { sourceLocale: "en", targetLocale: "de", });// Detect language const locale = await lingoDotDev.recognizeLocale("Bonjour le monde"); // "fr"
Available SDKs:
- JavaScript SDK - Web apps, Node.js
- PHP SDK - PHP, Laravel
- Python SDK - Django, Flask
- Ruby SDK - Rails
Read the docs →
Lingo.dev Compiler
Traditional i18n is invasive. You wrap every string in t() functions, invent translation keys (home.hero.title.v2), maintain parallel JSON files, and watch your components bloat with localization boilerplate. It's so tedious that teams delay internationalization until it becomes a massive refactor.
Lingo.dev Compiler eliminates the ceremony. Write React components with plain English text. The compiler detects translatable strings at build time and generates localized variants automatically. No keys, no JSON files, no wrapper functions - just React code that happens to work in multiple languages.
pnpm install @lingo.dev/compiler
Authentication:
Recommended: Sign up at lingo.dev and login
npx lingo.dev@latest login
Alternative: Add API key to .env
LINGODOTDEV_API_KEY=your_key_here
Or use direct LLM providers (Groq, OpenAI, Anthropic, Google)
GROQ_API_KEY=your_key
Configuration (Next.js):
// next.config.ts import type { NextConfig } from "next"; import { withLingo } from "@lingo.dev/compiler/next";
const nextConfig: NextConfig = {};
export default async function (): Promise { return await withLingo(nextConfig, { sourceRoot: "./app", sourceLocale: "en", targetLocales: ["es", "fr", "de"], models: "lingo.dev", dev: { usePseudotranslator: true }, }); }
Configuration (Vite):
// vite.config.ts import { lingoCompilerPlugin } from "@lingo.dev/compiler/vite";
export default defineConfig({ plugins: [ lingoCompilerPlugin({ sourceRoot: "src", sourceLocale: "en", targetLocales: ["es", "fr", "de"], models: "lingo.dev", dev: { usePseudotranslator: true }, }), react(), ], });
Provider setup:
// app/layout.tsx (Next.js) import { LingoProvider } from "@lingo.dev/compiler/react";
export default function RootLayout({ children }) { return ( {children} ); }
Language switcher:
import { useLocale, setLocale } from "@lingo.dev/compiler/react";
export function LanguageSwitcher() { const locale = useLocale(); return ( <select value={locale} onChange={(e) => setLocale(e.target.value)}> English Español ); }
Development: npm run dev (uses pseudotranslator, no API calls)
Production: Set usePseudotranslator: false, then next build
Commit the .lingo/ directory to version control.
Key features:
- Zero runtime performance cost
- No translation keys or JSON files
- No
t()functions or<T>wrapper components - Automatic detection of translatable text in JSX
- TypeScript support
- ICU MessageFormat for plurals
- Manual overrides via
data-lingo-overrideattribute - Built-in translation editor widget
Build modes:
-
pseudotranslator: Development mode with placeholder translations (no API costs) -
real: Generate actual translations using LLMs -
cache-only: Production mode using pre-generated translations from CI (no API calls) -
Next.js (App Router with React Server Components)
-
Vite + React (SPA and SSR)
Additional framework support planned.
Read the docs →
Contributing
Contributions welcome. Please follow these guidelines:
- Issues: Report bugs or request features
- Pull Requests: Submit changes
- Every PR requires a changeset:
pnpm new(orpnpm new:emptyfor non-release changes) - Ensure tests pass before submitting
- Every PR requires a changeset:
- Development: This is a pnpm + turborepo monorepo
- Install dependencies:
pnpm install - Run tests:
pnpm test - Build:
pnpm build
- Install dependencies:
Support: Discord community
Star History
If you find Lingo.dev useful, give us a star and help us reach 10,000 stars!
Localized Documentation
Available translations:
English • 中文 • 日本語 • 한국어 • Español • Français • Русский • Українська • Deutsch • Italiano • العربية • עברית • हिन्दी • Português (Brasil) • বাংলা • فارسی • Polski • Türkçe • اردو • भोजपुरी • অসমীয়া • ગુજરાતી • मराठी • ଓଡ଼ିଆ • ਪੰਜਾਬੀ • සිංහල • தமிழ் • తెలుగు
Adding a new language:
- Add locale code to
i18n.jsonusing BCP-47 format - Submit a pull request
BCP-47 locale format: language[-Script][-REGION]
language: ISO 639-1/2/3 (lowercase):en,zh,bhoScript: ISO 15924 (title case):Hans,Hant,LatnREGION: ISO 3166-1 alpha-2 (uppercase):US,CN,IN- Examples:
en,pt-BR,zh-Hans,sr-Cyrl-RS
Verwandte Server
Scout Monitoring MCP
SponsorPut performance and error data directly in the hands of your AI assistant.
Alpha Vantage MCP Server
SponsorAccess financial market data: realtime & historical stock, ETF, options, forex, crypto, commodities, fundamentals, technical indicators, & more
Jupyter Notebook MCP Server
Interact with Jupyter notebooks, allowing for code execution, cell manipulation, and notebook management.
Project Atlantis
A Python MCP host server that allows for dynamic installation of functions and third-party MCP tools.
MCP Todo Server
A demo Todo application server built with a clean architecture using MCPServer and JSON Placeholder.
Dify Workflows
An MCP server for executing Dify workflows, configured via environment variables or a config file.
BioMCP
Enhances large language models with protein structure analysis capabilities, including active site analysis and disease-protein searches, by connecting to the RCSB Protein Data Bank.
Jira Context MCP
MCP server to provide Jira Tickets information to AI coding agents like Cursor.
third-eye-mcp
Privacy-first screen capture MCP server for AI coding agents. Let Claude, Cursor, or any MCP-compatible AI see your screen with full control.
MCP ZAP Server
Exposes OWASP ZAP as an MCP server, enabling AI agents to orchestrate security scans, import OpenAPI specs, and generate reports.
weibaohui/kom
Provides multi-cluster Kubernetes management and operations using MCP, It can be integrated as an SDK into your own project and includes nearly 50 built-in tools covering common DevOps and development scenarios. Supports both standard and CRD resources.
markmap-http-mcp
An MCP server for converting Markdown to interactive mind maps with export support (PNG/JPG/SVG). Server runs as HTTP service.