# LibreTranslate [Try it online!](https://libretranslate.com) | [API Docs](https://libretranslate.com/docs) | [Community Forum](https://community.libretranslate.com/) [![Python versions](https://img.shields.io/pypi/pyversions/libretranslate)](https://pypi.org/project/libretranslate) [![Run tests](https://github.com/LibreTranslate/LibreTranslate/workflows/Run%20tests/badge.svg)](https://github.com/LibreTranslate/LibreTranslate/actions?query=workflow%3A%22Run+tests%22) [![Build and Publish Docker Image](https://github.com/LibreTranslate/LibreTranslate/actions/workflows/publish-docker.yml/badge.svg)](https://github.com/LibreTranslate/LibreTranslate/actions/workflows/publish-docker.yml) [![Publish package](https://github.com/LibreTranslate/LibreTranslate/actions/workflows/publish-package.yml/badge.svg)](https://github.com/LibreTranslate/LibreTranslate/actions/workflows/publish-package.yml) [![Awesome Humane Tech](https://raw.githubusercontent.com/humanetech-community/awesome-humane-tech/main/humane-tech-badge.svg?sanitize=true)](https://github.com/humanetech-community/awesome-humane-tech) Free and Open Source Machine Translation API, entirely self-hosted. Unlike other APIs, it doesn't rely on proprietary providers such as Google or Azure to perform translations. Instead, its translation engine is powered by the open source [Argos Translate](https://github.com/argosopentech/argos-translate) library. ![image](https://user-images.githubusercontent.com/64697405/139015751-279f31ac-36f1-4950-9ea7-87e76bf65f51.png) [Try it online!](https://libretranslate.com) | [API Docs](https://libretranslate.com/docs) ## API Examples ### Simple Request: ```javascript const res = await fetch("https://libretranslate.com/translate", { method: "POST", body: JSON.stringify({ q: "Hello!", source: "en", target: "es" }), headers: { "Content-Type": "application/json" } }); console.log(await res.json()); ``` Response: ```javascript { "translatedText": "¡Hola!" } ``` ### Auto Detect Language Request: ```javascript const res = await fetch("https://libretranslate.com/translate", { method: "POST", body: JSON.stringify({ q: "Ciao!", source: "auto", target: "en" }), headers: { "Content-Type": "application/json" } }); console.log(await res.json()); ``` Response: ```javascript { "detectedLanguage": { "confidence": 83, "language": "it" }, "translatedText": "Bye!" } ``` ### HTML (beta) Request: ```javascript const res = await fetch("https://libretranslate.com/translate", { method: "POST", body: JSON.stringify({ q: '
Hello!
', source: "en", target: "es", format: "html" }), headers: { "Content-Type": "application/json" } }); console.log(await res.json()); ``` Response: ```javascript { "translatedText": "¡Hola!
" } ``` ## Install and Run You can run your own API server with just a few lines of setup! Make sure you have Python installed (3.8 or higher is recommended), then simply run: ```bash pip install libretranslate libretranslate [args] ``` Then open a web browser to http://localhost:5000 If you're on Windows, we recommend you [Run with Docker](#run-with-docker) instead. On Ubuntu 20.04 you can also use the install script available at https://github.com/argosopentech/LibreTranslate-init If you would rather run it natively, you can follow the guide [here](https://github.com/nuttolum/LibreOnWindows). ## Build and Run If you want to make changes to the code, you can build from source, and run the API: ```bash git clone https://github.com/LibreTranslate/LibreTranslate cd LibreTranslate pip install -e . libretranslate [args] # Or python main.py [args] ``` Then open a web browser to http://localhost:5000 ### Run with Docker Simply run: ```bash docker run -ti --rm -p 5000:5000 libretranslate/libretranslate ``` Then open a web browser to http://localhost:5000 ### Build with Docker ```bash docker build [--build-arg with_models=true] -t libretranslate . ``` If you want to run the Docker image in a complete offline environment, you need to add the `--build-arg with_models=true` parameter. Then the language models are downloaded during the build process of the image. Otherwise these models get downloaded on the first run of the image/container. Run the built image: ```bash docker run -it -p 5000:5000 libretranslate [args] ``` Or build and run using `docker-compose`: ```bash docker-compose up -d --build ``` > Feel free to change the [`docker-compose.yml`](https://github.com/LibreTranslate/LibreTranslate/blob/main/docker-compose.yml) file to adapt it to your deployment needs, or use an extra `docker-compose.prod.yml` file for your deployment configuration. > The models are stored inside the container under `/root/.local/share` and `/root/.local/cache`. Feel free to use volumes if you do not want to redownload the models when the container is destroyed. Be aware that this will prevent the models from being updated! ### CUDA You can use hardware acceleration to speed up translations on a GPU machine with CUDA 11.2 and [nvidia-docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html) installed. Run this version with: ```bash docker-compose -f docker-compose.cuda.yml up -d --build ``` ## Arguments | Argument | Description | Default | Env. name | |-----------------------------|-------------------------------------------------------------------------------------------------------------| -------------------- |------------------------------| | --host | Set host to bind the server to | `127.0.0.1` | LT_HOST | | --port | Set port to bind the server to | `5000` | LT_PORT | | --char-limit | Set character limit | `No limit` | LT_CHAR_LIMIT | | --req-limit | Set maximum number of requests per minute per client | `No limit` | LT_REQ_LIMIT | | --batch-limit | Set maximum number of texts to translate in a batch request | `No limit` | LT_BATCH_LIMIT | | --ga-id | Enable Google Analytics on the API client page by providing an ID | `No tracking` | LT_GA_ID | | --debug | Enable debug environment | `False` | LT_DEBUG | | --ssl | Whether to enable SSL | `False` | LT_SSL | | --frontend-language-source | Set frontend default language - source | `en` | LT_FRONTEND_LANGUAGE_SOURCE | | --frontend-language-target | Set frontend default language - target | `es` | LT_FRONTEND_LANGUAGE_TARGET | | --frontend-timeout | Set frontend translation timeout | `500` | LT_FRONTEND_TIMEOUT | | --api-keys | Enable API keys database for per-user rate limits lookup | `Don't use API keys` | LT_API_KEYS | | --api-keys-db-path | Use a specific path inside the container for the local database. Can be absolute or relative | `api_keys.db` | LT_API_KEYS_DB_PATH | | --api-keys-remote | Use this remote endpoint to query for valid API keys instead of using the local database | `Use local API key database` | LT_API_KEYS_REMOTE | | --get-api-key-link | Show a link in the UI where to direct users to get an API key | `Don't show a link` | LT_GET_API_KEY_LINK | | --require-api-key-origin | Require use of an API key for programmatic access to the API, unless the request origin matches this domain | `No restrictions on domain origin` | LT_REQUIRE_API_KEY_ORIGIN | | --load-only | Set available languages | `all from argostranslate` | LT_LOAD_ONLY | | --suggestions | Allow user suggestions | `false` | LT_SUGGESTIONS | | --disable-files-translation | Disable files translation | `false` | LT_DISABLE_FILES_TRANSLATION | | --disable-web-ui | Disable web ui | `false` | LT_DISABLE_WEB_UI | Note that each argument has an equivalent environment variable that can be used instead. The env. variables overwrite the default values but have lower priority than the command aguments and are particularly useful if used with Docker. The environment variable names are the upper-snake-case of the equivalent command argument's name with a `LT` prefix. ## Run with WSGI and Gunicorn ``` pip install gunicorn gunicorn --bind 0.0.0.0:5000 'wsgi:app' ``` You can pass application arguments directly to Gunicorn via: ``` gunicorn --bind 0.0.0.0:5000 'wsgi:app(api_keys=True)' ``` ## Run with Kubernetes See ["LibreTranslate: your own translation service on Kubernetes" by JM Robles](https://jmrobles.medium.com/libretranslate-your-own-translation-service-on-kubernetes-b46c3e1af630) ## Manage API Keys LibreTranslate supports per-user limit quotas, e.g. you can issue API keys to users so that they can enjoy higher requests limits per minute (if you also set `--req-limit`). By default all users are rate-limited based on `--req-limit`, but passing an optional `api_key` parameter to the REST endpoints allows a user to enjoy higher request limits. To use API keys simply start LibreTranslate with the `--api-keys` option. If you modified the API keys database path with the option `--api-keys-db-path`, you must specify the path with the same argument flag when using the `ltmanage keys` command. ### Add New Keys To issue a new API key with 120 requests per minute limits: ```bash ltmanage keys add 120 ``` If you changed the API keys database path: ```bash ltmanage keys --api-keys-db-path path/to/db/dbName.db add 120 ``` ### Remove Keys ```bash ltmanage keys remove