1
0
mirror of https://github.com/vimagick/dockerfiles.git synced 2025-01-02 03:37:40 +02:00
dockerfiles/litellm
2024-12-27 15:51:16 +08:00
..
data update litellm 2024-12-27 15:51:16 +08:00
docker-compose.yml update litellm 2024-12-27 15:51:16 +08:00
README.md

litellm

OpenAI Proxy Server (LLM Gateway) to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user.

$ LITELLM_KEY=sk-xxxxxx

$ curl -H "Authorization: Bearer $LITELLM_KEY" http://127.0.0.1:4000/v1/models

$ curl -H "Authorization: Bearer $LITELLM_KEY" http://127.0.0.1:4000/model/info

$ curl http://127.0.0.1:4000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $LITELLM_KEY" \
  -d '{
    "model": "claude-3.5",
    "response_format": { "type": "json_object" },
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant designed to output JSON."
      },
      {
        "role": "user",
        "content": "Who won the world series in 2020?"
      }
    ]
  }'

To create virtual keys

$ curl --location 'http://127.0.0.1:4000/user/new' \
       --header "Authorization: Bearer $LITELLM_KEY" \
       --header 'Content-Type: application/json' \
       --data-raw '{"user_email": "username@example.com"}'
{
  "expires": "2023-12-22T09:53:13.861000Z",
  "user_id": "my-unique-id",
  "max_budget": 0.0
}

$ curl 'http://127.0.0.1:4000/key/generate' \
       --header "Authorization: Bearer $LITELLM_KEY" \
       --header 'Content-Type: application/json' \
       --data-raw '{"models": ["gpt-4o", "claude-3.5"], "user_id": "my-unique-id"}'

$ curl -H "Authorization: Bearer $LITELLM_KEY" 'http://127.0.0.1:4000/user/info?user_id=my-unique-id'
{
  "spend": 0
}