Skip to content

Releases: BerriAI/litellm

v1.40.0

02 Jun 00:25
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.39.6...v1.40.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.40.0

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 133.63252197830545 6.467733658247951 0.0 1936 0 94.77090299998281 801.180971000008
Aggregated Passed ✅ 120.0 133.63252197830545 6.467733658247951 0.0 1936 0 94.77090299998281 801.180971000008

v1.39.6

01 Jun 04:21
Compare
Choose a tag to compare

We're launching team member invites (No SSO Required) on v1.39.6 🔥 Invite team member to view LLM Usage, Spend per service https://docs.litellm.ai/docs/proxy/ui

👍 [Fix] Cache Vertex AI clients - Major Perf improvement for VertexAI models

✨ Feat - Send new users invite emails on creation (using 'send_invite_email' on /user/new)

💻 UI - allow users to sign in with with email/password

🔓 [UI] Admin UI Invite Links for non SSO

✨ PR - [FEAT] Perf improvements - litellm.completion / litellm.acompletion - Cache OpenAI client
inviting_members_ui

What's Changed

New Contributors

Full Changelog: v1.39.5...v1.39.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937
Aggregated Passed ✅ 78 90.37559010674164 6.5521693586672445 0.0 1958 0 65.34477100001368 961.3953589999937

v1.39.5-stable

31 May 16:46
Compare
Choose a tag to compare

Full Changelog: v1.39.5...v1.39.5-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.5-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 82 98.4437988109365 6.414126443845541 0.0 1920 0 65.89902199999642 1363.2986580000193
Aggregated Passed ✅ 82 98.4437988109365 6.414126443845541 0.0 1920 0 65.89902199999642 1363.2986580000193

v1.39.5

31 May 04:39
Compare
Choose a tag to compare

What's Changed

New Contributors

pika-1717129556201-1x

Full Changelog: v1.39.4...v1.39.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 168.39339172958924 6.4252258901831345 0.0 1923 0 109.15407800001731 1833.3729599999913
Aggregated Passed ✅ 130.0 168.39339172958924 6.4252258901831345 0.0 1923 0 109.15407800001731 1833.3729599999913

v1.39.4

30 May 15:48
Compare
Choose a tag to compare

What's Changed

  • fix - UI submit chat on enter by @ishaan-jaff in #3916
  • Revert "Revert "fix: Log errors in Traceloop Integration (reverts previous revert)"" by @nirga in #3909

Full Changelog: v1.39.3...v1.39.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 120.0 135.98662418243552 6.404889633803229 0.0 1913 0 97.80563699996492 1663.1231360000243
Aggregated Passed ✅ 120.0 135.98662418243552 6.404889633803229 0.0 1913 0 97.80563699996492 1663.1231360000243

v1.39.3

30 May 04:26
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.39.2...v1.39.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 110.0 133.96143579083153 6.347194412767075 0.0 1898 0 91.88108999995848 1459.6432470000025
Aggregated Passed ✅ 110.0 133.96143579083153 6.347194412767075 0.0 1898 0 91.88108999995848 1459.6432470000025

v1.39.2

29 May 06:53
Compare
Choose a tag to compare

What's Changed

pika-1716961715848-1x

Full Changelog: v1.38.12...v1.39.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.39.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 72 83.46968387564114 6.529958043991633 0.0 1954 0 61.38368400002037 678.4462749999989
Aggregated Passed ✅ 72 83.46968387564114 6.529958043991633 0.0 1954 0 61.38368400002037 678.4462749999989

v1.38.12

28 May 15:54
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.11...v1.38.12

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.12

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 76 91.16258395147193 6.473952425752436 0.0 1937 0 62.406538999994154 1772.6057410000067
Aggregated Passed ✅ 76 91.16258395147193 6.473952425752436 0.0 1937 0 62.406538999994154 1772.6057410000067

v1.38.11

28 May 03:25
4b0a8ff
Compare
Choose a tag to compare

💵 LiteLLM v1.38.11 Proxy 100+ LLMs AND Set Budgets for your customers https://docs.litellm.ai/docs/proxy/users#set-rate-limits

✨ NEW /Customer/update and /Customer/delete endpoints https://docs.litellm.ai/docs/proxy/users#set-rate-limits

📝 [Feat] Email alerting is now Free Tier: https://docs.litellm.ai/docs/proxy/email

🚀 [Feat] Show supported OpenAI params on LiteLLM UI model hub

✨ [Feat] Show Created at, Created by on Models Page

codeimage-snippet_28 (3)

What's Changed

New Contributors

Full Changelog: v1.38.10...v1.38.11

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.11

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 94 113.13091035154665 6.485092627447978 0.0 1940 0 80.4994959999874 735.4111310000064
Aggregated Passed ✅ 94 113.13091035154665 6.485092627447978 0.0 1940 0 80.4994959999874 735.4111310000064

v1.38.10

26 May 22:48
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.38.8...v1.38.10

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.38.10

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 130.0 152.41971991092666 6.452763997233594 0.0 1931 0 108.63601500000186 1150.9651800000142
Aggregated Passed ✅ 130.0 152.41971991092666 6.452763997233594 0.0 1931 0 108.63601500000186 1150.9651800000142