Skip to content

Releases: BerriAI/litellm

v1.67.0-stable

19 Apr 19:35
03b5399
Compare
Choose a tag to compare

What's Changed

New Contributors

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-stable

Full Changelog: v1.66.0-stable...v1.67.0-stable

v1.67.0-nightly

19 Apr 23:32
Compare
Choose a tag to compare

What's Changed

  • [Feat] Expose Responses API on LiteLLM UI Test Key Page by @ishaan-jaff in #10166
  • [Bug Fix] Spend Tracking Bug Fix, don't modify in memory default litellm params by @ishaan-jaff in #10167
  • Bug Fix - Responses API, Loosen restrictions on allowed environments for computer use tool by @ishaan-jaff in #10168

Full Changelog: v1.67.0-stable...v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.67.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002
Aggregated Passed ✅ 230.0 262.85419851041036 6.266552109647687 0.0 1873 0 202.24337799993464 5393.98836700002

v1.66.3.dev5

19 Apr 03:41
3d5022b
Compare
Choose a tag to compare

What's Changed

  • [Feat] Unified Responses API - Add Azure Responses API support by @ishaan-jaff in #10116
  • UI: Make columns resizable/hideable in Models table by @msabramo in #10119
  • Remove unnecessary package*.json files by @msabramo in #10075
  • Add Gemini Flash 2.5 Preview Model Price and Context Window by @drmingler in #10125
  • test: update tests to new deployment model by @krrishdholakia in #10142
  • [Feat] Support for all litellm providers on Responses API (works with Codex) - Anthropic, Bedrock API, VertexAI, Ollama by @ishaan-jaff in #10132

New Contributors

Full Changelog: v1.66.2.dev1...v1.66.3.dev5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3.dev5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 241.46378394371686 6.1149592690003 0.0 1830 0 197.6759699999775 1416.5823339999974
Aggregated Passed ✅ 230.0 241.46378394371686 6.1149592690003 0.0 1830 0 197.6759699999775 1416.5823339999974

v1.66.3.dev1

18 Apr 02:27
Compare
Choose a tag to compare

What's Changed

  • [Feat] Unified Responses API - Add Azure Responses API support by @ishaan-jaff in #10116
  • UI: Make columns resizable/hideable in Models table by @msabramo in #10119

Full Changelog: v1.66.2.dev1...v1.66.3.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002
Aggregated Passed ✅ 180.0 210.74489628810068 6.401988824678471 0.003341330284278951 1916 1 38.52582800004711 5506.760536000002

v1.66.3-nightly

17 Apr 20:58
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.3-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.3-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012
Aggregated Failed ❌ 250.0 302.3290337319068 6.097097387542003 0.04679789661490572 1824 14 218.4401190000358 5459.562037000012

v1.66.2.dev1

17 Apr 20:36
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.2-nightly...v1.66.2.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966
Aggregated Passed ✅ 200.0 242.7078390639904 6.1689738182726535 0.0 1844 0 181.44264199997906 6553.659710999966

v1.66.2-nightly

17 Apr 05:43
47e811d
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.66.1-nightly...v1.66.2-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.2-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016
Aggregated Passed ✅ 190.0 244.45508035967268 6.136194497326665 0.0 1835 0 169.77143499997283 8723.871383000016

v1.66.1-nightly

15 Apr 06:05
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.66.0-nightly...v1.66.1-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.1-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 220.0 243.74385918230334 6.268015361621096 0.0 1876 0 197.45038600001408 3855.600032000012
Aggregated Passed ✅ 220.0 243.74385918230334 6.268015361621096 0.0 1876 0 197.45038600001408 3855.600032000012

v1.66.0-stable

13 Apr 05:51
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.65.8-nightly...v1.66.0-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:litellm_stable_release_branch-v1.66.0-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 282.9933559715544 5.995117478456652 0.0 1793 0 223.97943800001485 5176.803935999998
Aggregated Passed ✅ 250.0 282.9933559715544 5.995117478456652 0.0 1793 0 223.97943800001485 5176.803935999998

v1.66.0-nightly

13 Apr 05:19
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.65.8-nightly...v1.66.0-nightly

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.66.0-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 252.49209995793416 6.279241190720279 0.0 1878 0 200.85592700002053 5135.250711999987
Aggregated Passed ✅ 230.0 252.49209995793416 6.279241190720279 0.0 1878 0 200.85592700002053 5135.250711999987