LiteLLM logo

LiteLLM

Verified

Unified API for 100+ LLMs using the OpenAI format - proxy server included

|Share:

About LiteLLM

LiteLLM screenshot

Unified API for 100+ LLMs using the OpenAI format - proxy server included.

LiteLLM works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU.

Platform Support

Available for: Windows, macOS, Linux

System Requirements

  • Minimum RAM: 4 GB
  • GPU: Not required — runs on CPU

Links

Official Website · GitHub Repository

Full description coming soon. Check the official website or GitHub for more details.

Frequently Asked Questions

What is LiteLLM?

Unified API for 100+ LLMs using the OpenAI format - proxy server included ## About LiteLLM Unified API for 100+ LLMs using the OpenAI format - proxy server included. LiteLLM works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU...

Is LiteLLM free?

Yes, LiteLLM is completely free to use. It's also open source.

Does LiteLLM work offline?

Yes, LiteLLM works 100% offline once installed.

What platforms does LiteLLM support?

LiteLLM is available for Windows, macOS, Linux.

Stats

Stars22,000
Last committoday
Self-hostedYes
View Repository

Requirements

Platforms
windows
macos
linux
Offline capable
Yes
Minimum RAM4 GB
GPU requiredNo