
LiteLLM
Unified API for 100+ LLMs using the OpenAI format - proxy server included
About LiteLLM

Unified API for 100+ LLMs using the OpenAI format - proxy server included.
LiteLLM works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU.
Platform Support
Available for: Windows, macOS, Linux
System Requirements
- Minimum RAM: 4 GB
- GPU: Not required — runs on CPU
Links
Official Website · GitHub Repository
Full description coming soon. Check the official website or GitHub for more details.
Frequently Asked Questions
What is LiteLLM?
Unified API for 100+ LLMs using the OpenAI format - proxy server included ## About LiteLLM Unified API for 100+ LLMs using the OpenAI format - proxy server included. LiteLLM works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU...
Is LiteLLM free?
Yes, LiteLLM is completely free to use. It's also open source.
Does LiteLLM work offline?
Yes, LiteLLM works 100% offline once installed.
What platforms does LiteLLM support?
LiteLLM is available for Windows, macOS, Linux.
Related Tools
View all →
Ollama
Run large language models locally with a simple CLI interface

LM Studio
Discover, download, and run local LLMs with an easy-to-use desktop app

Jan
Open-source ChatGPT alternative that runs 100% offline on your computer

GPT4All
Free-to-use, locally running, privacy-aware chatbot by Nomic AI

Text Generation WebUI
The AUTOMATIC1111 of text generation - maximum control for LLMs

KoboldCpp
Easy-to-use AI text generation software for GGML/GGUF models
