vLLM logo

vLLM

Verified

Easy, fast, and cheap LLM serving for everyone

|Share:

About vLLM

vLLM screenshot

Easy, fast, and cheap LLM serving for everyone.

vLLM works 100% offline, is open source, is completely free to use.

Platform Support

Available for: Linux

System Requirements

  • Minimum RAM: 16 GB
  • GPU: Required for best performance

Links

Official Website ยท GitHub Repository

Full description coming soon. Check the official website or GitHub for more details.

Frequently Asked Questions

What is vLLM?

Easy, fast, and cheap LLM serving for everyone ## About vLLM Easy, fast, and cheap LLM serving for everyone. vLLM works 100% offline, is open source, is completely free to use. ### Platform Support Available for: **Linux** ### System Requirem...

Is vLLM free?

Yes, vLLM is completely free to use. It's also open source.

Does vLLM work offline?

Yes, vLLM works 100% offline once installed.

What platforms does vLLM support?

vLLM is available for Linux.

Stats

Stars70,640
Last committoday
Self-hostedYes
View Repository

Requirements

Platforms
linux
Offline capable
Yes
Minimum RAM16 GB
GPU requiredYes