RLAMA logo

RLAMA

Open-source RAG CLI for Ollama - chat with your local documents

|Share:

About RLAMA

RLAMA screenshot

Open-source RAG CLI for Ollama - chat with your local documents.

RLAMA works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU.

Platform Support

Available for: Windows, macOS, Linux

System Requirements

  • Minimum RAM: 8 GB
  • GPU: Not required — runs on CPU

Links

Official Website · GitHub Repository

Full description coming soon. Check the official website or GitHub for more details.

Frequently Asked Questions

What is RLAMA?

Open-source RAG CLI for Ollama - chat with your local documents ## About RLAMA Open-source RAG CLI for Ollama - chat with your local documents. RLAMA works 100% offline, is open source, is completely free to use, runs on CPU without a dedicated GPU. ### Platfor...

Is RLAMA free?

Yes, RLAMA is completely free to use. It's also open source.

Does RLAMA work offline?

Yes, RLAMA works 100% offline once installed.

What platforms does RLAMA support?

RLAMA is available for Windows, macOS, Linux.

Stats

Self-hostedYes
View Repository

Requirements

Platforms
windows
macos
linux
Offline capable
Yes
Minimum RAM8 GB
GPU requiredNo