mozilla-ai / llamafile
Distribute and run LLMs with a single file.
AI Architecture Analysis
This repository is indexed by RepoMind. By analyzing mozilla-ai/llamafile in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Summary (README)
Previewllamafile
We want to hear from you! Mozilla.ai recently adopted the llamafile project, and we're planning an approach for codebase modernization. Please share what you find most valuable about llamafile and what would make it more useful for your work. Read more via the blog and add your voice to the discussion here.
<img src="llamafile/llamafile-640x640.png" width="320" height="320" alt="[line drawing of llama animal head in front of slightly open manilla folder filled with files]">
llamafile lets you distribute and run LLMs with a single file. (announcement blog post)
Our goal is to make open LLMs much more accessible to both developers and end users. We're doing that by combining llama.cpp with Cosmopolitan Libc into one framework that collapses all the complexity of LLMs down to a single-file executable (called a "llamafile") that runs locally on most computers, with no installation.<br/><br/>
<a href="https://builders.mozilla.org/"><img src="llamafile/mozilla-logo-bw-rgb.png" width="150"></a><br/> llamafile is a <a href="https://builders.mozilla.org/">Mozilla Builders</a> project.<br/><br/>
Quick Start
Download and run your first llamafile in minutes:
# Download an example model (LLaVA 1.5 7B)
curl -LO https://huggingface.co/Mozilla/llava-v1.5-7b-llamafile/resolve/main/llava-v1.5-7b-q4.llamafile
# Make it executable (macOS/Linux/BSD)
chmod +x llava-v1.5-7b-q4.llamafile
# Run it (opens browser automatically)
./llava-v1.5-7b-q4.llamafile
Windows users: Rename the file to add .exe extension before running.
Documentation
Check the full documentation in the docs/ folder or online at mozilla-ai.github.io/llamafile, or directly jump into one of the following subsections:
- Quickstart
- Supported Systems
- Example llamafiles
- Creating llamafiles
- Source installation
- Technical details
- Security
- Troubleshooting
Licensing
While the llamafile project is Apache 2.0-licensed, our changes to llama.cpp are licensed under MIT (just like the llama.cpp project itself) so as to remain compatible and upstreamable in the future, should that be desired.
The llamafile logo on this page was generated with the assistance of DALL·E 3.