Select runtimes to compare side by side. Click chips below to toggle selection.
| Metric | llama.cpp | LocalAI | vLLM |
|---|---|---|---|
| Score | C+63 | F37 | F35 |
| Type | |||
| Execution | aot | hybrid | jit |
| Interface | cli | api | api |
| Cold Start | 100ms | 3000ms | 5000ms |
| Memory | 50MB | 800MB | 2000MB |
| Startup | 10ms | 1000ms | 3000ms |
| Isolation | process | container | process |
| Maturity | production | stable | production |
| Languages | C, C++ | Go, Python | Python |
| License | MIT | MIT | Apache-2.0 |
| Links |