Select runtimes to compare side by side. Click chips below to toggle selection.
| Metric | llama.cpp | ONNX Runtime | LocalAI |
|---|---|---|---|
| Score | C+63 | C-50 | F37 |
| Type | |||
| Execution | aot | hybrid | hybrid |
| Interface | cli | sdk | api |
| Cold Start | 100ms | 500ms | 3000ms |
| Memory | 50MB | 300MB | 800MB |
| Startup | 10ms | 100ms | 1000ms |
| Isolation | process | process | container |
| Maturity | production | production | stable |
| Languages | C, C++ | Python, C++, C#, Java | Go, Python |
| License | MIT | MIT | MIT |
| Links |