Select runtimes to compare side by side. Click chips below to toggle selection.
| Metric | Python (CPython) | llama.cpp | ONNX Runtime | ExLlamaV2 |
|---|---|---|---|---|
| Score | B71 | C+63 | C-50 | D47 |
| Type | Language | |||
| Execution | interpreted | aot | hybrid | aot |
| Interface | cli | cli | sdk | sdk |
| Cold Start | 50ms | 100ms | 500ms | 1000ms |
| Memory | 15MB | 50MB | 300MB | 300MB |
| Startup | 10ms | 10ms | 100ms | 200ms |
| Isolation | process | process | process | process |
| Maturity | production | production | production | stable |
| Languages | Python | C, C++ | Python, C++, C#, Java | Python, C++, CUDA |
| License | Other | MIT | MIT | MIT |
| Links |