Select runtimes to compare side by side. Click chips below to toggle selection.
| Metric | GGUF | Python (CPython) | Vulkan | llama.cpp |
|---|---|---|---|---|
| Score | A-83 | B71 | C+64 | C+63 |
| Type | Language | |||
| Execution | aot | interpreted | aot | aot |
| Interface | embedded | cli | sdk | cli |
| Cold Start | <1ms | 50ms | 100ms | 100ms |
| Memory | 0MB | 15MB | 200MB | 50MB |
| Startup | <1ms | 10ms | 30ms | 10ms |
| Isolation | process | process | hardware | process |
| Maturity | production | production | production | production |
| Languages | Any | Python | C, C++ | C, C++ |
| License | MIT | Other | Apache-2.0 | MIT |
| Links |