Select runtimes to compare side by side. Click chips below to toggle selection.
| Metric | llama.cpp | ROCm | LLM (Python CLI) | MLX |
|---|---|---|---|---|
| Score | C+63 | C56 | D49 | D47 |
| Type | ||||
| Execution | aot | aot | hybrid | jit |
| Interface | cli | sdk | cli | sdk |
| Cold Start | 100ms | 200ms | 500ms | 500ms |
| Memory | 50MB | 600MB | 100MB | 200MB |
| Startup | 10ms | 100ms | 100ms | 100ms |
| Isolation | process | hardware | process | process |
| Maturity | production | stable | stable | stable |
| Languages | C, C++ | C, C++, Python | Python | Python, C++, Swift |
| License | MIT | MIT | Apache-2.0 | MIT |
| Links |