🐕
RuntimeDog
Directory
Compare
About
Compare Runtimes
Select runtimes to compare side by side. Click chips below to toggle selection.
Select runtimes (2 selected):
WasmEdge
Wasmtime
GGUF
safetensors
Wasmer
wazero
Bun
Cloudflare Workers
Metal
V8
gVisor
Python (CPython)
Vercel Edge Functions
Firecracker
Fastly Compute
containerd
CUDA Runtime
AWS Lambda
Node.js
Deno
Vulkan
Kata Containers
llama.cpp
Google Cloud Functions
Podman
Azure Functions
ROCm
Docker
llamafile
ONNX Runtime
LLM (Python CLI)
Ollama
Candle
ExLlamaV2
MLX
CTransformers
Open WebUI
Text Generation Inference
KoboldCpp
MLC LLM
LocalAI
vLLM
GPT4All
Jan
LM Studio
Text Generation WebUI
Metric
V8
ExLlamaV2
Score
B
74
D
47
Type
Language
Execution
jit
aot
Interface
embedded
sdk
Cold Start
5ms
1000ms
Memory
30MB
300MB
Startup
2ms
200ms
Isolation
process
process
Maturity
production
stable
Languages
JavaScript, TypeScript, WebAssembly
Python, C++, CUDA
License
BSD-3-Clause
MIT
Links
Website
GitHub
Website
GitHub