llamaperf
Submit
Sign in
Submit
llamaperf
/
compare
/
h100-80gb-vs-m2-16gb
H100 80GB vs M2 16GB
For running local LLMs · 2 reports across 2 models
Side A
H100 80GB
Vendor
nvidia
VRAM
80GB
Memory
Discrete
Side B
M2 16GB
Vendor
apple
VRAM
16GB
Memory
Unified
Tokens per second by model
Model
H100 80GB
M2 16GB
Gemma 4
up to 31B
—
18.0
n=1
Qwen3.6
up to 35B
45.0
n=1
—
More comparisons
H100 80GB
vs
RTX 5090
M2 16GB
vs
RTX 5090
H100 80GB
vs
RX 7900 XTX
M2 16GB
vs
RX 7900 XTX
H100 80GB
vs
RTX 3090
M2 16GB
vs
RTX 3090