llamaperf

H100 80GB vs RTX 3060 12GB

For running local LLMs · 4 reports across 2 models

Side A
H100 80GB
Vendor
nvidia
VRAM
80GB
Memory
Discrete
Side B
RTX 3060 12GB
Vendor
nvidia
VRAM
12GB
Memory
Discrete

Tokens per second by model

ModelH100 80GBRTX 3060 12GB
Gemma 4up to 31B60.0n=3
Qwen3.6up to 35B45.0n=1

More comparisons