llamaperf

H100 80GB vs RTX Pro 6000 Blackwell

For running local LLMs · 1 report across 1 model

Side A
H100 80GB
Vendor
nvidia
VRAM
80GB
Memory
Discrete
Side B
RTX Pro 6000 Blackwell
Vendor
nvidia
VRAM
96GB
Memory
Discrete

Tokens per second by model

ModelH100 80GBRTX Pro 6000 Blackwell
Qwen3.6up to 35B45.0n=1

More comparisons