llamaperf

A100 80GB vs RTX Pro 6000 Blackwell

For running local LLMs · 0 reports across 0 models

Side A
A100 80GB
Vendor
nvidia
VRAM
80GB
Memory
Discrete
Side B
RTX Pro 6000 Blackwell
Vendor
nvidia
VRAM
96GB
Memory
Discrete

No shared model reports between these two yet.

Submit a report

More comparisons