llamaperf

A100 40GB vs A100 80GB

For running local LLMs · 0 reports across 0 models

Side A
A100 40GB
Vendor
nvidia
VRAM
40GB
Memory
Discrete
Side B
A100 80GB
Vendor
nvidia
VRAM
80GB
Memory
Discrete

No shared model reports between these two yet.

Submit a report

More comparisons