llamaperf

A100 40GB vs RTX 4060 Ti 16GB

For running local LLMs · 4 reports across 1 model

Side A
A100 40GB
Vendor
nvidia
VRAM
40GB
Memory
Discrete
Side B
RTX 4060 Ti 16GB
Vendor
nvidia
VRAM
16GB
Memory
Discrete

Tokens per second by model

ModelA100 40GBRTX 4060 Ti 16GB
Gemma 4up to 31B45.0n=4

More comparisons