llamaperf

A100 40GB vs RTX 4090

For running local LLMs · 3 reports across 2 models

Side A
A100 40GB
Vendor
nvidia
VRAM
40GB
Memory
Discrete
Side B
RTX 4090
Vendor
nvidia
VRAM
24GB
Memory
Discrete

Tokens per second by model

ModelA100 40GBRTX 4090
Gemma 4up to 31B149.6n=2
Qwen3.6up to 35B25.0n=1

More comparisons