llamaperf

M2 Max 96GB vs RTX Pro 6000 Blackwell

For running local LLMs · 6 reports across 2 models

Side A
M2 Max 96GB
Vendor
apple
VRAM
96GB
Memory
Unified
Side B
RTX Pro 6000 Blackwell
Vendor
nvidia
VRAM
96GB
Memory
Discrete

Tokens per second by model

ModelM2 Max 96GBRTX Pro 6000 Blackwell
Llama 3.3n=3
Qwen3.6up to 35B28.0n=2n=1

More comparisons