The encyclopedia of AI compute. Detailed specs, benchmarks, pricing, and use-case guides for every GPU and LLM model.
GPUs Tracked
8
Models Indexed
8
| GPU | VRAM | FP16 TFLOPS | Mem BW | TDP | Avg Price | Best For |
|---|---|---|---|---|---|---|
H100 SXM 80GB NVIDIA | 80GB | 989.4 | 3.35 TB/s | 700W | $2.49/hr | LLM TrainingLLM Inference |
H200 SXM 141GB NVIDIA | 141GB | 989.4 | 4.8 TB/s | 700W | $4.20/hr | LLM TrainingLarge Models |
A100 SXM 80GB NVIDIA | 80GB | 312 | 2.0 TB/s | 400W | $1.29/hr | LLM InferenceFine-tuning |
L40S 48GB NVIDIA | 48GB | 362.1 | 864 GB/s | 350W | $1.09/hr | InferenceFine-tuning |
RTX A6000 48GB NVIDIA | 48GB | 38.7 | 768 GB/s | 300W | $0.79/hr | Inference3D Rendering |
RTX 4090 24GB NVIDIA | 24GB | 82.6 | 1.01 TB/s | 450W | $0.59/hr | Small Model InferenceFine-tuning |
MI300X 192GB AMD | 192GB | 1,307.4 | 5.3 TB/s | 750W | $3.89/hr | LLM TrainingLLM Inference |
B200 192GB NVIDIA | 192GB | 2,250 | 8.0 TB/s | 1000W | $6.50/hr | Next-gen TrainingFrontier Models |
NVIDIA · Hopper
NVIDIA's flagship data center GPU. The gold standard for large-scale AI training and inference workloads.
VRAM
80GB
FP16
989.4 TF
Avg Price
$2.49/hr
NVIDIA · Hopper
Enhanced Hopper with 141GB HBM3e memory and 4.8 TB/s bandwidth. Ideal for the largest language models.
VRAM
141GB
FP16
989.4 TF
Avg Price
$4.20/hr
NVIDIA · Ampere
Previous-generation data center GPU. Still widely used and cost-effective for inference and fine-tuning.
VRAM
80GB
FP16
312 TF
Avg Price
$1.29/hr
NVIDIA · Ada Lovelace
Versatile mid-range GPU for inference and multimedia AI workloads. Good balance of performance and cost.
VRAM
48GB
FP16
362.1 TF
Avg Price
$1.09/hr
NVIDIA · Ampere
Professional workstation GPU with large VRAM. Popular for inference and visualization tasks.
VRAM
48GB
FP16
38.7 TF
Avg Price
$0.79/hr
NVIDIA · Ada Lovelace
Top consumer GPU. Excellent price-performance for smaller models and experimentation.
VRAM
24GB
FP16
82.6 TF
Avg Price
$0.59/hr
AMD · CDNA 3
AMD's flagship AI accelerator with massive 192GB HBM3 memory. Strong competitor to H100.
VRAM
192GB
FP16
1,307.4 TF
Avg Price
$3.89/hr
NVIDIA · Blackwell
NVIDIA's next-generation Blackwell GPU. 2x performance over H100 with 192GB HBM3e.
VRAM
192GB
FP16
2,250 TF
Avg Price
$6.50/hr