HomeKnowledge Base

Knowledge Base

The encyclopedia of AI compute. Detailed specs, benchmarks, pricing, and use-case guides for every GPU and LLM model.

GPUs Tracked

8

Models Indexed

8

|

GPU Comparison Table

8 GPUs
GPUVRAMFP16 TFLOPSMem BWTDPAvg PriceBest For

H100 SXM 80GB

NVIDIA

80GB989.43.35 TB/s700W$2.49/hr
LLM TrainingLLM Inference

H200 SXM 141GB

NVIDIA

141GB989.44.8 TB/s700W$4.20/hr
LLM TrainingLarge Models

A100 SXM 80GB

NVIDIA

80GB3122.0 TB/s400W$1.29/hr
LLM InferenceFine-tuning

L40S 48GB

NVIDIA

48GB362.1864 GB/s350W$1.09/hr
InferenceFine-tuning

RTX A6000 48GB

NVIDIA

48GB38.7768 GB/s300W$0.79/hr
Inference3D Rendering

RTX 4090 24GB

NVIDIA

24GB82.61.01 TB/s450W$0.59/hr
Small Model InferenceFine-tuning

MI300X 192GB

AMD

192GB1,307.45.3 TB/s750W$3.89/hr
LLM TrainingLLM Inference

B200 192GB

NVIDIA

192GB2,2508.0 TB/s1000W$6.50/hr
Next-gen TrainingFrontier Models

H100 SXM 80GB

NVIDIA · Hopper

High-End

NVIDIA's flagship data center GPU. The gold standard for large-scale AI training and inference workloads.

VRAM

80GB

FP16

989.4 TF

Avg Price

$2.49/hr

LLM TrainingLLM InferenceHPC

H200 SXM 141GB

NVIDIA · Hopper

High-End

Enhanced Hopper with 141GB HBM3e memory and 4.8 TB/s bandwidth. Ideal for the largest language models.

VRAM

141GB

FP16

989.4 TF

Avg Price

$4.20/hr

LLM TrainingLarge ModelsHPC

A100 SXM 80GB

NVIDIA · Ampere

High-End

Previous-generation data center GPU. Still widely used and cost-effective for inference and fine-tuning.

VRAM

80GB

FP16

312 TF

Avg Price

$1.29/hr

LLM InferenceFine-tuningResearch

L40S 48GB

NVIDIA · Ada Lovelace

Mid-Range

Versatile mid-range GPU for inference and multimedia AI workloads. Good balance of performance and cost.

VRAM

48GB

FP16

362.1 TF

Avg Price

$1.09/hr

InferenceFine-tuningVideo AI

RTX A6000 48GB

NVIDIA · Ampere

Mid-Range

Professional workstation GPU with large VRAM. Popular for inference and visualization tasks.

VRAM

48GB

FP16

38.7 TF

Avg Price

$0.79/hr

Inference3D RenderingResearch

RTX 4090 24GB

NVIDIA · Ada Lovelace

Consumer

Top consumer GPU. Excellent price-performance for smaller models and experimentation.

VRAM

24GB

FP16

82.6 TF

Avg Price

$0.59/hr

Small Model InferenceFine-tuningGaming AI

MI300X 192GB

AMD · CDNA 3

High-End

AMD's flagship AI accelerator with massive 192GB HBM3 memory. Strong competitor to H100.

VRAM

192GB

FP16

1,307.4 TF

Avg Price

$3.89/hr

LLM TrainingLLM InferenceHPC

B200 192GB

NVIDIA · Blackwell

High-End

NVIDIA's next-generation Blackwell GPU. 2x performance over H100 with 192GB HBM3e.

VRAM

192GB

FP16

2,250 TF

Avg Price

$6.50/hr

Next-gen TrainingFrontier ModelsHPC