NVIDIA H100

Professional-grade GPU for deep learning, AI inference, and high-performance computing.

Best Starting Price
$0.73 /h
From 284 configurations Compare All Prices ↓

Pricing Explorer

Showing the top 5 lowest-priced configurations.

Provider Spec Total VRAM vCPUs RAM Billing Price/h

AI Training & Performance

The H100 is a high-performance Inferred GPU. Featuring 80GB HBM3 of ultra-fast memory, it is engineered for the most demanding AI model training, large language models (LLMs), and complex scientific computing.

Recommended Scenarios

LLM Training
Advanced AI Research
Deep Learning Fine-tuning

Technical Parameters

Architecture
Hopper
VRAM Capacity
80GB HBM3
Bandwidth
3350 GB/s
CUDA Cores
16896
FP16 Perf.
1979 TFLOPS
Power (TDP)
700W