H200 GPU

H200 GPU for cloud computing, machine learning, and AI workloads

Starting Price
$3.30/hr
Available on 5 cloud providers
H200 GPU

Key Specifications

๐Ÿ’พMemory

141GB VRAM

๐Ÿ—๏ธArchitecture

Hopper

โš™๏ธCompute Units

N/A

๐ŸงฎTensor Cores

N/A

Technical Specifications

Hardware Details

ManufacturerNVIDIA
ArchitectureHopper
CUDA CoresN/A
Tensor CoresN/A
RT CoresN/A
Compute UnitsN/A
GenerationN/A

Memory & Performance

VRAM141GB
Memory InterfaceN/A
Memory BandwidthN/A
FP32 PerformanceN/A
FP16 PerformanceN/A
INT8 PerformanceN/A

Common Use Cases

Generative AI and large language models (LLMs) training. Advanced scientific computing for HPC workloads.

Machine Learning & AI

  • Training large language models and transformers
  • Computer vision and image processing
  • Deep learning model development
  • High-performance inference workloads

Graphics & Compute

  • 3D rendering and visualization
  • Scientific simulations
  • Data center graphics virtualization
  • High-performance computing (HPC)

Market Context

The H200 sits within NVIDIA's Hopper architecture lineup,.

Cloud Availability

Available across 5 cloud providers with prices ranging from $3.30/hr. Pricing and availability may vary by region and provider.

Market Position

this GPU is positioned for professional workloads.

Current Pricing

ProviderHourly PriceSource
CoreWeave
$50.44/hr
RunPod
$3.59/hr
Amazon AWS
$4.33/hr
Vast.ai
$3.35/hr
Datacrunch
$3.30/hr

Prices are updated regularly. Last updated: 4/23/2025