manufacturer

NVIDIA GPUs Cloud Pricing

NVIDIA dominates the cloud GPU market with architectures spanning from Ampere to Blackwell. Their CUDA ecosystem and Tensor Cores make them the default choice for machine learning training, inference, and high-performance computing. Most cloud providers carry NVIDIA hardware across all performance tiers.

GPUs 54
Providers 37
From $0.04/hr

NVIDIA GPUs Available in the Cloud

A10

24 GBAmpereNVIDIA
$0.20/hr5 providers

A100 PCIE

40 GBAmpereNVIDIA
$0.25/hr10 providers

A100 SXM

80 GBAmpereNVIDIA
$0.45/hr23 providers

A16

Server
64 GBAmpereNVIDIA
$0.51/hr2 providers

A2

Server
16 GBAmpereNVIDIA
$0.06/hr3 providers

A30

Server
24 GBAmpereNVIDIA
$0.25/hr6 providers

A40

48 GBAmpereNVIDIA
$0.32/hr7 providers

B100

Server
192 GBBlackwellNVIDIA
No recent pricing

B200

192 GBBlackwellNVIDIA
$1.99/hr15 providers

GB200

Server
384 GBBlackwellNVIDIA
$1.02/hr2 providers

GB300

Server
576 GBBlackwellNVIDIA
$18.00/hr1 provider

GH200

96 GBHopperNVIDIA
$1.49/hr5 providers

H100 NVL

Server
94 GBHopperNVIDIA
$0.40/hr5 providers

H100 PCIe

Server
80 GBHopperNVIDIA
$0.89/hr4 providers

H100 SXM

Server
80 GBHopperNVIDIA
$0.80/hr30 providers

H200

141 GBHopperNVIDIA
$1.50/hr21 providers

HGX B300

Server
288 GBBlackwellNVIDIA
$1.08/hr7 providers

L4

Server
24 GBAda LovelaceNVIDIA
$0.23/hr9 providers

L40

40 GBAda LovelaceNVIDIA
$0.34/hr10 providers

L40S

48 GBAda LovelaceNVIDIA
$0.40/hr21 providers

RTX 3070

8 GBAmpereNVIDIA
$0.04/hr4 providers

RTX 3070 Ti

8 GBAmpereNVIDIA
$0.06/hr2 providers

RTX 3080

10 GBAmpereNVIDIA
$0.06/hr3 providers

RTX 3080 Ti

12 GBAmpereNVIDIA
$0.08/hr4 providers

RTX 3090

24 GBAmpereNVIDIA
$0.09/hr5 providers

RTX 3090 Ti

24 GBAmpereNVIDIA
$0.10/hr3 providers

RTX 4000 Ada

20 GBAda LovelaceNVIDIA
$0.09/hr4 providers

RTX 4060

8 GBAda LovelaceNVIDIA
$0.14/hr1 provider

RTX 4060 Ti

8 GBAda LovelaceNVIDIA
$0.06/hr2 providers

RTX 4070

12 GBAda LovelaceNVIDIA
$0.07/hr2 providers

RTX 4070 SUPER

12 GBAda LovelaceNVIDIA
No recent pricing

RTX 4070 Ti

12 GBAda LovelaceNVIDIA
$0.08/hr3 providers

RTX 4070 Ti SUPER

16 GBAda LovelaceNVIDIA
$0.09/hr1 provider

RTX 4080

16 GBAda LovelaceNVIDIA
$0.11/hr4 providers

RTX 4080 SUPER

16 GBAda LovelaceNVIDIA
$0.17/hr1 provider

RTX 4090

24 GBAda LovelaceNVIDIA
$0.16/hr8 providers

RTX 4500 Ada

24 GBAda LovelaceNVIDIA
$0.18/hr1 provider

RTX 5000

32 GBAda LovelaceNVIDIA
$0.25/hr1 provider

RTX 5060

12 GBBlackwellNVIDIA
$0.11/hr1 provider

RTX 5060 Ti

16 GBBlackwellNVIDIA
$0.07/hr2 providers

RTX 5070

12 GBBlackwellNVIDIA
$0.08/hr2 providers

RTX 5070 Ti

16 GBBlackwellNVIDIA
$0.10/hr2 providers

RTX 5080

16 GBBlackwellNVIDIA
$0.14/hr3 providers

RTX 5090

32 GBBlackwellNVIDIA
$0.25/hr6 providers

RTX 6000 Ada

48 GBAda LovelaceNVIDIA
$0.34/hr10 providers

RTX 6000 Pro

96 GBBlackwellNVIDIA
$0.13/hr9 providers

RTX A2000

6 GBAmpereNVIDIA
$0.06/hr2 providers

RTX A4000

16 GBAmpereNVIDIA
$0.08/hr6 providers

RTX A4500

Server
20 GBAmpereNVIDIA
$0.10/hr1 provider

RTX A5000

24 GBAmpereNVIDIA
$0.09/hr9 providers

RTX A6000

48 GBAmpereNVIDIA
$0.17/hr14 providers

RTX PRO 6000

Server
96 GBBlackwellNVIDIA
$0.31/hr8 providers

Tesla T4

16 GBTuringNVIDIA
$0.16/hr5 providers

Tesla V100

32 GBVoltaNVIDIA
$0.12/hr9 providers

Sample NVIDIA GPUs Pricing

ProviderConfigPrice / hrUpdatedSource
2×
$0.10/hr
4/11/2026
1×
$0.11/hr
4/18/2026
2×
$0.12/hr
4/18/2026
1×
$0.25/hr
4/18/2026
1×
$0.79/hr24mo
4/14/2026
1×
$1.55/hr
4/18/2026
2×
$1.69/hr
4/18/2026
8×
$3.44/hr
4/18/2026
8×
$4.50/hr
4/18/2026
Direct from providerVia marketplace

Showing 9 of 973 price points. Visit individual GPU pages above for full pricing.

Frequently Asked Questions

Which NVIDIA GPU is best for ML training?

For large-scale training, the H100 and B200 offer the highest throughput with HBM3/HBM3e memory and NVLink interconnects. For smaller workloads and fine-tuning, the A100 80GB and RTX 4090 provide strong performance at lower cost. Check current pricing above to compare.

What is the difference between consumer and server NVIDIA GPUs?

Server GPUs (A100, H100, B200) use ECC memory, support NVLink/NVSwitch for multi-GPU scaling, and have higher memory capacities (40–192 GB HBM). Consumer GPUs (RTX 4090, RTX 3090) use GDDR6X with lower VRAM but can still be cost-effective for inference and smaller training jobs.

Related Categories