HGX B300 GPU

The NVIDIA HGX B300 features the Blackwell Ultra GPU with 288GB HBM3e memory and 8 TB/s bandwidth, delivering 50% more AI performance than B200 for large-scale training and inference workloads.

Starting Price
$3.50/hr
Available on 4 cloud providers
HGX B300 GPU

Key Specifications

๐Ÿ’พMemory

288GB VRAM

๐Ÿ—๏ธArchitecture

Blackwell

โš™๏ธCompute Units

N/A

๐ŸงฎTensor Cores

N/A

Technical Specifications

Hardware Details

ManufacturerNVIDIA
ArchitectureBlackwell
CUDA CoresN/A
Tensor CoresN/A
RT CoresN/A
Compute UnitsN/A
GenerationN/A

Memory & Performance

VRAM288GB
Memory InterfaceN/A
Memory Bandwidth8000 GB/s
FP32 Performance100 TFLOPS
FP16 Performance5000 TFLOPS
INT8 PerformanceN/A

Performance

Computing Power

Computational Performance

FP32 (TFLOPS)100
FP16 (TFLOPS)5,000

Common Use Cases

Large-scale AI training, LLM inference, AI reasoning, HPC

Machine Learning & AI

  • Training large language models and transformers
  • Computer vision and image processing
  • Deep learning model development
  • High-performance inference workloads

Graphics & Compute

  • 3D rendering and visualization
  • Scientific simulations
  • Data center graphics virtualization
  • High-performance computing (HPC)

Market Context

The HGX B300 sits within NVIDIAโ€™s Blackwell architecture lineup, positioned in the ultra performance tier. It’s designed specifically for data center and enterprise use.

Cloud Availability

Available across 4 cloud providers with prices ranging from $3.50/hr. Pricing and availability may vary by region and provider.

Market Position

Released in 2025, this GPU is positioned for enterprise and data center workloads.

Current Pricing

ProviderHourly PriceSource
RunPod
$6.19/hr
Amazon AWS
$93.60/hr
White Fiber
$3.50/hr
Verda
$5.74/hr

Prices are updated regularly. Last updated: 2/19/2026