Loading comparison data...
Loading Comparison
Fetching pricing data and provider information...
Loading Comparison
Fetching pricing data and provider information...
Compare GPU and LLM inference API pricing between Lambda Labs and Scaleway. Find the best rates for AI training, inference, and ML workloads.
Provider 1
Provider 2
Average Price Difference: $0.94/hour between comparable GPUs
| GPU Model ↑ | Lambda Labs Price | Scaleway Price | Price Diff ↕ | Sources |
|---|---|---|---|---|
A10 24GB VRAM • Lambda Labs | Not Available | — | ||
A10 24GB VRAM • | ||||
A100 SXM 80GB VRAM • Lambda Labs | 2x GPU | Not Available | — | |
A100 SXM 80GB VRAM • | ||||
B200 192GB VRAM • Lambda Labs | 8x GPU | Not Available | — | |
B200 192GB VRAM • | ||||
GH200 96GB VRAM • Lambda Labs | Not Available | — | ||
GH200 96GB VRAM • | ||||
H100 SXM 80GB VRAM • Lambda LabsScaleway | 2x GPU | 2x GPU | ↓$0.94(31.1%) | |
H100 SXM 80GB VRAM • $2.10/hour 2x GPU configuration Updated: 4/21/2026 ★Best Price $3.04/hour 2x GPU configuration Updated: 4/21/2026 Price Difference:↓$0.94(31.1%) | ||||
HGX B300 288GB VRAM • Scaleway | Not Available | 8x GPU | — | |
HGX B300 288GB VRAM • Not Available $1.08/hour 8x GPU configuration Updated: 4/6/2026 ★Best Price | ||||
L4 24GB VRAM • Scaleway | Not Available | 8x GPU | — | |
L4 24GB VRAM • | ||||
L40S 48GB VRAM • Scaleway | Not Available | 8x GPU | — | |
L40S 48GB VRAM • | ||||
RTX 6000 Pro 96GB VRAM • Lambda Labs | Not Available | — | ||
RTX 6000 Pro 96GB VRAM • | ||||
RTX A6000 48GB VRAM • Lambda Labs | 2x GPU | Not Available | — | |
RTX A6000 48GB VRAM • | ||||
Tesla V100 32GB VRAM • Lambda Labs | 8x GPU | Not Available | — | |
Tesla V100 32GB VRAM • | ||||
A10 24GB VRAM • Lambda Labs | Not Available | — | ||
A10 24GB VRAM • | ||||
A100 SXM 80GB VRAM • Lambda Labs | 2x GPU | Not Available | — | |
A100 SXM 80GB VRAM • | ||||
B200 192GB VRAM • Lambda Labs | 8x GPU | Not Available | — | |
B200 192GB VRAM • | ||||
GH200 96GB VRAM • Lambda Labs | Not Available | — | ||
GH200 96GB VRAM • | ||||
H100 SXM 80GB VRAM • Lambda LabsScaleway | 2x GPU | 2x GPU | ↓$0.94(31.1%) | |
H100 SXM 80GB VRAM • $2.10/hour 2x GPU configuration Updated: 4/21/2026 ★Best Price $3.04/hour 2x GPU configuration Updated: 4/21/2026 Price Difference:↓$0.94(31.1%) | ||||
HGX B300 288GB VRAM • Scaleway | Not Available | 8x GPU | — | |
HGX B300 288GB VRAM • Not Available $1.08/hour 8x GPU configuration Updated: 4/6/2026 ★Best Price | ||||
L4 24GB VRAM • Scaleway | Not Available | 8x GPU | — | |
L4 24GB VRAM • | ||||
L40S 48GB VRAM • Scaleway | Not Available | 8x GPU | — | |
L40S 48GB VRAM • | ||||
RTX 6000 Pro 96GB VRAM • Lambda Labs | Not Available | — | ||
RTX 6000 Pro 96GB VRAM • | ||||
RTX A6000 48GB VRAM • Lambda Labs | 2x GPU | Not Available | — | |
RTX A6000 48GB VRAM • | ||||
Tesla V100 32GB VRAM • Lambda Labs | 8x GPU | Not Available | — | |
Tesla V100 32GB VRAM • | ||||
Explore how these providers compare to other popular GPU cloud services
Compare Lambda Labs with another leading provider
Compare Lambda Labs with another leading provider
Compare Lambda Labs with another leading provider
Compare Lambda Labs with another leading provider
Compare Lambda Labs with another leading provider
Compare Lambda Labs with another leading provider