Cheapest cloud GPUs and best local hardware to run GPT‑OSS 120B efficiently.
Cheapest offers that meet the VRAM requirements for this model.
| Provider | GPU | VRAM | $/GPU‑Hr | Availability | Tier | Link |
|---|---|---|---|---|---|---|
| RunPod | MI300X | 192.0 GB | $0.50 | in_stock | mixed | View |
| RunPod | H200 NVL | 143.0 GB | $0.50 | in_stock | mixed | View |
| Crusoe | A100 80GB | 80.0 GB | $1.09 | in_stock | production | View |
| RunPod | A100 40GB | 80.0 GB | $1.19 | in_stock | mixed | View |
| Shadeform | A100 80GB | 80.0 GB | $1.19 | in_stock | mixed | View |
| Lambda | A100 80GB | 80.0 GB | $1.29 | request_required | production | View |
| RunPod | A100 SXM | 80.0 GB | $1.39 | in_stock | mixed | View |
| MassedCompute | A100 80GB | 80.0 GB | $1.39 | in_stock | production | View |
| FluidStack | A100 80GB | 80.0 GB | $1.40 | in_stock | production | View |
| Replicate | A100 80GB | 80.0 GB | $1.40 | in_stock | unknown | View |
| Hyperstack | A100 80GB | 80.0 GB | $1.49 | in_stock | production | View |
| Hetzner | A100 80GB | 80.0 GB | $1.54 | request_required | production | View |
Top local hardware options that meet the minimum VRAM requirement.
| Rank | GPU | VRAM | Price | AI Score | $ Value | Buy | Compare |
|---|---|---|---|---|---|---|---|
| — | NVIDIA H100 PCIe | 80 GB | $30,000 | 100.0 | 3.33 | Amazon eBay |