Best GPU for GPT‑OSS 120B

Cheapest cloud GPUs and best local hardware to run GPT‑OSS 120B efficiently.

120 GB Recommended VRAM
80 GB Minimum VRAM
120B class Model Size

Best Cloud GPU Offers

Cheapest offers that meet the VRAM requirements for this model.

Provider GPU VRAM $/GPU‑Hr Availability Tier Link
RunPod MI300X 192.0 GB $0.50 in_stock mixed View
RunPod H200 NVL 143.0 GB $0.50 in_stock mixed View
Crusoe A100 80GB 80.0 GB $1.09 in_stock production View
RunPod A100 40GB 80.0 GB $1.19 in_stock mixed View
Shadeform A100 80GB 80.0 GB $1.19 in_stock mixed View
Lambda A100 80GB 80.0 GB $1.29 request_required production View
RunPod A100 SXM 80.0 GB $1.39 in_stock mixed View
MassedCompute A100 80GB 80.0 GB $1.39 in_stock production View
FluidStack A100 80GB 80.0 GB $1.40 in_stock production View
Replicate A100 80GB 80.0 GB $1.40 in_stock unknown View
Hyperstack A100 80GB 80.0 GB $1.49 in_stock production View
Hetzner A100 80GB 80.0 GB $1.54 request_required production View

Best GPUs to Buy

Top local hardware options that meet the minimum VRAM requirement.

RankGPUVRAMPriceAI Score$ ValueBuyCompare
NVIDIA H100 PCIe 80 GB $30,000 100.0 3.33 Amazon eBay

Related Comparisons