Best GPU for GLM‑5 (40B active)

Cheapest cloud GPUs and best local hardware to run GLM‑5 efficiently.

80 GB Recommended VRAM
48 GB Minimum VRAM
744B total / 40B active Model Size

Best Cloud GPU Offers

Cheapest offers that meet the VRAM requirements for this model.

Provider GPU VRAM $/GPU‑Hr Availability Tier Link
RunPod RTX A6000 48.0 GB $0.33 in_stock mixed View
RunPod A40 48.0 GB $0.35 in_stock mixed View
RunPod MI300X 192.0 GB $0.50 in_stock mixed View
RunPod H200 NVL 143.0 GB $0.50 in_stock mixed View
Shadeform RTX A6000 48.0 GB $0.55 in_stock mixed View
DataCrunch RTX 8000 48.0 GB $0.59 in_stock production View
Crusoe L40S 48.0 GB $0.65 in_stock production View
MassedCompute L40S 48.0 GB $0.65 in_stock production View
Shadeform L40S 48.0 GB $0.65 in_stock mixed View
RunPod L40 48.0 GB $0.69 in_stock mixed View
TensorDock RTX A6000 48.0 GB $0.69 in_stock mixed View
DataCrunch A6000 48.0 GB $0.69 in_stock production View

Best GPUs to Buy

Top local hardware options that meet the minimum VRAM requirement.

RankGPUVRAMPriceAI Score$ ValueBuyCompare
NVIDIA H100 PCIe 80 GB $30,000 100.0 3.33 Amazon eBay
RTX 6000 Ada 48 GB $6,800 59.8 8.79 Amazon eBay
NVIDIA L40S 48 GB $8,000 58.4 7.3 Amazon eBay
RTX A6000 48 GB $1,150 47.5 41.3 Amazon eBay
NVIDIA A40 48 GB $2,500 46.5 18.6 Amazon eBay

Related Comparisons