Best GPUs for Running LLMs (70B+ Models)

GPUs with 24GB+ VRAM for running large language models locally.

☁️ Production Cloud GPUs (Enterprise Providers)

Production cloud providers for reliable LLM hosting. Need 80GB? Rent an A100 or H100.

💡 Looking for budget options? Check the main Rent tab and switch to "Experimental" tier for cheaper P2P providers.

# Provider GPU VRAM Price/Hr Region Rent
1 FluidStack RTX 3090 24 GB $0.29/hr Global Rent →
2 LeaderGPU RTX 3090 24 GB $0.35/hr EU Rent →
3 FluidStack RTX 4090 24 GB $0.44/hr Global Rent →
4 MassedCompute RTX 4090 24 GB $0.45/hr US Rent →
5 LeaderGPU RTX A5000 24 GB $0.45/hr EU Rent →
6 Nebius L4 24 GB $0.49/hr EU/Israel Rent →
7 Lambda RTX 6000 24 GB $0.50/hr US/EU Rent →
8 Hetzner L4 24 GB $0.54/hr EU Rent →
9 LeaderGPU RTX 4090 24 GB $0.55/hr EU Rent →
10 OVHcloud L4 24 GB $0.57/hr EU/Global Rent →

🛒 Best GPUs to Buy

Showing 13 GPUs with 24GB+ VRAM for purchase.

RankGPUVRAMPriceAI Score$ ValueBuyCompare
1 Tesla P40 24 GB $140 19.4 138.57 Amazon eBay
2 RTX A5000 24 GB $480 30.7 63.96 Amazon eBay
3 GeForce RTX 3090 24 GB $570 36.3 63.68 Amazon eBay
4 GeForce RTX 3090 Ti 24 GB $640 37.5 58.59 Amazon eBay
5 Radeon RX 7900 XTX 24 GB $740 31.8 42.97 Amazon eBay
6 GeForce RTX 4090 24 GB $1,080 45.2 41.85 Amazon eBay
7 RTX A6000 48 GB $1,150 47.5 41.3 Amazon eBay
8 GeForce RTX 5090 32 GB $1,999 63.0 31.52 Amazon eBay
9 NVIDIA A100 40GB 40 GB $2,400 58.4 24.33 Amazon eBay
10 NVIDIA A40 48 GB $2,500 46.5 18.6 Amazon eBay
11 RTX 6000 Ada 48 GB $6,800 59.8 8.79 Amazon eBay
12 NVIDIA L40S 48 GB $8,000 58.4 7.3 Amazon eBay
13 NVIDIA H100 PCIe 80 GB $30,000 100.0 3.33 Amazon eBay