Sep 17, 2024
29
19
35
I have built this config for my friend who trains LLM models and requires a setup in mid range. Let me know if it require any change. Budget $1700 USD (1.5 Lacs INR).

Processor (CPU)AMD Ryzen 9 7900X
Cooling System (CPU Cooler)Cooler Master Liquid 240L Core 240mm CPU Liquid Cooler
MotherboardGigabyte B650 Aorus Elite AX V2 (Wi-Fi)
Memory (RAM)Corsair Vengeance RGB 64GB (32GB x 2) 6000MHz DDR5 RAM
Solid State Drive (M.2 SSD)
Crucial P3 Plus 1TB M.2 NVMe Gen4 Internal SSD
Hard Disk Drive (Internal HDD)Western Digital Blue 1TB 7200 RPM
Graphics Card (GPU/VGA)Asus Dual RTX 4060 Ti SSD OC Edition 8GB
Power Supply Unit (SMPS/PSU)Thermaltake Smart BX1 750W 80 Plus Bronze
Cabinet (Case)Deepcool CG580 ATX Mid Tower Case
 
  • Like
Reactions: neogunhero

Zed Clampet

Community Contributor
I have built this config for my friend who trains LLM models and requires a setup in mid range. Let me know if it require any change. Budget $1700 USD (1.5 Lacs INR).

Processor (CPU)AMD Ryzen 9 7900X
Cooling System (CPU Cooler)Cooler Master Liquid 240L Core 240mm CPU Liquid Cooler
MotherboardGigabyte B650 Aorus Elite AX V2 (Wi-Fi)
Memory (RAM)Corsair Vengeance RGB 64GB (32GB x 2) 6000MHz DDR5 RAM
Solid State Drive (M.2 SSD)
Crucial P3 Plus 1TB M.2 NVMe Gen4 Internal SSD
Hard Disk Drive (Internal HDD)Western Digital Blue 1TB 7200 RPM
Graphics Card (GPU/VGA)Asus Dual RTX 4060 Ti SSD OC Edition 8GB
Power Supply Unit (SMPS/PSU)Thermaltake Smart BX1 750W 80 Plus Bronze
Cabinet (Case)Deepcool CG580 ATX Mid Tower Case

The VRAM is definitely a concern for AI because the hard stuff is processed by the GPU and it wants to be able to store a ton of data there. If you selected the 4060 ti version that has 16 GB of VRAM, it would be much better. I don't think it's very much more in cost. The 4070 ti would also be a good choice.

The GPU is going to be doing the heavy lifting. You still need a good CPU, but if your choice is to go down a bit on the CPU so you can go up a bit on the GPU, you should do it.

Also, a 4070 TI with 12 GB of VRAM is better for AI than a 4060 TI with 16 GB of VRAM. VRAM isn't the only thing that's important in a GPU for AI.

Lastly, ignore whatever reddit has to say on the matter. They give terrible advice regarding training AI. They just pick whatever would be better for gaming.
 
Last edited:
  • Like
Reactions: neogunhero