Hello, I’m currently thinking back and forth about which new home server to build. What I’ve stumbled across: the i9 and new Core 9 Ultra all only support a maximum of 192GB RAM. However, some of the mainboards support 256GB (with 4 RAM banks and dual channel). Why?

I want to have the option of maxing out the RAM later.

I could buy 4x48GB RAM now and be at 192GB. Maybe I would be annoyed later that 48 GB of RAM is still “missing”. But what if I buy 4x 64GB RAM? 3x64 GB RAM makes no sense, because then dual channel is not used. 4x64 is probably not recognized by the processor?

Or are there LGA1851 or LGA1700 processors, capable of handling 256GB RAM?

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    AI inference is memory-bound. So, memory bus width is the main bottleneck. I also do AI on an (old) CPU, but the CPU itself is mainly idle and waiting for the memory. I’d say it’ll likely be very slow, like waiting 10 minutes for a longer answer. I believe all the AI people use Apple silicon because of the unified memory and it’s bus width. Or some CPU with multiple memory channels. The CPU speed doesn’t really matter, you could choose a way slower one, because the actual multiplications aren’t what slows it down. But you seem to be doing the opposite, get a very fast processor with just 2 memory channels.