Hi,

I have a friend who is looking to run a few simulations he has implemented in python and needs around 256GB of ram. He is estimating it will take a couple of hours, but he is studying economics so take that with a grain of salt 🤣

For this instance, I recommended GCP, but I felt a bit dirty doing that. So, I was wondering if any of you have a buttload of memory he can burrow? Generally, would you lend your RAM for a short amount of time to a stranger over the internet? (assuming internet acccess is limited to a signle ssh port, other necessary safeguards are in place)

  • HelloRoot@lemy.lol
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 days ago

    Why not get a 0.5 or 1 tb nvme ssd and set it all as swap?

    It will run probably 10 times slower, but it’s cheap and doable.

    • dgdft@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9 days ago

      This is the way.

      Depending on the nature of the sim, it could probably even be done with ~80 GB or less of existing SSD space using zram w/ zstd.

  • cevn@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 days ago

    Needing that much RAM is usually a red flag that the algo is not optimized.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 days ago

      Researchers always make some of the worst coders unfortunately.

      Scientists, pair up with an engineer to implement your code. You’ll thank yourself later.

    • DaPorkchop_@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 days ago

      True, but there are also some legitimate applications for 100s of gigabytes of RAM. I’ve been working on a thing for processing historical OpenStreetMap data and it is quite a few orders of magnitude faster to fill the database by loading the 300GiB or so of point data into memory, sorting it in memory, and then partitioning and compressing it into pre-sorted table files which RocksDB can ingest directly without additional processing. I had to get 24x16GiB of RAM in order to do that, though.

  • Max@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    9 days ago

    That’s kinda an insane amount of ram for most simulations. Is this like a machine learning thing? Is his python code just super unoptimized? Is it possible he’s making a bunch of big objects and then not freeing the references when he’s done with them so they’re never garbage collected?

  • irmadlad@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 days ago

    The computer I’m typing on has 96 gb ram. Most of my equipment is ancient in terms of PCs. This one I build about 14 years ago, and I fully stocked it with the cutting edge tech of the day. My intent was to build a LTS PC, as it were. LOL Back then, SLI was the thing, but I’ve upgraded the GPU. I have some old stuff in the parts bin tho, but it’s ancient as well.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    AWS has an r4.8xlarge 244gb ram with 32 vcores for $2.13 an hour If they can handle Linux. $2.81 an hour for windows.