For years I’ve had a dream of building a rack mounted PC capable of splitting its resources to host multiple GPU intensive VMs:

  • a few gaming VMs
  • a VM for work that can run Davinci Resolve and Blender renders
  • an LLM server
  • a Stable Diffusion server
  • media server

Just to name a few possibilities…

Everytime I’ve looked into it, it seemed like the technology just wasn’t there yet. I remember a few years ago Linus TT took a shot at it, but in the end suggested the technology (for non-commercial entities) just wasn’t in a comfortable spot yet.

So how far off are we? Obviously AI focused companies seem to make it work, but what possibilities exist for us self-hosters who might also want to run multiple displays in addition to the web gui LLM servers? And without forking out crazy money for GPU virtualization software licenses?

  • TCB13@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    6 months ago

    The technology has “been there” for a while, it’s trivial do setup what you’re asking for, the issue is that games have anti cheat engines that will get triggered by the virtualization and ban you.

    • socphoenix@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      Which games do that? Running pasthrough gpu on windows for destiny and halo at least gave me 0 issues for years

      • You999@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Anything using vanguard such as valorant and league of legends, battleye such as pubg, destiny 2, and rainbow 6 siege, and easy anti cheat such as fortnight blocks virtual machines. Vanguard is especially bad because it will not allow to run the game with Intel-VT/AMD-V enabled even if you are running bare metal as of its last update.

        • umbrella@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          this just makes me wanna install bare-metal goody-2-shoes windows and cheat using a 5$ arduino

  • Codilingus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Unraid does an excellent job at this. I helped a friend setup a rack mounted server, it runs home assistant, some other containers, and a VM for him to work in, or play games. AMD GPU being passed through.

  • Decronym@lemmy.decronym.xyzB
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I’ve seen in this thread:

    Fewer Letters More Letters
    NAS Network-Attached Storage
    NAT Network Address Translation
    PCIe Peripheral Component Interconnect Express
    PSU Power Supply Unit
    VPN Virtual Private Network

    5 acronyms in this thread; the most compressed thread commented on today has 5 acronyms.

    [Thread #807 for this sub, first seen 15th Jun 2024, 14:35] [FAQ] [Full list] [Contact] [Source code]

  • Trincapinones@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I’ve recently tried to do that using sunsine and different linux gaming distros and it was awful, the VM was working great for a few minutes and then suddenly crashes and I have to hard stop it.

    All the people that I’ve seen talking about it on the internet are using Windows VMs so I guess that I’m doing something wrong or the only way to do it is through a Windows VM, which I’ll not even try.

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      I’ve recently tried to do that using sunsine and different linux gaming distros and it was awful, the VM was working great for a few minutes and then suddenly crashes and I have to hard stop it.

      Are you running this with something like libvirtd/qemu? If so, VFIO configurations can get pretty complex. Random crashes seem like MSI interrupt issues (or you’ve allocated too much RAM to the guest). Or it could be GPU reset issues that would also occur on the (Linux) host, a newer kernel and Mesa version in the guest may help.

      Setting on the kernel commandline for the host to workaround MSR interrupt crashes:

      kvm.ignore_msrs=1

      If you’re running on a Windows host or with something like Virtualbox (assuming GPU passthrough is supported by these), YMMV but I wouldn’t expect good results.

      • Trincapinones@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        I’m using Proxmox with an NVIDIA 1050 GPU that I was passing through to another VM for jellyfin transcoding in docker (I don’t need it anymore), because of that I thought that the drivers were set up correctly.

        The guest was Bazzite with 2 cores and 2 GB of RAM, I was not even gaming, just login on steam and updating the system and I had sudden crashes with Bazzite only using 1 GB on the Summary…

        • vividspecter@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          Ah Nvidia. Bazzite uses Wayland I believe since it uses the same gamescope session as SteamOS (unless something has changed recently). While it may be possible to get it working, I’d expect a much better time with an AMD card.

          A traditional distribution may be a better bet with Nvidia for now.

          • Trincapinones@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Thanks, I’ve also tried to change the autologin with x11 with no success, I’ll try with nobara, but I really liked the console-like features

  • Sethayy@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I currently have a setup exactly like this, with a threadripper 2950x, an RX 6600, and a 2070 super.

    Let me know if you have any questions in the specifics, but its 100% possible

    Best part of this setup is being able to connect to both via sunshine on many displays at once

    • brownmustardminion@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I’m curious in a more in depth breakdown of your setup if you don’t mind. What is latency like and how are you handling switching?

      • Sethayy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 months ago

        I have a rack server in the garage with a gaming PC in it, 2 PSU’s and the 2 GPU’s mentioned, all running on Debian (which I soon plan to swap to nixos).

        The AMD GPU’s is passed through to a windows VM with 8 gigs or so of ram, for VR development in the garage usually, but sometimes is streamed as well.

        The second Nvidia GPU goes to my linux machine on Ubuntu just for ease of patched nvidia drivers, a couple virtual monitors with an xconfig like this, and is my daily driver with 16 gigs of RAM.

        Both use Virtio drivers for disk, network, and anything else I’m forgetting, Pcie passthrough via KVM/QEMU on the host.

        I’d say the latency hangs around 5ms when streaming both at once, and never comes close to saturating the gigabit connection, but I’m sure some optimisations could be done somewhere along the line.

        Clients run on anything from an Xbox series X to a random PC, hopefully soon an orange pi (worried about latency though).

        When I have a workload requiring both GPU’s I just keep 2 moonlight windows open and use the keybinds to unfocus the mouse then alt+tab to swap between them.

        I don’t have any complaints, although one time when my thermal setup was worse I left 2 copies Subnautica running for my wife and I to at Nitrox together, and it did start to drop in fps on the Linux machine once we picked it up after an hour or 2 running the games AFK.

        Edit to add I’m mostly using this for gaming right now, but its handled everything (within reason) that I’ve tossed at it, but I’m planning on soon setting up this sometime soon also across a couple other PC’s, but as of right now the VM’s feel as if they’re entirely distinct PC’s from an external perspective

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    You’re not really describing your use-case here. Are you just trying to run a server that does all your rendering for you so you can play games elsewhere? Yes, that’s totally possible.

    If you’re trying to describe a business…no, it’s not possible, scalable, or profitable.

    I’m curious as to what your intentions are here though.

    • brownmustardminion@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I have a workstation I use for video editing/vfx as well as gaming. Because of my work, I’m fortunate to have the latest high end GPUs and a 160" projector screen. I also have a few TVs in various rooms around the house.

      Traditionally, if I want to watch something or play a video game, I have to go to the room with the jellyfin/plex/roku box to watch something and am limited to the work/gaming rig to play games. I can’t run renders and game at the same time. Buying an entire new pc so I can do both is a massive waste of money. If I want to do a test screening of a video I’m working on to see how it displays on various devices, I have to transfer the file around to these devices. This is limiting and inefficient to me.

      I want to be able to go to any screen in my house: my living room TV, my large projector in my studio room, my tablet, or even my phone and switch between:

      • my workstation display running on a Window 10 VM
      • my linux VM with youtube or jellyfin player I use as a daily driver
      • a fedora or Windows VM dedicated to gaming, maybe SteamOS
      • maybe a friend comes over for a LAN party and we both can game without having to set up a 2nd rig
      • I want to host an LLM or stablediffusion server without having to buy a new GPU with enough VRAM to run SDXL
      • just_another_person@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        6 months ago

        What you’re describing is mostly a networking issue. I’m also pretty suspect about your setup and wishes. You definitely don’t work for a large VFX studio, and you’re not using this as described for CAD work. I’m going to guess this entire setup is for your anime and incest rendering farm.

        This is a ridiculous question for anyone with this amount of hardware in their home already that’s using it on a daily basis to actually work. You would also not be “running renders” if this was hardware provided by a company you work for.

        Whatever is being asked here is for a shady ass person. Don’t help them.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          … what?

          Them: “I want a centralized place to handle all my graphics stuff, so I can access graphically intensive things from any device.”

          You: “Must be incest renders because you already have hardware and say you use it for work.”

          So according to you, contractors don’t exist, iPhones can play PC games, and anyone wanting to split PC resources between multiple use cases is shady.

          What’s ridiculous is that you seem to think extreme paranoia is a normal thing in everyday life.