spaghetti_carbanana@krabb.orgtohomelab@lemmy.ml•How are you sharing your GPU with multiple VMs?
4·
26 days agoI’m looking into ways to get vGPU to work on VMware with the NVIDIA Tesla series, but as far as retail cards go, you will be hamstrung by the SR-IOV support and lack (or rarity) thereof.
For now I just use some low end Quadro GPUs passed through to VMs running docker, which then carves them up on a per-container basis.
Microsoft has GPU-P as you found, which is in Hyper-V on Windows 11 (maybe 10) and Windows Server 2025 and I believe works on retail cards.
For Proxmox, you have the vgpu-unlock script which will work for some consumer NVIDIA GPUs. I’ve heard of ways of getting this to work on xcp-ng as well.
Its not for everyone but I use Cisco Aironet APs with a virtual wireless LAN controller. Ubiquiti is popular among the community. They’re cost effective and work well in a home/small business environment. Aruba InstantOn are decent as well from my experience, but they’re cloud managed and this is self-hosted after all :)
I’ve extensively used Cisco, Meraki, Fortinet, Cambium, Aruba, Ubiquiti and Juniper in a professional setting. Avoid Fortinet and Cambium APs if you can, my experience is that they can be pretty unstable.
Generally speaking if you’re going to have multiple APs, you’ll want something that’s centrally managed so the APs are able to be aware of each other and manage clients effectively.