I’m on Arch (btw.) and I have a Intel i5-14600K CPU with a iGPU (UHD Graphics 770) (GPU 1) in it and a dGPU from Nvidia, the RTX 3060 (GPU 0). I have one monitor connected to the 3060 via display port 1.4.

I can see both GPUs in GNOME Mission Center, but hte iGPU has always Clock Speed 0 and Utilization 0. So anything which is done on the GPU is done on the 3060.

I want to seperate what is done on the iGPU and what is done on the 3060:

dGPU (RTX 3060):

  1. Video editing
  2. video transcoding
  3. AI stuff (ollama)
  4. Machine learning
  5. Blender
  6. Steam games

iGPU (intel):

  1. Firefox (especially YouTube video decoding, it has hw acceleration for that)
  2. Chrome
  3. Libre Office
  4. GNOME
  5. etc.

I wonder if this or at least parts of it is possible. I need the whole 12 GB VRAM on the 3060 for ollama, and the iGPU is just sitting there doing nothing. Is there a way to distribute the work? Do I need two screens for that or something?

It might also be that I’m misunderstanding how the whole thing works or over estimating Linuxes capabilities.

  • Gayhitler@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    You have some good answers and some bad answers here.

    It’s not the fault of the people answering, what you’re asking has been piecemeal and scattershot in implementation over the last decade so everyone has some bizarre response they came up with to be happy.

    Allow me to share mine: use a kvm switch.

    The switch lets you plug two computers into one keyboard, video, and mouse. But you’re gonna just use the video part. Plug it into both your motherboards and gpus video ports and push the button to switch back and forth between the gpu for gaming and the motherboard for everything else.

    Why only gaming? Because everything else you reference can make use of a gpu that’s not being used for video. I guess some game engines support rendering frames and then sending them to another output device but that’s not something to rely on.

    So when you’re using blender you see the model on your monitor plugged into the motherboard but the heavy lifting is done by the gpu. When you transcode a video the same thing happens.

    I came to this solution after trying to do what you’re asking for in x11 and having a bunch of headaches about it everytime an update would come down.

    Pushing a little button on the desktop was easier than messing around with software to make a rube Goldberg contraption to do the same thing. Mine had two leds on either side to indicate which “computer” I was using at the time. I ended up wrapping electrical tape around the rim to cover them both up and cut out the word “turbo” from the tape over the green led that indicated I was looking at the gpu.