2

I'm using Blender 3.5 with Cycles.

My stats are:
Processor: 11th Gen Intel® Core™ i5-11400 @ 2.60Ghz
Graphics card: NVIDIA GeForce RTX 2060
RAM: 32GB

enter image description here

I've disabled my CPU in the Preferences. The CPU usage shows no more than 10%.

Sometimes I get an error when trying to render that GPU is out of memory, even though I can see, that the GPU usage rate doesn't go above 20%.

*In Render Viewport Mode, my stats show the GPU remaining untouched, and it's usage rate not moving, while the CPU usage rate jumps to 80%. It's obvious that despite all the settings Blender is NOT using my GPU but instead my CPU. Worth to mention that my CPU is overheating up to 90 degrees when in Render viewport and when rendering. *

How can I render with GPU instead of CPU?

L0Lock
  • 15,965
  • 1
  • 20
  • 44
Monika G.
  • 21
  • 1
  • 3
  • 2
    Hi do you have a question we can answer? Blender doesn't overheat your computer, if it overheats is due to hardware problems. Either poor ventilation, poorly installed or inadequate cooling. Neither of these are on topic here, nor problems we could help you with without access to your hardware or at least knowing what is installed. – Duarte Farrajota Ramos Jun 20 '23 at 12:14
  • Please read carefully. It's not a hardware issue. I don't want to repeat myself again. I'm clearly stating that Blender is using my CPU instead of GPU despite all the settings. This is a software problem. @DuarteFarrajotaRamos – Monika G. Jun 20 '23 at 12:19
  • I have greatly condensed your post to highlight the problem. Please check, that the content still represents your problem. Otherwise just roll back the edit. =) – Leander Jun 20 '23 at 12:43
  • Make sure that you have selected a Cycles Render Device. – AlpineWorldCup Jun 20 '23 at 12:54
  • https://blender.stackexchange.com/a/179174/110840 – Allen Simpson Jun 20 '23 at 13:08
  • I'm using CUDA and I've disabled CPU there. OptiX does not seem to work for me. @AlpineWorldCup – Monika G. Jun 20 '23 at 14:12
  • @AllenSimpson It's not the same issue, because I've disabled the CPU from all settings, and it's still using it instead of the GPU. Updating the tile size seems useless since it's not using the GPU at all, I have no issue with the render time, my problem is that it's overheating my CPU, instead of using the GPU. – Monika G. Jun 20 '23 at 14:15
  • 2
    The CPU remains the brain of your machine and will be still be used while the GPU renders, even for simple tasks like handling data. It can get significantly used when your render exceeds your GPU's memory. Combine that with how Windows' task manager isn't accurately showing your GPU usage by default and it's really easy to think your CPU does the render instead of the GPU, but unless there's a massive bug, it isn't the case. – L0Lock Jun 20 '23 at 14:29
  • If you checked all the above, and you are sure it's not just your scene overflowing your GPU or your task manager set up incorrectly, then the only thing you can do is go to the menu Help > Report a bug and provide a sample scene in your bug report. There's nothing else we can do here for that matter. – L0Lock Jun 20 '23 at 14:31
  • If on the contrary it might be your scene overflowing, or another issue you have control over, we can gladly help you with that! But you will need to make those checks first, give us the result, and we'll see where we can explore from there. – L0Lock Jun 20 '23 at 14:33
  • 1
    How do you know the GPU is not used? Why do you trust whatever resource monitoring software you are using? GPU usage might not be reported correctly. You could test by render time. There are benchmarks for many GPU models online. I would recommend finding out if you actually have a problem. Also if you render many very fast rendering frames, GPU might not get the chance to work if the most work is loading the scene and writing files. – Martynas Žiemys Jun 20 '23 at 14:36
  • Oh yeah, nice to mention that. Short renders into PNGs can have this weird result of 90% of your time spent on waiting for files to write instead of rendering. Could also just be, again, a heavy scene with data to load/unload at each frame. Enabling Persistent Data when possible can save lots of troubles there. – L0Lock Jun 20 '23 at 14:39
  • Re: overheating CPU - https://blender.stackexchange.com/a/221732/110840 – Allen Simpson Jun 20 '23 at 22:31
  • Thank you all for the help. I will definitely report a bug since the scene is really not that heavy, given I've limited the display and render size of textures too. And I will definitely try OptiX, hopefully it will reduce the problem. – Monika G. Jun 22 '23 at 06:18

2 Answers2

3

You are already using the GPU for rendering(most likely). If you install latest drivers, set up the System tab in Preferences correctly choosing OptiX(preferably) or CUDA for Nvidia RTX GPUs and then select GPU Compute as Device in Render Properties tab, Properties Editor that is all you need to do.

You should not concern yourself with what statistics Windows Task Manager or Resource Monitor reports - they are incorrect. Other hardware monitoring software may also report incorrect GPU usage. That does not matter. What should matter is the time your renders take.

Sometimes I get an error when trying to render that GPU is out of memory, even though I can see, that the GPU usage rate doesn't go above 20%.

Again - you cannot see actual usage, it's shown incorrectly, but even if it was correct(which it isn't) this would make sense, because if the scene does not fit into memory, rendering computations will not start because the data cannot be loaded into VRAM so there is nothing the GPU can be used for in that scenario, except trying to load the scene. Also usage has nothing to do with memory, so even if usage was low, GPU can still run out of memory. But this does not matter, because once again - you cannot see GPU useage.

Here is my GPU definitely rendering a complex(in terms of lighting) archviz scene:

enter image description here

And here it is with Blender closed:

enter image description here

...CPU usage rate jumps to 80%...

That's right. CPU is used no matter what during rendering. It needs to be used for preparing and loading the scene and for managing the render process. That is to be expected.

...my CPU is overheating up to 90 degrees when in Render viewport and when rendering.

It's not a hardware issue.

This temperature is an issue for hardware whether you like it or not. It is possible it might eventually lead to hardware problems. Although from my experience hardware is not that easy to damage, and it will more likely lead to unexpected unexplained crashes in the middle of the night on weekend in an office computer left to render visualisations for a meeting with a client 10 am on Monday... because of course it will. Computers are sometimes not built with rendering in mind and it is not uncommon for CPU coolers not to be adequate for rendering especially if the PC is not meant to be a "workstation" computer. You might want to address that. If you are lucky, simply adjusting cooling settings in the UEFI might be enough to fix it.

Martynas Žiemys
  • 24,274
  • 2
  • 34
  • 77
  • Thanks, Martynas, that makes sense. I wasn't using Task Manager to track the heat and usage, but a different software.

    I understand there isn't much to do, I'm just generally confused since as I said I haven't had such issues on older Blender versions, even when using a laptop. I've made some a lot more complex scenes on a laptop than the one that's causing me problems right now. I will still report a bug issue, but will definitely try to work on the cooling system and see what I can do about it.

    – Monika G. Jun 22 '23 at 06:13
0

The other answer is true in most cases (and it seems this case), sometimes people did not configure it properly, so I will go through the steps on how to set it all up. Keep in mind that I am using Linux, so if anything looks odd that is why.

  1. go to Edit > Preferences
  2. go to System
  3. make sure that Cuda is checked (for nVidia devices) and choose your GPU

It should look like this properly configured

Then make sure that you have the device set to GPU Compute in the Render tab.

Note that it will sometimes reset it after closing Blender.

properly configured

now start rendering and check your gpu usage. for windows go to task manager and for linux go to terminal and type nvidia-smi nvidia-smi

Note that according to "GPU not rendering in Cycles" -

Windows Task Manager doesn't show CUDA utilization by default (which is what Cycles uses). To have that show up, click on the "3D" above the graph and choose "Cuda" or "Compute_0" in the drop down that pops up. It will then show correct utilization.

mr.hooman
  • 1
  • 1