A graphics processing unit (GPU) is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the rendering or output images.
Questions tagged [gpu]
876 questions
6
votes
2 answers
Loading render kernels - what exactly that means?
When you render on CUDA (OptiX) for the first time, it displays the message "Loading render kernels." But what does that mean? Does it use the internet to retrieve some "kernels" from Nvidia servers? I have already installed the drivers and CUDA…
Crantisz
- 35,244
- 2
- 37
- 89
4
votes
1 answer
eGPU + Blender + Cycles + Mac
Hi I wanted know if anyone is using an external GPU with blender on a Mac successfully. Would anyone care if so what setup they have and any indication of benefits and pitfalls.
Thanks
Carl
- 63
- 1
- 7
4
votes
0 answers
GPU Render with opensource AMD Ubuntu 16.04 driver
Is there any way that I can make GPU renders now that is no more flgrx for Ubuntu !? I've already installed the mesa-opencl-icd and the clinfo appears to recognize my GPU: Platform Version OpenCL 1.1 MESA 11.2.0 and Device Name : AMD HAWAII (DRM…
Victoralm
- 143
- 5
4
votes
1 answer
Why do volumetrics render noisier when using GPU then when using CPU in Cycles?
Alrighty. I've searched all over the place and even put in a couple of requests from every deity I could think of, but nothing on my issue. So, I'm messing around with making up volumetric light cones for spotlights in Cycles for a project. I've…
JBJB2495
- 43
- 5
4
votes
5 answers
Blender using wrong GPU for 3D View (This is not about Rendering)
I'm using two graphic cards in my system. I have recently replaced an old AMD gpu with a GT 710. Now blender is using the GT 710 instead of the much more powerful GTX 760. Blender now is painfullly slow and already stutters at 100k verts.
Anybody…
AzulShiva
- 835
- 1
- 15
- 35
3
votes
1 answer
How do I get Cycles to use GPU on a linux server?
Inspired by this question: Enabling GPU rendering for Cycles?
I have access to a Linux server that is equipped with GeForce GTX TITAN X. Now Cycles still renders everything with CPU. How can I get it to exploit the GPU?
Since I am talking about a…
Sibbs Gambling
- 719
- 2
- 15
- 30
3
votes
1 answer
When using multiple GPUs, is there a way to set Blender to use the memory from the card with the largest amount of RAM?
I have a a GeForce GTX 980 Ti and a GeForce GTX 760 installed in my system. I just tried rendering a scene that I know will render fine with the GTX 980 Ti but when I render with both cards I get "CUDA error: Out of memory in cuArrayCreate(&handle,…
Bryson Jack
- 3,757
- 11
- 38
- 51
3
votes
1 answer
Enable experimental features Blender 2.72
I'm using Blender 2.72 and I want to enable the experimental feature set, which supports OpenCL for my AMD/ATI graphics card (I know what the risks are of doing this). Googling it, I found a Blender Wiki page of 2.6x which said I have to change the…
Creator13
- 221
- 1
- 3
- 5
3
votes
2 answers
Blender 2.8 can't find my intel gpu
I was following a tutorial on YouTube when I realized that in render mode GPU is clickable but does not work. I tried doing what i saw online but couldn't find my GPU in the system preference. The information about my PC is: Intel(R) Core(TM)…
UnSpecific Coding
- 31
- 1
- 2
2
votes
1 answer
2 GPU different technologies Maxwell and Pascal Rendering Together
is it possible to use 2 gpus with different technologies for final render. specifically Maxwel Titan X and Pascal Titan XP.
Felipe
- 21
- 1
2
votes
0 answers
Can't select AMD GPU running Ubuntu
I'm currently running Xubuntu 16.04 and I would like to use my 2 GPU's for rendering in Blender. They are currently unused and my CPU is doing all the work. But what have I got my GPU's for anyway?
So, I tried the tutorials here on stack exchange…
Oxy Synth
- 121
- 2
2
votes
1 answer
Towel hairs taking up too much gpu memory
I have an Nvidia GeForce GTX 750 GPU with two gigabytes of memory. I followed Andrew Price's Tutorial on how to make realistic towels in Blender, and I was only able to do 700,000 hairs (before my GPU ran out of memory), instead of the 1,000,000…
Anson Savage
- 3,392
- 3
- 31
- 67
2
votes
0 answers
Rendering frames headless parallelly on a A10 GPU
I am trying to render 4 frames in parallel using a blender ".blend" file. I am using blender CLI 3.4 . There is only one camera in the scene. when I try to render multiple frames using animation command they render one after other.
Animation…
Nikhil Saini
- 21
- 2
2
votes
0 answers
How do I get blender to render using GPU on google colab?
I've been trying to use Google Colab to render animations but I can't get it to render using the graphics card. I change the hardware accelerator to GPU, but it doesn't use the GPU to render it only uses the RAM, I think this is the case because it…
Shady Ayman
- 21
- 1
2
votes
1 answer
Can you use 2 non SLI GPUs in Blender?
Can you use 2 non SLI GPUs in Blender with 2 different architectures an example
a geforce gtx 560 and a quadro fx 4000.
SupaKoopaTroopa64
- 821
- 1
- 6
- 19