Using multiple GPUS for rendering in Iray?

Now I have a question.

How do you make Iray to use multiple GPUs to render? And how do you connect those multiple GPUs?

I have heard that even if there are multiple GPUs, Iray will still take/consider only one GPU while rendering. Strange things to learn each day :(

Thanks!

 

«13

Comments

  • fastbike1fastbike1 Posts: 4,078
    edited June 2017

    What you have heard is correct ONLY for some specific conditions.

    This advice is for Studio and Iray rendering only. It may or may not apply to other render engines.

    Just put the GPUs in the machine. They both have to be Nvidia GPUs that support CUDA (i.e. fairly recent). The GPUs do not have to be identical but there are some caveats I will explain in a bit.

    You need to be running the 64 bt version of Studio, which means a 64 bit Operating System. You want Version 4.9.3.166 of Studio (currently the latest production release) and a later NVIDIA Driver (372.90 or later will be safe. This driver is stable with Studio)

    In Studio, Render Settings Tab > Editor> Advanced make sure both gpus are selected. Studio should do this automatically, but may not. See Screen shot.

    Now for the GPUs:

    Iray will use all GPUs IF the scene will fit in the VRAM of each GPU. VRAM is NOT shared. So if one GPU is 2GB and the other is 4GB, a 2GB VRAM scene will use both cards, a 4GB scene will use just the 4GB card, and a scene > 4GB will not use any cards and be rendered by the CPU (SLOW). Etimating th amount of VRAM needed before rendering is difficult. There are methods for using less VRAM for a scene but needs a spearate discussion  (several threads are available)

    Iray will share CUDA cores for all GPUs that it can use (this is the advantage of multiple GPUs). Without geing into too many caveeats and detail, more CUDA cores mean faster renders. Not better, just faster.

    If the clock speed of the GPUs is differemt, Iray will slow the fastest card(s) to match the slowest.

    Ok?

    Capture.JPG
    514 x 646 - 36K
    Post edited by fastbike1 on
  • alexhcowleyalexhcowley Posts: 2,392
    edited June 2017

    You don't need special connectors to link the GPUs.  Iray doesn't like SLI.  I've rendered using a 1080 and my old 970.  Just plug them into the motherboard and DAZ Studio should recogise them. 

    A couple of things to bear in mind: make sure your PSU has enougth wattage to handle multiple GPUs (I use an 850 watt unit for my  two) and it also helps if you've got a large, well ventilated case.

    Cheers,

    Alex.

    Post edited by alexhcowley on
  • Oso3DOso3D Posts: 15,053

    I have two GTX 970s. They work fine with no fuss.

     

  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited June 2017

    How do you make Iray to use multiple GPUs to render? And how do you connect those multiple GPUs?

     

     

    It is very simple;

    1. Install GPUs
    2. Tick on GPUs in render settings
    3. Done.

    As in the old commercials, "it's so simple a caveman can do it."

    -P

    Post edited by PA_ThePhilosopher on
  • BobvanBobvan Posts: 2,653
    edited June 2017

    I dont know why the dude at NCIX is against multiple cards stating conflics. Sonds like I may get a Pascal 10xx series as replacement for the 980 It would be nice to add it to the 1080ti

    Post edited by Bobvan on
  • PA_ThePhilosopherPA_ThePhilosopher Posts: 1,039
    edited June 2017
    Bobvan said:

    I dont know why the dude at NCIX is against multiple cards stating conflics. Sonds like I may get a Pascal 10xx series as replacement for the 980 It would be nice to add it to the 1080ti

    Now that I own two 1080 Ti's, I am even more of a believe in them (which says a lot, coming from a quad 780 Ti setup). Having just two of them scores a 402 on Octane bench, which is equivelent to four 980's. That's insane. 

    I've also noticed that scenes load much faster in the viewport, which is a nice bonus. And the wait-time in the viewport from basic shader to Iray is much faster. That 1900 Mhz clock just kills it (not to mention the 3,584 cuda cores).

    -P

    Post edited by PA_ThePhilosopher on
  • JamesJABJamesJAB Posts: 1,762
    Bobvan said:

    I dont know why the dude at NCIX is against multiple cards stating conflics. Sonds like I may get a Pascal 10xx series as replacement for the 980 It would be nice to add it to the 1080ti

    It sounds like the guy you are dealing with at NCIX only has experience from a video gamer perspective.  It does not sound like he knows about the amature and semi-pro 3D rendering crowd and GPU based render engines like Iray. Tell him that the dual GPU setup that you want is for Iray rendering.  If he tries disouraging you, ask if he even knows what Iray is.  He might not.

    ***Back on topic***
    Plug 2 or more Iray capable video cards into your computer.  Do NOT install any SLI bridges.  Install GPU drivers.  Open Daz Studio, ensure that all of your GPUs are shown and check boxed in the advanced render settings. Load scene, and hit render.

  • BobvanBobvan Posts: 2,653
    edited June 2017
    JamesJAB said:
    Bobvan said:

    I dont know why the dude at NCIX is against multiple cards stating conflics. Sonds like I may get a Pascal 10xx series as replacement for the 980 It would be nice to add it to the 1080ti

    It sounds like the guy you are dealing with at NCIX only has experience from a video gamer perspective.  It does not sound like he knows about the amature and semi-pro 3D rendering crowd and GPU based render engines like Iray. Tell him that the dual GPU setup that you want is for Iray rendering.  If he tries disouraging you, ask if he even knows what Iray is.  He might not.

    ***Back on topic***
    Plug 2 or more Iray capable video cards into your computer.  Do NOT install any SLI bridges.  Install GPU drivers.  Open Daz Studio, ensure that all of your GPUs are shown and check boxed in the advanced render settings. Load scene, and hit render.

    So if I dont have them installed SLI how do I ask to have them installed as you mentioned I may wind up with a better card as a replacement. My claim was accepted and it mentions if they dont have the card may be equivalent or better so if its a Pascal card I will most likely want to have it added. BTW my new 1080ti is not EVGA its a Gigabyte

    Post edited by Bobvan on
  • Richard HaseltineRichard Haseltine Posts: 104,134

    SLI is just a connector - you could ask for it to be left off, or you could let them add it and just disable it for DS in the driver (I believe).

  • Romulus71Romulus71 Posts: 144

    You can have the cards installed with an SLI bridge. When you want to use DS just open up the Nvidia control panel and disable SLI

  • BobvanBobvan Posts: 2,653
    Romulus71 said:

    You can have the cards installed with an SLI bridge. When you want to use DS just open up the Nvidia control panel and disable SLI

    Ok thanks im curious as far as the Gforce experience, control panel and drivers. Does one install different control panel for each card? or does the experience and control panel see both cards?

  • MattymanxMattymanx Posts: 6,974

    The Gforce control panel handles ALL nvidia GPUs installed in the computer.  Also, as stated, SLI is just a small bridge between the cards and it can be enabled or disabled via the Gforce control panel.  I do that all the time for gamming and then shut it off when Im done so I can use Iray.

  • BobvanBobvan Posts: 2,653
    edited June 2017

    So all driivers ect are under the experience and control panel got it if NCIX guy does not feel compantent will find someone who is, I guess I also have to make sure power supply can handle it. It all depends on what EVGA sends back.. As per nightfall if they send back a 980ti it can be useful since they still sell them on their site..

    Post edited by Bobvan on
  • SPadhi89SPadhi89 Posts: 170
    fastbike1 said:

    What you have heard is correct ONLY for some specific conditions.

    This advice is for Studio and Iray rendering only. It may or may not apply to other render engines.

    Just put the GPUs in the machine. They both have to be Nvidia GPUs that support CUDA (i.e. fairly recent). The GPUs do not have to be identical but there are some caveats I will explain in a bit.

    You need to be running the 64 bt version of Studio, which means a 64 bit Operating System. You want Version 4.9.3.166 of Studio (currently the latest production release) and a later NVIDIA Driver (372.90 or later will be safe. This driver is stable with Studio)

    In Studio, Render Settings Tab > Editor> Advanced make sure both gpus are selected. Studio should do this automatically, but may not. See Screen shot.

    Now for the GPUs:

    Iray will use all GPUs IF the scene will fit in the VRAM of each GPU. VRAM is NOT shared. So if one GPU is 2GB and the other is 4GB, a 2GB VRAM scene will use both cards, a 4GB scene will use just the 4GB card, and a scene > 4GB will not use any cards and be rendered by the CPU (SLOW). Etimating th amount of VRAM needed before rendering is difficult. There are methods for using less VRAM for a scene but needs a spearate discussion  (several threads are available)

    Iray will share CUDA cores for all GPUs that it can use (this is the advantage of multiple GPUs). Without geing into too many caveeats and detail, more CUDA cores mean faster renders. Not better, just faster.

    If the clock speed of the GPUs is differemt, Iray will slow the fastest card(s) to match the slowest.

    Ok?

    Understood and thanks!

  • namffuaknamffuak Posts: 4,264
    fastbike1 said:
    If the clock speed of the GPUs is differemt, Iray will slow the fastest card(s) to match the slowest.

    I'm not sure where this bit of mis-information comes from, but it is wrong - at the very least, not true for all environments. I have a 980 TI and a 1080. The 1080 runs at a GPU clock of 1949 MHz and memory clock of 4513 MHz; the 980 TI runs at a GPU clock of 1240 MHz and the memory clock is 3304 MHz. And these two cards will run at those speeds for the duration of therender; the 1080 does NOT slow down to match the 980 TI.

  • JamesJABJamesJAB Posts: 1,762

    That mis-information is based on SLI.
    You do not need identical brand and speed cards to run SLI, just same model
    When running in SLI, because the cards are sharing the frame renering for real time rendering they both need to run at the same speed, so it will run both cards at the speed of the slowest of the two.

    THIS IS NOT HOW IRAY WORKS.

     

    Iray just uses all compatible Nvidia cards installed in the system at the rated speed for each card.

  • Silver DolphinSilver Dolphin Posts: 1,628
    edited June 2017

    Almost everything has been covered in multiple gpu >> Large powersupply, no sli connectors and video ram and gpu speed. The most critical one with multiple cards is COOLING. Yes, you need to download MSI afterburner and take the cards off auto which only works for gaming and turn the fans to 100% and keep an eye on temps while you are rendering or you will burn up your expensive video card. Additionally, in the summer if you live in a warmer state or don't do air conditioning you need to take off the PC side panel and point a 110volt fan at your PC to get cool air to your cards. Iray is a video card killer! If you are not carefull it will cook your video card. Most video card fan profiles are made with gaming in mind, but Iray stresses the cooling on your video card like nothing else. This is something that cant be stressed enough. On a side note>> doing Iray on laptops is just going to kill your laptop really fast because laptops do not have adaquate cooling for video cards like a PC. Gaming is OK on these laptops but doing Iray on Nvidia equipped laptops is going to cook your expensive laptop.

    Post edited by Silver Dolphin on
  • BobvanBobvan Posts: 2,653

    Agreed hence why I dont render with my ROG laptop just build charcters and set up scenes. I only render with the new beast. Speaking of which I installed afterburner and crank up the fans when rendering which in tun can cool things down 10 degrees and sometimes more. Will this not wear out the card fans?

  • JamesJABJamesJAB Posts: 1,762
    Bobvan said:

    Agreed hence why I dont render with my ROG laptop just build charcters and set up scenes. I only render with the new beast. Speaking of which I installed afterburner and crank up the fans when rendering which in tun can cool things down 10 degrees and sometimes more. Will this not wear out the card fans?

    It will put more wear and tear on the fans.  They should still last at least untill the card is obsolete.  (if for some reason the fans wear our before then, keep in mind that replacing a fan is going to be extremely cheap compared to replacing the whole card)

  • BobvanBobvan Posts: 2,653
    edited June 2017

    Fair enough Im lucky that the 980 EVGA was still covered. Even if they send  980ti which they sell for 199.00 I can still look at adding it to my main system..

    Post edited by Bobvan on
  • JamesJABJamesJAB Posts: 1,762
    edited June 2017
    Bobvan said:

    Fair enough Im lucky that the 980 EVGA was still covered. Even if they send  980ti which they sell for 199.00 I can still look at adding it to my main system..

    If they inform you that they are going to send you an "equivilent" card, make sure to do your research into prices.  The current market value of your GTX 980 is irrelivent because it's still under warranty.  Go into negotiation mode citing that the GTX 980 was a $550 card when purchased, and it is still covered under the original waranty.  From a pure equivilent stand point they should upgrade you to a GTX 1080, from there you can let them haggle you down to a GTX 1070.  (a card that they currently sell for $199 should not be a waranty replacement for a card you paid $550 for)

     

    Ignore most of what was written above, just went and read EVGA's warranty.
     

    • Products sent in for RMA will be repaired and returned or replaced with a thoroughly tested recertified product of equal or greater performance.
    Post edited by JamesJAB on
  • fixmypcmikefixmypcmike Posts: 19,619
    JamesJAB said:
    Bobvan said:

    Fair enough Im lucky that the 980 EVGA was still covered. Even if they send  980ti which they sell for 199.00 I can still look at adding it to my main system..

    If they inform you that they are going to send you an "equivilent" card, make sure to do your research into prices.  The current market value of your GTX 980 is irrelivent because it's still under warranty.  Go into negotiation mode citing that the GTX 980 was a $550 card when purchased, and it is still covered under the original waranty.  From a pure equivilent stand point they should upgrade you to a GTX 1080, from there you can let them haggle you down to a GTX 1070.  (a card that they currently sell for $199 should not be a waranty replacement for a card you paid $550 for)

     

    Ignore most of what was written above, just went and read EVGA's warranty.
     

    • Products sent in for RMA will be repaired and returned or replaced with a thoroughly tested recertified product of equal or greater performance.

    Although how they define "equal or greater performance" may not match how you would, for example more CUDA cores but less RAM.

  • JamesJABJamesJAB Posts: 1,762
    JamesJAB said:
    Bobvan said:

    Fair enough Im lucky that the 980 EVGA was still covered. Even if they send  980ti which they sell for 199.00 I can still look at adding it to my main system..

    If they inform you that they are going to send you an "equivilent" card, make sure to do your research into prices.  The current market value of your GTX 980 is irrelivent because it's still under warranty.  Go into negotiation mode citing that the GTX 980 was a $550 card when purchased, and it is still covered under the original waranty.  From a pure equivilent stand point they should upgrade you to a GTX 1080, from there you can let them haggle you down to a GTX 1070.  (a card that they currently sell for $199 should not be a waranty replacement for a card you paid $550 for)

     

    Ignore most of what was written above, just went and read EVGA's warranty.
     

    • Products sent in for RMA will be repaired and returned or replaced with a thoroughly tested recertified product of equal or greater performance.

    Although how they define "equal or greater performance" may not match how you would, for example more CUDA cores but less RAM.

    The GTX 980 is a 4GB card, so anything they could realisticly see as an equivilent card will have at least 6GB of Vram
    GTX 980ti 6GB
    GTX 1060 6GB (if they try giving you a 3GB version, fight with them on that because it is slower all around in benchmarks)
    GTX 1070 8GB
    GTX 1080 8GB GDDR5X

  • BobvanBobvan Posts: 2,653

    A980ti would be acceptable due to the fact that Nightfall benchmarked a 980ti AND 1080TI at 1 minute vs my 1:59 with the 1080ti and he prolly has a slower processor which does confirm I did go overboard with my processor since its more about GPU with vram using iray..

  • JamesJABJamesJAB Posts: 1,762
    Bobvan said:

    A980ti would be acceptable due to the fact that Nightfall benchmarked a 980ti AND 1080TI at 1 minute vs my 1:59 with the 1080ti and he prolly has a slower processor which does confirm I did go overboard with my processor since its more about GPU with vram using iray..

    You did not go overboard... Just think of it as future-proofing.
    Look at the desktop in my signature.  Dual Xeon X5570 CPUs.  It's a pair of quad core Xeons based on the 1st generation core i7 architecture.  It is still a fast machine by todays standards (though I'm looking at upgrading to a pair of six core Xeon X5670 or X5675 CPUs for soewhere around $100 - 130).

    You have just made sure that your CPU will still be considered fast in 3 or 4 years.

  • BobvanBobvan Posts: 2,653
    edited June 2017

    Fair anough at the end of the day I have vast improved times. I benchmaked my ROG 980M laptop for sh*its and giggles over 5 minutes.. As far as the 980 card goes good thing it came with a 3 yr warranty.

    Post edited by Bobvan on
  • JamesJABJamesJAB Posts: 1,762

    Here's what my "little" notebook does

    Core i7-3840QM @ 2.80 GHZ (4 cores 8 threads) 16GB RAM
    Nvidia Quadro K5000M 4GB RAM (core config equivilent to GTX 680M)

    12 minutes 38 seconds - CPU/GPU (OptiX on) (CPU holds @ 3.31 GHZ boost throughout entire render)
    15 minutes 20 Seconds - GPU only (OptiX on) (GPU holds at a cool 71c under full load)

     

    I miss the old days when XFX still made Nvidia cards.  They had a "Double Lifetime Warranty" on their cards, It was covered for as long as the first and second owners oned it (Provided that you register the card on their site)  I bought their passively cooled Geforce 7950 GT back in 2006 and used it for a good four years before giving it to a friend in need, and they used it for at least another 4 years.  (eight years out of a high end gaming card with no fan)

  • BobvanBobvan Posts: 2,653

    Specs of laptop with GTX980M which is weaker then a full 980 for what I was told

    75d516afcc8c2653e343e47906873b.jpg
    543 x 271 - 28K
  • JamesJABJamesJAB Posts: 1,762

    It is more in line with the desktop GTX 970.  Well slightly less powerfull, but better.

    The 970 has an amount of ram that does not line up with it's core config so the last 512MB of ram have a terible 32bit bus   (It's ram is configured as such: 3.5GB @ 224bit and 0.5GB @ 32bit)

    Your GTX 980M has a 256bit interface and OEMs can choose 4 or 8GB configurations.

  • Nyghtfall3DNyghtfall3D Posts: 799
    edited June 2017
    Bobvan said:

    A980ti would be acceptable due to the fact that Nightfall benchmarked a 980ti AND 1080TI at 1 minute vs my 1:59 with the 1080ti and he prolly has a slower processor which does confirm I did go overboard with my processor since its more about GPU with vram using iray..

    Yup, much slower:  3.5 GHz Core i7-4770K.

    On the bright side, your CPU will absolutey rock LuxRender, should you decide to give Reality another whirl.  ;)

    Post edited by Nyghtfall3D on
Sign In or Register to comment.