GTX 1080 Ti and GT 730 Driver incompatibility

SzarkSzark Posts: 10,634
edited November 2018 in The Commons

I wonder if anyone has come across this. GT 730 drivers won't go past 391.35 and the GTX is up to 411.94 all good. But Windows has decided that there is an issue with the GT 730 and stopped it working citing Code 43.When I try to update the drivers it resets the 1080's drivers to 391.35.

After some back and forth I have come to the conclusion the New and the Old don't like each other and I cant have both.

So if that is the case then buying a smaller GTX should fix the problem? Allowing me the have the 1080 as a dedicated rendering GPU.

Which comes to the next part of how to set the default start up card for the monitor when I do get a new small card. I am pretty good when it comes to Googling stuff but it isn't happening for me today. :)

 

BTW I have an Asus B250 Plus MB and nothing in the bios to set the default card, well not that I could see or found in the manual. 

 

Post edited by Szark on

Comments

  • SzarkSzark Posts: 10,634

    I found the answer to the first part. Both cards should be the same generation so I will have a get another GTX, small one.

     

    No begs the Q of how to set up a defualt monitor GC, on boot up. 

  • outrider42outrider42 Posts: 3,679
    Cards don't have to be same generation, just supported. Here is what happened. Some 730's are based on Fermi, and Nvidia ended Fermi support earlier this year so they will not get anymore updates. Some 730's are Kepler, and not effected. That is a problem with some of the lower end GPUs out there, you don't really know what Nvidia may have put in them, because they have so many variations. The same goes for the 1030, I think some are Maxwell based instead of Pascal. And there is a particular 1030 you need to avoid at all cost, one that uses DDR memory instead of GDDR. This change is not marked very well on the package, but it greatly reduces performance. In some cases to HALF. So if you are looking for a 1030 to replace the 730, be careful!

    Keep in mind that no matter what, Windows will reserve some memory on all GPUs installed, even if not plugged up to display. So your net gain may be small.

  • namffuaknamffuak Posts: 4,264
    Szark said:

    I found the answer to the first part. Both cards should be the same generation so I will have a get another GTX, small one.

     

    No begs the Q of how to set up a defualt monitor GC, on boot up. 

    As I understand it, for windows, the default monitor will be on the first video card found. But I'm not sure if that really applies on an EFI bios. I've got both my monitors running on  the card in slot 1, with the default running vga and the second attached by hdmi.

    Anyone running a monitor on the second or third card?

  • SzarkSzark Posts: 10,634

    yeah that sound right to me but hey I don't know much about the hardware side of things. I know a little but not a lot. :)

     

    I just have one Monitor.

  • outrider42outrider42 Posts: 3,679
    edited November 2018
    Just leave the 1080ti unplugged if you are not using it. Windows 10 uses plug and play, meaning video connections are always ready to be plugged in or removed. This is also why Windows reserves a little VRAM on each card installed.

    Otherwise, display options are in Windows Control Panel. You can chose your display there. But in general, like was said, the first card found is made display by default. This is the first card slot. Small card in slot 1, 1080ti in slot 2.
    Post edited by outrider42 on
  • SzarkSzark Posts: 10,634
    edited November 2018

    OK but my PCIE3x16 (x16 mode) is slot one. That is where I will get full speed out of the 1080

    PCIE3x16 (x4 mode) is slot 3 and I have to have my old GT 730 in a PCI slot why down the bottom of the MB. 

    I did find that the PCI slot that I have my old card in could have shared resources with the main PCIe3 x16 slot. SO I will try moving it right to the bottom PCI slot and see what happens.

     

    So yeah if I can get Window to leave the old card alone I might get both working. At the moment it is no biggie but it would be nice to see. :)

     

    Thx outrider

    Post edited by Szark on
  • outrider42outrider42 Posts: 3,679
    edited November 2018

    PCIe has no effect on rendering at all. Puget Systems tested this by running cards in x8 and x16 and saw no difference. I don't believe x4 makes a difference either. So don't worry about that. It makes a difference in games and perhaps some other software, but Iray plays by different rules.

    You could even plug a 1080 into a PCIe 2.0 on very old motherboards and still get the same rendering speed as you would on a PCIe 3.0.

    I don't know exactly why, but I believe it is because of how Iray loads the entire scene onto the GPU. Once the scene is loaded, there is little real data transfer going on as the GPU runs its calculations. Even in multiple GPU systems the PCIe made little to no difference. In multiple GPU systems the trend indicated that more CPU cores made rendering faster by a small amount. For example, a high core Xeon machine could render faster than a i7 with same multiple GPUs. Again, this was only when multiple GPUs are used in rendering, which you are not doing.

    Post edited by outrider42 on
  • Putting a GPU into a x4 slot would mean it would take a little longer to load the scene onto the card but we're talking very very little more time.PCIE gen 3 can transfer just under a gigabit per second per lane so a x4 slot can transfer about 3.6 gigabits a second or something close to 450 Mb/s. So to transfer an entire 11 Gb scene over a x4 connection would take just over 20 seconds plus whatever housekeeping was involved versus roughly 5 seconds on a x16 connection. I guess if you were running a really massive batch render that difference could add up but...

  • SzarkSzark Posts: 10,634

    Things are moving off track here and moving away from my original q, so thx for the help so far. I will put the old gc if the bottom slot and see if it works just for the Monitor and nothing else. :)

Sign In or Register to comment.