How can I stress test an installed second video card?

I got my GTX 1080 and I would like to stress test it as I don't have use of it with DAZ Studio as I only have the General Release. The problem is that it is installed as the second video card. I tried FurMark and Kombustor and I didn't see any way to select individual cards or even get them to stress test the second video card.

Comments

  • srieschsriesch Posts: 4,243

    I'm unfamiliar with the tools you mentioned, but if they work can you just temporarily remove the other graphics card so this is the only one, forcing it to get used in the test?

  • In GPU-Z on the graphic card tab you can click on the question mark (next to bus interface) to do a mini stress test but perhaps you are looking for a better test than that

     

  • nDelphinDelphi Posts: 1,877

    The last thing I want to do is remove cards, let's just say that I had to rearrange stuff to get that GTX 1080 in the case.

    I tried GPU-Z but no matter what card I have selected the test is done on the primary card, my GTX 960. It is the same behavior I am seeing with the other stress test tools. The card is visible to the OS, DAZ Studio, etc. Even the stress test applications can see it and report on it. I am baffled.

     

  • nicsttnicstt Posts: 11,715
    edited November 2016

    Why do you want to?

    Stress-testing is done by rendering, it is presumably what you're using it for. You get realistic data then from actually useage.

    GPU-z gives good info, whilst you're rendering.

    But, if you want temperatures, I'd use something like Hardware Monitor, http://www.cpuid.com/downloads/hwmonitor/hwmonitor_1.30.exe

    There are benefits to stress testing, but there are also risks; I tend to avoid for my own use.

    Post edited by nicstt on
  • nDelphinDelphi Posts: 1,877

    Because I am not using it at the moment and wanted to make sure there are no issues with heat. I am not going to install the beta of DAZ Studio, so it will be some time before it gets used.

  • nicsttnicstt Posts: 11,715

    Fair enough. :)

    But you're going to have to install something, and testing it with HWMonitor and Daz Beta gives you real world data; it will be what you're using it for so relevant to you, not a collection of generic tests, and you can uninstall beta after you're done.

    Otherwise, you already have some ideas.

  • linvanchenelinvanchene Posts: 1,382
    edited November 2016

    Because I am not using it at the moment and wanted to make sure there are no issues with heat

    Maybe you can get some idea about how your system performs when you run the OctaneBench while at the same time having a look at the temperature in GPU-Z?

    It may not run that long. But at least you get a temperature estimate and system check with all available GPU.

    - - -

     

    OctaneBench is the official software from Otoy to test how all your GPU perform.

    Officially the OctaneBench application was not yet updated to test Pascal cards.

    Unofficially some savy forum users moved some files around and now you can also test Pascal cards...

    The average performance of a "standard" GTX 980 equals 100 score.

    If your 1080 scores around 130 it means its around 30% faster than a GTX 980.

     

    - - -

    Quick guide:

    - First read how you would install and run the OctaneBench test normaly here:

    https://render.otoy.com/octanebench/

    - Then perform the same steps with the unofficial version:

    https://render.otoy.com/forum/viewtopic.php?f=9&t=56108

    - Download .rar file attached in the first post by unica of the linked thread

    - extract .rar file

    - Double-click the script "_run_benchmark.bat" in the Windows Explorer.

    No need to start the included OctaneRender test version...

    - Select the installed GPU you want to test

    -> Now a series of test scenes will render

    After the test scenes have finished rendering you get your score.

    - - -

    Alternatively:

    Wait a few days, weeks or months until the official update for OctaneBench is released.

    - - -

     

    OctaneBench pascal test v1001.jpg
    1360 x 768 - 189K
    OctaneBench score 360.jpg
    1360 x 768 - 229K
    Post edited by linvanchene on
  • SixDsSixDs Posts: 2,384

    Can you not simply disable your primary card temporarily in Control Panel so that the system defaults to the new card? That is easier and quicker than physically removing anything, and when the testing is done you can re-enable the primary card.

  • nDelphinDelphi Posts: 1,877

    Thanks all. I will look into these suggestions.

  • bad4ubad4u Posts: 684
    edited November 2016

    No need to actually disable 960, just set 1080 to primary display temporary and plug monitor into it. As long as you don't hit the 8GB limit (minus running Windows desktop) when rendering you can even leave it primary and it will add the 960 automatically for rendering as long as the scene file fits into the 4GB limit too - vice versa it probably hits the 4GB limit far sooner if that card runs Windows desktop too, not using the additional cuda cores from 960 anymore then. If your scenes come closer to the 8GB limit often it would be recommended to keep the 1080 free from running Windows desktop, but I guess you more often reach the 4GB limit than the 8GB (once Pascal support is added to release version, and until then you free some extra memory from 960 for rendering when using 1080 for desktop) ..

    Post edited by bad4u on
Sign In or Register to comment.