GTX 1660 Super enough for now?

Wicked OneWicked One Posts: 220
edited April 2020 in The Commons

Hello all! I'm looking to buy a new PC soon so as to upgrade from an older PC that runs on an GTX 760. My goal/budget is fit for a prebuilt with an RTX 2060 Super, however after hearing about the new 30xx cards that are said to be coming out later this year, I'm a bit unsure whether I should just go with something that has a GTX 1660 Super/Ti for the time being, save the price difference and then wait for the new cards to release. I figure if I went that route, I could either use the 1660 Super as a secondary card to browse as the other renders, or toss it in my current PC and use it for gaming when needed (Which isn't often these days), however I know that support likely won't exist for those cards for some time after they release. In reality, I'd just like to get something that will last me about as long as can be as far as Iray support goes. For reference, Iray will soon no longer be supported on my current GPU which is 7 years old at the time of writing this, so if I can get just 3-5 years out of whatever I upgrade to, I'd be happy, be it RTX or not. Any input would be greatly appreciated.

Post edited by Wicked One on

Comments

  • Richard HaseltineRichard Haseltine Posts: 104,204

    I have a 1660Super in this machine, though as a display card rather than for Iray (I have a 2080Ti for that) - it seems to be about a quarter of the 2080Ti's speed, 6GB isn't an enormouse amount of memory but I was able to use it with some test scenes (and there's always texture resizing). The latest DS Public Build (Beta) has an updated Iray which has some memory management options, though I can't say how much difference they will make.

    Do bear in mnd that a) the current lock-downs have apparently thrown release dates off and b) with the last two new generations of cards (10x0 and 20x0) there has been a wait for Iray to support them at all. There's no knwing what will happen this time, especially if a delayed release gives the developers more time to incorporate support in Iray, but it would not surprise me if we did not see Iray working on 30x0 systems before next year sometime.

  • marblemarble Posts: 7,500

    I doubt that development comes to a grinding halt during the lockdown. My son is a software developer and is working full tilt from home. He has access to corporate development systems and is hardly hampered at all. Indeed, this CV19 crisis might, at last, usher in that work-from-home corporate mentality that the company I worked for before I retired tried to promote for years. After all, DAZ is still releasing beta versions and working on the next one, presumably.

  • Richard HaseltineRichard Haseltine Posts: 104,204
    marble said:

    I doubt that development comes to a grinding halt during the lockdown. My son is a software developer and is working full tilt from home. He has access to corporate development systems and is hardly hampered at all. Indeed, this CV19 crisis might, at last, usher in that work-from-home corporate mentality that the company I worked for before I retired tried to promote for years. After all, DAZ is still releasing beta versions and working on the next one, presumably.

    True, but that assumes that the Iray team works on support before the cards start to ship - it's possible that they don't as their testers would not have access to the hardware, or that they don't want to risk a premature leak of information. Whatever the reason, it has taken time for Iray to support (i.e. work on) new hardware after the last two new GPU series so it is sensible to expect the same this time (and perhaps to then be pleasantly surprised).

  • marblemarble Posts: 7,500
    edited April 2020
    marble said:

    I doubt that development comes to a grinding halt during the lockdown. My son is a software developer and is working full tilt from home. He has access to corporate development systems and is hardly hampered at all. Indeed, this CV19 crisis might, at last, usher in that work-from-home corporate mentality that the company I worked for before I retired tried to promote for years. After all, DAZ is still releasing beta versions and working on the next one, presumably.

    True, but that assumes that the Iray team works on support before the cards start to ship - it's possible that they don't as their testers would not have access to the hardware, or that they don't want to risk a premature leak of information. Whatever the reason, it has taken time for Iray to support (i.e. work on) new hardware after the last two new GPU series so it is sensible to expect the same this time (and perhaps to then be pleasantly surprised).

    Also true but, again, the delay might just give them the opportunity to get it right by launch date instead of having the public play beta testers.

    It's also giving me more time to save for that 30xx GPU while I continue to wish for a decent amount of VRAM on the 3x70.

    Post edited by marble on
  • outrider42outrider42 Posts: 3,679

    Actually, the 2000 series cards did work with Daz at launch, at least the beta did. People were celebrating in the forums. Rob benched his new 2080. It was only the hardware ray tracing cores that did not work. Now support for that did take a while, but the cards still worked out of the box and were quite fast with pure CUDA. Then when Iray RTX launched the 2000 series got quite a boost with RT cores fully supported.

    One of the key features of the new Iray RTX was that the new OptiX 6.0 does not need to be recompiled for every new hardware release. That means it *should* work for future hardware launches. There is always a chance that a new hardware feature may not work, like the RT cores, and they may not have optimized drivers for Iray, but I believe that the cards will work otherwise. I think we are finally past this waiting game for Iray support every new generation.

    I think right now is really tough, because we just do not know what is going to happen. All rumors pointed to a mid to late 2020 launch. There have been no new rumors about Nvidia for a while, at least nothing that says there will be a delay. AMD has stated that they are still on track for all their hardware launches, though they have have not given additional info like exactly what those launches are. Nvidia uses a number of similar sources for their chips, so this could point to them being on track as well. The real question is IF Nvidia still wants to release in 2020, even if they can, they may not want to. They may be afraid that people will be reluctant to buy expensive hardware in 2020. However, if AMD launches, then Nvidia will not sit around and wait if they have a chance to counter, especially if AMD has this monster "Nvidia Killer" that rumors are freaking out over, which is claimed to be faster than the 2080ti. If that is true, then Nvidia would be absolutely pushing to top that ASAP, they would never want AMD to hold the performance crown for very long, if at all.

    The 1660 Super is certainly a big upgrade from a 760. You are used to dealing with that old card, so this would probably make you quite happy even if it is somewhat VRAM limited...you are used to only 2GB. So 6GB would feel great. Besides, the vast majority of high end cards only have 8GB, just 2 more than that anyway. If I bought a 1660 Super, I would not just have it drive the monitor. I would have it rendering along side whatever else I had, as long as I knew it would fit that 6GB VRAM. If I am making a large scene, then I would uncheck it. If the scene is under 6GB, then hell yeah, use it! CUDA stacks pretty well. You would be very surprised at how much faster 2 cards can be. A 1660 Super plus a potential 3060 would be pretty respectable, just a pure guess, but that would certainly nudge render speeds up a class or even two, as if you have a 3070 or 3080...with 6GB. I can remember that two 1070's together were about the same speed as a 1080ti. 

    I can understand the desire to upgrade. BTW, the benchmark thread is in my signature, so you can compare cards there. You can also test yours on it, and get a very good idea of just how much faster a 1660 Super or 2060 Super would be versus your 760.

  • marblemarble Posts: 7,500

     

     I would have it rendering along side whatever else I had, as long as I knew it would fit that 6GB VRAM. If I am making a large scene, then I would uncheck it. If the scene is under 6GB, then hell yeah, use it! CUDA stacks pretty well. 

     Trouble is, I'm not aware of any reliable way to determine the VRAM requirements of a scene. If it renders then there's enough and if it dumps to CPU there isn't. For someone who already owns a certain GPU there are utilities like GPU-Z but if I am a prospective GPU buyer and I have a typical scene, I'm not sure how to determine whether 6GB (or 11GB) would be enough.

  • fred9803fred9803 Posts: 1,564

    I used a GTX960 for a long time before I got a RTX2080. No problems except it took much longer to render in those days and my scenes needed to be fit the VRAM.

    You're asking a question that nobody can really answer - do I upgrade now or wait.... the question every gamer and renderer asks themselves. There's no perfect time to do it so it's up to you. Note that a 30xx card will cost you at least 6 times as much as your GTX 760. If you've got the money for that then personally I'd wait a bit until the new ones come out.

  • Wicked OneWicked One Posts: 220

    Thanks for all the replies! I was actually pricing out components to see what I could squeeze into my budget, and lo and behold it would appear that the RTX 2070's are actually going for less than the RTX 2060 Supers where I am(!), however buying a prebuilt with one definitely shoots past my budget, so I'll likely stick with the original plan of buying a 1660 Super/Ti prebuilt, and then swap/add on a RTX 2070. If the last previous generation is anything to go by, I'll feel more comfortable buying a xx70 varient than a xx60 with new cards so (relatively) close to release. Thanks again!

  • Wicked OneWicked One Posts: 220
    edited May 2020
    snip

    I actually went and upgraded my current PC. While my CPU is on the older side, I slapped on a new cooler on it as well. Now all I need to do is upgrade my ram and I should be good to go. Well, I kind of already am, since I haven't any any issue thus far with only 8GB of RAM, but being that it's not too costly these days, why not? With that said, would 16GB suffice? Keep in mind, I already optimize my scenes out of habit from operating on much lower specs.
    Edit: Forgot to mention, I went with an RTX 2060. It's only the 6gb varient, but jeeze does it fly through renders like butter!

    Post edited by Wicked One on
  • outrider42outrider42 Posts: 3,679

    Nice to hear you are happy. That's what matters. The 2060 is a solid upgrade since it has some RT cores.

    I think 16GB RAM would be totally fine. I don't think you would be able to push that when the GPU has just 6GB. And if you haven't had any issues with 8GB of RAM, then you may not even need that.

  • kenshaw011267kenshaw011267 Posts: 3,805
    snip

    I actually went and upgraded my current PC. While my CPU is on the older side, I slapped on a new cooler on it as well. Now all I need to do is upgrade my ram and I should be good to go. Well, I kind of already am, since I haven't any any issue thus far with only 8GB of RAM, but being that it's not too costly these days, why not? With that said, would 16GB suffice? Keep in mind, I already optimize my scenes out of habit from operating on much lower specs.
    Edit: Forgot to mention, I went with an RTX 2060. It's only the 6gb varient, but jeeze does it fly through renders like butter!

    16Gb of RAM is a good baseline for any modern computer. If the CPU dates from the same time as a 760 it is likely DDR3 which means you'll have to buy more on the used market.

  • Wicked OneWicked One Posts: 220
    snip

    I actually went and upgraded my current PC. While my CPU is on the older side, I slapped on a new cooler on it as well. Now all I need to do is upgrade my ram and I should be good to go. Well, I kind of already am, since I haven't any any issue thus far with only 8GB of RAM, but being that it's not too costly these days, why not? With that said, would 16GB suffice? Keep in mind, I already optimize my scenes out of habit from operating on much lower specs.
    Edit: Forgot to mention, I went with an RTX 2060. It's only the 6gb varient, but jeeze does it fly through renders like butter!

    16Gb of RAM is a good baseline for any modern computer. If the CPU dates from the same time as a 760 it is likely DDR3 which means you'll have to buy more on the used market.

    Not necessarily, there's still a few shops around here that have plenty collecting dust. Mind you, this is all so as to save for later this year, when all the new fancy stuff comes out. lol

Sign In or Register to comment.