Upgrading from a GTX 1070 to a RTX 4000 series, thoughts?

So I currently have a GTX 1070 and have been wanting to upgrade for a long time now. With the cost of the RTX 3000 series over the last 2 years, I just held off. Now they are dropping in price but we have the 4000 series on the horizon so I've decided to hold out a little longer. My question is, how much of a render time reduction do you think I would get from a 4090 or 4080, going to one of those from a 1070? Right now for a good render, I'm looking at about an hour per which kind of sucks. Thoughts?

Comments

  • rrwardrrward Posts: 556
    edited July 2022

    Over the years I've upgraded from a 980ti to a 1080ti, to three 1080ti's, to two 2080ti's, to a single RTX A5000 (the Quadra version of the 3080). And boy howdy did render times go down. I average five to ten minutes for a render. Part of that is the pure horsepower of the card, and part of that is the RTX cards allow for use of the noise filtering system, which greatly reduces the iteration counts needed to create a good image. I can't give you exact time differences, but I can safely say that you won't be catching up on Seinfeld reruns between renders any longer. 

    Post edited by rrward on
  • Matt_CastleMatt_Castle Posts: 2,729

    We don't know much about the specifications of the 40 series cards yet, so this will definitely be a guess.

    However, what we can say from the benchmarks we do have for the current 3080/3090 level cards is that they're probably *broadly* 6x faster than your 1070, and very roughly Iray performance has doubled with each generation, so it's plausible you'd see about a 10-15x increase in render speed depending on exact specifications. (Plus probably either 16 or 24 GBs of VRAM). But take that with a large pinch of salt, and expect to have to upgrade your power supply at the same time, as rumours are they'll have a large power draw.

  • GordigGordig Posts: 10,332

    Matt_Castle said:

    We don't know much about the specifications of the 40 series cards yet, so this will definitely be a guess.

    Very much this. There's a lot of speculation about what the next line of RTX cards will offer, but practically nothing in the way of facts. The Ampere cards are a known quantity, and they're getting cheaper by the day, so if you upgrade to one of those, you can be relatively confident of what you're getting. I'm not saying you shouldn't wait for the Lovelace cards to be released, but we won't even be able to compare potential performance increases until we have some concrete details about them. 

  • outrider42outrider42 Posts: 3,679
    edited July 2022

    The Iray benchmark thread is my signature. We have all the cards from the 3000 series tested. You can download this scene yourself and run it, and this will give you a fairly direct comparison. The performance gap will not always be the same, it can depend on how you build your scenes, but the benchmark still serves as a general measure of performance and is very helpful. I think this will really open your eyes, because the new cards are so much faster than a 1070, and of course the 4000 series will be even faster.

    It is true there is a lot of speculation over what the 4000 series will be like. But this stuff is not being made up out of thin air. There are whole networks of leakers that report details during the creation of any new GPU product. While some things will certainly be wrong, one thing that is held in common is that the 4000 series will be a big leap. The 3000 series was also a big leap for Iray users (less so for gamers). The important thing to remember is what the leaks are. They can test a variety of GPU specs that they do not release. As we get closer to release the leaks get better as the products are dialed in for final production. This is why some leaks do not pan out, it doesn't mean the information was bad, it could just mean that particular item was changed before release. Even now just months before release, some key specs can change, like clock speeds and power draw. And of course, price, higher power draw means a bigger cooler and design, which means a bigger price.

    Plus we also have history. Not only the history of what Nvidia has released, but also there were leaks before many launches that accurately predicted the specs. The rumored specs of the 3000 cards were almost 100% correct months in advance of its announcement. So were the specs of the 2000 and 1000 series. It seems it is really hard to keep secrets in this industry!

    Most leakers suggest the top 4000 series will be about double the 3090 performance, but ray tracing might be even faster than that, possibly 2.4 times the performance of the 3090. And the ray tracing performance is especially key for Iray, that is why the each generation has seen such big gains.

    The 2080ti was only about 35% faster than the 1080ti for gaming, but for Iray it was almost TWICE as fast. 

    The 3090 is about 45% faster than the 2080ti at 4K gaming, but again it hit about TWICE the performance at Iray.

    So for several generations in a row Nvidia has delivered basically twice the performance as the last top GPU for Iray. I expect to see this size of a gap again, maybe even more. They are talking about the 4070 equaling the 3090. Nvidia has done this before, as the 1070 you current have is equal to the Titan Maxwell that came before it. But it is important to note that they are talking about gaming performance, while Iray performance has historically done better. So the 4070 will probably beat the 3090 easily when it comes to Iray. That will be very nice indeed.

    The downside is that they are also talking about delays. Because we now have a massive GPU surplus, they are trying to sell their current inventory before releasing the next generation. They might release the top cards, like the 4090 and 4080 later this year. It is possible they delay the 4070 a bit longer, and the 4060 might see an even longer delay. So waiting might mean waiting a bit longer, unless you want the higher end models.

    The good news is that prices of the 3000 series should keep falling because of the stock levels. This is only the beginning, I believe we will see a pretty big price crash in next month or two, and when the 4000 series finally does get announced, prices will free fall. That could make it tempting. Since you have a 1070, I can tell you that any 3000 series card is an upgrade. Even a 3050 would be a decent boost, that is how far things have come. But 3050s are still inflated at this time. However as I said prices should be coming down in time fast. 

    I think the 3090 in particular will be at a shockingly low price when the 4090 get announced. Considering it has 24gb, and the 4070 will only have maybe 10gb, IMO that makes the 3090 far more attractive. I say this because I believe the 3090 will eventually drop to almost the price the 4070 may be selling for. To be fair, the 4080 is supposedly going to pack 16gb, which is a decent bump. So a 4080 could be a great option, while being a good bit faster than the 3090.

    Post edited by outrider42 on
  • kwerkxkwerkx Posts: 105

    I think the 3090 in particular will be at a shockingly low price when the 4090 get announced. Considering it has 24gb, and the 4070 will only have maybe 10gb, IMO that makes the 3090 far more attractive. I say this because I believe the 3090 will eventually drop to almost the price the 4070 may be selling for. To be fair, the 4080 is supposedly going to pack 16gb, which is a decent bump. So a 4080 could be a great option, while being a good bit faster than the 3090.

    This.

    My perspective focuses on VRAM.  If the scene doesn't fit in the card's VRAM, it renders on the CPU instead.  I understand and respect the benchmarks for these cards; but GPU performance is only applicable if the scene fits in the GPUs VRAM.  So, I would focus on cards with more VRAM even if it sacrifices performance.

    FWIW: Going from a 1050ti (..I think) laptop GPU to a desktop w/ a 3080, my renders varied between a couple minutes to my 2 hour max setting (wich was very dissapointing).  I learned (eventually) that it I was maxing out my VRAM.. Mostly because I'm dense (for example my definition of 4k was way off) but also because I confinced myself that my scenes were nothing for such a fancy GPU.  

    I agree with @outrider42 quote above.  For a render machine whose first obstical is having enough VRAM to allow a GPU render, that 3090 might be the sweet spot between capability and price.

  • nicsttnicstt Posts: 11,715
    edited July 2022

    I think it is reasonable to assume that there might be a 20% improvement for the 4000 cards; I hope a lot more, and can't see it being much less.

    20% seems ok, if the price isn't crazy; AND if the power consumption isn't crazy; I've been using a 1200W psu for years, and am not keen on upgrading it; it works fine!

    I paid £1700 for my 3090 and am happy enough with it; works great in Blender too.

    Post edited by nicstt on
  • outrider42outrider42 Posts: 3,679

    kwerkx said:

    I think the 3090 in particular will be at a shockingly low price when the 4090 get announced. Considering it has 24gb, and the 4070 will only have maybe 10gb, IMO that makes the 3090 far more attractive. I say this because I believe the 3090 will eventually drop to almost the price the 4070 may be selling for. To be fair, the 4080 is supposedly going to pack 16gb, which is a decent bump. So a 4080 could be a great option, while being a good bit faster than the 3090.

    This.

    My perspective focuses on VRAM.  If the scene doesn't fit in the card's VRAM, it renders on the CPU instead.  I understand and respect the benchmarks for these cards; but GPU performance is only applicable if the scene fits in the GPUs VRAM.  So, I would focus on cards with more VRAM even if it sacrifices performance.

    FWIW: Going from a 1050ti (..I think) laptop GPU to a desktop w/ a 3080, my renders varied between a couple minutes to my 2 hour max setting (wich was very dissapointing).  I learned (eventually) that it I was maxing out my VRAM.. Mostly because I'm dense (for example my definition of 4k was way off) but also because I confinced myself that my scenes were nothing for such a fancy GPU.  

    I agree with @outrider42 quote above.  For a render machine whose first obstical is having enough VRAM to allow a GPU render, that 3090 might be the sweet spot between capability and price.

    VRAM is indeed a key thing to look at, but there is still more to it than that. You can run out RAM before filling up your VRAM. So it is also possible to own a 3090 with 24GB but not be able to actually use all of that memory. Even 64GB of RAM might not be enough to hold a scene that uses 24GB of VRAM. At which point, the 16GB cards make more sense, because there is a better balance between VRAM and RAM.

    But of course, RAM can usualy be upgraded fairly easily, while VRAM cannot.

    So there are multiple things to consider here, and ultimately every user has to make these choices for themselves. For some people, maybe 8 or 10GB is all they really need because they are rendering light scenes. Plus you can often optimise a scene to make it fit if need be. And while that might suck, if that means saving a few hundred dollars on hardware, that might make sense. If you make money doing this, then it also makes sense to spend some money for the beefier product.

    This will certianly be an interesting thing to see. We have never had a 24GB consumer card before, and we are looking at a generation where a x70 class could match or even beat that card in performance. So how consumers value the 3090 going forward is kind of unknown. It is possible that people who make content who want that VRAM could help prop the value of the 3090 up, but if gamers are indeed the primary market, they will probably not care so much and the value can drop to where the 4070 prices are.

  • There is an interesting article about some leaked 4090 performance at TomHardware

    The leak is more focused on Gaming Performance, and Outrider42 has already made a case for, historically, Gaming Gains lag behind Rendering Gains... so if the Gaming Gains are big, the Rendering Gains/Iray gains should be significant


    In A nutshell...

    Nvidia's upcoming GeForce RTX 4090 cracked the 15,000 point barrier in 3DMark Time Spy Extreme by a long shot, with a record-breaking graphics score of 19,000 points in the famous benchmark. Furthermore, if Kopite's news is accurate, it puts the RTX 4090's performance barrier well ahead of anything available today, including RTX 3090 Ti GPUs chilled on liquid nitrogen.

    For some perspective, the current reigning champion of the 3DMark TimeSpy Extreme benchmark is user "biso biso," with an LN2 cooled EVGA RTX 3090 Ti Kingpin Edition graphics card punching out a world recorded graphics result of 14,611 points.

    And a bit further in the article

    Current rumors speculate that the top die for the 40 series, AD102, will pack 71% more CUDA cores and SMs than Ampere's top die GA102. In addition, it will reportedly feature similar or higher clock speeds compared to RTX 30 series, thanks to a more efficient TSMC 5nm process. We don't expect the rest of Nvidia's 40 series dies to pack the same core count increase, but they all are expected to pack a lot more cores if 71% is the ceiling for the 40 series.

     

    So... if the price can be "reasonable", this looks to be an outstanding GPU for Rendering.... 

  • kyoto kidkyoto kid Posts: 41,385

    ..yes, but will it run Crysis at 8K?

  • nicsttnicstt Posts: 11,715

    kyoto kid said:

    ..yes, but will it run Crysis at 8K?

    Will it run Crysis at 1080p ? :D

  • marblemarble Posts: 7,500

    outrider42 said:

    ...

    I think the 3090 in particular will be at a shockingly low price when the 4090 get announced. Considering it has 24gb, and the 4070 will only have maybe 10gb, IMO that makes the 3090 far more attractive. I say this because I believe the 3090 will eventually drop to almost the price the 4070 may be selling for. To be fair, the 4080 is supposedly going to pack 16gb, which is a decent bump. So a 4080 could be a great option, while being a good bit faster than the 3090.

     

    If the 3090 were to to be significantly cheaper I would think about adding a second but I fear I would have to upgrade other things to accommodate it. I have an 850W PSU at the moment and I very much doubt that would suffice. I read that the 4000 series will draw a lot of power too so a bigger PSU might be required either way. Not sure my Midi tower would fit two 3090s either.

  • jmtbankjmtbank Posts: 176
    edited July 2022

    cluod1 said:

    So I currently have a GTX 1070 and have been wanting to upgrade for a long time now.... My question is, how much of a render time reduction do you think I would get from a 4090 or 4080, going to one of those from a 1070? Thoughts?

     

    My 1080ti was nearly twice the speed of my old 1070.  My briefly owned 3060 12gb about the same again.  Currently on a 3090 and its speed will vary depending on if I'm running it at 250Watts or full 350.  But maybe 2.5 times the 3060.

     

    The core count on leaks/guesses on the upcoming 4080 puts it fractionally higher than a 3090.  They rekon it may run at nearly 50% higher clock speed.  Possible rendering ipc improvemens should really make it at least 50% faster than a 3090 at the same power usage.

     

    So all added together for a 4080 vs 1070?  Erm 2x2x2.5x1.5 = A bit less than 15 times faster?

     

    I think I'd rather have the 24gb from a 2nd hand 3090 than 50% faster rendering on a 4080.

    Post edited by jmtbank on
  • I agree with you jmtbank, on the choice of more VRAM over Faster Rendering.... that is why I was looking for a 3060 to hold me over (at 12gb) until I could reasonably get a 40x0 card.

    Rumors have it that the 4080 MAY have 16gb VRAM... with the Floor on the 40x0 Models being 10gb.

    We shall see.  My guess is, by Mid/End of August, we should have an excellent picture of what the Lovelace lineup will be capable of and at what cost.

Sign In or Register to comment.