OT: New Nvidia Cards to come i…
Daz 3D Forums > General > The Commons>OT: New Nvidia Cards to come i…
You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
...and if you are someone in our little pond who creates large, involved, high quality scenes, the lack of a card with a reasonable amount of memory (16 GB & >) is highly problematic. As I mentioned for the cost, why couldn't they add even 1 GB to the 2080 Ti, if not 5 nd give the standard 2080 10 GB? I saw the Titan V as pretty much a dead end particularly in light of the forthcoming (now present) developments as it didn't get an increase in memory and doesn't have NVLink capability to warrant an 1,800$ price increase over it's already overpriced predecessor, theTitan Xp (the then 700$ 1080Ti performed just as good if not better than the Xp in several categories).
If hardware acceleration will not be implemented for Iray, then yes it will not only raise a few hackles but make people further question the price of the consumer grade RTX series.
It could also be a matter for Daz3D which has pretty much hitched its future to Iray. As Outrider mentioned, what if Iray stops being supported? What of all that content and resources using in the store and in development which uses Iray only materials?
True there is Octane, but like LuxRender, it means artists will have to convert all materials for every scene they create (I don't see Daz releasing Octane material kits anytime soon). With implementation of Vulkan, Octane will also be supported on both Nvidia and AMD GPUs. Don't see that happening with Iray which is strictly CUDA based as Nnidia would be shooting themselves in the foot if they did so.
Move to Pro Render? Well that would stir things up a bit here as so many already have purchased or upgraded their Nvida GPUs because of Iray. As I don't do games, I never had the need for a powerful high memory GPU card until Iray was introduced, and then effectively was deemed Daz's primary render engine with 3DL taking a back seat.
Maybe finally updating 3DL? Wowie's aweShading Kit for 3DL hit the store two days ago, and 3DL development has some interesting things in the works.
Hard to say where this will all go, however I do know one thing, an RTX card is definitely not in my future (unless I get a windfall and can afford a couple RTX5000s with NVLink)
agreed, including the stuff about daz and PA products I was thinking and did not mention. As it is right now, I think the unattractive pricing on RTX cards is to sell off old GTX10 stock that didn't sell during the mining craze. For us non-gamers, well, for us that gaming is not the primary purpose of our computers, it is even worse due to the loss of memory. I do see Iray having a good use, just not for complex large scenes. 3DL just has that little bit going for it, it is kind of easy to add memory DIMMs to a motherboard, not so easy to add memory to a graphics card, lol.
I was not totally sold on Octane when I looked at it years ago, because it required post-processing to fix the speckles sprites whatever them things are, at least Iray is a tad better with that. I honestly don't even know where to begin with what will happen in the future of 3D rendering engines, it's a tad beyond what I have time to follow now as it is, lol. There is so much potential for cool stuff, and some scary possibilities as well for PAs and artists alike.
As for games, the RTX cards are already raising voices. You are essentially paying a tier up for an equivalent card that will deliver lower FPS with RT on. It's Hairworks 2.0 in a nutshell, and most gamers want more consistent higher FPS, not pretty shadows and reflections at a lower FPS, lol. I think Andrew and Steve at GN put it best, the people that are most excited about RT, are the people Nvidia is not trying to sell the cards to, lol. Let's just hope that whatever Daz3d does, it works with our old clothes, lol.
...yeah waiting until I get my benefit cheque next month to pick up Wowie's aweShaders Kit. I watched much of the development over on the 3Delight Lab thread and ccan't wait to get it.
With 24 GB of memory in my primary render system and 32 in my secondary system (need to figure out how to attempt network rendering on 3DL between the two), that would give me a total of 56 GB of memory (well 54 after Windows) to throw at 3DL as well as Carrara render jobs. If I could figure out how bring the CPUs into play, I'd have 20 total CPU threads for 3DL (Carrara already allows it).
My experiments with IBL Master were very fruitful and I felt I came very close to a sense of realism (had to fake the bounce lighting though but the test scene still rendered in far less time than with UE)
...well Octane4 will have a much more affordable subscription track that will get quicker updates. That has me interested.
...and it's Not just GPU components but system memory and other components will be affected by the tariffs as well.
If I was Daz, I would start looking towards somekind of Octane intergration into Studio after the Octane 4 launch, unless of course, they have somekind of inside knowledge regarding Iray's future which we are not aware of.!!
Ok I have something I do not understand. I'm trying to test the memory and my new dual 2080 Ti with NVLink in a stress test.
So I loaded the largest scene I could think of, Airport Island with all the props, and Airport Island City Centre with all the props.
https://www.daz3d.com/airport-island-airport
https://www.daz3d.com/airport-island-city-center.
I first loaded everything, worked fine in texture mode.
Then I decided to switch to Iray mode.
This is what happened:
According to GPU Tweak II, inTexturemode I used 56% of the memory.
According to GPU Tweak II, in Iray mode I used 151% of memory.
What does this mean?
Is GPU Tweak an unreliable reader of memory, bugged?
Or did the cards decide to stack the memory?
I'm tempted to say no, provided the cards have 11GB each, and 6GB is quite shy of that.
It may be reporting 6GB on each card as a total of 151% of one card's worth of memory, an instrumentation error.
Right! I'm doing another test with an even larger scene and using windows Task Manager instead.
And you seem to be right.
As I said I used Taskmanager instead and decided to render the scene. So i attached the Log for this aswell.
Well, I can't seem to be able to upload the Log, But it says it rendered with CPU.
Which isn't surpricing, as I wieved in Iray when I started to render.
Here's the log for above render
2018-09-30 10:11:40.069 Iray VERBOSE - module:category(IRAY:RENDER): 1.5 IRAY rend progr: CUDA device 1 (GeForce RTX 2080 Ti): Processing scene...
2018-09-30 10:11:40.083 Iray VERBOSE - module:category(IRAY:RENDER): 1.8 IRAY rend progr: CUDA device 0 (GeForce RTX 2080 Ti): Processing scene...
2018-09-30 10:11:40.424 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.4 IRAY rend error: Unable to allocate 1218423948 bytes from 622376550 bytes of available device memory
2018-09-30 10:11:40.432 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.9 IRAY rend error: Unable to allocate 1218423948 bytes from 680643788 bytes of available device memory
2018-09-30 10:11:40.440 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.5 IRAY rend error: CUDA device 1 (GeForce RTX 2080 Ti): Scene setup failed
2018-09-30 10:11:40.442 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.5 IRAY rend error: CUDA device 1 (GeForce RTX 2080 Ti): Device failed while rendering
2018-09-30 10:11:40.450 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.8 IRAY rend error: CUDA device 0 (GeForce RTX 2080 Ti): Scene setup failed
2018-09-30 10:11:40.456 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.8 IRAY rend error: CUDA device 0 (GeForce RTX 2080 Ti): Device failed while rendering
2018-09-30 10:11:40.457 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray WARNING - module:category(IRAY:RENDER): 1.8 IRAY rend warn : All available GPUs failed.
2018-09-30 10:11:40.458 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray WARNING - module:category(IRAY:RENDER): 1.8 IRAY rend warn : No devices activated. Enabling CPU fallback.
2018-09-30 10:11:40.464 WARNING: ..\..\..\..\..\src\pluginsource\DzIrayRender\dzneuraymgr.cpp(304): Iray ERROR - module:category(IRAY:RENDER): 1.8 IRAY rend error: All workers failed: aborting render
2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU: using 8 cores for rendering
2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : Rendering with 1 device(s):
2018-09-30 10:11:40.464 Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : CPU
2018-09-30 10:11:40.469 Iray INFO - module:category(IRAY:RENDER): 1.0 IRAY rend info : Rendering...
Based on previous discussions, I think there's a suspicion that the log is reproting the uncompressed texture sizes - but most of the images will be larger than the default compression thersholds and so will take up less RAM on the card than they do as raw data.
So THAT's why whenever I search for "iray" on google what I mostly get is posts in the DAZ forums....
I never thought of that, but it looks like you're right. The NVIDIA website for Iray actually lists DAZ Studio as one of the half-dozen or so users. Maybe Iray is becoming the Carrara/Bryce/Hexagon of the NVIDIA lineup
Anyway, at the end of the day, I think most of this stuff is a bit irrelevant in practical terms for most of us. We get whatever software works, buy the best GPU we can afford (realizing nothing's perfect), and realize we can't change any of it (including VRAM stacking, W10 hogging VRAM, RTX/Turing utilization, NVLink functionality, and so on). And I assume most of us aren't going to run out and buy a couple of RTX-2080ti's anytime soon, so I think most of this is moot.
On the other hand, anyone know of some cool Optix-based renderers that can directly import Studio scenes?
Oh, BTW, I read on the wiki that Optix isn't actually a renderer.
This stuff is giving me a headache.
Thanks that's a big (bad) info you give us. I saw the Optix Prime dll in iray dir and was wondering if that was different from Optix
You'd still have to wait the new Octane to confirm the compatibility
for 3DL network rendering you need one licence per computer ( 600$ for eight core licence)
...the free version of 3DL supports 8 cores. Wouldn't you be able to network that one?
As to Octane4, Vulkan compatibility will most likely be in the 2019 release (they have already been testing it).
I'm confused. Is Vulkan supposed to replace OpenGL or OpenCL? I've seen articles about both.
I never tried but you could at least use one free node. However you'll need to do that by hand with command lines. I think it's possible to script something to integrate thatr inside DS but I don't know anybody doing that.
But still you'd only have one render node. I'm not sure it wouldn't be easier to simply have a DS on the other machine to render
For Octane I was speaking of compatibility with AMD Hardware. Even if in theory that is compatible there could be some HW implementation specifics. You should check Octane Forum for that
Both in fact. One API to rule them all
Oh. That should be nice. If all major vendors get behind it that could definitely work out for the consumer. As far as the 2000 series goes, I'm going to pass for now. It's not worth it for me to upgrade.
Indeed.
Closer integration (and it is close now) would have me using a hammer not a fist to knock that door down. :)
Actually I'm not please. Dislocating my shoulder trying to pay myself on the back is something I'd rather have avoided.
There is lots to like about IRAY tbh, support from Nvidia doesn't seem to be one.
In response to the image: Bravo.
I currently see these cards as being great sometime in the future, providing the future pans out as Nvidia expect; they should ask Intel how that is working out for them... Over-confidence can be fatal.
Ah, I'm not clear on what agreement Daz has with what is in Daz Studio, tho the single CPU thing is a good point I had forgotten about. That chat was a few years back in the 3DL threads, and I can't remember if the free ver supports networked rendering or not. Kettu and others may know in the 3DL thread. I'm sure the launch of 8 core 16 thread Ryzen CPUs may possible in theory have the 3DL team hypothetically reconsidering the possibility of reconsidering the 8 core limit if that is not already an old limit due to Threadripper and i9 CPUs. On the "GN R7 build" Daz Studio does indeed use all 16 threads of the R7 with the built-in version of 3DL (I don't 'Think' that is the free version tho, and I don't think that is the full pay ver either. I honestly do not remember the details of what was different between the free version and what Daz3D has)
Total coolness. If I'm not mistaken, I think there are two different plugins for another engine at the Daz Store, Reality and another one (that Destiny's garden and Inane Glory has done some stuff for). It may be already an option when Octane4 is out of beta if not soon after. Oh, that was Lux, I don't know about Octane4. I'm sorry, my memory is a tad fuzzy on this. Oops, Reality and Luxus for LuxRender, not Octane, Sorry. There is some old threads about Octane plugin things, they are old tho (2013 ish).
P.S. Thanks nicstt
The version of 3DL that DAZ Studio uses will use unlimited cores on a single PC (actually more than the paid full version). It's a full 3DL version, but never the current version, it's just been dumbed down for DAZ users. The functions are readily available through scripting as Kettu has proven. It can't be networked and I don't believe the free version can either. The free version is all command line, too. There was a problem a few years back reported that the rib files that DAZ Studio makes did not work as well in the free version, but I don't know if that was a user knowledge issue. I seem to remember that getting an animation done with it was problematic, but again a user issue possibly. 3DL in DAZ Studio can do animation and with motion blur! There used to be another thread here that explained the free version.
Hey, i just noticed DAZ has a prominent position on the 3Delight Cloud page! https://3delightcloud.com/
...hmm that last item. Could that be a possible forthcoming service?
Here is something that will give one a heart attack, the computer parts site below is in New Zealand and what they are charging for the new cards.. All can say is one could just about buy a complete new computer for these prices..
https://www.playtech.co.nz/computer-hardware/graphic-cards/filter/cat/520-nvidia-geforce.html?p=3
And here is what we will be paying in Australia, not as bad but still over the top expensive..
https://www.scorptec.com.au/product/graphics-cards/nvidia
https://www.pccasegear.com/category/193_1966/graphics-cards/geforce-rtx-2080-ti
So yeah I think I will be waiting for a long time before I get one of these new cards.. lol
...crikey in NZ even at the exchange rate, a 2080 Ti Strix is still expensive at 1,945USD. Wonder if part of that is due to a VAT.
In Australia the same card is 1,593USD. I do know you folks have a VAT.
I'd hang onto that 1070 Ti you have for a while.
Ask someone to buy it state-side and ship it to you.
I'm sure you've got US friends as long as some of you have been posting....
I'll wait on real-world reviews, but 30% speed increase is nothing to sneeze at. For those that do TONS of renders, that's a blessing and probably worth the cost for the time saved, overall.
Yeah in Australia with have the GST = Goods and Services Tax that adds another 10% to the price.. :( In New Zealand it is 15% so yes can get expensive, the main issue they have is a low exchange rate, $1 USD is .64 NZD.. Here in Australia $1 USD is .70 AUD at the moment, can see the take up of the cards here taking a while.. lol
And well for myself with my new system that is in my sig, I will be holding off for quite a while.. lol
There is one advantage to NVLink that has not been mentioned and it's a good one. NVLink allows each GPU to share its video memory through the NVlink bridge so when you have two cards the video memory is shared so two RTX 2080ti's will give you 22GB of memory for rendering. Right now each GPU has to save a copy of the Rendering assists. so each 1080ti has only access to 11gb
Great marketing pitch.
Does anyone of you uses monthly subscription of Octane for Daz Studio?
How well does it perform?
I see, they have updated version for Unity to Octane 4.00 RC 5, so I may give it a try there, first.
I started a monthly sub for the Octane plugin for daz. The plugin doesn't use the latest v4 of Octane yet and not sure when that will be available.
I played around with the plugin for 20ish hours over the last week and half. I have used the 4.11.0.196 beta with the plugin. There are many little quirks, like alot. I have managed to navigate through most things and can say I truly like Octane results, however, it takes quite some time to learn and get used to things. If you have never used a node editor for materials then you will have to learn to with the plugin. Most materials work well in the conversion process. Lighting is interesting and tricky until you learn the differences and how to utilize it with your scene. Also, I have tested the different anatomical elements available for daz models and only the daz ones convert well with the others showing lines and odd behaviour with the material on them. I am sure there is a way to fix them but I have not focused on it.
Crashes. There will be crashes sometimes. I find that if I make alot of changes and keep rebuilding the scene daz will eventually crash. That is to say that I will make some material changes or displacement map changes etc and then in the separate Octane render window pane I will have to "Rebuild Scene" if it doesn't update correctly, will happen often, and then Daz will crash if I have done that a few times. Not sure if memory consumption is just getting too high or what.
There is just way too much that I have learned in the last week and half to put conscisely in this response so my simplified recap is to be ready to tinker....alot. Great results though when you get things figured out and you will see alot of detail that just doesn't show up in iRay renders, at least from Daz iRay renders.
Thanks a lot for such a nice review and insights about Octane, shaneseymourstudio.
I think, I will wait for v4 of Octane, then. The base version of Octane for Unity is free, so I will try it first to see, if I like it over iray renders.