Graphics card MSI GeForce 1660, OK for IRay?

I've built a better machine for myself (i5-8600 CPU, 32GB RAM) nice zippy machine. But I'm still looking for an affordable graphics card that can seriously help with DAZ-Studio IRay rendering.
Not willing to pay $500+ for a graphics card.
But for $209, this is tempting https://www.newegg.com/msi-geforce-gtx-1660-gtx-1660-ventus-xs-6g-oc/p/N82E16814137400?item=N82E16814137400&nm_mc=bac-criteo&cm_mmc=bac-criteo-_-video+card+nvidia-_-msi-_-14137400# It apparently has 6GB of RAM but I've heard that Win10 steals some of the graphics RAM? My images don't get very complicated usually no more than a mix of three M4, Gn, G2, G3 or G8 characters in a scene with very simple clothes (loincloths, swim trunks, etc) and I usually use stage backgrounds (Millennium Environment, Cyclorama, etc.). I don't use complicated light setups or sophisticated multi-layer surface textures.
I'm currently using the native graphics of the i5-8600 CPU itself but want better. How do I know if this MSI GeForce card can help significantly with IRay rendering?
Comments
The GTX 1660 will work (I've previously just about got away with a 1050 Ti), but if you can stretch your budget and power supply further, I'd strongly recommend looking at the RTX 2060 Super (which does come in at under $500 - low to mid $400 range is normal), which comes with 8GB of DDR6 VRAM (the problem with GPU rendering in Iray is that if the scene doesn't fit into the VRAM, then the graphics card provides no benefit whatsoever), as well as ray-tracing cores, making it faster than any GTX card.
It's not exactly a budget card, but it is definitely one of the best cards, cost to performance, for Iray.
EDIT: To put some specific numbers on it, the Benchmarking thread has the RTX 2060 Super at almost twice the speed of the GTX 1660, and almost directly on a par with last generation's top cards (the GTX 1080 Ti and Titan Xp), which were well over twice the price of the 2060 Super. That's how much difference the RT cores make for Iray.
My brother has some fairly wise advice on this front - don't look at the cost of the graphics card, look at the cost of the entire system. The RTX 2060 Super might look like twice the cost of the GTX 1660, but given you've already spent a fair sum on everything else, it's not increasing the price of your overall system that much.
You're looking at it like a part. LeatherGryphon. Matt's brother is on to what's important. I would go further saying the card does most of the heavy lifting in iRay, so it's more important and worth every penny.
Thank you, good information. One of my issues with a new graphics card is that I've built this new system into a stripped out old HP desktop case with an mATX motherboard (ASUS PRIME B360M-A). There's a new 650 watt Corsair RMX powersupply in there too, along with the original harddrive cage so there's not a lot of room. I've measured and there is room for a 2-fan graphics card, but not a 3-fan one. I'm using air cooling with a CoolerMaster i71C CPU fan and one case exhaust fan. I don't have room for any more fans without getting creative with drills and saws, but currently heat is not a problem. Memory is 32GB of DDR4 2666MHz RAM. I don't overclock and I'm below hot even when rendering. Also, ~$200 is my limit. I've spent enough on this, my first build, and I still want to add a 1TB NVMe SSD into that system too.
Looking at the benchmark thread, I think my second hand GTX1060 6Gb rendered the benchmark model in around 4 mins 50 secs. Using this site https://benchmarks.ul.com/compare/best-gpus the GTX 1660 has a score of 14084 vs the 12966 of the GTX 1060. This indicates it would be a bit quicker, though an exact prediction of the render time would be foolish to give. Anyway, it would be a moderately rapid card with DS. By spending silly money you can get the benchmark render to under 1 minute.
At the other end of the scale, without a working graphics card on my PC (the original card was too out of date for DS to use), it rendered the benchmark model in slightly over 50mins.
As for number of figures etc. My 6Gb card will work with 3 figures, clothes & a basic backdrop. HOWEVER, some hair models are enormous. OOT's Caprice Hair for G8F and 1 figure is enough to get CPU rendering only because the VRAM isn't enough. Apparently, a graphics ram rule of thumb is 2Gb per G8 figure, and this seems to work fairly well in my limited experience.
Hope this helps,
Richard
Very helpful, thanks Richard.
I think I'm ready to commit to the card and an extra DisplayPort-to-HDMI cable to run to my other monitor. Sigh, money! Where does it go? (rhetorical question)
So, you moght want to look into getting a a used GTX 1070 8GB card instead of a new 1660 6GB
There is currently a glut of these cards used on ebay, including the 2 fan "mini" version for around $200.
You will probably be a lot happier in the long run with a GTX 1070 because of the 8GB of VRAM. The extra 2GB of VRAM does not sound like much, but when you consider how much is goobled up by running your viewport and other windows tasks... Also, rendering 3 G8 characters with hair, clothing and a background... you will probably end up back in CPU rendering land with a 6GB card unldess you spend a bunch of extra time "optimizing" your scene before rendring.
This is coming from someone who lived with a 6GB GTX 1060 in my Daz Studio desktop for the G3 and begining of the G8 era.
If you can't stretch to the extra cash for the 2060 Super, then yes, I'd agree with JamesJAB - a used 1070 would be my choice over the 1660.
Because the GTX 1660 isn't an RTX card, it's missing out on a lot of the technological boosts you'd normally get by jumping up a generation, so actual benchmark performance isn't really much of an improvement on the 10xx series. Being entirely honest...the entire 1600 series is really kind of mediocre unless you have a particularly niche scenario. (I do have a 1650 in one of my computers, but that's because it has a shallow case that specifically needs a half height card, and there is currently no better option when it comes to a low profile Nvidia card).
In this case, the extra 2 GB of VRAM on the 1070 over the 1660 would be a significant benefit to Iray.
I have 1660s and 3700x 32gb ram and my PC is always at 100% on both CPU/GPU load just after clicking iray switch. This thing is seriously garbage.