Xeon vs i7
Daz 3D Forums > General > The Commons>Xeon vs i7
Xeon vs i7

in The Commons
Hi. I have an i7 4790, 16 ram, 2x980 ti and one 970. I read somewhere that even if I buy 4x titans I would need a high end processor that can handle all these cards... This is correct? How can I know If my processor is fine for handling my cards? Which processor do I need?
Thanks!
Comments
A good way to check is to see the effect of adding additional cards to your render in GPU mode only. If the CPU load from servicing the GPUs is too much, your system will lag and/or your render speed won't scale as expected.
Keep in mind that additional cards don't scale linearly either...
in the actual beta 4.9.3.71, it seems that the scale is beter. See the thread here
http://www.daz3d.com/forums/discussion/95436/daz-studio-pro-beta-version-4-9-3-71-release-candidate-updated#latest
especially the latest Mec4D posts....
You need more than 4 cores/8 threads to run 4 Titans.
Do you need a Xeon, not specifically, but I'd look for 6 or 8 cores 12/16 threads. Personally, I'd look for a minimum of 8 cores/16 threads. That is my plan on my next upgrade.
I would consider a dual Xeon system though; but when considering think about what you use your computer for not just when preparing scenes, but will you want to use it when it's rendering. The more spare cores then, the better; of course a second workstation might be better. Only you know your requirements.
you can run multiple CPU's with Xeon, you can only run a single CPU with i7.
i7's and Xeons can come in quad core or dual core configurations.
My Dual Xeons render faster than my i7 but my i7 seems to be more responsive in Studio, it may be an optimization thing in the software itself.
It is a hardware optimization actually. i7's are presumed to be used in desktop systems therefore the bus connections to the display io hardware are given a higher priority than other operations. The Xeons are the opposite in that there is no optimization for display at all since the presumption is that the systems would be used for servers or workstations that have "intelligent" video cards in them (Quadro or FireGL) that will make the necessary interrupts themselves instead of relying on the system itself to do the work.
People don't seem to realize that there is a LOT more to the Quadro than just a higher price attached to the same GPU family. One of these is a much more developed bus control architecture than the GeForce cards have.
Kendall
Thanks for all the answers!
Using only the gpus I got cpu usage peaks of 60% and ram of 88% but when I surf on the web it gets too much laggy and cpu usage rises to 90% and ram down to 67% but rendering gets slow
Which processor do you recommend me for surf on the web quietly and handle 4x 980 ti while rendering?
Thanks again
Super cheese, I looked up the specs on your cpu and you have a great processor for Iray or whatever. No need to upgrade. Just get more memory 32gigs of DDR3 is cheap right now. I would upgrade to more ram. If your motherboard does not have enough slots get one that does and one with more Pcie slots for more video cards. I would suggest Ebay or Craigslist locally that way if something goes wrong you know the person who sold you the hardware and can contact them for a refund.
...even DDR4 memory is pretty affordable. I'm looking at a 128 GB quad channel kit for my next build (I also work in Carrara and Bryce) which will cost about as much as 24 GB tri channel DDR3 memory did several years ago when I built my current system.
That same 24 GB DDR3 tri channel kit is going for 114$ today (Newegg) so an additional 16 GB shoud be well under 100$.
Xeons can get pretty expensive and generally have lower clock speeds than i7s. I originally was looking at dual 8 core Xeon E5-2630 v3s, each was 669$ and were clocked slower than my old 2.8 GHz i7 930. Instead I'm looking at a single 3.5 GHz 6 core i7-5930K. There is an 3,0 GHz 8 core i7 but is is almost twice the cost of the 6 core 5930K (1,015$). I am planning on running only 2 Titan-Ps once pricing and availability have stabilised..
Were money not a big issue, I would then go with the dual 8 core Xeon setup and dual 16 GB Quadro P5000s .
if money wasn't an issue I'd be after two 22 core Xeons: sadly it is. :(
I bought a Xeon E5 2670 8core from ebay for $78. Yes, new they are very expensive. I almost pulled the trigger on 3 Titan X's but my 3x 780's with 6gb each get the job done so why pay so much for a little more speed?
...yeah for me it's scenefile size why I am looking at a high memory GPU. The less chance of the process dumping to CPU mode the better., however having a lot of memory overhead just in case is still a good thing. For my need two GPUs are pretty optimal as the Pascal Titan-X has over 3,500 CUDA cores.
quadro tritta kaley
Tasty, but lacking in CUDA core4s.
FOUR 980 Tis? That's a pretty tough question. See if you can track down Mec4D's rig specs... she's posted it a couple of times around here, since she runs 4 (or 3?) Titan Xs on her machine.
edit: Also, this answer is dependent on what videocard you're using for your viewport...
I am actually on the cusp of upgrading my 4790 system to Xeon. This is the plan:
ASUS Pro WS C621-64L SAGE/10G , Xeon Gold 6208U, SAMSUNG M393A4K40CB2-CVF 32gb (1x32gb) 2933mhz Pc4-23400 Cl21 Ecc X6 for 192 gig, and Noctua NH-U14S DX-3647 cooler. Those parts should all be compatible, and they should fit into my Coolermaster Haf X.
Yeah, it'll cost me. I have three 1080 ti's to put into it. I'm going to leave the 2080 ti's where they are for now. When the 3090 prices drop a bit, then I'll swap out the 1080's for at least two 3090's.
Why?
I'd be going for Threadripper if I was going for lots of cores. Just curious about your reasoning.
I have a 5 year old 1950x 16/32 cores/threads
I already built a Threadripper, as you can see from my signature. It runs fine, but it isn't exactly blowing my doors off. I have a comic business, and I want to move away from consumer grade stuff. You know, for the Threadripper build I had to absolutely use a watercooler. A Floriing. I was cautioned against trying to use any air cooling at all. What that tells me is that the Ryzen chip is really working it's ass off and making tons of heat. As for the Xeon processors, they make much less heat and that tells me they aren't working as hard. The set up I am planning is specific for graphics, among other things. It should be solid as a rock for doing this stuff and I can't forsee having to build aother one any time soon. I like the threadripper, but for my future investment on the 3090's I wanted to put them not in the Threadripper, and not in an aging i7-4790 based system. I want them to be surrounded by state of the art components, with nothing whatsoever to slow them down.
You're buying reconditioned? You're getting that POS 6208 for way, way, way less than $1k right? And please tell you've changed your mind about the case.
The 6208 is a 150W not the sub 100W parts that are what went into PC's when the Haf X was built.
To be blunt you're going to need serious airflow to cool it with air. They're meant to be in cases with lots and lots of airflow provided by high RPM fans. The Haf X doesn't come close, unless you mod it. To make matters worse the heatsinks for those LGA 3647's aren't all that good (mostly just big metal blocks with a fan if they even have a fan).
If you put that server part in that old case you're in for a world of pain particularly with all those GPU's.
? Huh?
What are you building? A render box or a desktop for setting up scenes or what because you're all over the place.
If you're building a render box then definitely get a low end TR and shove as many GPU's into it as you can. 3090's make no sense unless you actually need the VRAM. If what you need is render speed then more GPU's is better and that means CPU threads (iRay requires 1 CPU thread per GPU). The amount of power the CPU draws is virtually irrelevant if it is dissipated and the chip runs at an acceptable temp. And you can certainly aircool TR CPU's. The aircoolers are very large and cost what the AIO's cost but they are available. Used Xeon's only really make sense if you are saving enough to get more GPU's.
The price I have for the 6208 is $1023. If you know where to get it for less money, I'm all ears. As for the case, it is a good case with good airflow. Do you actually have any experience with one? I've been using mine for going on 4 years. I don't know what the wattage has to do with anything. The PSU is a 1200 W Platinum. I would have thought that if cooling this series of CPU was an issue, there would be more options available out there to do it. From my research the best one out there is the Noctua NH-U14S DX-3647, Premium CPU Cooler for Intel Xeon LGA3647. I did find this: Dynatron L13 Water Cooler for Intel Socket 3647, and there is room in the case to put it, Do you recommend that?
Heh. I am building for the future. Once it's done and houses at least one 3090, that will likely be the main rendering rig. I should point out that since I do need to produce 15-20 frames/day, not uncommonly I render with both of them. And I've been waiting for a GPU with the kind of capacity the 3090 has for a long time. Many of my comics easily exceed the VRAM of the 2080 ti's. As for the cost, I won't say that money is no object, but business has been good enough such that if I can dump some capital here at the year's end, it helps come tax day. It has been suggested that instead of the build I should just start getting 3090's instead. They are just too expensive at the moment. In maybe 6 months the prices will come down to something more reasonable, and the card will be the same. As for the build project, that will likely be as expensive to make in 6 months as it is now.
I have a Noctua NH-D15 for my i7-6700k and its does an excellent job. Noctua does CPU cooling very well.
I also have a HAF-X with 3 200mm fans and one 220mm fan and it has no issue keeping the hardware cool.
I have two Zotac 980TI AMP Extremes in my case. I would highly suggest, if its at all possible, depending on what all you want to do with your new computer, that you do NOT place the GPUs side by side on the motherboard but instead have an empty slot between them. The reason is that if they are right next to each other, the primary card will be sucking hot air from the secondary card and not getting cooled correctly. I had this issue with my two cards at first. The primary card was hitting 83C while rendering and gaming and the secondary card would hit about 65C. But ever since I moved the secondary card down a slot, the primary card only goes up to 70C at full speed.
I was going to get a second fan for up top. Yeah, I know all about the issues with multi card heat. In my threadripper, two of the cards are almost touching, but they are both hybrids, and the temps are fine. The third card is aircooled but away enough from the other two so that the temps are fine. Now, in the i7 rig, I have three 1080 ti's, and the middle one runs so hot I can't use it. The two on either end both run under 70 is I don't use the middle one. The Asus board is very large, and I'm sure I won't have any issues running two cards. If I want to put a third in, it may have to be a hybrid.
No, it was a good case when cases had lots of HDD cages and optical drives. Since those conditions are no longer true and CPU's and GPU's do put out more more heat those cases are obsolete.
Wattage is a direct measure of how much heat a part emits. That's what it means, when discussing a parts power consumption. This Xeon is a 150W while your 2950 is a 180W. The NH-U14 is likely going to work just as it would likely work for the TR, there is a TR version. But it won't be enough in that case. I'm sleptical of that 120mm AIO. Not that it won't work but how fast that fan spins and how loud it would be. For a 120mm AIO to cool a 205 W CPU, which it is advertised to support, the fan would have to be moving a lot of air which means it would have to be pretty loud.
The cooling of Xeons, Xeon gold is a server part (gold means it can support up to 4 CPU's in one installation), in server chassis is usually done passively. The CPU's sit under big heat sinks. Like this:
DELL WCM0C Heatsink Poweredge R730 R730XD Precision Rack 7910 - ServerSupply.com
And the chassis is full of high RPM fans that move a lot of air but sound like jet engines.
I definitely would not be buying one of those for the new price. If you want that sort of HW get refurbished HW. There's lots of cheap server racks on eBay.
For that kind of money , and honestly since you don't really know HW, I suggest you have someone build the system for you. A competent small shop or even something like a Microcenter would have never steered you to a Xeon Gold which is just way too much CPU for what you need at way too much money.
If you're really exceeding 2080ti's frequently then 3090's might be a way to go, although did you try getting an NVLink for the 2080tis? That would be a solution for the HW you have now.
If the goal is to get 3090's though the reality is you either will need a server rack, and somewhere isolated to put it, due to cooling issues or you'll want to reconsider how many GPU's you want in the box. Because even 2 3090's is going to be so much heat it will be hard for even an open air test bench to cool long term plus they're all 3 slots wide at least except for the WC ones.
If you do want dual 3090's then you don't want the server rack just get a decent desktop system and get it custom watercooled.
It may take a bit of thought for adding more fans, but I'm confident the case will work just fine. The airflow is exceptional, and there is plenty of room for the Asus board which is bigger than what I have now. I know very well what is available on Ebay. I don't want to fill the case with refurbuished parts. It's a business purchase and I want all the parts I get to be under warranty.
I am used to the negativity I have come to expect from this site. Yes, I know I could do it more cheaply. Fact of the matter is that I don't really need to do anything to the system. As it is now, it works just fine. But I want to get 3090's when they're a bit cheaper, and since I would probably be getting two of them (to start, at least), I don't want to put them into a dated rig, and unlike many on this site I actually have real world experience with a TR, and I have to think that isn't the best it could be. Plus, I either spend the money now on computer parts, or get taxed on it in April. I choose the former.
I've built some good rigs for someone who in your opinion "doesn't know hardware." I am fully aware that this system I am proposing is much more than is needed. Like I said, my current builds work, so why do anything at all? My understanding of the NVLink, unless it has changed, is that by linking the two cards together you are cutting the number of CUDA cores in half. I want more CUDA cores, not less, because that is where the rendering speed comes from. Even if it adds the VRAM together, the drop off in rendering speed is unacceptable. You can correct me if I'm wrong. I have a thick skin for someone who doesn't know hardware.
I have a few years experience in running multiple cards. I won't have any problem whatsoever running two 3090's. The configurations I have successfully run, are:
980 tix2, 780 tix1
980 tix2, 1080 ti x1
980 ti x1, 1080 ti x2
1080 ti x3
2080 ti x2, 1080 ti x1
2080 ti x3.
I don't see why running multiple 3090's would be any different, and I fail to see why I would need a different case for that reason.
I'm not being negative. I'm being honest. You're talking about paying the new price for a 3 year old Xeon when no new Xeon's are selling. You're then putting a a Xeon gold in a single socket board. That makes no sense. There's a 16c/32t Xeon Silver out there (I don't have all those parts memorized but it's a Xeon 4000 something) which I'd hope is cheaper but it is certainly a better choice for a workstation CPU than a gold part, IIRC the Silver is a lower TDP as well so it won't be as hard to cool but I stopped paying any attention to Xeon's when Rome came out.
But it's your money.
You are being a bit negative. I am also being honest. The Silver prices are not very different from the Gold. You're an AMD guy, and that's fine by me. I am underwhelmed by AMD. So I want my next rig to be workstation grade, not consumer grade, and I want it to be Intel. Given that many times my rigs are rendering all day and all night long, I need durability and reliability. And yes, it is my money. But thus far you have not given me a compelling argument against my proposed build.
Why? 2080ti 250W (boost up to 400W transient). 3090 Nvidia says 350W but that is known to be too low so probably 375 or 380W (boost is 500+) even guys like LTT who routinely grossly overbuild systems have reported 3090's tripping their PSU's overvoltage.
So a dual 3090 system not only needs to be prepared for that sort of draw on the PSU, and that sort of power delivery to the cards, but the cooling those cards will need. Even undervolted so they don't boost at all you're still looking at ~700W draw for two cards plus the rest of the system, likely at least another 100W minimum so unless you want to run a PSU redlined you're looking at a 1000W PSU even with undervolted cards. Then you have all that heat to get out of the case and that is challenging considering that anyway you do it those cards will be right up against each other. It's not impossible but it will be a strain on pretty much any system.
Yes, you misunderstand NVlink. There is some overhead from the cards talking to each other but it does not cut render speed in half. I mostly deal with CUDA applications in non rendering tasks and in those sorts of things it is considered neglible, and the benchmarks show it to be almost inside the run to run variance (2 to 3% max.).