Forum Groups
  All forums
    Help & Feedback
      Questions
      Work in progress
      Finished Art
      Non-Max related
    Community
      Offtopic
      News
    Hardware
    Photography



Maxunderground news unavailable

Nvidia Quadro K4200 vs. K5200: Where can I expect noticeable performance difference?
show user profile  Stenrik
Hi,

I'm building a PC and gearing towards building very high poly, multi-object, all-around complex scenes in Max (2014). I'm talking hundreds/thousands of objects in a scene. In my old machine I'm getting a lot of bounding boxes in my viewports, viewport lag, and a distracting kind of flickering when in "realistic" mode.

For the new build, I can't decide between the Quadro K5200 or K4200. Obviously the 5200 is better, but how much better? Better enough to justify paying 800 dollars more? I know this is a somewhat subjective question, but I can afford it or I wouldn't be asking for help in deciding between the two.

-----------------------------------------------------------------------------

Things that are important to me (most important first):

1. Smooth viewport performance for complex scenes
2. Fast real time lighting in viewports, like having "realistic" mode on without the problem of flickering or super inaccurate shadows. (Nitrous)
3. Faster render times. (I use Mental Ray but am thinking of switching to Vray soon, if that makes any difference. I never render animations, and my still images tend to be about 5000 pixels on the longest side.) Note: I tend to render a lot of reflective objects and glass, meaning I want great raytracing performance.

-----------------------------------------------------------------------------

I know that the Quadro line is what I want since it seems tailored to play nice with Max using iRay.http://www.nvidia.com/object/autodesk-3ds-max.html
(Also, I'm a heavy Photoshop user and the line works well with that too.)

But the question is, how much better is the K5200 for things like viewport performance and real time updating? Has anyone had scenes that the 5200 could handle well that the 4200 couldn't, or general experiences that would indicate this?

http://www.develop3d.com/reviews/Nvidia_Quadro_Maxwell_Kepler_CAD_Creo_Solidworks_CAE_iray_review
(This is not benchmarked in Max, but I'm assuming a lot of crossover.) One thing that worries me is the very last comment in that above link: "I have quadro k2200 and it's not better than my old firepro v4800 which was way much cheaper, also Nvidia doesn't list it on the certified drivers and new driver updates for 3dsmax. Don't know why, but development seems to have got worse when it comes to drivers."

So I did a search of driver support on the Autodesk site and found this: http://usa.autodesk.com/adsk/servlet/syscert?siteID=123112&id=18844534&results=1&stype=graphic&product_group=6&release=2014&os=8192&manuf=1&opt=1
So apparently the K5200/K4200 aren't even supported in Max 2014 then. Crap. These cards have been out since Q2 2014, so would it be best to assume they will never provide support for them? I notice that they are supported by Max 2015, and I could upgrade as long as my plugins would all work with it.


I'm a bit of a hardware noob, so I'll take whatever guidance I can get. Assume I don't know what I'm talking about.

~Sten



(Oh, and in case it's relevant, the processor for this new machine will be a Quad-Core Intel i7-4790K processor at 4.0 GHz. (I've heard that octa-cores are actually worse for viewport performance). I'm also getting 32 gigs of RAM for it.)
read 2489 times
2/4/2015 7:35:01 PM (last edit: 2/4/2015 10:22:40 PM)
show user profile  9krausec
I'm watching this thread. Quadro vs GTX has been a point of confusion/learning for me. Feel free to read up on my build thread here-

http://maxforums.org/threads/computer_build_advice/0001.aspx

^Some good info there if you want to do some reading.

3 GTX 970s were recommended for my build by our great Stabby here on MF. I was looking into Quadro cards, but decided to go GTX instead due to the price point.

Apparently Quadro cards are practically the same as GTX cards at a hardware vs hardware standpoint, but Nvidia throttles performance for 3D Production work with the drivers of the GTX cards (this is how I understand it).




- Portfolio-




read 2461 times
2/5/2015 2:59:11 AM (last edit: 2/5/2015 2:59:11 AM)
show user profile  donvella
I use a Quadro K4000 3GB for work

My previous 1280mb 560GTX rips all over this card. I hate using Quadros! This is also the answer to your 3 questions. Rendering wont benefit much from this unless your using activeshade, it still has issues with certain objects though

I am currently requesting a 980GTX replacement...

btw regarding the max15 support, they are all the same, the k4000 is supported. Ive used quadros at different companies over many years aswell as GTX cards, the only thing quadros are good for is Autocad (they draw wireframes fast, thats about all) and setting them on fire and replacing with GTX cards. Its a marketing sham



read 2455 times
2/5/2015 5:15:46 AM (last edit: 2/5/2015 5:24:32 AM)
show user profile  9krausec
Totally not arguing that GTX cards are the way to go (I'm also going to be running 980s now), but what the hell does this mean then?



tests like that confuse me about this topic.




- Portfolio-




read 2435 times
2/5/2015 2:16:59 PM (last edit: 2/5/2015 2:16:59 PM)
show user profile  Nik Clark
The argument has always been that the Quadro cards are designed to be run at full capacity for days and days without break, whereas gaming cards were not designed for such punishment.

However, gaming cards have excellent cooling now, and are quite robust.


Click here to send me an emailClick here to visit my websiteClick here to visit my photo gallery on Flickr

read 2432 times
2/5/2015 2:26:13 PM (last edit: 2/5/2015 2:26:13 PM)
show user profile  Stenrik
@9krausec: I actually hadn't been considering the GTX at all.

For what you said about Nvidia throttling 3D CAD performance with the GTX cards... is there an easy way around this? Whether it's hardware or software holding back performance, it's the end result that I'm concerned with. This makes me still want to go with the Quadro.

Regarding processors in your thread: I already have a motherboard and case, which only allows for one processor and graphics card, unfortunately. But if you can make a great argument for multiple, I'm open to it.

I've already secured a quad core 4.0 GHz i7 processor (http://www.newegg.com/Product/Product.aspx?Item=N82E16819117369
), but can ebay it if it's not advised.

To respond to your second post, I'm with you on being confused by tests like this! Hopefully someone can help clarify for us.

-------------------------------------------------------------------------

@donvella: I actually would be using activeshade. Is it a significant benefit?

Also a big priority of mine is manipulating complex multi-object scenes in the viewport without lag/bounding boxes. When you said Quadros are good for autocad because they draw wireframes fast, it implies that they would also be very good at this task.

Regarding driver support, I was talking about support for the K4200, not the K4000. I did see the K4000 on the Max 2014 list, but I have an opportunity to get a 4200 at a good deal right now, so I'm not looking at the 4000.

-------------------------------------------------------------------------

@Nik Clark: I'd expect to be taxing this card for 8 hours per day with some overnight renders here and there, but there would definitely be breaks within the 8 hours as well as around them (I'm human, hah). If I do go with a GTX, should I be looking into some kind of additional cooling then?

-------------------------------------------------------------------------

@Everyone: One thing that hasn't been touched on is how much of a difference the 4 vs. 8 GPU memory is. I had assumed that the 8GB GPU for the K5200 would be useful for very large highpoly scenes, and noticeably better. Can anyone address that aspect?

Edit: I've been reading around and it looks as though viewports don't use all THAT much GPU memory, so I may have answered my own question on that one. :)

Thanks,

~Sten
read 2415 times
2/5/2015 7:15:44 PM (last edit: 2/5/2015 7:53:36 PM)
show user profile  9krausec
I've read a few articles about ways around the throttling, but they were sketchy at best. Something about actually modifying the hardware to get the computer to recognize some specific gtx card as the K5-something something quadro.

So no.. no easy 100% proven way that I've found.

Only reason you'd want multiple CPUs would be for better render times (since during rendering all cores are being utilized of both the processors). While running software though usually only one CPU is being used. For this reason I'm going with a single CPU with my build.

If I were building a personal computer for home use I'd reconsider dual xeons because I would be relying on my machine for rendering as well as general working (no render farm at home).




- Portfolio-




read 2400 times
2/5/2015 8:00:10 PM (last edit: 2/5/2015 8:00:10 PM)
show user profile  Mr_Stabby
>>For the new build, I can't decide between the Quadro K5200 or K4200. Obviously the 5200 is better, but how much better?

check here:http://en.wikipedia.org/wiki/Nvidia_Quadro

video card performance within the same chip family scales linearly with the product of core clock * core count. K5200 is at 650 * 2304 = 1497600, k4200 is 1048320. If you divide one by the other, you get 1497600 / 1048320 = 142% as in 42% better.

>> Better enough to justify paying 800 dollars more?

thats rather subjective, i wouldn't pay 800 total for both of them combined since the k5200(a downclocked gtx 780) is worth about 3.5 TF which is about the equivalent of a $350 gtx 970.

>>So no.. no easy 100% proven way that I've found.

heres some nice fellars that figured out the k6000 http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/msg207550/#msg207550

edit:ops thats for the k5000, gtx780ti -> k6000 was somewhere in that thread too but i forgot where and its a long ass thread.

read 2365 times
2/6/2015 9:40:19 PM (last edit: 2/6/2015 9:43:31 PM)
show user profile  Stenrik
@9krausec Yep that's what I thought, makes sense. I have a laptop for doing "other" work during renders so I'll probably stick to the single CPU.



@Mr. Stabby: Thanks for the breakdown. I wasn't sure how "42% better on paper" translated to real world use. Would it mean viewport FPS would be 42% faster? Probably more to that story.

I keep hearing that it's about the drivers, not the numbers, so I'd assumed that the Quadro was the way to go. But now I've been seriously considering just switching to the GTX 980 instead. Energy efficiency and error checking don't matter as much to me, but I still am in the dark about exactly how different gaming-targeted cards are. Some people on forums swear that they are very, very different in approach, insisting that quadros beat performance-equivalent GTX's by a longshot for things like complex viewport scenes. (http://forums.cgsociety.org/showpost.php?p=7274895&postcount=5
) OK, so let's assume they're right for a sec:
At what point does a "faster hardware" card like the GTX 980 compensate for what it lacks in driver capability?

I did this head-to-head:http://www.game-debate.com/gpu/index.php?gid=2382&gid2=1700&compare=quadro-k4200-vs-geforce-gtx-980-4gb

and here: http://www.videocardbenchmark.net/compare.php?cmp%5B0%5D=2953&cmp%5B1%5D=2944

HOWEVER, then I saw this breakdown article: http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-5000_6.html#sect1
Granted, this is 5 year old, and a lot can change during then. "SPECViewperf also makes it clear that using gaming cards for professional applications is hardly appropriate. Although the GeForce GTX 470 has the same architecture as the Quadro 5000 and even works at higher frequencies, it has problems processing complex models via OpenGL. The speed of the gaming card is much lower than that of the specialized solutions with optimized drivers."
This is an actual test in Max 9: http://www.xbitlabs.com/articles/graphics/display/nvidia-quadro-5000_7.html
Interestingly, the Quadros indeed are far superior for wireframe mode. Often I do use wireframe mode in camera view to prevent lag, but it looks as though the gap is narrower with the GTX in other modes. Again, not sure how relevant this comparison is today.

It's hard to believe that driver throttling could make up for how much better the GTX is numbers-wise, though.



Also it's cool to know that the hack exists! I am not hardware savvy and my time is limited, so I'm a little wary of doing something like this myself though.

read 2347 times
2/7/2015 12:23:29 AM (last edit: 2/7/2015 1:45:12 AM)
#Maxforums IRC
Open chat window


Support Maxforums.org