Forum Groups
  All forums
    Help & Feedback
      Work in progress
      Finished Art
      Non-Max related

Maxunderground news unavailable

First page  Go to the previous page   [01]  [02]  Go to the next page  Last page
is the gtx titan X useless due to no double precision?
show user profile  debodeebs
looks a monster but nvidia have removed the double precision so is it worth buying for real time rendering (previewing)?
read 611 times
4/27/2015 7:28:49 PM (last edit: 4/27/2015 7:28:49 PM)
show user profile  LionDebt
Is my keyboard useless just ecause the B key is a bit roken and I need to hit it very hard to make it work?

read 579 times
4/28/2015 1:29:20 AM (last edit: 4/28/2015 1:29:20 AM)
show user profile  Bolteon
Yes, to both.

-Marko Mandaric

read 574 times
4/28/2015 1:35:54 AM (last edit: 4/28/2015 1:35:54 AM)
show user profile  digs
useless might be the wrong word.. i would have grabbed one the other week if ye ol' computer shop had 1 in stock. went with a 980, much cheeper
read 570 times
4/28/2015 1:43:23 AM (last edit: 4/28/2015 1:44:56 AM)
show user profile  Mr_Stabby
plot twist: both iray and vray rt use single precision math. Also I don't know a single commercial application(aside from benchmarks) that uses dp on gpu's

read 556 times
4/28/2015 5:17:32 AM (last edit: 4/28/2015 5:17:32 AM)
show user profile  Bolteon
Didn't we have a long winded discussion a while back about DP vs SP?

I'm obviously talking about double penetration vs single.

-Marko Mandaric

read 552 times
4/28/2015 6:15:13 AM (last edit: 4/28/2015 6:15:20 AM)
show user profile  Mr_Stabby
I don't remember.. I'm too fed up with gangbangs though, these days I prefer one on one romance. In my basement.

read 548 times
4/28/2015 7:46:06 AM (last edit: 4/28/2015 7:46:06 AM)
show user profile  debodeebs
@liondebt with that kind of responce we would all be buying shit parts lol :) tbh im fed up of all this hype about gpu rendering as 80% of it is misleading imho, you mention gpu render and u get corona/octane ect you ask about vray rt you get told its shit for gpu rendering then you ask whats a good gpu for rendering and you get told to get a titan its the dog bollox but now titan x is out you get told its shit and best getting a 970/980, im thinking of just going for 4 way 970 4gb's as i dont think my scenes will use more then 4gb tbh and i hear more good things about 970's then any other card to this day.
read 520 times
4/28/2015 5:58:52 PM (last edit: 4/28/2015 5:58:52 PM)
show user profile  debodeebs
thanks for clearing that up stabby so now i dont have to worry about the sp or dp anymore :) still think i might stick to 4 970's rather then 1 or 2 titan x's
read 517 times
4/28/2015 6:03:48 PM (last edit: 4/28/2015 6:03:48 PM)
show user profile  Bolteon
GPU rendering wont be worth it until it can start accessing common ram and not just what's packed on the GPU.

-Marko Mandaric

read 494 times
4/29/2015 3:41:27 AM (last edit: 4/29/2015 3:41:41 AM)
show user profile  Nik Clark
It depends what you are doing Marko. I can fit most of the scenes I make into 4GB. Product renders aren't very demanding. When I upgrade to an 8GB card I can't see me having many problems with GPU rendering, and my 970 is much faster than my i7 machine at rendering.

Click here to send me an emailClick here to visit my websiteClick here to visit my photo gallery on Flickr

read 475 times
4/29/2015 9:33:38 AM (last edit: 4/29/2015 9:36:23 AM)
show user profile  Garp
> GPU rendering wont be worth it until it can start accessing common ram and not just what's packed on the GPU.

You'd still want the data operated upon to be cached in the GPU's memory, not to reside in main memory which is very far away.
It takes time for the electrical charges to travel from one place to the other. Who controls the transfer between a memory stick and the GPU (i.e. whether main memory has become part of the GPU's address space or not) doesn't change the laws of physics.

read 467 times
4/29/2015 10:25:53 AM (last edit: 4/29/2015 10:38:53 AM)
show user profile  Bolteon
Yes, obviously it doesn't... but it's still too limited.

When I say wont be worth much, I mean to say in a global sense.

Course it works for product viz... but that's a very small part of VFX.

Ideally, we get a card that carries 32gb of ram or GPU's with expandable RAM. But neither of those things will happen anytime soon (if ever) *and* until it does, it's a step backwards in capability.

Sure, it's faster; but there's no point in owning a Ferrari if it only fits half your body.

-Marko Mandaric

read 447 times
4/29/2015 10:57:10 PM (last edit: 4/29/2015 11:02:40 PM)
show user profile  Garp
Plus, algorithms on GPUs can be a bitch. :\
The difference in architecture is not just the number of cores. CPUs and GPUs are very different beasts.

For example, 3 years ago a guy named Nanjappa published his PhD thesis about an algorithm for 3D Delaunay triangulation using the GPU (which is remarkably well written, I'm working through it right now). Compared to the sequential implementation of the CGAL library, it achieves a performance boost of 5x or 6x. More recently, it's been improved to around 10x. IFAIK, it's the fastest implementation to this day.
On one hand, a 10x improvement is massive. Not a mere 15% or so. On the other, going from one thread of execution on a single core to a myriad of threads on hundreds of cores and getting 'only' a 10x speedup feels like a very poor return on investment.

GPGPU is hard. I see many requests on forums along the line of 'please, make feature X (deformation, dynamics, whatever) run on the GPU', like you plug your current implementation on the device and it just works.

read 427 times
4/30/2015 8:16:29 AM (last edit: 4/30/2015 8:20:03 AM)
show user profile  Bolteon
Yup, it's a painful mess...

-Marko Mandaric

read 414 times
4/30/2015 12:02:35 PM (last edit: 4/30/2015 12:02:35 PM)
First page  Go to the previous page   [01]  [02]  Go to the next page  Last page
#Maxforums IRC
Open chat window