Jump to content
3DCoat Forums
ajz3d

Kepler cards and wireframe mode

Recommended Posts

(I didn't want to post it in the 3D Coat BETA thread where the discussion took place, so I created a new topic)
 

Exactly. BTW, Alvordr, I still don't recommend AMD cards, because I've had experiences in other CG apps where the AMD card I had, simply refused to work right. I tried updating the drivers multiple times and looking for solutions online and I could find no answers. So, I went out and bought a comparable NVidia card, and sure enough, everything worked great. I've had no problems whatsoever, with CG apps from then on. I am just ticked at Nvidia for the GTX 600-770 series of cards (Kepler). For some reason, they took a big step back in CUDA performance AND they reduced the Memory Bus from 384bit to 256bit. That is purposefully creating a bottleneck, and that's why that Kepler line of cards should be avoided, if you use 3D Coat much.

When you use LiveClay and other dynamic subdivision type tools, you want to see the wireframe as you are working. As Ajz3d just mentioned, if you have over 1-2mill polys in the Voxel room, having wireframe toggled on brings those cards to it's knees. I suspect it is the 256 Memory Bus. The previous line of NVidia cards (Fermi) ds had no such problem. Why? Because they had/have a 384bit memory bus. What's really odd is, the generation before that, was 512bit. So, instead of staying put or increasing the size of the Memory bus, Nvidia has been steadily cutting back.

Guys, I think something was improved in one of the latest Kepler drivers, because since I installed version 340.52 a few days ago, I notice a major boost in viewport's speed when working in wireframe mode inside 3D Coat. I made some tests with 30 million tris mesh and I get 15fps when rotating the camera, which is night-and-day difference to what I had before (1-2 fps or even less). On a 4.3 million mesh I get 103 fps.
And again, all this on a Kepler (GTX660Ti) and in wireframe mode!

Share this post


Link to post
Share on other sites

Hmm interesting, thanks for the heads up! I didn't know about that..

 

Which OS/Version are you running? Also, can you run the same tests with 3DCoat OpenGL and see how things stack up?

I have a 770gtx with 340.52 drivers, I'll run a few tests. :)

Edited by Nossgrr

Share this post


Link to post
Share on other sites

I'm running Windows 7 x64 Pro.

I get the same frame rate in OpenGL versions of 3D Coat, so they're affected too. B)

Edited by ajz3d

Share this post


Link to post
Share on other sites

So Nvidia might have finally stopped gimping their consumer cards on the driver end? Great news if that's the case.

Hopefully.

Share this post


Link to post
Share on other sites

The lowest my frame rate will go using an overclocked GTX 670 (256-bit / 2048 MB GDDR5) with driver version 331.65 and 3D Coat 4.1.12 DX Cuda in Windows 7 SP1 64-Bit with a 30 million triangle mesh is 17 fps. I downloaded the 340.52 driver yesterday, but haven't installed it yet. Trying to figure out why I can't move, rename, or delete the installer lol.

Share this post


Link to post
Share on other sites

The lowest my frame rate will go using an overclocked GTX 670 (256-bit / 2048 MB GDDR5) with driver version 331.65 and 3D Coat 4.1.12 DX Cuda in Windows 7 SP1 64-Bit with a 30 million triangle mesh is 17 fps. I downloaded the 340.52 driver yesterday, but haven't installed it yet. Trying to figure out why I can't move, rename, or delete the installer lol.

I had one just like it...and the framerate would drop through the floor when working on an object more than 5mill polys + wireframe turned on. With it turned off, everything was fine. Don't know why wireframe would kill the frame-rate, but it did/would. It's why I asked Andrew about it. He put me in contact with his NVidia rep, who admitted it was a "known" issue...and they would try to address it with an upcoming driver. I guess they finally did...a year late. It made the card unusable in my daily workflow, as you NEED wireframe turned on when working with LiveClay or when you are trying to assess the density level of your object.

 

I sold the new card I bought on ebay (took about a $75-100 loss in the process, and bought a GTX 580 3GB. Been really happy with it, especially since it's just below a $1k Titan in terms of CUDA performance. I'm probably going to wait and see what Nvidia has next, before I upgrade.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×