Jump to content
3DCoat Forums

Kepler cards and wireframe mode


ajz3d
 Share

Recommended Posts

  • Contributor

(I didn't want to post it in the 3D Coat BETA thread where the discussion took place, so I created a new topic)
 

Exactly. BTW, Alvordr, I still don't recommend AMD cards, because I've had experiences in other CG apps where the AMD card I had, simply refused to work right. I tried updating the drivers multiple times and looking for solutions online and I could find no answers. So, I went out and bought a comparable NVidia card, and sure enough, everything worked great. I've had no problems whatsoever, with CG apps from then on. I am just ticked at Nvidia for the GTX 600-770 series of cards (Kepler). For some reason, they took a big step back in CUDA performance AND they reduced the Memory Bus from 384bit to 256bit. That is purposefully creating a bottleneck, and that's why that Kepler line of cards should be avoided, if you use 3D Coat much.

When you use LiveClay and other dynamic subdivision type tools, you want to see the wireframe as you are working. As Ajz3d just mentioned, if you have over 1-2mill polys in the Voxel room, having wireframe toggled on brings those cards to it's knees. I suspect it is the 256 Memory Bus. The previous line of NVidia cards (Fermi) ds had no such problem. Why? Because they had/have a 384bit memory bus. What's really odd is, the generation before that, was 512bit. So, instead of staying put or increasing the size of the Memory bus, Nvidia has been steadily cutting back.

Guys, I think something was improved in one of the latest Kepler drivers, because since I installed version 340.52 a few days ago, I notice a major boost in viewport's speed when working in wireframe mode inside 3D Coat. I made some tests with 30 million tris mesh and I get 15fps when rotating the camera, which is night-and-day difference to what I had before (1-2 fps or even less). On a 4.3 million mesh I get 103 fps.
And again, all this on a Kepler (GTX660Ti) and in wireframe mode!

Link to comment
Share on other sites

  • Advanced Member

Hmm interesting, thanks for the heads up! I didn't know about that..

 

Which OS/Version are you running? Also, can you run the same tests with 3DCoat OpenGL and see how things stack up?

I have a 770gtx with 340.52 drivers, I'll run a few tests. :)

Edited by Nossgrr
Link to comment
Share on other sites

  • 2 weeks later...
  • Advanced Member

The lowest my frame rate will go using an overclocked GTX 670 (256-bit / 2048 MB GDDR5) with driver version 331.65 and 3D Coat 4.1.12 DX Cuda in Windows 7 SP1 64-Bit with a 30 million triangle mesh is 17 fps. I downloaded the 340.52 driver yesterday, but haven't installed it yet. Trying to figure out why I can't move, rename, or delete the installer lol.

Link to comment
Share on other sites

  • Reputable Contributor

The lowest my frame rate will go using an overclocked GTX 670 (256-bit / 2048 MB GDDR5) with driver version 331.65 and 3D Coat 4.1.12 DX Cuda in Windows 7 SP1 64-Bit with a 30 million triangle mesh is 17 fps. I downloaded the 340.52 driver yesterday, but haven't installed it yet. Trying to figure out why I can't move, rename, or delete the installer lol.

I had one just like it...and the framerate would drop through the floor when working on an object more than 5mill polys + wireframe turned on. With it turned off, everything was fine. Don't know why wireframe would kill the frame-rate, but it did/would. It's why I asked Andrew about it. He put me in contact with his NVidia rep, who admitted it was a "known" issue...and they would try to address it with an upcoming driver. I guess they finally did...a year late. It made the card unusable in my daily workflow, as you NEED wireframe turned on when working with LiveClay or when you are trying to assess the density level of your object.

 

I sold the new card I bought on ebay (took about a $75-100 loss in the process, and bought a GTX 580 3GB. Been really happy with it, especially since it's just below a $1k Titan in terms of CUDA performance. I'm probably going to wait and see what Nvidia has next, before I upgrade.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...