Jump to content
3DCoat Forums
Sign in to follow this  
alvordr

Dual 650Ti Sale

Recommended Posts

Folks,

I saw this sale, this morning and thought it might be useful for someone:

http://www.newegg.com/Product/ComboBundleDetails.aspx?ItemList=Combo.1323525&nm_mc=EMC-IGNEFL052413&cm_mmc=EMC-IGNEFL052413-_-EMC-052413-Index-_-Combo-_-Combo1323525-LM6A

The sale flyer said free shipping on this and it's $359.99 after the Mail-In-Rebate.

I hope this helps.

Share this post


Link to post
Share on other sites

Stay away....far, far away....from Nvidia 600 series (Kepler) cards. Great for Games, but you will experience major suckage levels in 3D Coat. When you have wireframe turned on, it is brought to it's knees. My old GTX 275 runs circles around the 670 I recently bought. Going to be selling the 670 and finding a 580 on Ebay instead. I think it's because NVidia intentionally crippled the GTX line (starting in the Fermi cards), but cutting the Memory Bus and ROP's (Render Output Processors) dramatically. All they have effectively done is increase the VRAM amounts and increased the clock speeds. This seems like a clear attempt to steer unhappy customers to their expensive Quadro line. All it's really going to do is tick me off that they would even try this and end up going back to AMD cards.

If you have a GTX 400-500 series, you're good. No need to waste your money like I did.

Share this post


Link to post
Share on other sites

Wow. Good to know. I know that my experiences with my 550Ti card don't match what others have had with the same card, so I figured all that discussion had mostly to do with whatever computer config people had. I built mine and have had no issues with it, other than the fact that I can't update my drivers (both the OEM and Nvidia drivers cause issues, so I've been using the same 1+ year-old drivers. The only reason that was an issue was when I was beta testing a new renderer that literally mandated new drivers. It wasn't worth the problems that arose, so I downgraded my drivers.

Share this post


Link to post
Share on other sites

To be fair, the FPS is much higher when wireframe is not turned on, but I asked Andrew if there was something he could do on his end (in 3D Coat)...thinking maybe it was something to do with the way 3D Coat handles wireframe in DX or GL, but he just passed it off to Nvidia (some rep he knew), and that guy just passed it off as drivers, maybe...saying it wasn't happening on his Quadro 5000 (no kidding, Sherlock...I had already mentioned that the problem didn't exist with the 470 I had). I'm pretty ticked off about it and considering putting something up on youtube about it, to warn others in the CG community not to waste their money taking a step back.

Problem with that is I don't know if it's an issue with 3D Coat only. I'm thinking it might, because instead of a true wireframe...it's a shaded wireframe = not good for performance. Why is this important? Cause you are going to want to have decent wireframe performance when using LiveClay, as you'll have it on so you can see the tessellation or reduction as you sculpt. With it being currently crippled with the Kepler cards, it's practically worthless.

Share this post


Link to post
Share on other sites

Hmm... My 660 gives me no issues on my Lenovo y580 laptop.

Smooth as Butter. :)

With wireframe turned on, navigating about a scene with 7mill or more polys? I have a video recorded showing where an 8mill poly object was smooth as butter with my 470, but is choppy and laggy with the 670 (4GB)

As a test, start a new scene and choose the Base Human primitive (next to the Mannequin in the splash screen > voxel sculpt option). Navigate about....not terribly bad. Now click on the Res+ icon in the Voxtree layer panel. Now try and navigate about the scene with wireframe toggled on (W key)

Edited by AbnRanger

Share this post


Link to post
Share on other sites

I will try a few tests with the wireframe turned on later and report back.

I hope I didn't waste my money on a turkey.

I think you got a splendid laptop for the $$$, it's just that NVidia crippled the Kepler cards in terms of Compute ability and they have reduced the Memory Bus size with each successive generation....going backwards instead of forward. My old GTX 275 has a 448bit memory bus. Guess what the 470 is/was? 320 bit. How about the 670?....256!!! WTF, over? Why in the world is NVidia taking big steps backwards? They are putting a V8 in a ride with a transmission made for a 4-banger.

Check out this comparison:

http://gpuboss.com/gpus/GeForce-GTX-670-vs-GeForce-GTX-580

Edited by AbnRanger

Share this post


Link to post
Share on other sites

The cards that were on that sale link won't go in a laptop. I would listen to AbnRanger on this one, as I have no personal experience with the 600 series.

Share this post


Link to post
Share on other sites

One of the reps for Nvidia (I reported the issue to Andrew with Video recordings of both cards...and Andrew forwarded my email to his rep) told me that they believe it is a driver issue identical to a known issue, and that they are working to get it fixed in an upcoming driver update...regarding the CUDA issue (where they took a step back in performance from Fermi cards even though it has more CUDA cores), he said they acknowledged that it hasn't worked as well as they had hoped and that Kepler cards need to be tuned for the technologies that were introduced (Dynamic parallelism, HyperQ, etc)...by vendors like Andrew. So, part of it is them, and part of it is software companies not recompiling/updating CUDA to take advantage of the tools in their hands.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×