Jump to content
3DCoat Forums

Anyone with an Nvidia GTX 670/680?


AbnRanger
 Share

Recommended Posts

  • Reputable Contributor

Has anyone upgraded to the newer GTX 670/680, and if so, have you noticed any tangible boost in performance, with with CUDA in the Voxel Sculpting room or with Texture Painting in the Paint Room? I've gotten burned the last two times I did a full upgrade. Was hoping hardware upgrades would help overcome some of the bottlenecks in performance, but I've been a little disappointed. Granted, going from 8GB to 16GB made a difference in the size/complexity of the scene, but it doesn't help with the bottleneck 3DC has when working with large brush radius'.

The card I currently have (GTX 470 1.25 GB VRAM), and I remember a long time ago, Andrew mentioning larger VRAM amounts affect the texture size you can work with. So, I was thinking of upgrading to the GTX 670 4GB card, but I don't want to just throw money at the problem, only to find out that it's an application architecture issue.

Link to comment
Share on other sites

  • Reputable Contributor

I upgraded to a 2 gig video card because of limited computer ram but it seems that 3Dcoat still seems to eat computer ram like crazy... I thought ok, Andrew said if you want to load higher texture sizes have more video ram. It really made no difference as it seem to consume as much computer ram as before even when first importing or merging the model.

I believe it has has something to do with the undo data. It is storing undo information in computer ram.

Plus I notice if I delete layers after merging my available computer ram gets smaller and smaller very quickly. I have asked a few times to have a setting to be able to clear the undo buffer.

This is not quite answering your problem but might be part of it...

Link to comment
Share on other sites

  • Advanced Member

I have a GTX560 and a when I was freelancing at a company they had a computer with GTX680 with 4GB, it did not feel any faster at all.. :(

Also tried Octane render but it does not seem to work because somehow it is not supported yet, not sure if this is the same with 3DCoat.. updating CUDA code might have to do with it I suppose.. I dont know.

Link to comment
Share on other sites

i was reading this

The thing to remember is that for Nvidia cards, the first digit of a card model is simply the series number, or how recently the card was made. It doesn't reflect anything about a card's performance except that sometimes you'll see small improvements as newer series are introduced... support for later shader model versions and DirectX.

By the way, this also applies to older Nvidia cards. A GeForce 8800GT is better than a 9600GT, an 8600GT is better than an 9400GT.

Link to comment
Share on other sites

  • Advanced Member

I have a GTX560 and a when I was freelancing at a company they had a computer with GTX680 with 4GB, it did not feel any faster at all.. :(

Also tried Octane render but it does not seem to work because somehow it is not supported yet, not sure if this is the same with 3DCoat.. updating CUDA code might have to do with it I suppose.. I dont know.

If you paid for the Octane beta there is a release candidate, version 3 that works with the Kepler GPU. It works with my GTX660. Octane is cool, it imports 3D-Coat obj's with most of the textures in place. Just wish it had displacement mapping.

AbnRanger, I put together a new system with an Intel Core i7 processor, lots of ram, and a GTX660 (3GB GDDR5). Everything runs quicker including 3D-Coat. I also installed Windows 8, what a piece of poop. Basically Windows 7 with miscellaneous annoyances.

Kind of wished I would have popped for the GTX660 Ti, more cuda cores.

Link to comment
Share on other sites

  • Reputable Contributor

If you paid for the Octane beta there is a release candidate, version 3 that works with the Kepler GPU. It works with my GTX660. Octane is cool, it imports 3D-Coat obj's with most of the textures in place. Just wish it had displacement mapping.

AbnRanger, I put together a new system with an Intel Core i7 processor, lots of ram, and a GTX660 (3GB GDDR5). Everything runs quicker including 3D-Coat. I also installed Windows 8, what a piece of poop. Basically Windows 7 with miscellaneous annoyances.

Kind of wished I would have popped for the GTX660 Ti, more cuda cores.

Thanks for the tip on that. I may wait and see what's coming down the pike for the Intel CPU's, and hopefully Andrew and Raul can get around to addressing the performance issues mention. I'm VERY concerned that Intel will just sit on it's laurels and take it's sweet time now between major advancements in it's chips. AMD has publicly admitted they are no longer engaged in the High End CPU arms race...and instead going to try and leverage their ATI acquisition (years ago) to attack at the budget-level (Trinity APU) market. I currently have an AMD Phenom X6, and I think that was the last chip where they fighting with Intel for the high-end market.
Link to comment
Share on other sites

  • Advanced Member

I was basically in the same place as you are. I had a GTX465 (or 460) upgraded to 580. I didn't notice a huge speed bump like I was hoping. But I'd say for sure it can handle bigger scenes but I didn't do an apples to apples test before and after. If you want we can make a scene we all share and do some sort of bench mark on it. Not sure exactly how we'd do that but its something. maybe something simple like I can work with this file smoothly. Like smooth one of the humans until you cannot use a radius 0.5 grow brush on it smoothly.

full human 1 object smooth level 4x I can still tumble around the scene easy but the grow brush is now slow. about 4-5 seconds to go from chest to hip and back radius 0.5

Link to comment
Share on other sites

  • Reputable Contributor

On the GPU side, it compares OpenGL performance. I only use DX, plus my concerns are more tool-specific. As in brush performance in Paint Room and CUDA in the Voxel Sculpting room. I'm probably going to hold off upgrading until Andrew and raw do something to ensure that 3D Coat is utilizing the cards to it's fullest. Right now, it appears that it doesn't. As long as 3D Coat is not recompiled for the newer CUDA versions, then it won't.

Link to comment
Share on other sites

  • Reputable Contributor

yeah I think you're right. altho surfing for prices and deals when they happen might not be a bad idea too.

When finalRender R4 GPU is finally released, I will probably spring for an upgrade then. I think right now it would probably be a waste, due to the reasons previously mentioned.
Link to comment
Share on other sites

  • 2 weeks later...
  • Advanced Member

I'm considering the Quadro K5000 - it's not cheap - but given that Mac Pro's don't have many options (this is the first in years) and that I need the 4K video that card does too, I'm hoping it will also scream for 3D and improve performance with 3D Coat. I've generally been using my Nvidia 8800 GT - but was thinking about trying 3DC out on my other Windows box which has an NVidia 660 Ti - too bad I can't put one of those in my Mac Pro (I don't know how to hack the bios into EFI if that's even possible - hence waiting for the K5000).. Don't mean to hijack, but does anyone here have the K5000 or know how it will work with 3DC?

Link to comment
Share on other sites

  • Advanced Member

I bought a GTX 680 (4gb version) to replace my GTX 470.

In real world usage I see NO difference at all between the two in any 3D application, nor in any GPU accelerated app I own (Fusion, After Effects, Photoshop, etc.).

Maybe it's great if you do GPU based rendering, but for anything else I don't see the massive speed increase I was expecting.

-Paul

Link to comment
Share on other sites

  • Reputable Contributor

I bought a GTX 680 (4gb version) to replace my GTX 470.

In real world usage I see NO difference at all between the two in any 3D application, nor in any GPU accelerated app I own (Fusion, After Effects, Photoshop, etc.).

Maybe it's great if you do GPU based rendering, but for anything else I don't see the massive speed increase I was expecting.

-Paul

That makes sense, I guess. Why? Because when I open GPU-Z to monitor GPU usage during voxel sculpting, it doesn't even go over 30% usage....so adding 3 times the amount of cores wouldn't matter if it's not even fully utilizing what's available. This is why I've been pleading with Andrew to re-compile 3D Coat's implementation of CUDA (and hopefully expand it's usage throughout the application...ie, Paint Room). It's still coded for CUDA 1. We're on CUDA 5. There have been some significant enhancements made to CUDA since Andrew first introduced it into 3DC.

One of those is specific to the Kepler generation of cards. It offers dynamic parallelism among other unique technologies. So, threads coded for CUDA 1 will not utilize those technologies in Kepler (GTX 600 series) cards. This is why I was afraid to throw down $400+ for a new card hoping it would make a major improvement. Glad I didn't waste it.

Link to comment
Share on other sites

  • 3 weeks later...
  • Contributor

Damn. I was -this- close from buying Gigabyte's GTX670 4GB and wasting $500. I've stumbled upon this thread in the very last moment.

So this card won't give any boost at all when compared to 460Ti? No smoother sculpting and texturing in 3D Coat? Man, what a disappointment.

(...) I've generally been using my Nvidia 8800 GT - but was thinking about trying 3DC out on my other Windows box which has an NVidia 660 Ti (...)

Photonvfx, have you tried this yet? Was there a huge difference between 8800GT and 660Ti? I'm interested in this because I currently still use 9800GTX 512MB which I believe is a very similar card to 8800GT. The thing is that sometimes it's a pain working with larger texture layers in 3D Coat with so little VRAM (significant waiting time when hiding/unhiding layers). Also, the 9800GTX starts to choke when there's about 12+ million tris displayed on screen, with 17+ rendering the program almost completely unresponsive.

Link to comment
Share on other sites

  • 1 year later...
  • New Member

Hello everyone, I was planning on upgrade from a GTX670 to a 970 and I just wanted to make sure that the card would be compatible with my motherboard.

Also, the GTX670 I have is an Asus model that uses two 6 pin cables to connect to my power supply.

Does the 690 use two pins as well?

I cannot find the specifications on the newegg website and I don't want to have to modify my power supply cables.

Link to comment
Share on other sites

  • Advanced Member

the asus strix 970 uses one 6 pin power connector. it varies by manufacturer. the msi one uses 2 i think. I think anandtech had a round up review. which checks this.

 

best way is to check manufacturer website i guess.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...