Jump to content
3DCoat Forums

Advise on System Configuration


Recommended Posts

  • Advanced Member

Hi ,

 

Im about to replace my old pc (AMD Athlon II X4 640 + nVidia GTX 460),

 

Im currently comparing two pc's

 

1.) i5-4440 cpu  + Nvidia  GTX 770

 

2.) i7-4770 cpu  + Nvidia  GTX 760

 

 

What would your choice on this combinations.

Im mainly using 3DCoat and MODO!

Im mostly interested in getting less laggy brush behaviour on high polycounts

Edited by chingchong
Link to comment
Share on other sites

  • Reputable Contributor

I personally had a bad experience with a GTX 670 (which the 770 is just a slightly enhanced version of) in 3D Coat. So much so, that I sold it within a few weeks after buying it, and got a GTX 580 3GB, instead. The problem was when you toggle wireframe on, like you typically would when using LiveClay, the navigation performance was just HORRIBLE. I contacted Andrew about it and he said there was nothing he could do on his end, but he had his NVidia rep get in contact with me. I showed him the video I recorded, showing the difference between it and the GTX 470 I was replacing. He admitted that it was related to a known problem, and that they also didn't expect to see such a dramatic dropoff in CUDA performance, compared to the Fermi cards.

 

To this day, the GTX 580 is still just shy of the Titan in terms of GPU computing with CUDA...according to a recent comparison benchmark on Tomshardware.com. I have two different GPU renderers in 3ds Max, that use CUDA, and the 580 rocks. And it's a great performer in 3D Coat, all the way around...even when I use a really heavy scene, with lots of individual components. The main thing that hampers the GTX 600-770 is that it's the first generation of Kepler architecture...and Nvidia hamstrung it in terms of GPU computing. They also reduced the memory bus from 384bit with the Fermi cards, to 256. That is why it choked on the wireframe in 3D Coat...and the Fermi cards had no problem. They went back up to 384bit with the Titan and 780.

 

So, my advice would be to go with a GTX 580 3GB (don't settle for the 1.5GB if you don't have to...the 3GB will make a difference when painting large textures and such, and is RAM size is critical if you plan to use a GPU renderer at any point) on eBay, Amazon or something like that. Look around, cause some people are still charging way too much...but you should be able to get one for $200 USD or less, if you look around.

 

With that extra bit of cash you save, try to get an i7 4930....the 6core/12 threads model. It's a beast, and not all that much more than the 4770. 3D Coat is multi-threaded when sculpting and painting...using Intel's TBB library, so you'll notice a HUGE boost there, right away. Modo being CPU only, will also thank you for it. 

Link to comment
Share on other sites

  • Advanced Member

I have the same issue with my system.  I currently have two GTX460's, to replace them I would have to go up to a 780 to get a boost in performance.  In general 1 Ferme core = 3 Kepler cores.  There are other things that Kepler brings that are nice, like more texture images and better cooling.  You might want wait for the Maxwell cards instead, the GTX750 is one, but they are not fully supported yet by Cuda, at least there are issues with them still.  Octane devs have said that they tried to support the Maxwell chip but they couldn't get Octane to work with them, and looks to be a driver issue.  

 

To comment on your question I guess it depends on what you want to do (or how you want to do it).  I use Octane so my focus is on GPU's, if you are using a CPU renderer then you would want to spend your money on the CPU.  

 

Jason

Link to comment
Share on other sites

  • Advanced Member

Thank you very much Jason& Don for your sharing your thoughts on the combination.

I have to admit that i wasnt aware of the gpu issue.

Is the Gtx 580 also better than GTX 780 for 3dCoat?

regarding the cpu, im not sure if the i7 4930 will fit in my budget.

But i will keep an eye on this.

Link to comment
Share on other sites

  • Reputable Contributor

Thank you very much Jason& Don for your sharing your thoughts on the combination.

I have to admit that i wasnt aware of the gpu issue.

Is the Gtx 580 also better than GTX 780 for 3dCoat?

regarding the cpu, im not sure if the i7 4930 will fit in my budget.

But i will keep an eye on this.

The 580 is about neck and neck with the 780 in terms of CUDA performance, but the 780 probably will navigate a heavy scene a somewhat better than the 580. Nvidia just took a big step back with the early Kepler cards (GTX 600-770). But then again, it comes down to what you use it for. The 780 will cost a few hundred more than the 580 and for that difference, you could afford to step up to the i7 4930 instead.

Link to comment
Share on other sites

  • Reputable Contributor

Yep, When Nvidia crippled the cuda cores, they were so kind as to not really advertise that fact...

There were a large number of Blender uses who upgraded their video cards thinking they would get a big cuda core jump for rendering in Cycles. 

Many of them were steaming at their gills as their 580's were beating out the new cards... :aggressive:

 

Check the cuda core compute ability and the FP64 performance before buying if do alot of GPU rendering or use cuda enabled programs.

 

Personal opinion here ----Get lots of system ram as well, it is the cheaper product... I plan on getting 32 gigs when I upgrade. Some say it is overkill but I rather have than not have it. 3DC uses alot of ram to hold undo information as well as your higher polygon or voxel model.

Link to comment
Share on other sites

I just purchased a new machine as well.

 

If you're planning for very good performance with 3D software and longevity of your hardware, I would go with a configuration close to this:

 

 

Intel 4770 (or 4770K)

Nvidia GTX 660 or 760 with at least 2GB GDDR5

At least 16GB of DDR3 (DDR4 is coming out soon, if you can wait)

 

 

Regarding the videocard, I wouldn't get a 500 series, while it may perform quite close to a newer one, Nvidia has started phasing services out for older products, even if they are still decent hardware (see Shadowplay running on 600 series or higher, for example). I personally went with the 760 with 4GB of RAM, but it was a very close choice between that and the GTX 660. The only reason I went with the 760 over the 660 is because I testing a piece of software that is GPU accelerated, and the 760 has a few hundred more CUDA cores. Otherwise I may have gone with the 660.

 

Frankly, I think part of the issue you're having with performance is CPU related. I have the same issue (currently using an AMD 6 core). Intel edges out in performance right now, and with all of my tests between my AMD workstation and my newer Intel laptop, my laptop crushes the workstation in performance at every turn (for base the laptop has an Intel Core i7 4700MQ, 12GB of DDR3 and GTX 765M w 2GB GDDR5. Workstation has an AMD Phenom II x6, 8GB of DDR3 and two GTX 460s each with 1GB of GDDR5).

Link to comment
Share on other sites

  • Reputable Contributor

My point about the GTX 580 is that, for the money, it's a better option than a GTX 600-770....unless you plan to use it to play games on. In CG, the 580 beats them black and blue. I know this by empirical experience.

 

<a href="http://www.screencast.com/t/BrVnQC8UOQXd">Nvidia GTX 470 vs GTX 670</a>

Link to comment
Share on other sites

  • Reputable Contributor

I tried to preview the last post, to see if the embedded code would work here, but it just committed the post...and without being able to edit the post, I'll just have to keep making multiple post to protect someone's tender sensibilities....

 

So, yeah, you can see the video I sent Andrew and then the NVidia Rep. The early Kepler line (GTX 600-770) is garbage compared to the Fermi cards, in 3D Coat, and that isn't even mentioning the CUDA performance. When they dropped the Memory Bus from 384bit to 256bit, that is a quantifiable measure to "cripple" the card. They made up for it in other ways, when game playing....but for CG work...you are buying a Hamstrung Horse. Stay away from it. I spent $425 on the GTX 670 4GB card, and after discovering all this mess, I ended up having to sell it on ebay, about 3-4weeks later, for roughly a $50+ loss. I then had to turn around and buy a GTX 580 3GB on eBay, but it only cost about $225.

 

It works better in 3D Coat in almost every measurable way. And it thumps the Kepler cards in GPU centric renderers (Blender Cycles, Octane, Furryball, Thea, Arion, Moskito, etc.). So, again....with the money you would save, by going with a GTX 580, you could invest toward a better CPU (ie, the i7 4930). You can often times find good deals on it on ebay, rather than through a major retailer.

 

http://www.screencast.com/t/BrVnQC8UOQXd

Link to comment
Share on other sites

  • Advanced Member

stuff like this is really frustrating.. i was asking a similar question on c4dcafe forums. and from the responses a guys 470 beat the 770...

 

shouldn't that FCC place, or whatever its called "consumer protection thing" be all over this? i know the number of people affected is quite low, but still frustrating.

 

I nearly bought a 770, now instead looking for a 580..

  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor

Today Video cards are almost the most expensive item to buy for our computers. We have to adjust our mindset according. I myself would not buy anything under a Titan when upgrading to a new computer.

 

The compute performance and lack of double precision as PolyHertz pointed out leaves you wanting...

 

From AnandTech.... Sums it up fairly good comparing the 780 to the Titan.

"GTX 780 can offer 90% of GTX Titan’s gaming performance, but it can only offer a fraction of GTX Titan’s FP64 compute performance, topping out at 1/24th FP32 performance rather than 1/3rd like Titan. Titan essentially remains as NVIDIA’s entry-level compute product, leaving GTX 780 to be a high-end gaming

product."

 

So even the Titan's baby brother is running around with a bad leg for our type of applications.

Edited by digman
Link to comment
Share on other sites

  • Reputable Contributor

stuff like this is really frustrating.. i was asking a similar question on c4dcafe forums. and from the responses a guys 470 beat the 770...

 

shouldn't that FCC place, or whatever its called "consumer protection thing" be all over this? i know the number of people affected is quite low, but still frustrating.

 

I nearly bought a 770, now instead looking for a 580..

I certainly let my frustrations be known about all of this to that Nvidia Rep. I have to believe a LOT of studios have been bitten in bum the same way and let NVidia know about this. Seems like a blatant attempt to force consumers in the CG market to buy the workstation cards. I don't like being forced to buy anything, and Tomshardware.com made it clear that even the Workstation cards came up lame against the Fermi cards in terms of GPU computing (ie. GPU rendering). This is why I am waiting to see what Nvidia does next.

Link to comment
Share on other sites

It's definitely a sticky situation with video cards, and it depends on how you intend to use it. It's more than annoying that Nvidia isn't being tackled by the FTC the protect consumers, as you pointed out Aleksey.

 

It's a bit nuts that they have been parting out functionality from previous cards into different branches of cards in the latter builds. I almost went with an ATI/AMD video card with my recent workstation build because of it. But there was something specific I wanted with the functionality of the newer video cards (Shadowplay for tutorials, it works very well).

 

 

We'll see how this plays in the future. Maybe they'll stop parting out functionality with future card lines.

Link to comment
Share on other sites

  • Advanced Member

Thank you all for hints and tipps.

While a Titan is nice, its out of reach for me, since it will be 2/3 of my budget.

Lol, i just saw on ebay a used rig, with triple SLI Titan Black!!! o-O

How about SLI mode for 3DC, would there be an increase in performance, if i just get another gtx 460, just to bridge the time till some better gpus are available?

(Ah Javis just saw, that you have two in your old workstation, what if they are in combination with an intel"?)

CPU wise i think i will go for i7 4930k, 16GB ram( and upgrade later to 32gb)

-fits the budget, because my boss agree to pay with hardware for my work overtime, "pre-taxed". :D

Edited by chingchong
Link to comment
Share on other sites

Hi Peter!

 

I'm not entirely sure, but based off what Andrew said about 3DC using an engine similar to a game and the fact it can go full screen, it should be able to take advantage of SLI.

 

I think if I had both of my 460 cards in the machine, it might perform quite well. I intend to try it out before I open the box of the 760, lol! If it does well, I can forego shadowplay entirely and use a software based recorder. I'll let you know how it goes. :) In the meantime, you could try with your 460 in a newer machine to see how it goes also. I would be very good to not need a $290+ video card!

 

 

I think the hardware you chose is good! The "k" Haswell's are really nice. Anyway, looking forward to hearing your experience with the 460 in your new machine, if you try it out. 

Link to comment
Share on other sites

  • Advanced Member

my newer Intel laptop, my laptop crushes the workstation in performance at every turn (for base the laptop has an Intel Core i7 4700MQ, 12GB of DDR3 and GTX 765M w 2GB GDDR5. Workstation has an AMD Phenom II x6, 8GB of DDR3 and two GTX 460s each with 1GB of GDDR5).

 

 

What's your laptop? I'm looking at the asus rog with the same cpu as yours but a bit pricey I think.

Link to comment
Share on other sites

  • Advanced Member

From AnandTech.... Sums it up fairly good comparing the 780 to the Titan.

"GTX 780 can offer 90% of GTX Titan’s gaming performance, but it can only offer a fraction of GTX Titan’s FP64 compute performance, topping out at 1/24th FP32 performance rather than 1/3rd like Titan. Titan essentially remains as NVIDIA’s entry-level compute product, leaving GTX 780 to be a high-end gaming

product."

 

So even the Titan's baby brother is running around with a bad leg for our type of applications.

 

This really depends on what applications you want to run on the card.  For instance Octane runs using single precision numbers so you can run it on a GTX780 and 780ti at Titan speeds but for a lot less money.   If you are running an application that needs floating point performance then yep, you want to get a Titan or a Quatro card for that.  It's interesting that Octane runs much faster on the gaming cards than it does on the Quadros.  

 

Jason

Link to comment
Share on other sites

 

my newer Intel laptop, my laptop crushes the workstation in performance at every turn (for base the laptop has an Intel Core i7 4700MQ, 12GB of DDR3 and GTX 765M w 2GB GDDR5. Workstation has an AMD Phenom II x6, 8GB of DDR3 and two GTX 460s each with 1GB of GDDR5).

 

 

What's your laptop? I'm looking at the asus rog with the same cpu as yours but a bit pricey I think.

 

 

 

Hey geo!

 

It is a MSI GE60. Here's the AZ listing:

http://www.amazon.com/MSI-GE60-2OE-003US-9S7-16GC11-003-15-6-Inch/dp/B00CU9GKTO?ref_=pe_527950_34207380

Link to comment
Share on other sites

  • Advanced Member

i think for my next build im just gonna stick 2 gtx 580's into it. around $500 off ebay.

 

and then when something better comes out, if it comes out, i'll get that.

 

but seeings as 2 580's will give stronger performance than a titan at least in thea ( minus scene size ability due to ram) seems like 580's are a better choice for me.

http://thearender.com/cms/images/edition13/TheaPrestoGPUCPUBenchmark.pdf

 

as far as laptops, this thing really grabs my attention especially if gpu rendering becomes a real thing. just wish it had a stylus and could flip into tablet mode.

http://www.theverge.com/2014/1/8/5286760/gigabyte-gets-hilariously-serious-about-a-impressively-thin-gaming

Link to comment
Share on other sites

  • Reputable Contributor

Thank you all for hints and tipps.

While a Titan is nice, its out of reach for me, since it will be 2/3 of my budget.

Lol, i just saw on ebay a used rig, with triple SLI Titan Black!!! o-O

How about SLI mode for 3DC, would there be an increase in performance, if i just get another gtx 460, just to bridge the time till some better gpus are available?

(Ah Javis just saw, that you have two in your old workstation, what if they are in combination with an intel"?)

CPU wise i think i will go for i7 4930k, 16GB ram( and upgrade later to 32gb)

-fits the budget, because my boss agree to pay with hardware for my work overtime, "pre-taxed". :D

SLI doesn't help on most CG apps. When using a 2nd card for GPU rendering, you don't even have to have a monitor hooked to it. The renderer sees it and you can choose your 2nd card to render while working interactively (first card for display purposes only). But when you are ready to do final frame rendering, you can enable both cards to contribute.

Link to comment
Share on other sites

  • Advanced Member

SLI doesn't help on most CG apps. When using a 2nd card for GPU rendering, you don't even have to have a monitor hooked to it. The renderer sees it and you can choose your 2nd card to render while working interactively (first card for display purposes only). But when you are ready to do final frame rendering, you can enable both cards to contribute.

Thanks,

Here is my rig so far,

http://pcpartpicker.com/p/3GTZQ

I will use my current case, plus current hdd as additional drive, and my gtx 460 until i can grab a decent card.

Not decided yet what to do with gpu, though.

Link to comment
Share on other sites

  • Reputable Contributor

Thanks,

Here is my rig so far,

http://pcpartpicker.com/p/3GTZQ

I will use my current case, plus current hdd as additional drive, and my gtx 460 until i can grab a decent card.

Not decided yet what to do with gpu, though.

I would wait until the GTX 880 hits the market. Already been revealed. That system looks pretty stout, but if you plan to add a 2nd card for GPU rendering, you will want to bump that PSU up to 1000W+, because each card pulls quite a bit of juice and you will be pushing it at 750W

Link to comment
Share on other sites

  • Advanced Member

Thanks,

Yeah, i noticed that:

There are some OC Cards (GTX 580) with nearly 400 watt consumption. Thats really insane.

So, if i grab one of these, i think i will stay with a single gpu.

 

Those Maxwell gpus seem far more efficient in terms of power/noise.

Edited by chingchong
Link to comment
Share on other sites

  • Reputable Contributor

They are, but I am seeing two conflicting reports as to it's Memory Bus size. Some reports have it at 256bit. If that is true, then they are making the very same mistake they did with the early Keplers....to which they went back to where they were in the Fermi Cards (384bit). I cannot understand their rationale for this move. It's like cranking up the horsepower on a Corvette, but then replacing it's transmission from a Malibu or Cruze.

 

If they go back down on memory bus size, I am not about to waste my money on one. I personally think it's a cost-cutting move, as those lines maybe using gold (for best conductivity). Not 100% sure about that, but that is my hunch. They are able to enhance gaming performance with other measures, but increased Bus size seems like something they are trying to avoid. The performance could be significantly better if they increased it to 512 (where the GTX 200 line was).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...