Contributor Greg Posted March 4, 2013 Contributor Report Share Posted March 4, 2013 Hi, I've upgraded my computer, but it came with a pretty average AMD video card. I'm not going to spend a bunch of money, but found a EVGA GeForce GTX 650 card for either $160(1 gig memory) or $180 (2 gig memory). It has 768 Cuda cores which I guess is good and DDR 5 memory. And, I wouldn't have to replace my 480w power supply to use it. Also within my price range of under $200. Can any of you who are 'in the hardware know' give me your thoughts on it? Thanks in advance! Greg Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted March 4, 2013 Contributor Report Share Posted March 4, 2013 Go with 2 GBs of memory. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 4, 2013 Advanced Member Report Share Posted March 4, 2013 I'd go for the 2 gigs ddr5 memory card if I were you. Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 4, 2013 Author Contributor Report Share Posted March 4, 2013 thanks guys. I'm also looking at the GeForce GTX 660 . It's about 20 over my desired price range after rebate. Where does it end? Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 4, 2013 Reputable Contributor Report Share Posted March 4, 2013 thanks guys. I'm also looking at the GeForce GTX 660 . It's about 20 over my desired price range after rebate. Where does it end? If you can, spring for the 660ti w/ 3GB. With the CPU (Intel vs AMD) arms race officially over (AMD threw in the towel publicly and stated they are focusing on budget and mobile device markets and are no longer competing with Intel for the Hi-End market), it make more sense to put your chips in the GPU segment. 3D Coat can handle larger maps and brush radius' with more GPU memory, not to mention zip around a lot faster in the viewport w/ more GPU muscle. Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 4, 2013 Author Contributor Report Share Posted March 4, 2013 I may consider that one too Don, thanks. A bit more than I'd wanted to spend, but I could easily talk myself into it. lol. Gonna think real hard about the 3 choices and will decide one day. Greg EDIT: I ordered the GTX 650 Ti with 2 gig ram. Decided to stay in budget for a change. Thanks to all who responded. 1 Quote Link to comment Share on other sites More sharing options...
Advanced Member michalis Posted March 5, 2013 Advanced Member Report Share Posted March 5, 2013 2 GB ram is not much. Actually, this is why all serious renderers are based on CPU and not GPU. 6GB of VRAM costs too much, but it is something. A 24 or 32 GB CPU based engine still looks better. Just saying. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 5, 2013 Reputable Contributor Report Share Posted March 5, 2013 2 GB ram is not much. Actually, this is why all serious renderers are based on CPU and not GPU. 6GB of VRAM costs too much, but it is something. A 24 or 32 GB CPU based engine still looks better. Just saying. He just upgraded to Lightwave 11.5, didn't you Greg? VPR is still CPU based, so he wouldn't necessarily see any big benefit from a huge amount of VRAM. 2GB will serve him fairly well in 3D Coat, so I can see what he chose that model. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2013 Reputable Contributor Report Share Posted March 5, 2013 I hope to get two Nvidia titans when I upgrade.... 6 gig of ram and 2688 cuda cores on each card, blender will use both cards for Cycles Gpu Rendering. I know Blender Cycles will only see the 6 gig on each card and use it that way and not as one big lump total but that is ok. How it divides up the scene memory for each card, I have no idea. I can also just render with one card and free up the other card so I can continue to work in blender or other applications. I waited a long time between upgrades so I want these two babies... Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted March 5, 2013 Contributor Report Share Posted March 5, 2013 How big a PS do you need to power them? Quote Link to comment Share on other sites More sharing options...
Contributor BeatKitano Posted March 5, 2013 Contributor Report Share Posted March 5, 2013 I had a few bad experience with gamers card SLI Digman, I hope you're luckier. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2013 Reputable Contributor Report Share Posted March 5, 2013 A very, very big one.... Each card uses 250 watts 600 watt power supply is the minimum for one card plus running the computer. I would need a good quality 1600 watt power supply. Price about 400 dollars. I need the extra to be safe and for running the cpus as well.. http://www.geforce.c.../specifications Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2013 Reputable Contributor Report Share Posted March 5, 2013 I had a few bad experience with gamers card SLI Digman, I hope you're luckier. For rendering using Cycles with cuda you do not need them to be SLI mode, in fact it is better if they are not in SLI mode... Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted March 5, 2013 Contributor Report Share Posted March 5, 2013 Octane says to exit SLI mode. Quote Link to comment Share on other sites More sharing options...
Contributor BeatKitano Posted March 5, 2013 Contributor Report Share Posted March 5, 2013 For rendering using Cycles with cuda you do not need them to be SLI mode, in fact it is better if they are not in SLI mode... Ok, I thought you were talking about a SLI setup, my bad. Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 5, 2013 Author Contributor Report Share Posted March 5, 2013 He just upgraded to Lightwave 11.5, didn't you Greg? VPR is still CPU based, so he wouldn't necessarily see any big benefit from a huge amount of VRAM. 2GB will serve him fairly well in 3D Coat, so I can see what he chose that model. Yes, just upgraded & enjoying it. That's also part of why I didn't want to spend way too much on the video card lol. It should be here on Thursday! Things seemed to run o.k. on the card that came with the computer, but just o.k. And sculptris wouldn't run on it at all. To be honest, most things work great on my old computer still .. I just felt like treating myself. I do that every couple of years. Quote Link to comment Share on other sites More sharing options...
Advanced Member popwfx Posted March 6, 2013 Advanced Member Report Share Posted March 6, 2013 For rendering using Cycles with cuda you do not need them to be SLI mode, in fact it is better if they are not in SLI mode... Apparently, there are some very good external box power supplies you can get from NewEgg if you want to run one of the cards off of the internal and on out of the external. A small cable could be run out the back if you don't have enough space/fans inside your case for 2 of them or a monster one.... I would have had to do this for a Mac Pro 3,1 in order to get it to run a NVidia 580 - But I stuck with upgrading that with the 570 and will get the titans if I build a new PC or consider the Quadro K5000 if it ever comes out for the mac like NVidia promised last september.... Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 9, 2013 Author Contributor Report Share Posted March 9, 2013 Just an update... so far, so good with the GTX 650 (2 gig). It seems pretty snappy compared to my old card. Anyhow, I think it was a good purchase. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 9, 2013 Reputable Contributor Report Share Posted March 9, 2013 Just an update... so far, so good with the GTX 650 (2 gig). It seems pretty snappy compared to my old card. Anyhow, I think it was a good purchase. What about texture painting? Have you tried painting on 4k-8k maps? How does it work with larger brush radius'? Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 9, 2013 Author Contributor Report Share Posted March 9, 2013 No, haven't tried that yet, but I'll let you know. Been swamped with work all week (that's a good thing!). Will update when I try it though. Quote Link to comment Share on other sites More sharing options...
Carlosan Posted March 9, 2013 Report Share Posted March 9, 2013 i got the same is a good card i can paint 4K not tried 8K larger brush radius, no difference performance fast superior that my old 9800GT Quote Link to comment Share on other sites More sharing options...
Contributor Greg Posted March 10, 2013 Author Contributor Report Share Posted March 10, 2013 Hey Don, I did a quick test with a 8192 map : Not sure if this is what you meant or not. Anyhow I accidentally exported the video at 15fps and pretty crap quality, but it still shows up at the speed I worked at. I don't think camtasia slowed my machine at all while recording. http://vimeo.com/61472931 Anyhow, there was a bit of lag with a very large brush, but I can't imagine I'd ever use a brush that size in the first place. Smaller brushes had negligible lag and very acceptable performance. So, happy with the results. Greg Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 10, 2013 Reputable Contributor Report Share Posted March 10, 2013 I understand some lag on an 8k map, but large brush sizes have been 3DC's Achilles heel, even on a 4k map. The user should have to be restricted to a small brush, just to 4k maps or bigger. I showed Andrew a demo of Mudbox, using 4k maps and it just chews through the painting, even with crazy large brush sizes. Was trying to show him that GPU acceleration is the way to go, instead of CPU. This was before AMD publicly threw in the towel, in the CPU arms race. So, Intel can just drag their feet now and release new CPU's 2-3 times as slow as they otherwise would, and with less of a performance increase. Some of it could be architecture, but seeing that Mudbox captured the performance crown in both Sculpting and Texture Painting after switching to GPU acceleration, and Mari too is GPU accelerated...that should be clear enough evidence that CPU isn't the way forward. And as long as it is, I can't see studios using 3D Coat for film work or cinematic work, where larger textures are a must and the performance can't be that weak using them. It's like grinding fingers across the chalkboard slow. The reason I asked about using your new card at those sizes, is I wanted to see if the new NVidia cards made any difference, as the manual states a certain amount of video memory is needed for certain map sizes. I think the bottleneck is probably the CPU and code architecture 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.