Reputable Contributor AbnRanger Posted June 13, 2013 Reputable Contributor Report Share Posted June 13, 2013 ...it ain't a good match. Let's just put it that way. If you use an AMD Graphics card....you're screwed (CUDA only. No OpenCL). If you use an AMD CPU, you're screwed. Andrew used Intel thread building block libraries as the core of the multi-threading in 3D Coat. That's good news....if you own an Intel CPU. If you, however have an AMD CPU (regardless of how well it performs in games and even in rendering apps.), keep some KY jelly around. http://www.osnews.com/story/22683/Intel_Forced_to_Remove_quot_Cripple_AMD_quot_Function_from_Compiler_ "However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string," Fog details, "If the vendor string says 'GenuineIntel' then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version." It turns out that while this is known behaviour, few users of the Intel compiler actually seem to know about it. Intel does not advertise the compiler as being Intel-specific, so the company has no excuse for deliberately crippling performance on non-Intel machines. "Many software developers think that the compiler is compatible with AMD processors, and in fact it is, but unbeknownst to the programmer it puts in a biased CPU dispatcher that chooses an inferior code path whenever it is running on a non-Intel processor," Fog writes, "If programmers knew this fact they would probably use another compiler. Who wants to sell a piece of software that doesn't work well on AMD processors?" In fact, Fog points out that even benchmarking programs are affected by this, up to a point where benchmark results can differ greatly depending on how a processor identifies itself. Ars found out that by changing the CPUID of a VIA Nano processor to AuthenticAMD you could increase performance in PCMark 2005's memory subsystem test by 10% - changing it to GenuineIntel yields a 47.4% performance improvement! There's more on that here [print version - the regular one won't load for me]. In other words, this is a very serious problem. I bring this up because, as I mentioned elsewhere, I had to use a backup PC recently, which had an i7 950 as the CPU. I tested how well it performed using large brushes on a 4k map...thinking the extra 2 threads might help. It's almost like night and day. Using an AMD Phenom X6 running at 4.1Ghz and 1866 memory, it bogs down using large texture maps and large brushes. Even after upgrading the video card (thinking that would help...having a much bigger RAM buffer), it was no difference. So, to summarize...this Intel Library business means you can blow a wad of cash upgrading components, but it won't really matter. 3D Coat runs slower on AMD components and that's the short version of it. That means you have to buy into Intel and Nvidia if you want to get the most out of 3D Coat. 1 Quote Link to comment Share on other sites More sharing options...
Contributor BeatKitano Posted June 13, 2013 Contributor Report Share Posted June 13, 2013 Yep Quote Link to comment Share on other sites More sharing options...
Member moogaloonie Posted June 14, 2013 Member Report Share Posted June 14, 2013 So, how do I change my CPUID? Thanks for giving me something to think about. I've never had an Intel desktop, but this is a good reason to consider Intel for my next machine. It's also the kind of shady anti-competitive behavior I have always expected of Intel, so there's that also. So even with nVidia/CUDA, AMD systems are sluggish with 3D-Coat? That really surprised me. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted June 14, 2013 Author Reputable Contributor Report Share Posted June 14, 2013 So, how do I change my CPUID? Thanks for giving me something to think about. I've never had an Intel desktop, but this is a good reason to consider Intel for my next machine. It's also the kind of shady anti-competitive behavior I have always expected of Intel, so there's that also. So even with nVidia/CUDA, AMD systems are sluggish with 3D-Coat? That really surprised me. Not to the point that it's unusable or that you can't get decent performance, but there is a very noticeable difference. I was stunned to see how much better an i7 950 (not even the top model in that line) was able to paint with a large brush. The AMD machine was my main one...until I found this out. I was even thinking about upgrading the CPU to an FX 8350. But, I'm afraid it won't matter, either...for this very reason. And as much as I'd like to go with an AMD card every once in a while, very few software vendors are writing for OpenCL. Not as mature as CUDA, and I think there are more tools in the CUDA toolkit, to make it easier for the developers code for GPU. Quote Link to comment Share on other sites More sharing options...
Advanced Member alvordr Posted June 15, 2013 Advanced Member Report Share Posted June 15, 2013 (edited) Actually, I have an AMD CPU and 3DC has no issues on lag or significant performance issues. However, I stick with Nvidia cards. I noticed that you have the same CPU, but your clock speed was higher. Did you overclock or buy it overclocked? I never do this, as I've found over the years that overclocking isn't worth it. It tends to push the temperatures way up and can cause problems. Edited June 15, 2013 by alvordr Quote Link to comment Share on other sites More sharing options...
Carlosan Posted June 16, 2013 Report Share Posted June 16, 2013 any1 saw this news ? AMD unveils first eight-core 5GHz processorAMD introduces today its FX-9000 Series, a family of eight-core processors that are said to be the first in the world to have a clock speed of up to 5GHz. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted June 16, 2013 Author Reputable Contributor Report Share Posted June 16, 2013 Actually, I have an AMD CPU and 3DC has no issues on lag or significant performance issues. However, I stick with Nvidia cards. I noticed that you have the same CPU, but your clock speed was higher. Did you overclock or buy it overclocked? I never do this, as I've found over the years that overclocking isn't worth it. It tends to push the temperatures way up and can cause problems. This CPU (Black Edition) has a lot of headroom for Overclocking. It can run at 4Ghz comfortably, and I use a good CPU cooler (Zalman CNPS 12X). This isn't the issue. It's an issue of large brush sizes with large texture maps (4K+). The performance (bottleneck) threshold with the AMD CPU is much lower than the i7. Even Andrew said he's sure Intel pulls some dirty tricks in this regard. Quote Link to comment Share on other sites More sharing options...
Advanced Member alvordr Posted June 16, 2013 Advanced Member Report Share Posted June 16, 2013 AbnRanger, Yeah, I can't speak to it entirely, except that they used Intel CPUs at our school and in my experience with IT systems for a long time, I haven't been impressed with Intel chips. That said, you've got more first-hand knowledge here, in relation to comparing both with 3DC. I have an Intel-based laptop, but the specs are lower. I just haven't really experienced any major lag issues with my desktop. Quote Link to comment Share on other sites More sharing options...
New Member cpberi Posted July 20, 2013 New Member Report Share Posted July 20, 2013 I might be an odd ball here, but I was wondering about Mac version. I have an iMac with nVidia graphics card (GTX 680MX 2GB) that I use for work. I assumed Mac version of the 3DC doesn't use CUDA, or does it? Quote Link to comment Share on other sites More sharing options...
Advanced Member alvordr Posted July 20, 2013 Advanced Member Report Share Posted July 20, 2013 I wish I could help here. I've got a Macbook Pro. It's specs are likely lower than your iMac. Not sure on Cuda...can't recall. Quote Link to comment Share on other sites More sharing options...
Andrew Shpagin Posted July 20, 2013 Report Share Posted July 20, 2013 Mac does not use CUDA, but Mac version is very fast anyway. Quote Link to comment Share on other sites More sharing options...
Advanced Member michalis Posted July 20, 2013 Advanced Member Report Share Posted July 20, 2013 OSX 3dc builds don't support CUDA. Is this what you meant Andrew? Mac does not support CUDA means something else. OSX does support CUDA alright. (OSX 10.7 - OSX 10.8.4) Checked and confirmed under blender cycles GPU (CUDA). Excellent results as expected. BTW OSX 3dc builds run wonderfully on my machine. (dual xeon, 16 threads, ATI HD 5870, 24 GB ram) (5870 is still a decent GPU for OGL, NVidias, not so) Quote Link to comment Share on other sites More sharing options...
New Member cpberi Posted July 21, 2013 New Member Report Share Posted July 21, 2013 Thank you guys =) Anyway, I don't have much complaints about the performance of the 3DC on Mac compared to PC. I was just wondering if even faster version was lurking somewhere. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.