Search the Community
Showing results for tags 'gpu'.
Hi, this is my first post to this forum, and if I am doing anything wrong, please let me know, and I will give my best to try to fix it. I would like to thank you guys firstly to be such a great community, although I didn't take part in your discussions, I've red many of your posts and they helped me a lot. Although I am still a 3d Coat newbie, I have some previous foundation in drawing and concept designing, so it didn't took me long to realize outstanding power of this software, and I've decided to incorporate, to be more precise to MIGRATE majority of my work to it. But unfortunately, my PC does'n handles 3D Coat the best, so I've decided to build a new one, since this one has more then 7 years. So, since I don't know anything (almost) about computer hardware I've started deep search on NET and among other forums on this one about my future build. But majority of reviews is for GAMERS, and I wont play games on this PC, it will be semi-professional work station for: 3d modeling (3d Coat, MAX, Skethup, zBrush), Unreal engine, Keyshot,... Photoshop. So, since I don't have enough money to buy best equipment, I would like to ask you for your advice: what is the best value for your money today for: CPU, GPU, motherboard, how much RAM? to be able to perform with ease concept designing and visualization. My choices were GTX- 970 - because I've read somewhere about cuda cores, but few days ago on this same forum I've read that they are not playing that important role at all, so I am totally confused now, and I don't know should I think about R9 390 also? For CPU I was thinking about I7 4790K, but then I've heard about RYZEN few months ago, and I am even more confused what to choose now for CPU then for GPU. So, please HELP ME, and give me some advice. Thank you in advace
Hi all, I bought 3dcoat some months ago, but only now I am starting using it for real. I would need some help to improve the performance of 3DC because I checked and the gpu is not working at all (using GPU-Z software to see it). my system is the following asus N7 17" 3d coat 4.5.19 dx 64 CUDA and not Intel i7-3630QM @2.40 GHz RAM 16 GB W.10 64bit GPU NVIDIAGeForce GT 740M My software gets stuck trying to make a Vox Layer of a section of a model about 750.000 triangles is it normal? Thanks
AMD's Polaris-based Radeon Pro WX GPUs can create VR content for under $1,000 AMD has scrapped the FirePro brand for its workstation GPUs as it tries to catch up with Nvidia in the professional graphics market AMD's Radeon Pro WX 7100 is a VR content creating GPU for under $1,000. AMD is set to release three new professional GPUs as it looks to begin sunsetting its FirePro lineup. The company said Monday night at graphics trade show SIGGRAPH that its new Radeon Pro WX 4100, Radeon Pro WX 5100, and Radeon Pro WX 7100 will be based on its latest Polaris architecture and are aimed at professional users. Specs and pricing of the new cards weren’t immediately available, but the images don't indicate that AMD is going for the jugular in terms of performance, unlike arch-rival Nvidia. AMD officials didn’t talk performance, but they did say the Radeon Pro WX 7100 will hit Steam’s VR performance requirements. Perhaps more importantly, said AMD's head of Industry Alliances David Watters, is the way AMD has organized its graphics unit going forward. Watters said competitor Nvidia must contend with its consumer GeForce products competing with its professional Quadro and Tesla lines.
Hello. I am using both mac and pc. I have used 3D Coat 3 in Mac for long time. But I realize that the trial version of 4.5 works much slower with mac than my PC, mainly the material preview window. In my PC I have CUDA enabled graphics card. In my Mac I have only mac version of ATI Radeon 5770. It might be that the CUDA GPU enabled processing really makes the difference because the PC is way faster. My question is that can 3D Coat 4.5 take advantage of hardware acceleration in Mac? For example, if I was to upgrade to new Mac Pro with ATI Firepro cards, would it be able to take advantage of the GPU power?