Jump to content
3DCoat Forums

Please, Help!!!! 3D-Coat - best pc build -> (AMD Threadripper?)


Rygaard
 Share

Recommended Posts

  • Contributor

Hello!!!!

I hope this post serves many others who would like to do the same thing.

Please, for anyone who really knows about PC Build, I would very much appreciate your guidance.

I am wanting to build a new pc after many years, but unfortunately I do not understand much. I would like to set up a machine for programs like mainly 3D-Coat (of course), ZBrush, Blender, Mari, Substance Designer, Marvelous Designer, Krita, Photoshop, Mari, Unreal Engine, Marmoset Toolbag and others applications, as well as I can play games.

I've been following the forum for information but I coundn´t find exacly what I´m looking forward and I was worried about what Andrew Shpagin said that "3D-COAT is essentially multithreaded, so cores will speed up. Intel is better because Intel TBB library used for multithreading, so on Intel it works faster anyway."

Maybe this would be the first problem, because the configuration I set up was according to AMD Threadripper... I've done a lot of research and most of the people are only commenting on positive things about AMD Threadripper is better than Intel. I always had Intel, but... :)

I have a lot of doubts regarding processor, Memory and the video card:

Please, you could analyze in detail each item and see if it would fit to my goal of working with the applications above, as well as games.
Below is the configuration of the pc.
================================================
1.Motherboard:
ASRock X399 TAICHI sTR4 SATA 6Gb/s USB 3.1/3.0 ATX AMD
=====================================================
2.Processor:
AMD Ryzen Threadripper 1950X (16-core/32-thread)
=====================================
3.CPU Cooler:
Ecomaster Technology Enermax ELC-LTTR360-TBP LIQTECH TR4 Threadripper 360MM AIO Liquid CPU Cooler
====================================
4.
I do not know which mhz is better 3000mhz, 3200mhz or above? And which quad channel should I choose?
What do You Think???
Memory: G.SKILL 64GB (4 x 16GB) Ripjaws V Series DDR4 PC4-25600 3200MHz For ntel Z170 Platform Desktop Memory Model F4-3200C16Q-64GVK
Or which one you indicate?
================================================
5.
Video Card: Radeon Pro Duo 32GB (amd 100-506048 - AMD RADEON PRO DUO 32GB GDDR5 POLARIS PCIE 3XDP 1XHDMI)

One person talked about the following video cards:
2x (two) Gigabyte - GeForce GTX 1080 8GB Turbo OC...
--- OR:
2x (two) Zotac ZT-P10800H-10P...,
--- OR:
1x (one) PowerColor AMD Radeon RX VEGA 64 8GB HBM2 HDMI...
--- OR:
1x (one) Radeon Pro Duo 32GB (amd 100-506048).

It seems to me that the Radeon Pro Duo 32GB (amd 100-506048) would be the best video card. But I would very much appreciate your opinion, which video card would be the best for the applications above and games?

Would it be possible to use the Radeon Pro Duo 32GB (amd 100-506048) in conjunction with geforce1080 or Zotac ZT-P10800H?

How would it be possible to play a game with these cards at the same time? I would have the option to choose which card I want to use it in the game (I know the Radeon Pro Duo 32GB (amd 100-506048) is not a gaming board).
What do you think?
=====================================
6.Storage:
Samsung 960 EVO Series - 500GB NVMe - M.2 Internal SSD (MZ-V6E500BW)  
=================================
7.Power Supply:
What do you think?
PC Power & Cooling FireStorm Gaming Series 1050 Watt (1050W) 80+ Gold Fully-Modular Active PFC Performance Grade ATX PC Power Supply 5 Year Warranty FPS1050-A4M00  
=====================================

Thank you very much for your attention and I hope everyone benefits from all the opinions here.

Link to comment
Share on other sites

  • Reputable Contributor

AMD CPU's used to be at a disadvantage years ago, because Intel would cripple AMD CPU's on apps (like 3D Coat) which used any of it's developer-level software to run multi-threading...such as their Thread Building Library. What they would do is utilize a CPU ID, that would optimize the code for Intel and make it run more slowly on AMD's. They have since lost a major legal case over that and it's my understanding that they had to fix that, as well as pay the punitive damages. 

That's how Intel rolls. They've been sued globally scores of times. They play dirty pool, not to mention price-gouge. So, I will never buy another CPU of theirs anytime soon. Watch this and you'll see more details as to why:

 

I upgraded from an i7 970 to an AMD Ryzen 7 1700X, with 32GB RAM (3200mhz), ASROCK Killer SLI X370 Motherboard, Corsair H115i  AIO CPU cooler. Really happy with the performance. No noticeable lag in 3D Coat at all and it rocks when rendering. 

  • Like 1
Link to comment
Share on other sites

  • Contributor

Hi, #AbnRanger, thanks for your comments!

I am very happy to hear of your satisfaction with your current AMD configuration. I was very apprehensive of putting together an AMD Threadripper, running all the programs I mentioned above and when I was using 3D-Coat mainly the program would be completely hampered because of the multi-threading business ... such as their Thread Building Library .

About the video card: Watch the youtube video (AMD Pro Duo) that talks about the video card facing the Blender GPU, as well as using the AMD Pro Duo with another video card.
     Why do not you indicate an AMD Pro Duo with another video card?
I do not know what to do then ... because I thought I could put 2 boards in the system and have the benefit of the professional board work together with a more gaming board when working on 3D and thus further improving the performance of the board machine. And when I wanted to play I could choose the game board for that.

If I could mount a system with 2 video cards, could I use the AMD Pro Duo with which other video card?


About the memory: I have a one-day intention to use 128GB on the system. What memory do you indicate? 3000mhz or 3200mhz or above that? Which brand?

And.... What about the configuration I said, is it good? Something to change?

https://www.youtube.com/watch?v=To_N2xT6Xa8

thank you

 

 

Link to comment
Share on other sites

  • New Member

Hey!

I just recently joined the AMD 1950x club myself!  Micro Center has a KILLER deal on it right now, $700 USD.  I went ahead and got a Gigabyte x399 Aorus Gaming 7 motherboard with it too at a discount.  Thought it's all sitting in their boxes still (still waiting on other parts).  

I've been trying to do as much research as I can as well.  While Intel's new chips seem to outshine AMD in the single threaded, for the most part Threadripper's multithreaded is incredible.  Again, being able to get mine as cheap as I did definitely helped with the decision. I too snagged the same Enermax AIO as well and look forward to getting it here soon.

As for ram, that seems to be the biggest conundrum so far.  From my understanding ram with a CAS of 14 (C14) is ideal for getting the most out of a Ryzen chip.  As for speed, I'm not sure how big of a difference 3000 vs 3200 will make.  For me I'm sure I'd never notice it.  Also it seems ram built off the Samsung b-die architecture if best for Ryzen.  The TR4 build is quad channel which supposedly means it runs most efficiently with 4 or 8 dimms.  With ram being so terribly expensive now I only ordered 2x8gb sticks for now.  It won't be optimal, but at least it'll be enough to get it up and running.  Hopefully prices come down and I can start maxing out those slots!

I'm going to use my old GTX 770 for the time being.  A new GPU isn't in the "cards" (pun intended!) right now.  So I really can't comment on the GPU aside from more vram the better.  I do know that Nvidia cards use Cuda Cores which, depending on your application of choice, can be very important.  

Well I hope this helps!  Definitely not an expert so feel free to fact-check all this.  Best wishes on your build!  I hope to have mine up this weekend.

- Brad

 

Here's a thread I've been browsing to try to make sense of it all.  

 

 

Link to comment
Share on other sites

  • Contributor

Hi #Brad,

thank you very much for your comment!
Really, there's no way to resist joining the AMD 1950x club. I believe we will not regret it! :)
I've been doing a lot of research on the internet regarding the memories that are optimized for the Threadripper 1950x, they say about the quad channels and show models that are very expensive. I did not know about the RAM CAS of being 14 (C14) the ideal. At speed, I'm lost :(, because I do not know if it interferes with anything. I did not know about the Samsung b-die architecture ... Thanks for the tips!
If you do not mind, what memory did you decide to use?

The video card I was interested in was AMD Radeon Pro duo 32GB, but I am undecided to not be able to play some games, because my current video card is an Nvidia Quadro 600 and I can not play with it. Of course, I will prioritize my work, but being able to play a little game should be allowed! :)

So I thought about having the AMD Radeon Pro 32GB duo and in the future put another video card running at the same time. I do not know if I'm right in my reasoning, but with the 2 video cards, they could improve performance at work with the 3D programs and when I wanted to play a game, I could choose the gaming-oriented board for that. What do you think?


But it looks like this would not be highly recommended as #AbnRanger commented.

Thanks for your help and I hope you are very lucky with your new machine too!

Link to comment
Share on other sites

  • Advanced Member

I just bought a computer with a Ryzen 7 1700 a week or two ago with a GTX 1080, and 3D-Coat runs butter smooth.  I'm running 16GB RAM at 3000 MHz and have had no problems.  With the system specs you are going for with you aren't going to have any issues.

For the video cards if you are thinking about going that high end I'd suggest looking at the GTX 1080ti which has significant performance advantages over 1080's, it has 11GB memory and all the benchmark things I watched it was like 20% faster than the 1080, which was the fastest card in general (the Vega 65 LC matched 1080 or was a little bit faster in some production tests, particularly Blender. the Vega didn't come close to a Ti though).  I wasn't looking so closely at the pro cards because that was out of my price range, but my understanding from what I saw was that the 1080ti was the more worthwhile buy, but I didn't see any hard data like benchmarks to assert whether that was true or not.

Edited by Falconius
Link to comment
Share on other sites

  • Contributor

Hi Falconius, thanks for your comment!
It's amazing to know that 3D-Coat is working very well with AMD technology according to your pc build. I'm glad to hear that!
Every time when I tried to set up a pc, I was not successful because the person who guided me did not know much about the 3D programs that we are accustomed to using and indicated me a pc build that when I mounted did not have the performance I was expecting and with that I was very frustrated.
In recent days, I've done a lot of research and I've set up a configuration according to what I posted here.

About the memories I was afraid to choose something that was not appropriate for the system, apart from the fact that there are many marks and specifications making me confused at the time of what would be appropriate.

I'll try to find out more about the GTX 1080i video card and by your comment it should be excellent. Thanks for the guidance!

Seeing reviews on youtube and google, I took note of the video card of AMD Radeon Pro duo 32GB (which is a video card pro). In the youtube link that I posted above, the guy demonstrated the use of this card with Blender, which had great points and what caught my attention was that he commented that with AMD Radeon Pro duo 32GB, he can simultaneously use another program at the same time while rendering without causing any problems, unlike with the two 1080's cards that locked the system while rendering and thus making you wait to finish the render to be able to use the pc. He also made use of the AMD radeon Pro 32GB and set with Nvidia 1080 and in that way he got better performance in the system.

As I said earlier, what do you think about using two cards working at the same time on the system? My intention was to have a pro card like the AMD Radeon Pro duo 32GB (for professional 3D work and etc.) and in the future put a video card like the 1080ti (as you indicated me) to be able to play games and increase even more the performance in 3D works as well. I do not know if I would have any benefit with using the two cards at the same time in the 3D works and when I wanted to play I would have chosen the 1080i video card for example.

Link to comment
Share on other sites

  • Reputable Contributor
14 hours ago, Rygaard said:

Hi, #AbnRanger, thanks for your comments!

I am very happy to hear of your satisfaction with your current AMD configuration. I was very apprehensive of putting together an AMD Threadripper, running all the programs I mentioned above and when I was using 3D-Coat mainly the program would be completely hampered because of the multi-threading business ... such as their Thread Building Library .

About the video card: Watch the youtube video (AMD Pro Duo) that talks about the video card facing the Blender GPU, as well as using the AMD Pro Duo with another video card.
     Why do not you indicate an AMD Pro Duo with another video card?
I do not know what to do then ... because I thought I could put 2 boards in the system and have the benefit of the professional board work together with a more gaming board when working on 3D and thus further improving the performance of the board machine. And when I wanted to play I could choose the game board for that.

If I could mount a system with 2 video cards, could I use the AMD Pro Duo with which other video card?


About the memory: I have a one-day intention to use 128GB on the system. What memory do you indicate? 3000mhz or 3200mhz or above that? Which brand?

And.... What about the configuration I said, is it good? Something to change?

https://www.youtube.com/watch?v=To_N2xT6Xa8

thank you

 

 

It's good that Blender Cycles can utilize both an AMD and NVidia card, simultaneously, but you probably won't find that compatibility in other GPU rendering engines. In fact, most GPU engines still require CUDA enabled cards (NVidia only, as that is their exclusive technology). That's why I have a GTX 1080 & 1070. It allows me to render with any GPU render engine, and it's really fast. However, you have Cycles and AMD's ProRender GPU engines that can use AMD cards right now. VRayRT does have an OpenCL mode in 3ds Max and Maya, so, it can render with AMD cards as well.

With all of that said, the AMD Pro Duo is effectively 2 workstation GPU's on one card, thus the 32GB (16GB per GPU). Since it is a workstation card, it should handle dense geometry in the 3D Viewport much better than any GTX/Gaming card. Since the AMD Pro Duo is seen by 3D programs like Blender, 3ds Max, Maya, etc., as 2 separate GPU's, there is no real advantage of having the 2nd GPU or it's dedicated 16GB's of VRAM. That means, in those applications, you are only using one of those GPU's and 16GB of VRAM for all work except for GPU rendering.

The new Radeon Vega Frontier Edition is on sale at Newegg for $699.

https://www.newegg.com/Product/Product.aspx?Item=N82E16814105073

It's basically a blend between the gaming performance of a gaming card (on par with the GTX 1080) and all the benefits of a workstation card, with 16GB of VRAM. So, if you want a relatively inexpensive workstation card that is very good at gaming, too, that might be the best option.

I went with 2 Nvidia GTX 10xx's because it hit on compatibility (CUDA & OpenCL), pricing, performance, and energy efficiency. Granted, each card has 8GB VRAM each, but that is more than adequate for what I may use it for. Plus, some GPU render engines have "Out of Core" support, meaning it will still render if you run out of VRAM. It will access system RAM in that case. 2 that I know of are Redshift and now AMD's ProRender offers that support. In 3ds Max, and maybe Blender too, it can render using both the GPU and CPU simultaneously.

So, it really comes down to whether you REALLY need more than 8GB of VRAM on your primary display card. If you do, then either of the AMD workstation cards would be the best cost/benefit option. The 1080Ti has 11GB of VRAM, so that might be another option to consider. 

Link to comment
Share on other sites

  • Reputable Contributor

As for memory, there has been quite a change in RAM compatibility since the first few months after Ryzen 7 was first released. The 14 CAS/ Samsung B-Die modules (which cost more because they were considered the premium RAM modules). I got 2 x 16GB modules of the GSkill Trident Z, and it was the 16 CAS/3200Mhz rated stock. Wasn't bad for 32 GB, and I have room to add 2 more, if needed. It's pretty rare for me to run out of RAM at 32GB, but I will probably add 2 more modules for a total of 64, soon, just to be on the safe side.

https://www.newegg.com/Product/Product.aspx?Item=N82E16820232415&cm_re=gskill_trident_z-_-20-232-415-_-Product

At any rate, when I first put my system together, my RAM would only run as high as 2400Mhz....even though it was rated at 3200Mhz. That's where the whole Samsung B-die stock issue came into play. Up to that point, they WERE the only modules able to reach 3200Mhz. However, in August, AMD released a new string of code for Motherboard Manufacturers to greatly expand the Memory compatibility. A few weeks later, most of those manufacturers released new BIOS updates to include this optimization. That worked. On my ASROCK board, I was able to get the full 3200mhz with no issues. I just choose the XMP 2.0 Profile for 3200Mhz and it works great. 

I'm still using an SSD drive (1TB) for my main drive, but an M.2 card would definitely be faster. I haven't invested in one yet, because it boots up really fast with this system. I have my Ryzen 7 1700X running at 3.8 Ghz, and it runs like a champ. 3ds Max loads really fast, now (10-15sec), whereas before, it might take up to a full minute to fully load. No kidding.

Link to comment
Share on other sites

  • Advanced Member

If you can get a large enough NVMe card (500gb?) definitely get it (keep in mind that regular SATA SSD's can also use the M.2 slot and aren't NVMe, so don't get fooled there). I'm using a regular SSD and Winodws 10 boots up in seconds after the BIOS screen closes so don't worry about opting for a straight up SSD for now.  For the RAM I did have to go into the BIOS to select an option that had the setting for it (D.O.C.P (I think) right in the main BIOS screen) otherwise it was running at the default 2133mhz or whatever it is.  I also watched the video, and if you are considering going for an SLI setup I'd say just go for the better card, especially as it'd still be cheaper than two Ti's and has significantly more VRAM (SLI will still only use the VRAM from one card so even with two you only get the 11gb).  If you are rendering a lot I'd definitely go with the Pro Duo, it is a fantastic piece of hardware keeping in mind what AbnRanger said about CUDA requirements for other programs (I use Blender and 3D Coat primarily, so for me it would obviously work well).

As for gaming at the same time as rendering... I'd be either too busy switching in and out, or too worried about system stability to risk the render failing.  But if you test it out do let us know :).

Link to comment
Share on other sites

  • Contributor

AbnRanger,
I have no words to thank you for the great information you shared!

After having seen the review video of the AMD Radeon Vega Frontier Edition video card, I confess that I had doubts about acquiring this video card, as I took the opportunity to watch other reviewers about this video card and showed problems with the use of the video card. plate (in relation to loud noise, heat, and deficiency in use with some games and programs). As well, I was unsure of purchasing the amd Radeon Pro Duo after knowing what you said about it.
Sorry to be so indecisive, but I feel like walking in the dark. Please, be patience with me!

My biggest intention is to be able to work with millions and millions of polygons for sculpting and texturizing using 3D-Coat and Zbrush.
To render, model and sometimes animate mostly in Blender (if I need 3ds-Max, Maya I can use it as well). Use Unreal 4 and Marmoset Toolbag (or other programs of the genre), Marvelous Designer, Substance Design and other production programs that you know. That is, to use all these programs in a very fluid way.
Being able to play games would be very good, but I do not need to beat the fps record in a game to have a great playing experience.
It is being a difficult choice for which video card (or 2x cards) to purchase.

If I can not use all the power of a video card like the AMD Radeon Pro 32GB to my benefit for the programs and in the future install a GTX1080ti (for example) to further improve the performance of the pc (for both 3d and for games), so what can I do?
Maybe acquiring two GTX 1080i video cards would be enough to handle a dense amount of geometry in the viewport as an AMD pro card would handle? What Do you think?

Regarding memory, I get calmer after you've explained about the mhz (thanks for the memory link). I'm thinking of having 64GB, but if I can try the 128GB. :)

No problem! I fully believe in you when you talked about the speed your computer opens 3ds max in 10-15 secs. This is a dream, is not it? :)

Link to comment
Share on other sites

  • Contributor

Falconius,
swearly, I'm going to try to get 500GB NVMe card and I will not fool myself. Thanks for the tip! :)

Regarding memory I saw that people had to enter the BIOS in order to correctly configure memory!

I have to see seriously about this CUDA deal. I do not want to mess up my decision. I'm not sure what to do at the moment.

Sorry if I did not explain it better, but I think you misunderstood me when I said that I would use the 2 video cards to increase the performance to use two programs in parallel. I would not try to play a game while rendering ... that should cause problems, right? :)
I was thinking of getting two GTX 1080ti ... I hope to have everyone's opinion if this would be the best choice for my goals I've already commented above.
I thank you so much for your help!

Link to comment
Share on other sites

  • Reputable Contributor
2 hours ago, Rygaard said:

Falconius,
swearly, I'm going to try to get 500GB NVMe card and I will not fool myself. Thanks for the tip! :)

Regarding memory I saw that people had to enter the BIOS in order to correctly configure memory!

I have to see seriously about this CUDA deal. I do not want to mess up my decision. I'm not sure what to do at the moment.

Sorry if I did not explain it better, but I think you misunderstood me when I said that I would use the 2 video cards to increase the performance to use two programs in parallel. I would not try to play a game while rendering ... that should cause problems, right? :)
I was thinking of getting two GTX 1080ti ... I hope to have everyone's opinion if this would be the best choice for my goals I've already commented above.
I thank you so much for your help!

I like that AMD Pro Duo, but personally, I would would go with a single GTX 1080 for now. I was able to get one, recently, on eBay for about $470 USD. See how well it performs for you. Then hold off buying another card until NVidia comes out with their next generation card, which should be in a few months. I say this because the 1080's came out almost 18 months ago. When they do, it will probably be a big bump in performance and VRAM capacity. Use that for your primary graphics card and use the 1080 as your second card, or sell it and use that money to help pay for a 2nd new card.

You can sell your 1080 very easily on eBay or Amazon because they are pretty popular with the DataMining crowd...the folks who are gobbling up all the midrange cards for bitcoin/data mining, and driving the prices up, in the process. The reason I don't recommend using an AMD and Nvidia card in the same system is, that is generally not recommended. Graphic cards can be buggy enough at times, without further compounding issues with 2 totally different graphics card drivers.

I had serious issues not long ago, just using a GTX 1070 with a GTX 580 as a 2nd card (for rendering purposes). It was because they were completely different architectures and not very compatible with one another. You are supposed to be able to mix cards like that because NVidia drivers are supposed to be "unified," but in practice it was a headache.That's why I went ahead and bought a GTX 1080, instead. Whenever you see GPU rendering benchmarks, you will notice many of the dual-card setups are 2 of each card. It helps remove the possibility of such driver conflicts.

Link to comment
Share on other sites

  • Contributor

AbnRanger,
Thank you for clarifying things further! You're really helping me!
You're right that in a while, they'll release a better video card.

I'm between the GTX 1080 or the GTX 1080ti video cards .
Almost deciding in favor of the 1080i GTX because of the 11GB of VRAM that you spoke about earlier.

I'll follow your advice and I'll just get 1 of the two video cards.

If you do not mind, could you tell me which brand and model it would be better both 1080 GTX as the GTX 1080ti? Because when I researched there are several types of models and brands.

Link to comment
Share on other sites

  • Contributor

Hi Everyone,
I do not know if I'm making a good choice over the video card, but I'm deciding to get a GTX 1080i.
I am searching and there are several types, models and brands of this video card. That made me crazy!

After seeing several reviewers, I decide between two models:

1) NVIDIA ASUS GEFORCE GTX 1080 TI DDR5X 11GB ROG STRIX (GTX1080TI-O11G-GAMING)
    -> The temperature of the video card when overloaded is between 60-70 ° C.
    -> It's a bit noisy when it's at this time .

---- OR ------

2) ZOTAC geforce gtx 1080 ti amp edition
     -> They say it's the best of all, but when overloaded it gets to 80ºC. This temperature is very high and it made me worried.

What would be the best model between these 2 video cards I chose?
And in case, have another model better (but not more expensive yet), indicate me, ok? (Please send me the webpage link of the video card)

Thank you

Link to comment
Share on other sites

  • Advanced Member

I've heard lots of good things about the ROG STRIX cards in the past.  Don't know about the ZOTAC.  In either case it is extremely unlikely that you'll be pushing those cards to their limits in anything except rendering situations or when exploding a like 100 barrels all at once in Crisis 3.   In the sculpt room with a fairly complex model with many many pieces my card (GTX 1080 G1 Gaming 8G) is only running at an average of 50% capacity for the 3d compute, and maybe one gig of VRAM.  When I render it bumps it up to 100 %, but I still can't hear the fans .

Link to comment
Share on other sites

  • Reputable Contributor
3 hours ago, Rygaard said:

Hi Everyone,
I do not know if I'm making a good choice over the video card, but I'm deciding to get a GTX 1080i.
I am searching and there are several types, models and brands of this video card. That made me crazy!

After seeing several reviewers, I decide between two models:

1) NVIDIA ASUS GEFORCE GTX 1080 TI DDR5X 11GB ROG STRIX (GTX1080TI-O11G-GAMING)
    -> The temperature of the video card when overloaded is between 60-70 ° C.
    -> It's a bit noisy when it's at this time .

---- OR ------

2) ZOTAC geforce gtx 1080 ti amp edition
     -> They say it's the best of all, but when overloaded it gets to 80ºC. This temperature is very high and it made me worried.

What would be the best model between these 2 video cards I chose?
And in case, have another model better (but not more expensive yet), indicate me, ok? (Please send me the webpage link of the video card)

Thank you

Don't let the differences make you crazy.  :D They have only subtle differences, so as long as you choose one that has aftermarket cooling (usually 2-3 large fans on it), you're good to go. They all share the very same GPU and Memory capacity as the others. The only real difference is some are slimmer, so it's easier to fit multiple cards in your PC, while some are really thick to get the maximum amount of cooling for slightly higher factory overclocks. They will all perform basically the same.

 

Link to comment
Share on other sites

  • New Member
On 11/22/2017 at 11:40 PM, Rygaard said:
 

Hi #Brad,

thank you very much for your comment!
Really, there's no way to resist joining the AMD 1950x club. I believe we will not regret it! :)
I've been doing a lot of research on the internet regarding the memories that are optimized for the Threadripper 1950x, they say about the quad channels and show models that are very expensive. I did not know about the RAM CAS of being 14 (C14) the ideal. At speed, I'm lost :(, because I do not know if it interferes with anything. I did not know about the Samsung b-die architecture ... Thanks for the tips!
If you do not mind, what memory did you decide to use?

The video card I was interested in was AMD Radeon Pro duo 32GB, but I am undecided to not be able to play some games, because my current video card is an Nvidia Quadro 600 and I can not play with it. Of course, I will prioritize my work, but being able to play a little game should be allowed! :)

So I thought about having the AMD Radeon Pro 32GB duo and in the future put another video card running at the same time. I do not know if I'm right in my reasoning, but with the 2 video cards, they could improve performance at work with the 3D programs and when I wanted to play a game, I could choose the gaming-oriented board for that. What do you think?


But it looks like this would not be highly recommended as #AbnRanger commented.

Thanks for your help and I hope you are very lucky with your new machine too!

Hey!

I bought the ram I did before I really started to dig into which ram was actually best suited for the x399.  I'm more of a hobbyist so I'm trying to find the balance between power/affordable/fun.  I'm setting myself up for one of those wonky RGB clown fart builds!  Here's what I purchased.

Corsair Vengeance RGB 16GB (2x8GB) DDR4 3200 (PC4-25600) C16 -Intel 100/200 Series PC memory CMR16GX4M2C3200C16

As stated above, I am counting on BIOS updates to kind of round-out the whole compatibility issues with different dimm modules.  Hopefully I'll have all the parts later today and start working on the build tonight.  I'll definitely report back with my findings of how this ram performs.  To be completely honest, one of the main reasons I went with the ram selection I did is for the RGB ability and it's compatibility with Corsair Link/Commander Pro.

After doing more research (barring my results this evening), I'm leaning more toward G.Skill TridentZ.  They have this listed as their Threadripper ram.  It's proven hard to find the GTZRX variant, whereas the GTZR is more common.  I don't know if this is all a gimmick, and the 'X' is just an indicator it's a 4-pack (quad channel) kit or actually provides better performance.  If RGB's not your thing, you can definitely find the same ram in your standard design, and cheaper (slightly!).  Again, after some BIOS updating I might be in luck with the sticks I already purchased.  I missed the boat and all the C14/3200 GTZR ram is sold out... booo!

----------

I really don't think you'll regret going 1080ti(!), though I really can't touch on which brand is better.  If the fun-dage was there I'd be going that route as well!

After some debating I think I will upgrade my GTX 770 3gb to a GTX 1060 6gb.  Not a terribly expensive upgrade, and should be almost twice as powerful.  Hopefully it'll tide me over until this GPU pricing dabacle settles into a bad memory!  

Best of luck!

- Brad

 

 

Link to comment
Share on other sites

  • Reputable Contributor

...I should add that it's important to avoid the thick 2.5 slot coolers (they are generally designed for single card setups), whereas the 2 slot cards are slimmer for maximum spacing when multiple cards are being used in a system. This would be one of the best 1080ti options in terms of slim design/price/performance:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487337

 

Link to comment
Share on other sites

  • Contributor
4 hours ago, Falconius said:

I've heard lots of good things about the ROG STRIX cards in the past.  Don't know about the ZOTAC.  In either case it is extremely unlikely that you'll be pushing those cards to their limits in anything except rendering situations or when exploding a like 100 barrels all at once in Crisis 3.   In the sculpt room with a fairly complex model with many many pieces my card (GTX 1080 G1 Gaming 8G) is only running at an average of 50% capacity for the 3d compute, and maybe one gig of VRAM.  When I render it bumps it up to 100 %, but I still can't hear the fans .

So far, from what I've seen in all reviews, the Asus GTX 1080i graphics card is excellent, as are the other ones as well, but there is the temperature rise business a lot when the board is outdated.
I realized that many people overclock the video card.
To be honest, I do not want to overclock (I do not understand much about this) on the video card because I want the card to have a long life.
A curiosity, in the Sculpt Room how many millions of polygons can you get without compromising performance? And how is the performance in the Paint room?

Link to comment
Share on other sites

  • Contributor
1 hour ago, AbnRanger said:

Don't let the differences make you crazy.  :D They have only subtle differences, so as long as you choose one that has aftermarket cooling (usually 2-3 large fans on it), you're good to go. They all share the very same GPU and Memory capacity as the others. The only real difference is some are slimmer, so it's easier to fit multiple cards in your PC, while some are really thick to get the maximum amount of cooling for slightly higher factory overclocks. They will all perform basically the same.

 

I'll try not to go crazy, my friend! :)
When I saw the diversity of gtx 1080ti cards, I really went crazy, not knowing which one would be the best. :) In my research, you too would go crazy when you look at the comments thinking that someone will give you a light and make you make a decision ... it's not like that! Everyone speaks wonderfully well for each video card. Oh, you ask yourself: What now? Which one? :)
You're absolutely right! Video cards differ in few things, but with the same technology! What seems to matter is the thickness of the board, the maximum temperature it can reach, and the noise of it. To tell you the truth, I do not care if the video card will be all lit up and flashing or if it is beautiful (by God, nothing against for those who like it!), What matters most to me is the performance. :)
I'll see the videos you sent me ... Thank you!

 

Link to comment
Share on other sites

  • Contributor
1 hour ago, BAAJR said:

Hey!

I bought the ram I did before I really started to dig into which ram was actually best suited for the x399.  I'm more of a hobbyist so I'm trying to find the balance between power/affordable/fun.  I'm setting myself up for one of those wonky RGB clown fart builds!  Here's what I purchased.

Corsair Vengeance RGB 16GB (2x8GB) DDR4 3200 (PC4-25600) C16 -Intel 100/200 Series PC memory CMR16GX4M2C3200C16

As stated above, I am counting on BIOS updates to kind of round-out the whole compatibility issues with different dimm modules.  Hopefully I'll have all the parts later today and start working on the build tonight.  I'll definitely report back with my findings of how this ram performs.  To be completely honest, one of the main reasons I went with the ram selection I did is for the RGB ability and it's compatibility with Corsair Link/Commander Pro.

After doing more research (barring my results this evening), I'm leaning more toward G.Skill TridentZ.  They have this listed as their Threadripper ram.  It's proven hard to find the GTZRX variant, whereas the GTZR is more common.  I don't know if this is all a gimmick, and the 'X' is just an indicator it's a 4-pack (quad channel) kit or actually provides better performance.  If RGB's not your thing, you can definitely find the same ram in your standard design, and cheaper (slightly!).  Again, after some BIOS updating I might be in luck with the sticks I already purchased.  I missed the boat and all the C14/3200 GTZR ram is sold out... booo!

----------

I really don't think you'll regret going 1080ti(!), though I really can't touch on which brand is better.  If the fun-dage was there I'd be going that route as well!

After some debating I think I will upgrade my GTX 770 3gb to a GTX 1060 6gb.  Not a terribly expensive upgrade, and should be almost twice as powerful.  Hopefully it'll tide me over until this GPU pricing dabacle settles into a bad memory!  

Best of luck!

- Brad

 

 

Great! I think it will look great what you are doing with the memories!
Wonderful news, then you mean that today your toy will be assembled? Very good! Share your experiences with us! Congratulations!
To be honest with you my friend, I do not care much for the aesthetic part regarding the lighting of the parts inside the cabinet, I find it very beautiful, but what I really care about in the end is with the machine's performance, speed and etc. . And as you said, it's a little cheaper! :)
For you to see, I started thinking about having an AMD Radeon Pro DUO and I'm researching which GTX 1080i I'll get. I believe I will not regret it! :)
The choice of your video card should be worth it (twice as much power is not anything! It's a dream!) :)
I wish you lots of luck and thank you for commenting.

 

 

Link to comment
Share on other sites

  • Reputable Contributor
1 hour ago, Rygaard said:

So far, from what I've seen in all reviews, the Asus GTX 1080i graphics card is excellent, as are the other ones as well, but there is the temperature rise business a lot when the board is outdated.
I realized that many people overclock the video card.
To be honest, I do not want to overclock (I do not understand much about this) on the video card because I want the card to have a long life.
A curiosity, in the Sculpt Room how many millions of polygons can you get without compromising performance? And how is the performance in the Paint room?

The ASUS Strix is one of those 2.5 slot cards and I always avoid those models, regardless. On most motherboards those 2.5 cards will bump right up against the other card in your system, if you can fit more than one anyway. This is why EVGA's entire 1080Ti line is all slim 2 slot models. They know many users will want more than one card. In my system with a Gigabyte G1 Gaming 1080 and a matching 1070 model, there is at least 1-2inches of space between the cards. If they were ASUS Strix, they would be smacking up against each other, which doesn't allow one of them to breath properly. Tall cards are a no-go for mulit-card setups.

 

 

Link to comment
Share on other sites

  • Contributor

AbnRanger,
I can not thank you enough for the help .. included your patience, guidance and time!

I watched the videos and with your guidance, again, you are making me see much clearer. :)
Until then, I was convinced to get the Asus GTX 1080i ... opsss...

Now, I've already taken another path towards EVGA GeForce GTX 1080 Ti FTW3 GAMING, 11GB GDDR5X, iCX Technology
  or
EVGA GeForce GTX 1080 Ti SC2 GAMING, 11G-P4-6593-KR, 11GB GDDR5X, iCX Technology. :)

Now my friend, in your opinion, which of these two EVGAs would be the best? From what I observed would be the EVGA GeForce GTX 1080 Ti FTW3? Or am I wrong again?

Link to comment
Share on other sites

  • Reputable Contributor
4 hours ago, Rygaard said:

AbnRanger,
I can not thank you enough for the help .. included your patience, guidance and time!

I watched the videos and with your guidance, again, you are making me see much clearer. :)
Until then, I was convinced to get the Asus GTX 1080i ... opsss...

Now, I've already taken another path towards EVGA GeForce GTX 1080 Ti FTW3 GAMING, 11GB GDDR5X, iCX Technology
  or
EVGA GeForce GTX 1080 Ti SC2 GAMING, 11G-P4-6593-KR, 11GB GDDR5X, iCX Technology. :)

Now my friend, in your opinion, which of these two EVGAs would be the best? From what I observed would be the EVGA GeForce GTX 1080 Ti FTW3? Or am I wrong again?

They are very similar, overall. Just a $30-$40 difference in price. As you can see, the SC2 has 2 fans, whereas the FTW card has 3, so, the FTW has slightly better cooling...for a slightly higher price. I don't think you will notice any tangible difference though. In many benchmarks, the framerates are the same. NVidia has built-in overclocking, called GPU BOOST, which will increase the clockspeed automatically based on temperature thresholds. Each manufacturer will have their own software modification utility, where you can choose between different performance/efficiency modes.

On my Gigabyte G1 Gaming card, it has OC/Gaming/Eco modes. You can go in and further customize/tweak the settings, but I just choose one of the modes based on the task I am performing. If I were rendering an animation, I might choose the Gaming mode, where it's OC'ed a fair amount, but not pushing the boundaries. If I'm just surfing the internet or rendering with a CPU only render (like VRay or Arnold), I would choose the ECO mode. If you are sculpting and painting, you could choose the OC mode and in the Fan setting, choose the TURBO (higher fan speed) mode.

BTW, the CPU and system RAM will the main performance components in 3D Coat, although the GPU does have some bearing, especially if you are using larger texture map sizes. It also matters when you have a heavy scene. With a GTX 1080 or 1080Ti, you should be able to rotate about the scene fluidly with over 100+million visible polygons. RAM capacity will dictate how many polygons you can comfortably have in your scene. The graphic card will dictate how quickly/smoothly you can move about in the scene.

One of the developers was working on a GPU-based brush engine, at least for painting. That could improve things even more and make the graphic card even more important in 3D Coat. I should also mention that AO and Light baking currently uses the GPU, as well. There is Renderman integration in 3D Coat 4.8, and Pixar has already announced an upcoming GPU/CPU hybrid engine. So, if you plan to use that in 3D Coat, the GPU will certainly have an increasingly greater role.

This video helps sort out the difference between the EVGA 1080Ti FTW and SC2 models. In a nutshell, it reflects what I already stated...that it's only a difference in 2 or 3 fans. If you don't mind spending the extra $30, then the FTW is probably the best choice.

Link to comment
Share on other sites

  • Contributor

AbnRanger,

Again, fantastic explanation! Many doubts clarified!

I'm impressed with what these boards can do, even more so with the ease of them allowing you to set up the way you will use it according to your task. I honestly did not know that.
In that regard, I would like to ask you a question, as I do not understand much, but even so I will ask. I've heard people saying that overclocking has its positive as well as negative side. The downside is regarding the life of the video card, and this leaves me worried.
Having said that all, what I would like to know is this by changing in the card program the mode you would like to use according to your task (it seems like the program automatically overclock for you, correct me if I am wrong in my reasoning) does not this directly affect the life of the video card or what I just said is completely wrong?

Does the GTX 1080Ti have this configuration of being able to choose OC / Gaming / Eco modes? Because this is wonderful! I am very enthusiastic to know all this! I look like a little child at Christmas! :)

My God, 100 million polygons and still running smoothly across the screen? I think I'm going to be insane, the day I get over 100 million polygons or even more, I think the polygons will brainwash me and I'll end up becoming a polygon. :) Joking aside, this is a dream.

Well, I'm taking 3D-Coat very seriously and for fine details like pores and other details, I always have to have a very high number of polygons (dividing the mesh by all the methods we already know, including live clay ) to acquire detail similar to ZBrush.

The improvement of paint brushes in the paint room is a great new!

Now, thanks to you, it has become much easier to choose the video card! As you said the difference lies in the amount of fans! If I have enough money my choice will be FTW, because I can not forget that I have to get all the other pieces to complete the pc.

I know it does not fit into our subject, but after you've explained to me how 3d-Coat works. I would like to make a "little comment".
For me, three essential things miss 3D-Coat in the sculpt room.
1. The layering system for sculpture
2. A version of "morth target" and
3. Possibility of transferring a detail from one mesh to another.
One day who knows, 3D-Coat can give us these very important features!
Do you agree? :)
Thank you very much!

Link to comment
Share on other sites

  • Advanced Member
9 hours ago, Rygaard said:

Does the GTX 1080Ti have this configuration of being able to choose OC / Gaming / Eco modes? Because this is wonderful! I am very enthusiastic to know all this! I look like a little child at Christmas! :)

It's a Gigabyte utility that you install with the card, but any of the big board manufacturers will have something similar.  You might want to keep in mind that video card technology advances pretty fast, a year from now who knows what things will look like but it could well be the medium-high end range will be as powerful if not more than the ti.  That said I don't think you'll regret buying a Ti.  For me the biggest consideration between the various cards was price, I bought the one I did because it was significantly cheaper than all the rest of the 1080's, for instance the ROG STRIX version was more than 100 bucks more expensive, and because the reviews it had were good.  I don't think you find that the performance differences between the various Ti's all that striking.  Go for a good price, good reviews and a company you trust.

Edited by Falconius
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...