Jump to content
3DCoat Forums

Ryzen 7 1700x vs Ryzen 7 2700


kenmo
 Share

Recommended Posts

  • Advanced Member

I presently have a I7-4770 with 32 GB of DDR3 memory with a Nvidia GTX 1060 video card 6 GB vram.  I am debating on upgrading with an AM4 motherboard (prefer gigabyte mobos as all 4 of my computers at home use Gigabyte), 32 GB DDR4 and either Ryzen 7 1700x or Ryzen 7 2700. Does the price difference warrant much of a performance gain between the Ryzen 7 1700x and 2700? The Ryzen 7 1700x is $299 (Canadian dollars - Newegg) and the 2700 $409 / 2700x $449 - again Canadian dollars at Newegg.

Cheers & many thanks..

Edited by kenmo
Link to comment
Share on other sites

  • Advanced Member

I'm running a 1700 (as the differences between the x version and not x didn't really seem to matter that much to justify the price) and from what I've seen and read the 2700 is a very good chip.  As I understand it however they should be releasing their next batch in the first or second quarter of this year.  In which case you may want to wait and see, because the leaked specs seem very promising, and in any case it should drop the prices of the 2700 down as well.  Pretty sure there a few people here who run the 2700 and can chime in with their experiences.

Link to comment
Share on other sites

  • Advanced Member

I would go with the more recent one because of certain possibly important subtleties in the UEFI. (I'm not pretending to be an expert here or anything; I'm just studying up on all these complexities myself right now in preparation for buying a 2950x and an X399 motherboard.  I'm even tempted in buying a 2990WX but there's a lot of sophisticated stuff with that high end CPU that baffles even the experts so I'm getting sort of intimidated at the prospect of buying it and trying to make it work properly.)

 

For example the later version of that CPU will probably have a better  XMP so that with a simple switch in the UEFI you can have the full value of your RAM. 

Extreme Memory Profile (XMP) is a specification serving as an extension to the standard JEDEC SPD specifications developed by Intel. XMP is intended to make overclocking easier and more accessible to new users through profiles and predefined overclocking configurations that are known to be stable.

 

 

Hardware Brad said:
 
For example, if you purchase 3200MHz RAM and install it into your system the motherboard will automatically default to run it at 2133MHz. If you turn XMP on, it will overclock the RAM to run at its advertised speed. I believe it may also overclock the CPU as well, but I am not 100% on that, but RAM for sure.

Its only for Memory
it will overclock the CPU only if you choose the CPU speed under XMP settings

 

 

XMP is proprietary to Intel but you should check closely with your motherboard to see if it is enabled for AMD too. It probably is if it's second gen like the X399 boards.

You don't want to buy 3200MHz RAM and never be able to use all that extra speed you paid for..

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Advanced Member

I purchased the 2700x with a Gigabyte Aorus motherboard last Black Friday.. Couldnt be happier! Stable, fast, responsive.

One thing to note, if you're going to install Windows 10, make sure you dont do the same mistake I originally did and use an old Win 10 ISO.. You're going to be cursing. lol.

Use (download) the Windows Media Creation tool to create a fresh\current Windows 10 ISO and use that for your install.. It will make for a smooth install. Speaking of, combine that system with a fast M.2 SSD and it was the fastest WIndows install ever.. I had Win10 installed in 5-6 mins.. <- then added some gigabyte custom drivers.. etc.. but the base Windows install was really quick. Anyways, feel free to ask questions. 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

  • Advanced Member

Thanks Nossgrr. 

 

I've been really studying up for my next build and it's a revelation every day.  I was prepared to go big bucks even for the 2990WX  but it seems that thousands of extra dollars doesn't necessarily buy you a proportional increase in performance.

 

I just came across this sobering fact;

 

threadripper.thumb.JPG.e315e9c5b87f3aa642fc76ab19870ed6.JPG

 

2990-indigo-regression.png.b3bf5cfed120b8e5fc1bf470e206e3f3.png

 

That's crazy. And that 2990WX would be worth it if it gave you double the speed on renders (imagine 64 render buckets all going at 4ghz!) but because of the way AMD took it's 64 thread EPYC CPU and crippled the memory access with half of the  four chiplet modules it just doesn't perform the way you'd think all those extra threads and render buckets should.

https://bitsum.com/portfolio/coreprio/

It appears to be mainly oriented towards scientific and computational researchers.

 

The 1950X is by far the superior buy.

 

 

Edited by L'Ancien Regime
  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor
12 hours ago, L'Ancien Regime said:

Thanks Nossgrr. 

 

I've been really studying up for my next build and it's a revelation every day.  I was prepared to go big bucks even for the 2990WX  but it seems that thousands of extra dollars doesn't necessarily buy you a proportional increase in performance.

 

I just came across this sobering fact;

 

threadripper.thumb.JPG.e315e9c5b87f3aa642fc76ab19870ed6.JPG

 

2990-indigo-regression.png.b3bf5cfed120b8e5fc1bf470e206e3f3.png

 

That's crazy. And that 2990WX would be worth it if it gave you double the speed on renders (imagine 64 render buckets all going at 4ghz!) but because of the way AMD took it's 64 thread EPYC CPU and crippled the memory access with half of the  four chiplet modules it just doesn't perform the way you'd think all those extra threads and render buckets should.

https://bitsum.com/portfolio/coreprio/

It appears to be mainly oriented towards scientific and computational researchers.

 

The 1950X is by far the superior buy.

That's true. The 2990X is a great option if you use CPU-based renderers like Arnold, Corona, VRay and Renderman. In that video, he clearly states that the noted regression in performance is WINDOWS fault, not AMD's. Hopefully, that public embarrassment of Windows will get Microsoft hot on a fix for that, so people aren't having to rely on some 3rd party work-around.

Nevertheless, the 2950X is a good option and has a bit more OC headroom than the original ThreadRipper 1950X. But, on Black Friday, I saw a deal on NewEgg for a 1950X I couldn't pass up. They had a new one for $450 USD. I wasn't really looking to upgrade to a ThreadRipper, but that was just too good a deal. I sold my old render node/backup PC to pay for it, so it made sense. The big problem with ThreadRipper CPU's is finding proper cooling.

I had a Corsair H115i 280mm AIO that served me well with the Ryzen CPU's, but it couldn't handle ThreadRipper very well. I had to keep it at stock speeds, which would boost up to 3.7Ghz on it's own, but when running Cinebench or the VRay Benchmark, it would thermal throttle back down to 3.4Ghz or lower on some threads. The Corsair Link software, that allows you to increase the speed of the pump and the fans, had the fans and temps all over the place. It drove me nuts for days, trying to diagnose the problems with it.

I did a ton of research, trying to find the right solution, and I ran into a few issues. If I stuck with Watercoolers (AIO kind, not the custom loop kind which are too expensive for me), there was no clear winner. My case is not setup for a 360mm radiator and the Enermax Liqtech II coolers seem like the best option, but I noticed a lot of complaints about issues with them and some bad reviews on Youtube. It's so easy to have small corrosion particles and sediment cause issues and overall more points of failure, than an Air Cooler. Plus, dedicated ThreadRipper Air Coolers like the kind from Noctua, actually outperformed many watercooler AIO's like my Corsair H115i. Most of that is because the dedicated ThreadRipper coolers, have a larger cold plate that fits the massive TR CPU die. Many AIO's like mine, don't fully cover the die. Just about 2/3's of it.

I was close to getting a Noctua DH 14-TR, but I hate the colors of the fans (like many people do), and didn't want to spend an additional $50+ buying black fans they sell separately. They showed some black fan versions at CES last year, but I couldn't find anything available on the market, yet. The one I really wanted was the Be Quiet Dark Rock Pro TR4 (Dark Rock Pro 4 specifically designed for ThreadRipper), but they weren't available yet, anywhere in the US. Not on Newegg or Amazon. However, someone mentioned a link to AquaTuning (in Europe) for $87 and they said they got it shipped in the US within a week (not bad at all, as I've had some Newegg orders take that long). It was only $87 and the shipping was only $10.

After installing it, I have to say I'm blown away how well it works. I'm running a constant 4.0Ghz OC (1.34v) on the 1950X and it's quiet as a kitten. No software needed to run it or adjust fan curves, etc. I would give this cooler the Platinum Award for ThreadRipper CPU's, if I were one of those Tech Gurus on Youtube. It beats all other contenders, badly. In fact, if you are buying a Ryzen 7 CPU like the 2700X, a regular Dark Rock Pro 4 is easily available at all retailers (only the ThreadRipper model hadn't yet made it to the US retailers), and it would be an awesome option, especially if you wanted to see if you can get a reasonable overclock on it (the 2700X should OC easily in the 4.1-4.2 Ghz range).

 

  • Thanks 1
Link to comment
Share on other sites

  • Advanced Member

But what about the Wraith Ripper?

 

http://www.coolermaster.com/cooling/cpu-air-cooler/wraith-ripper/

 

 

And thanks for all that info on the heat sinks.

 

 

And also you're right that the 2990WX's problems are solely due to MicroSoft.  here are the Linux Results on those renders.

https://www.phoronix.com/scan.php?page=article&item=amd-2920x-2970wx&num=9

 

2.JPG.91c59ef9b9c69896f0db828e8e0337bf.JPGCapture.thumb.JPG.7004abcdac68564697c81ee013d566d9.JPG

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Reputable Contributor
1 minute ago, L'Ancien Regime said:

I'm sure it's fine, but it only has one fan (in the center) and no apparent capacity to add more. Plus, it's considerably more expensive. The only thing it has that the Dark Rock Pro TR4 and the Noctua cooler doesn't, is some RGB lighting, but I personally don't care about that. I'm not interested in converting my PC to a year round Christmas tree. :D

 

Link to comment
Share on other sites

  • Reputable Contributor

That's pretty darn impressive, but I cannot understand why, at 7nm, they cannot leap ahead of NVidia's best. Matching the 2080 is ok, and great at that price point and having 16GB of HBM memory...but it's odd that their best hasn't seemed to match Nvidia's best, in the past 4-5yrs. They used to leapfrog Nvidia on a regular basis. I will say, though, rendering w/ ProRender or Cycles, it probably beats the 2080 or 2080Ti. AMD's Vega compute performance was supposedly better than the comparative NVidia cards.

Link to comment
Share on other sites

  • Advanced Member

Yep, and for me, like you, only the renders count. Games disappoint me. The AI and game designs and aesthetics for the most part have failed that is unless you love shooting and blowing stuff up.  With 16 gb of VRAM that thing is actually very good value for the money.

 

Edited by L'Ancien Regime
  • Like 1
Link to comment
Share on other sites

  • Advanced Member

At home I have four home built computers. One is a low end AMD AM3 running Linux (Zorin-OS) and used as a Samba file server. The other three i7-4770, I5-2600K and AMD FX-8320. At first they all ran Windows 7 Home Premium and the FX-8320 would run easily with the I5-2600K.  However after upgrading to Windows 10, initially the FX-8320 was fine but after a few Win10 updates the FX-8320 started to stall and freeze for a couple of seconds, then be ok. I tried a few bios setting tweaks recommend on sites like Tom's Hardware and others. Nothing seem to resolve the issue. I formatted the hard drive and re-installed Windows 10. The problem persisted. So before Christmas I formatted the hard drive and rolled the computer back to Windows 7. I notice a big performance gain running Windows 7 on the AMD FX-8320 over Windows 10.

Perhaps the Threadripper has a similar issue with WIndows 10.

 

Link to comment
Share on other sites

  • Reputable Contributor
10 hours ago, Sorda said:

That is why I use Intel. so as not to bother with the fact that the first cooling system will not cope with my processor.

What? Any cooler that works with Intel CPU's will work with all AMD CPU's, except ThreadRipper, which is a niche CPU. If you want to keep getting price-gouged, then yes, Intel is the right CPU. The AMD 3000 CPU's will outpace Intel's top consumer CPU, with a lot less power/heat. The AMD Ryzen 2700X already beats Intel's  8700k  for less $$$, in rendering and other multi-threading tasks.

 

  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor
4 hours ago, kenmo said:

At home I have four home built computers. One is a low end AMD AM3 running Linux (Zorin-OS) and used as a Samba file server. The other three i7-4770, I5-2600K and AMD FX-8320. At first they all ran Windows 7 Home Premium and the FX-8320 would run easily with the I5-2600K.  However after upgrading to Windows 10, initially the FX-8320 was fine but after a few Win10 updates the FX-8320 started to stall and freeze for a couple of seconds, then be ok. I tried a few bios setting tweaks recommend on sites like Tom's Hardware and others. Nothing seem to resolve the issue. I formatted the hard drive and re-installed Windows 10. The problem persisted. So before Christmas I formatted the hard drive and rolled the computer back to Windows 7. I notice a big performance gain running Windows 7 on the AMD FX-8320 over Windows 10.

Perhaps the Threadripper has a similar issue with WIndows 10.

 

The 1950X ThreadRipper runs smooth as butter on Windows 10. From what I understand in that video about the Windows ThreadRipper issue, is that it applies to the 32-core chip because Windows doesn't really know how to handle more than 16-cores....yet. I would think AMD would be pressing Microsoft pretty hard to get that fixed. 

Link to comment
Share on other sites

  • Advanced Member

Я думал, что рендеринг на процессоре уже был в каменном веке, и продвинутые пользователи рендеринг на видеокартах. 

Wait, does this dude in the video quite seriously compare processors with different numbers of cores? Naturally, even for the Stone Age render, more cores are better than a higher frequency. This says nothing, however. 

 

I personally use i7-6800k and do not have any difficulties in conjunction with my gtx 1080ti video card. The only thing I don’t like (as in a large number of processors) is the insufficient number of pci-e lines. I really don’t know if it matters, when rendering on several video cards. I think it looks like the scene is loaded into the video card's memory, where the scene is already being processed, after which the final result is unloaded from the video card, if you can call it that. In contrast to the dynamic processing of information, as in video games, for example. Correct if I'm wrong. 

Edited by Dmitry Bedrik
Link to comment
Share on other sites

  • Advanced Member

Aren't the algorithms on a CPU more complex due to the hard coded math engines in them compared to the ones used in a GPU no matter how fast?

 

Well based simply on noise elimination BOXX says GPU rendering is over 6 times faster than CPU rendering.

https://blog.boxx.com/2014/10/02/gpu-rendering-vs-cpu-rendering-a-method-to-compare-render-times-with-empirical-benchmarks/

But that's only one criterion.

 

 

 

https://www.fxguide.com/featured/look-at-renderman-22-and-beyond/

 

Well it would seem that they've attained parity at least in specific render software.  

 

https://renderman.pixar.com/news/renderman-xpu-development-update

So if that's the case, what is the correct way to proceed here in the purchase of a new rig?

What is the optimal price performance configuration for a rig especially if you've been frustrated by the testing of new shaders/textures bottleneck and you want to make that workflow more agile and responsive?

 

 

Link to comment
Share on other sites

  • Reputable Contributor
7 hours ago, L'Ancien Regime said:

Aren't the algorithms on a CPU more complex due to the hard coded math engines in them compared to the ones used in a GPU no matter how fast?

 

Well based simply on noise elimination BOXX says GPU rendering is over 6 times faster than CPU rendering.

https://blog.boxx.com/2014/10/02/gpu-rendering-vs-cpu-rendering-a-method-to-compare-render-times-with-empirical-benchmarks/

But that's only one criterion.

 

 

 

https://www.fxguide.com/featured/look-at-renderman-22-and-beyond/

 

Well it would seem that they've attained parity at least in specific render software.  

 

https://renderman.pixar.com/news/renderman-xpu-development-update

So if that's the case, what is the correct way to proceed here in the purchase of a new rig?

What is the optimal price performance configuration for a rig especially if you've been frustrated by the testing of new shaders/textures bottleneck and you want to make that workflow more agile and responsive?

 

 

It really depends on what render engines you prefer or plan to use. There are some good CPU-based renderers, still, but GPU rendering is really the way to go. There just isn't any comparison, even with ThreadRipper CPU's. That's why engines like Cycles in Blender, which now has a hybrid GPU + CPU mode...where it will use both at the same time...is where the future is at. It's what I like about Thea Render a few years back. It was one of the first to leverage both the CPU and GPU.

In VRay 3.6 for Modo, there is a GPU + CPU mode in RT. The AMD card does limit your options as to which engines support OpenCL, but there are some good options because ProRender is no afterthought. It's actually pretty impressive.

 

Link to comment
Share on other sites

  • Advanced Member

I wonder where that new Radeon Vega 7 would fall in that graph with it's 16GB of HBM2 VRAM?

 

Capture.thumb.JPG.1c558f627ce819d8a0e7f0ee5c2aca00.JPG

 

 

And how do you find that AMD Radeon Pro Render for SSS and caustics? Does it measure up to something like Maxwell Render?

 

I'm reading that for things like particle cloud renders the CPU is still superior due to it's math engines like Embree..and Embree also runs on Ryzen CPUs as well

https://software.intel.com/en-us/rendering-framework

https://software.intel.com/en-us/articles/embree-highly-optimized-visibility-algorithms-for-monte-carlo-ray-tracing

 

"All recent AMD CPU support Embree, including Rizen. Performance in Vray(for Max) are in between an 8core and a 10core I7. At least on very old release of Vray it was better to avoid mixing AMD and Intel when caching IM because some parts of the computation was random and behave slightly differently on the two hardware platform. Don't know if this is a problem with current release.. anyway you can always save GI maps on a single system and distribute the render for final frame, this should always work"

http://forum.vrayforc4d.com/index.php?threads/19169/

 

CPUs' = complex elegant solutions.

GPU's = brute force simplistic solutions.

 

Or is that a concept that's now 5 years out of date?

 

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Reputable Contributor
5 hours ago, L'Ancien Regime said:

I wonder where that new Radeon Vega 7 would fall in that graph with it's 16GB of HBM2 VRAM?

 

Capture.thumb.JPG.1c558f627ce819d8a0e7f0ee5c2aca00.JPG

 

 

And how do you find that AMD Radeon Pro Render for SSS and caustics? Does it measure up to something like Maxwell Render?

 

I'm reading that for things like particle cloud renders the CPU is still superior due to it's math engines like Embree..and Embree also runs on Ryzen CPUs as well

https://software.intel.com/en-us/rendering-framework

https://software.intel.com/en-us/articles/embree-highly-optimized-visibility-algorithms-for-monte-carlo-ray-tracing

 

"All recent AMD CPU support Embree, including Rizen. Performance in Vray(for Max) are in between an 8core and a 10core I7. At least on very old release of Vray it was better to avoid mixing AMD and Intel when caching IM because some parts of the computation was random and behave slightly differently on the two hardware platform. Don't know if this is a problem with current release.. anyway you can always save GI maps on a single system and distribute the render for final frame, this should always work"

http://forum.vrayforc4d.com/index.php?threads/19169/

 

CPUs' = complex elegant solutions.

GPU's = brute force simplistic solutions.

 

Or is that a concept that's now 5 years out of date?

 

I haven't tried to use the Blender Pro Render plugin for a while, because I've been waiting for 2.8 to become official, so I know most of the bugs have been ironed out. I've tested the one for Modo, but it's still in an early Beta stage and Foundry is really quiet about it's development. I reported an issue with their shadow catching material (not working) on the Pro Render Beta thread, months ago and they never bothered to answer. It doesn't yet have SSS working, but it is otherwise very nice. It works with Modo's default materials/lights, etc, and seems like you are working with Modo's native render, just much faster. 

Link to comment
Share on other sites

  • Advanced Member

Here's another sobering fact; thanks to that Level1 guy's videos I'd decided on this motherboard at $560 Cdn. That was its price 3 days ago on Amazon.

I guess a lot of other people saw his video too because I checked it last night and this was the new pricegouge.thumb.JPG.2edf8fa551197a24b6233e43f7aeba8e.JPG

 

hahaha...Amazon.com instead of Amazon.ca...that's $516.00

 

1308041026_notgouge.thumb.JPG.f9ecb0fb6a083c967392bfe268e2ac87.JPG

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Reputable Contributor
6 hours ago, L'Ancien Regime said:

Here's another sobering fact; thanks to that Level1 guy's videos I'd decided on this motherboard at $560 Cdn. That was its price 3 days ago on Amazon.

I guess a lot of other people saw his video too because I checked it last night and this was the new pricegouge.thumb.JPG.2edf8fa551197a24b6233e43f7aeba8e.JPG

 

hahaha...Amazon.com instead of Amazon.ca...that's $516.00

 

1308041026_notgouge.thumb.JPG.f9ecb0fb6a083c967392bfe268e2ac87.JPG

This is the one I got, open box, so it was about $279

https://www.newegg.com/Product/Product.aspx?Item=N82E16813145105&Description=gigabyte x399 aorus gaming 7&cm_re=gigabyte_x399_aorus_gaming_7-_-13-145-105-_-Product

Link to comment
Share on other sites

  • 3 weeks later...

https://techgage.com/article/amd-ryzen-threadripper-2970wx-2920x-performance/ 

AMD’s Ryzen Processor Lineup
  Cores Clock (Turbo) L2+L3 Memory TDP Price
  Threadripper WX-series
2990WX 32 (64T) 3.0 GHz (4.2) 16+64MB Quad 250W $1799
2970WX 24 (48T) 3.0 GHz (4.2) 12+64MB Quad 250W $1299
  Threadripper X-series
2950X 16 (32T) 3.5 GHz (4.4) 8+32MB Quad 180W $899
2920X 12 (24T) 3.5 GHz (4.3) 6+32MB Quad 180W $649
  Ryzen 7
R7 2700X 8 (16T) 3.7 GHz (4.3) 4+16MB Dual 105W $329
R7 2700 8 (16T) 3.2 GHz (4.1) 4+16MB Dual 95W $299
  Ryzen 5
R5 2600X 6 (12T) 3.6 GHz (4.2) 3+16MB Dual 95W $219
R5 2600 6 (12T) 3.4 GHz (3.9) 3+16MB Dual 65W $189
R5 1600X 6 (12T) 3.6 GHz (4.0) 3+16MB Dual 95W $219
R5 1600 6 (12T) 3.2 GHz (3.6) 3+16MB Dual 65W $189
R5 1500X 4 (8T) 3.5 GHz (3.7) 2+16MB Dual 65W $174
R5 1400 4 (8T) 3.2 GHz (3.4) 2+8MB Dual 65W $169
  Ryzen 3
R3 1300X 4 (4T) 3.5 GHz (3.7) 2+8MB Dual 65W $129
R3 1200 4 (4T) 3.1 GHz (3.4) 2+8MB Dual 65W $109
  Ryzen w/ Radeon Vega Graphics
R5 2400G 4 (8T) 3.6 GHz (3.9) 0.5+4MB Dual 65W $169
R3 2200G 4 (4T) 3.5 GHz (3.7) 0.5+4MB Dual 65W $99
Link to comment
Share on other sites

  • Advanced Member
On 1/9/2019 at 9:19 AM, AbnRanger said:

I'm sure it's fine, but it only has one fan (in the center) and no apparent capacity to add more. Plus, it's considerably more expensive. The only thing it has that the Dark Rock Pro TR4 and the Noctua cooler doesn't, is some RGB lighting, but I personally don't care about that. I'm not interested in converting my PC to a year round Christmas tree. :D

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...