Advanced Member L'Ancien Regime Posted January 19, 2019 Advanced Member Report Share Posted January 19, 2019 Just now, Tony Nemo said: As a non-gamer, I like that AMD is CUDA enabled as I use a GPU renderer (Octane). I thought I was a captive Nvidea customer. Have you got some links on that? Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 19, 2019 Reputable Contributor Report Share Posted January 19, 2019 1 hour ago, L'Ancien Regime said: Have you got some links on that? I think this is what he's talking about. I don't know if it's already working on AMD cards, or if it's still being developed. https://www.extremetech.com/computing/224599-gpu-computing-breakthrough-cloud-rendering-company-claims-to-run-cuda-on-non-nvidia-gpus Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 19, 2019 Advanced Member Report Share Posted January 19, 2019 This guy is saying Radeon VII is the equivalent to RTX 2080Ti And this is more recent https://www.phoronix.com/scan.php?page=news_item&px=LLVM-CUDA-To-AMD-HIP Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted January 19, 2019 Contributor Report Share Posted January 19, 2019 I got the idea above, " hence the CUDA cores and Stream processors on AMD cards." Competition is a great leveler. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 20, 2019 Reputable Contributor Report Share Posted January 20, 2019 13 hours ago, L'Ancien Regime said: This guy is saying Radeon VII is the equivalent to RTX 2080Ti And this is more recent https://www.phoronix.com/scan.php?page=news_item&px=LLVM-CUDA-To-AMD-HIP I watched that video earlier today and he said it's supposed to compete at the high end with the 2080 and 1080Ti (both have the same level of performance). However, I did hear him mention that it scored about 62% faster than the 2080 in Luxmark which uses OpenCL rendering for benchmarking purposes. That's the statistic we care about most. Not necessarily how it compares to the 2080 or 2080Ti in gaming benchmarks. That gives us a good idea what the viewport performance might be like in a 3D app, but not for GPU rendering. That Luxmark figure means it probably spanks even the 2080Ti in OpenCL rendering. That's pretty awesome news if you consider that it outperforms a $1200 2080Ti in the same task. When you think about it, Nvidia hoodwinked consumers into buying refreshed 1080Ti's buy rebranding them 2080's, to make it look like the xx80 series card took a step up in performance, but it didn't really. They just slapped different names on them to give that impression. They labeled a 1080 a 2070. The 2080ti is effectively the Titan Pascal. The only real difference is they added the RTX cores...that nobody is using. They think Tech Reviewers and hardware enthusiasts like us, are too stupid to notice this scheme. That's taking a page from Intel's playbook, and it's one reason why I'm somewhat anxious to give AMD a try, when I can afford to. https://luxcorerender.org/ http://www.luxmark.info/ Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 20, 2019 Advanced Member Report Share Posted January 20, 2019 (edited) 7 hours ago, AbnRanger said: I watched that video earlier today and he said it's supposed to compete at the high end with the 2080 and 1080Ti (both have the same level of performance). However, I did hear him mention that it scored about 62% faster than the 2080 in Luxmark which uses OpenCL rendering for benchmarking purposes. That's the statistic we care about most. Not necessarily how it compares to the 2080 or 2080Ti in gaming benchmarks. That gives us a good idea what the viewport performance might be like in a 3D app, but not for GPU rendering. That Luxmark figure means it probably spanks even the 2080Ti in OpenCL rendering. That's pretty awesome news if you consider that it outperforms a $1200 2080Ti in the same task. When you think about it, Nvidia hoodwinked consumers into buying refreshed 1080Ti's buy rebranding them 2080's, to make it look like the xx80 series card took a step up in performance, but it didn't really. They just slapped different names on them to give that impression. They labeled a 1080 a 2070. The 2080ti is effectively the Titan Pascal. The only real difference is they added the RTX cores...that nobody is using. They think Tech Reviewers and hardware enthusiasts like us, are too stupid to notice this scheme. That's taking a page from Intel's playbook, and it's one reason why I'm somewhat anxious to give AMD a try, when I can afford to. https://luxcorerender.org/ http://www.luxmark.info/ I've been narrowing down my choices and I'm getting pushed on time to make a decision and it's coming down to the Ryzen 2950 and right now I'm heavily leaning towards this Radeon VII which as I said is basically a card that would otherwise cost us $10k or more if it wasn't whittled down from being a Radeon Instinct to being an artist's card for $699. That's a good deal. Even if the NAVI top card is dirt cheap later on this summer would it be worth it to add it to this Radeon VII? Would there be any increase in GPU rendering speed that route? Also I watched this video last night and this guy goes into how the RAM speed doesn't necessarily reward a 3200 mhz RAM over 2600RAM in real performance. This stuff isn't simple. Edited January 20, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 20, 2019 Reputable Contributor Report Share Posted January 20, 2019 29 minutes ago, L'Ancien Regime said: I've been narrowing down my choices and I'm getting pushed on time to make a decision and it's coming down to the Ryzen 2950 and right now I'm heavily leaning towards this Radeon VII which as I said is basically a card that would otherwise cost us $10k or more if it wasn't whittled down from being a Radeon Instinct to being an artist's card for $699. That's a good deal. Even if the NAVI top card is dirt cheap later on this summer would it be worth it to add it to this Radeon VII? Would there be any increase in GPU rendering speed that route? Also I watched this video last night and this guy goes into how the RAM speed doesn't necessarily reward a 3200 mhz RAM over 2600RAM in real performance. This stuff isn't simple. Hopefully AMD will let the Tech Reviewers get their hands on the Radeon VII in the next week or so. I think 3200Mhz is a good speed to get right now. The AMD TR 1950X I'm running doesn't seem to like going over 2933Mhz. I have 3200Mhz, but cannot set it that high, yet. Lower timings is probably better than max Mhz. As for graphics cards, I really like the Radeon VII, but I think in terms of bang for your buck, you can easily score 2x 1080's used, on eBay (if you are a bit patient and watch some of the auctions for a few days), for the price of a 2070. I would wait and see what the Benchmarks are for GPU rendering on the Radeon VII. If it's better than the 2080Ti, it would be worth it. Don't forget, Houdini uses OpenCL for simulations and this card could be a beast in that regard. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 20, 2019 Reputable Contributor Report Share Posted January 20, 2019 Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 29, 2019 Advanced Member Report Share Posted January 29, 2019 https://wccftech.com/amd-radeon-vii-7nm-gpu-firestrike-and-timespy-benchmarks-leaked/ AMD Radeon VII 7nm GPU FireStrike and TimeSpy Benchmarks Leaked Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 30, 2019 Reputable Contributor Report Share Posted January 30, 2019 3 hours ago, L'Ancien Regime said: https://wccftech.com/amd-radeon-vii-7nm-gpu-firestrike-and-timespy-benchmarks-leaked/ AMD Radeon VII 7nm GPU FireStrike and TimeSpy Benchmarks Leaked That's pretty much where AMD was projecting it to land, in direct competition with the 2080, in Gaming benchmarks. But I'm expecting the compute side of the card actually match or beat the $1200 2080Ti...which is basically a Titan, renamed, so Nvidia's 20xx line doesn't look like such a disappointment. Nvidia's Turin cards are basically Pascal rebranded because they added RTX cores...which nobody uses yet. Vega 7 is AMD showing up and finally competing with Nvidia's top consumer card (2080). Again, the Ti models usually come out a full year after the xx80 and xx70 models come out. So, again, Nvidia is just trying to fool people by shifting the naming convention up one model. For CG work, I'm most interested in seeing what the LuxMark or other GPU rendering benchmarks are. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 30, 2019 Advanced Member Report Share Posted January 30, 2019 (edited) I'm strongly leaning towards the Radeon VII when it's released with reviews of rendering power. Plus I'm getting 3 x 32" monitors and FreeSync monitors will be cheaper. This guy is blaming the Micron VRAM degrading. Edited January 30, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 30, 2019 Reputable Contributor Report Share Posted January 30, 2019 Oh, my! Just to try and get an idea how well the Vega 7 might perform, I did a little research and a I ran my GTX 1080Ti on Luxmark, just to see how it stacked up against the Vega 64 (someone showed a test of it on Youtube, in the video below). In Luxmark, the Vega 64 stomps my 1080Ti. It had a score of over 25,500, while my 1080Ti managed 21,500! This review came up with similar findings. The VRay benchmark had the two cards pretty much even. But, look at the OpenGL (viewport performance) marks. It handily beats the 1080Ti, there, too! Wow. https://techgage.com/article/a-look-at-amds-radeon-rx-vega-64-workstation-compute-performance/2/ So, I'm guessing that the Vega 7 will at least match the 2080Ti, if not beat it, in GPU rendering. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 30, 2019 Reputable Contributor Report Share Posted January 30, 2019 The Vega 64 is TWICE as fast as the GTX 1080, in Luxmark benchmarks! https://www.phoronix.com/scan.php?page=article&item=12-opencl-98&num=4 And it's 10 sec faster (to render the BMW scene) than the 1080Ti in Blenchmark: http://blenchmark.com/gpu-benchmarks Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted January 30, 2019 Contributor Report Share Posted January 30, 2019 But what about CUDA for my Octane renderer? Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 30, 2019 Reputable Contributor Report Share Posted January 30, 2019 3 hours ago, Tony Nemo said: But what about CUDA for my Octane renderer? Didn't see any benchmarks for Octane. I recall Otoy stating they were creating a CUDA translator for Octane, instead of re-writing everything for OpenCL. I'm not sure if it's already working or if it's still in development. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 30, 2019 Reputable Contributor Report Share Posted January 30, 2019 https://venturebeat.com/2016/03/09/otoy-breakthrough-lets-game-developers-run-the-best-graphics-software-across-platforms/ Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 31, 2019 Advanced Member Report Share Posted January 31, 2019 (edited) There's another big question here; if the Radeon VII is the best AMD is offering over the next year to year and a half in the GPU department, what will be it's Crossfire performance? Because $699 * 2 = $1398.00. The RTX 2080Ti is $1119 to $2000 depending on which version you get. the FE from Nvidia direct is $1119.00 It'll be interesting to see if the Radeon VII in Crossfire can blow the doors off the RTX 2080Ti. Two Radeon VII's might even rival the new Titan RTX at $2500 with 24 GB of VRAM. This isn't just academic; we're on the cusp of real time rendering for studio quality 4K unbiased renders. It could be well worth investing in two of these over the next two or three years. Edited January 31, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 31, 2019 Reputable Contributor Report Share Posted January 31, 2019 36 minutes ago, L'Ancien Regime said: There's another big question here; if the Radeon VII is the best AMD is offering over the next year to year and a half in the GPU department, what will be it's Crossfire performance? Because $699 * 2 = $1398.00. The RTX 2080Ti is $1119 to $2000 depending on which version you get. the FE from Nvidia direct is $1119.00 It'll be interesting to see if the Radeon VII in Crossfire can blow the doors off the RTX 2080Ti. Two Radeon VII's might even rival the new Titan RTX at $2500 with 24 GB of VRAM. This isn't just academic; we're on the cusp of real time rendering for studio quality 4K unbiased renders. It could be well worth investing in two of these over the next two or three years. SLI doesn't work in CG apps. None of them. GPU render engines already recognize multiple cards and put them to work, without SLI. Two cards will not get you better viewport performance, either. Just when you use it for GPU rendering. Heck, you could buy 2 Vega 64 cards, right now and it would outperform a GTX 2080Ti in terms of GPU rendering performance. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 31, 2019 Advanced Member Report Share Posted January 31, 2019 9 minutes ago, AbnRanger said: SLI doesn't work in CG apps. None of them. GPU render engines already recognize multiple cards and put them to work, without SLI. Two cards will not get you better viewport performance, either. Just when you use it for GPU rendering. Heck, you could buy 2 Vega 64 cards, right now and it would outperform a GTX 2080Ti in terms of GPU rendering performance. Yep so it's going to use all your resources both CPU and GPU then.. Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted January 31, 2019 Contributor Report Share Posted January 31, 2019 21 hours ago, AbnRanger said: https://venturebeat.com/2016/03/09/otoy-breakthrough-lets-game-developers-run-the-best-graphics-software-across-platforms/ Okay, Octane (currently ver 4) runs just fine on AMD chips. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 31, 2019 Advanced Member Report Share Posted January 31, 2019 Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted February 1, 2019 Reputable Contributor Report Share Posted February 1, 2019 21 hours ago, L'Ancien Regime said: Yep so it's going to use all your resources both CPU and GPU then.. Yes. Any GPU render will see multiple cards installed and use all of them, without any need for an SLI bridge and such. If the card is plugged into a PCI Express slot and has the proper cables from the power supply, connected, the render engine will see it and use it. Doesn't have to be hooked up to a monitor, either. Whether a GPU renderer uses the CPU as well, depends on whether or not that feature is supported or not. Right now, Octane doesn't support that feature, but Cycles and ProRender does. I don't think Redshift does, either. So, there are only a few that do. Cycles and ProRender are two of those. VRay (3.6+) RT GPU does utilize GPU + CPU to some degree. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted February 1, 2019 Reputable Contributor Report Share Posted February 1, 2019 5 hours ago, Tony Nemo said: Okay, Octane (currently ver 4) runs just fine on AMD chips. Are you saying or asking? Cause I'm not 100% sure, since I don't have an AMD card, and can only go by their claims online. I haven't seen anything totally conclusive, just that they said they were working on a CUDA translation, so AMD cards could use Octane. I would think they would want to tap into the MAC market, where some 3D apps have versions working. Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted February 1, 2019 Contributor Report Share Posted February 1, 2019 18 hours ago, AbnRanger said: Are you saying or asking? Cause I'm not 100% sure, since I don't have an AMD card, and can only go by their claims online. I haven't seen anything totally conclusive, just that they said they were working on a CUDA translation, so AMD cards could use Octane. I would think they would want to tap into the MAC market, where some 3D apps have versions working. I'm going by Otay's announcement that Octane has run on AMD chips sine Octane ver 3.1. It seems Octane has a CUDA enabler built in. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted February 2, 2019 Advanced Member Report Share Posted February 2, 2019 https://wccftech.com/amd-radeon-vii-7nm-gpu-makes-its-way-to-reviewers-test-benches/ AMD Radeon VII 7nm GPU Makes Its Way To Reviewers Test Benches Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted February 4, 2019 Advanced Member Report Share Posted February 4, 2019 Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted February 5, 2019 Reputable Contributor Report Share Posted February 5, 2019 The AMD rep said that for Blender (I presume he is talking about Cycles and/or ProRender), the performance increase in Radeon VII is roughly 27% improvement. That would easily put it into 2080Ti territory or better, as the Vega64 already outperforms the 1080ti/2080. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted February 5, 2019 Advanced Member Report Share Posted February 5, 2019 7 hours ago, AbnRanger said: The AMD rep said that for Blender (I presume he is talking about Cycles and/or ProRender), the performance increase in Radeon VII is roughly 27% improvement. That would easily put it into 2080Ti territory or better, as the Vega64 already outperforms the 1080ti/2080. I'm thinking that original issue of 5000 will be sold out within the first day or two. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted February 5, 2019 Reputable Contributor Report Share Posted February 5, 2019 That's why you pre-order? I don't think they will stop at 5000 if they sell out quickly. That may just be their low-ball estimate. Who knows. Quote Link to comment Share on other sites More sharing options...
Applink Developer haikalle Posted February 6, 2019 Applink Developer Report Share Posted February 6, 2019 Interesting to see tomorrows reviews. Rumors are saying it's fast but also it's loud....I'm a little bit scared.. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.