Carlosan Posted January 12, 2019 Share Posted January 12, 2019 https://www.engadget.com/2019/01/09/amd-ceo-cto-radeon-vii-ray-tracing/ After lagging behind with Vega desktop GPUs for a few years, AMD announced a major upgrade today: the Radeon VII, the first 7nm GPU for gamers. It's a powerful card capable of serious 4K performance. Its new architecture means it won't use up too much power, leaving plenty of room for overclockers to take it even further. But there's no real-time ray tracing, a technology that NVIDIA has been pushing since last year, when it unveiled its RTX desktop GPUs. So where does this leave AMD? Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 13, 2019 Advanced Member Share Posted January 13, 2019 I sure wish all these GPU guys be they Nvidia or AMD would at least throw the content creators a few bones with their publicity, especially with these high end cards. I really don't care about playing The Division or Final Fantasy. If we're going to be expected to fork out this kind of money for a piece of technology they could at least print out a few paragraphs on how it runs with Arnold or Renderman or Keyshot. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 13, 2019 Reputable Contributor Share Posted January 13, 2019 2 hours ago, L'Ancien Regime said: I sure wish all these GPU guys be they Nvidia or AMD would at least throw the content creators a few bones with their publicity, especially with these high end cards. I really don't care about playing The Division or Final Fantasy. If we're going to be expected to fork out this kind of money for a piece of technology they could at least print out a few paragraphs on how it runs with Arnold or Renderman or Keyshot. I agree, although I have noticed Lisa Su mention Content Creation for these cards. OpenCL and Blender (as well as Premiere Pro and I think AE) was mentioned in the presentation of the card. What I think would help her justify the card and the price is to mention that it's not only aimed at Gamers, but more and more content creators are using these Consumer cards. That's who the 16GB was aimed at. This Vega 7, IMO, is basically Vega Frontier Edition 2.0...which also had 16GB of HBM Memory. My Guess is that the mainstream versions will be coming by summer, and they will have about 8GB of RAM and tweaked a bit to outdo the 2080, outright, and probably be around $550-599. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 13, 2019 Reputable Contributor Share Posted January 13, 2019 ...but you are right. When these cards are sent out to independent Tech Reviewers, they should ask the reviewers to compare this card to 2080 and the 2080ti, directly using ProRender. THAT will help them sell a bunch of these cards. My guess (based on the presentation mentioning OpenCL improvements of 62% over Vega 64) is that it could very well stomp even the 2080ti in Compute performance. The 2080 is basically the 1080Ti + RTX, so 62% improvement over what was already stellar compute performance, should blow the 2080Ti away in OpenCL-based GPU rendering. If it does, I might get one. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 14, 2019 Advanced Member Share Posted January 14, 2019 (edited) https://www.techpowerup.com/gpu-specs/radeon-vii.c3358 RTX 2070 is $549 USD Radeon VII is $699 USD TechPowerup rates the 2070 at 97% to Radeon VII at 100% performance... Edited January 14, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 14, 2019 Reputable Contributor Share Posted January 14, 2019 We won't know for certain where the Radeon 7 stacks up against the Nvidia cards until right before they are released, when the Tech Reviewers typically get their hands on them just prior to launch. However, AMD was clearly claiming to outdo the 2080, not the 2070. The 2080 is about $750-800. So, it's inaccurate to try and compare it to the 2070. That chart doesn't tell us anything, as they've had no means of testing any of the cards. Furthermore, I bet the compute performance of the card blows away even the 2080Ti, and that is what we content creators are most interested in. Not how many FPS we can get on a 1080p monitor, in Battlefield V. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 14, 2019 Advanced Member Share Posted January 14, 2019 3 hours ago, AbnRanger said: Furthermore, I bet the compute performance of the card blows away even the 2080Ti Now THAT would be very interesting...especially with 16gb of HBM2 and a Teraflop/second. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 15, 2019 Advanced Member Share Posted January 15, 2019 (edited) https://wccftech.com/amd-radeon-vega-vii-5000-units-64-rops-no-fp64-compute/ AMD Radeon Vega VII Rumored To Have Less Than 5000 Units Made – Confirmed To Feature 64 ROPs, Botched FP64 Compute Compared To Instinct Mi50 (ROP The render output unit, often abbreviated as "ROP", and sometimes called raster operations pipeline, is a hardware component in modern graphics processing units (GPUs) and one of the final steps in the rendering process of modern graphics cards. The pixel pipelines take pixel (each pixel is a dimensionless point), and texel information and process it, via specific matrix and vector operations, into a final pixel or depth value. This process is called rasterization. So ROPs control antialiasing, when more than one sample is merged into one pixel. The ROPs perform the transactions between the relevant buffers in the local memory – this includes writing or reading values, as well as blending them together. Dedicated antialiasing hardware used to perform hardware-based antialiasing methods like MSAA is contained in ROPs. All data rendered has to travel through the ROP in order to be written to the framebuffer, from there it can be transmitted to the display.) AMD Radeon Vega VII Will Feature 64 ROPs and Botched Down FP64 Support – Rumored To Have Less Than 5000 Units With No AIB Models Alright so first up, we have a rumor by TweakTown which states that the AMD Radeon Vega VII graphics card will have less than 5000 units made during its production cycle and each card is going to be sold at a loss considering these are just repurposed Instinct MI50 parts that could’ve been sold for much higher prices to the HPC sector. https://www.amd.com/en/products/professional-graphics/instinct-mi50 https://wccftech.com/amd-radeon-instinct-mi60-first-7nm-vega-20-gpu-official/ https://arrayfire.com/explaining-fp64-performance-on-gpus/ Also, since the Vega VII is basically an Instinct MI50 with Radeon RX drivers, it was thought that the card would retain it’s heavy FP64 compute, making it a formidable compute option at its price point but that isn’t the case anymore. Confirming through AMD’s Director of Product Marketing, Sasa Marinkovic, TechGage reports that the Radeon VII does not feature double precision enabled and that it’s 1:32 FP64 compute like the RX Vega 64 cards at just 0.862 TFLOPs while the Instinct MI50 features 6.7 TFLOPs of FP64 compute. But you're still getting an incredible 1 Terabyte per second memory bandwidth with the Radeon VII that the Radeon Instinct MI60 and MI60 provide. Edited January 15, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 15, 2019 Reputable Contributor Share Posted January 15, 2019 I'm still going to wait until I see some of the Tech Reviews rather than rely on speculation. It seems some people just want to dump trash on AMD for no good reason. I wouldn't be surprised if this WTFtech site has a cozy relationship with Intel and Nvidia. Nobody else is sticking their neck out to throw trash on this card this hard, before anyone has had a chance to get their hands on it. For what it's worth, I don't think GPU render engines even use the Double-Precision units. Seems more like science research and the like, would be more affected and they would be target audience for the radeon instinct cards, in the first place. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 16, 2019 Advanced Member Share Posted January 16, 2019 (edited) I didn't get the feeling that WCCFTech was throwing trash on it. That's a story that's identical throughout the press. In fact that's the first idea I've gotten that it was actually a much higher end card that had been toned down for a lower priced sale. I thought the material I posted, far from being a dumping of trash was an impressive advertisement for the Radeon VII that made me far more likely to entertain buying it. Basically it's an expensive scientific and database card that has been cut down into a very affordable super powerful artist's card. Edited January 16, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 16, 2019 Reputable Contributor Share Posted January 16, 2019 I've seen a LOT of dumping on AMD for this card and I just don't get it. One yahoo (you can tell isn't one of those legit tech review guys) is bashing them.....cause they are charging half of what the 2080 costs. Another Tech Review source UFD tech or something, was also loudly bashing them and for what? If the card matches the 2080 for $100 less, that's not nothing. Just because the extra RAM doesn't appeal to him as a reason to buy it over the NVidia card, it doesn't make it a failure. I suspect this about people EXPECTING AMD to leapfrog NVidia once they got to the 7nm process. And before 2019, they just might. Again, this is basically a Frontier Edition (like Nvidia's Founder's Edition) card. The more mainstream models will probably arrive within a few months and they will probably have an 8GB version that is more in line with the 2070 for about $450, a little less than the $500 the 2070 sells for. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 16, 2019 Reputable Contributor Share Posted January 16, 2019 ...in the last post, I meant to say that the one person was bashing AMD for not offering better than 2080 performance for half the price. That's just stupid. If they did, that would be great, but nobody is "entitled" to that. All we can reasonably expect is that they are Competitve, both on features and pricing. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 16, 2019 Advanced Member Share Posted January 16, 2019 It turns out that it's not about Ray Tracing...it's about DLSS Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 16, 2019 Reputable Contributor Share Posted January 16, 2019 Funny thing about DLSS, is AMD's Vega cards actually start to pull away from the Nvidia cards at higher resolutions, so AMD users don't need AI gimmicks to look like a higher res. AMD cards can play well at the native high resolutions without gimmicks. This is the first time in over a decade, that I'm considering an AMD card. That's partly to do with Nvidia's price-gouging scheme when the 20xx line was launched. The other part is AMD is being pretty competitive with this card and the 16GB of HBM2 has my attention. I cannot wait to see how it works when using ProRender or Cycles. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 16, 2019 Reputable Contributor Share Posted January 16, 2019 Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 16, 2019 Advanced Member Share Posted January 16, 2019 (edited) 4 hours ago, AbnRanger said: Funny thing about DLSS, is AMD's Vega cards actually start to pull away from the Nvidia cards at higher resolutions, so AMD users don't need AI gimmicks to look like a higher res. AMD cards can play well at the native high resolutions without gimmicks. This is the first time in over a decade, that I'm considering an AMD card. That's partly to do with Nvidia's price-gouging scheme when the 20xx line was launched. The other part is AMD is being pretty competitive with this card and the 16GB of HBM2 has my attention. I cannot wait to see how it works when using ProRender or Cycles. Like I said; the Radeon VII is a geared down Radeon Instinct MI50 (there's also a MI160 that's even more powerful) They're scientific engineering and datacenter cards for research purposes. The Radeon VII strips away the stuff artists don't need and just gives them that teraflop of memory bandwidth and stream processors (3840) which is incredible. I suppose if they'd really wanted to have gone crazy they could have made a Radeon VIIb from the Radeon Instinct MI60 with 4096 stream processors and 32 GB of HBM2 VRAM but that would have gotten really expensive to double the VRAM like that. So what does a MI50 or an MI60 cost? We don't know yet and won't know till the end of March 2019. But let's look at earlier editions of the Radeon Instinct so we can broadly surmise what that will be; So you're going to get all the power you need as an artist from that Radeon Instinct MI60 or MI50 for $699.00 instead of $10,568.99. I'm waiting on this one myself. Edited January 16, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 16, 2019 Advanced Member Share Posted January 16, 2019 (edited) 5 hours ago, AbnRanger said: Funny thing about DLSS, is AMD's Vega cards actually start to pull away from the Nvidia cards at higher resolutions, so AMD users don't need AI gimmicks to look like a higher res. AMD cards can play well at the native high resolutions without gimmicks. This is the first time in over a decade, that I'm considering an AMD card. That's partly to do with Nvidia's price-gouging scheme when the 20xx line was launched. The other part is AMD is being pretty competitive with this card and the 16GB of HBM2 has my attention. I cannot wait to see how it works when using ProRender or Cycles. Plus everything you're saying about Nvidia and using gimmicks to get higher resolutions is almost identical to what an interior architect who did high end medical installations I knew personally used to say about Mental Ray. Basically all they did was stack *****, just blurring pixels and then blurring the blurred pixels to get higher res. And who owns Mental Ray?? There's some good reasons why it's been discontinued. Edited January 16, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 16, 2019 Advanced Member Share Posted January 16, 2019 (edited) At last year’s Game Developers Conference 2018, Microsoft announced a framework “Windows ML” for developing machine learning based applications on the Windows 10 platform, and “DirectML” that makes it available from DirectX12. We are currently experimenting with the preview version SDK of DirectML, but Radeon VII shows excellent results so far. By the way, Radeon VII scored about 1.62 times the GeForce RTX 2080 in “Luxmark” which utilizes an OpenCL-based GPGPU-like ray tracing renderer. Based on these facts, I think NVIDIA’s DLSS-like thing can be done with a GPGPU-like approach for our GPU. (A general-purpose GPU (GPGPU) is a graphics processing unit (GPU) that performs non-specialized calculations that would typically be conducted by the CPU (central processing unit). Ordinarily, the GPU is dedicated to graphics rendering.) https://whatis.techtarget.com/definition/GPGPU-general-purpose-graphics-processing-unit DirectML is currently due to be available in Spring 2019. We actually reached out to Microsoft a while ago and received the following statement regarding its extensive capabilities: DirectML provides a DirectX 12-style API that was designed to integrate well into rendering engines. By providing both performance and control, DirectML will enable real-time inferencing for game studios that want to implement machine learning techniques and integrate them into their games. These scenarios can include anything from graphics related scenarios, like super-resolution, style-transfer, and denoising, to real-time decision making, leading to smarter NPCs and better animation. Game studios may also use this for internal tooling to help with things like content and art generation. Ultimately, we want to put the power into creators’ hands to deliver the cutting edge experiences gamers want across all of the hardware that gamers have. https://wccftech.com/amd-radeon-vii-excellent-result-directml/ Edited January 16, 2019 by L'Ancien Regime 1 Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted January 17, 2019 Advanced Member Share Posted January 17, 2019 I went AMD for my CPU but my GPU is another story... For the same price I'll always lean towards NVidia.. Even with the last mess we just had with the 20xx series. AMD has to make it worth switching over.. Besides, the markup on those cards is high lately so they have a lot of room to lower their prices and attract new customers. Radeon VII vs RTX 2070 for the same price? Not even flinching, RTX every time.. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 17, 2019 Reputable Contributor Share Posted January 17, 2019 7 hours ago, Nossgrr said: I went AMD for my CPU but my GPU is another story... For the same price I'll always lean towards NVidia.. Even with the last mess we just had with the 20xx series. AMD has to make it worth switching over.. Besides, the markup on those cards is high lately so they have a lot of room to lower their prices and attract new customers. Radeon VII vs RTX 2070 for the same price? Not even flinching, RTX every time.. It's NOT a 2070! Anybody making that comparison is either a NVidia fanboy or they don't know what they are talking about. Why? Because not single person has gotten their hands on one to test, except AMD. Furthermore, AMD's CEO Lisa Su just gave a big presentation COMPARING IT DIRECTLY TO A 2080! I haven't bought an AMD card in over a decade, but the way NVidia has been acting lately, I'm ready to give AMD a chance as long as they make it compelling enough. This card appears to do so, at least for GPU rendering. ProRender is aggressively being developed by AMD and is already the default GPU render for C4D, and soon to be for Modo, too. I'm going to try and see if I can convince Andrew to have someone port Pro Render in 3DCoat. It's legit in my tests. Furthermore, Octane, another GPU render that I sometimes use, is supposedly working on a CUDA translator so that AMD cards can be used with Octane. This card has $4000 Nvidia Quadro Workstation card level RAM and compute power and that's not nothing. It's why I'm really getting sick of hearing all the AMD bashing, when at the very worst, this is a 2080 equivalent being sold for $100 less. What's to fuss about that? Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 17, 2019 Reputable Contributor Share Posted January 17, 2019 ....to add another thought to this discussion, allow me to submit this bit of information. AMD already has a card that competes directly with the RTX 2070...it's called the Vega 64 and it came out about 18mos ago. The 2070 is basically the same performance as the 1080, and the Vega 64 goes head to head with the 1080. Doesn't have RTX, but RTX isn't being used anywhere, yet and on the one game it does, it is extremely hard to tell any difference. That makes it an expensive gimmick. What is NOT a gimmick is the Radeon V 7 having double the VRAM. Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted January 17, 2019 Advanced Member Share Posted January 17, 2019 That's the thing.. No one knows at this point.. I'm not buying into the hype until i see real benchmarks and price point.. Like I was saying, if AMD has equivalents to NVidia's offering and is at the same price point, it's not enough for me.. Not to mention I'd be leaving RTX behind for a few years.. We'll see when it comes out. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 17, 2019 Reputable Contributor Share Posted January 17, 2019 12 minutes ago, Nossgrr said: That's the thing.. No one knows at this point.. I'm not buying into the hype until i see real benchmarks and price point.. Like I was saying, if AMD has equivalents to NVidia's offering and is at the same price point, it's not enough for me.. Not to mention I'd be leaving RTX behind for a few years.. We'll see when it comes out. I'm not buying any hype, but I'm not buying the BS from the AMD haters, either. There ain't no way a 7nm version of Vega VII is no better than the Vega 64, which already compares head to head with the 2070. So, to hear ANYBODY say it's a $700 card that competes with the 2070 is a load of bull chips. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 17, 2019 Reputable Contributor Share Posted January 17, 2019 ...Lisa Su is not going to stick her neck out there and compare the Vega VII to 2080, only to have it match the 2070 when it's launched. She knows she'll get raked across the Internet coals if she does that. It will compare directly with the 2080, and with the 2080Ti on the compute side of things. Seeing that it has 5GB of VRAM more than the $1200 2080Ti, I'm happy to see it. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 17, 2019 Advanced Member Share Posted January 17, 2019 (edited) It's going to be interesting to see if that Radeon VII uses some kind of Crossfire. Edited January 17, 2019 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 17, 2019 Reputable Contributor Share Posted January 17, 2019 1 minute ago, L'Ancien Regime said: It's going to be interesting to see if that Radeon VII uses some kind of Crossfire. Crossfire won't help in CG apps. GPU render engines don't need it. They recognize any installed card and will use it/them. 1 Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 18, 2019 Advanced Member Share Posted January 18, 2019 Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted January 19, 2019 Reputable Contributor Share Posted January 19, 2019 I suspect that they will ramp up production if they receive positive reviews and interest in the cards. If they have a relative hit on their hands, there seems to be no reason why they wouldn't. Gamers always think these cards are made just for them. They are not, hence the CUDA cores and Stream processors on AMD cards. Those are not Gaming focused technologies, but what do facts have to do with anything, right? Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted January 19, 2019 Advanced Member Share Posted January 19, 2019 https://www.fudzilla.com/news/graphics/46014-vega-7nm-is-not-a-gpu https://www.fudzilla.com/news/graphics/46038-amd-navi-is-not-a-high-end-card Start at around the 11 min mark Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted January 19, 2019 Contributor Share Posted January 19, 2019 As a non-gamer, I like that AMD is CUDA enabled as I use a GPU renderer (Octane). I thought I was a captive Nvidea customer. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.