Jump to content
3D Coat Forums
Carlosan

NVIDIA RTX Technology: Making Real-Time Ray Tracing A Reality For Games

Recommended Posts

https://www.geforce.com/whats-new/articles/nvidia-rtx-real-time-game-ray-tracing

NVIDIA RTX: Real-Time Ray Tracing For Games

Ray tracing simulates light, shadowing and other effects to near-perfection, which is why every film and big-budget TV show dedicates massive computational resources to each VFX shot, ensuring graphics blend seamlessly with reality. For over a decade NVIDIA has been at the forefront of this field, developing and accelerated rendering techniques, and assisting film studios with visual effects on the biggest blockbusters.

Now, we’re ready to deliver a generational leap in image quality in games, with NVIDIA RTX Technology. Powered by film-quality algorithms and new GameWorks SDK modules, RTX will enable developers to ray trace ambient occlusion effects, area shadows and glossy reflections in their games and engines. In other words, developers can create realistic, film-quality, physically-accurate scenes with lighting, shadows and reflections that capture the scene around them and account for every variable, immersing you in rich, detailed worlds that feature a level of fidelity you could previously only dream of.

Ray Traced Ambient Occlusion will ensure near-perfect contact shadowing scene-wide, enable ambient occlusion shadowing on occluded objects, and enable ambient occlusion shadowing behind the player camera, affecting the appearance of what can be seen Ray tracing fully accounts for the material of all surfaces, enabling crisp or soft reflections, and even variable reflections should surfaces be made of multiple materials, scuffed, or otherwise damaged See the highest-quality videogame shadowing with NVIDIA RTX and Volta-architecture GPUs

Share this post


Link to post
Share on other sites

1:16:43

 

1.16.43.jpg

Share this post


Link to post
Share on other sites

It seems to be all about Tensor Cores..as opposed to CUDA cores...

 

https://www.nvidia.com/en-us/data-center/tensorcore/

A BREAKTHROUGH IN TRAINING AND INFERENCE
Designed specifically for deep learning, Tensor Cores deliver groundbreaking performance—up to 12X higher peak teraflops (TFLOPS) for training and 6X higher peak TFLOPS for inference. This key capability enables Volta to deliver 3X performance speedups in training and inference over the previous generation. 

Each of Tesla V100's 640 Tensor Cores operates on a 4x4 matrix, and their associated data paths are custom-designed to dramatically increase floating-point compute throughput with high-energy efficiency.

58 page white paper PDF on Tensor Cores

http://images.nvidia.com/content/volta-architecture/pdf/volta-architecture-whitepaper.pdf

 

 

 

Share this post


Link to post
Share on other sites

You know this is going to take us back to the old argument on whether pro cards like the Quadros and Radeons are worth the extra coin. 

Definitely they tend to receive the best chips cut from the center of the big silicon dies as the xray lithography rays get more slanted and the lithography gets less perfect towards the edges of the wafers...

But look at this now;

 

NVIDIA Allegedly Launching Monstrous 4352 CUDA core RTX 2080 Ti
According to at least three separate sources NVIDIA is said to be looking to surprise everyone with the launch of an absolutely monstrous RTX 2080 Ti graphics card and not just an RTX 2080 as was previously thought.

Additionally, according to TPU, the new gaming flagship features a very slightly cut down version of the big daddy Turing GPU that we saw in all its glory earlier this week at NVIDIA’s keynote. A version that’s in fact very similar to the GPU that the company leverages in its $10,000 Quadro RTX 8000. We’re going to call this chip GT102 for the time being, although TPU alleges that it may actually be called RT102.

So, what exactly are we looking at? Well, the RTX 2080 Ti is said to feature 4352 CUDA cores, 576 TENSOR cores, 272 TMUs and 88 ROPs paired with a 352-bit memory interface and 11GB of 14gbps GDDR6 memory for a whopping 616 GB/s of bandwidth . Please be reminded that these specifications are very much rumored and in no way shape or form confirmed at this moment.

https://wccftech.com/rumor-nvidia-launching-surprise-rtx-2080-ti-with-4352-cuda-cores-11gb-gddr6-vram/

 

From what I've heard from engineers and people in CAD CAM is that the pro cards make sense for engineering apps that demand floating point precision/correction but that for artists like us a card like this is every bit as good for a fraction of the price. 

And the price for all this is $699. I could afford two of them..

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites

https://wccftech.com/nvidia-geforce-rtx-2080-ti-and-rtx-2080-specs-leak/

 

NVIDIA GeForce RTX 2080 Ti 11 GB and RTX 2080 8 GB Graphics Cards Core Specifications Confirmed – 2080 Ti With TU102 GPU Rocks 4352 CUDA Cores, 2080 With TU104 Rocks 2944 CUDA Cores

NVIDIA Turing GPU Based GeForce RTX 2080 Ti Comes With 11 GB GDDR6 Memory and 4352 Cores, GeForce RTX 2080 Comes With 8 GB GDDR6 Memory and 2944 Cores

 

turing.thumb.JPG.e7f5a509ebe3499df1cbfbc99fcc6626.JPG

 

MSI-GeForce-RTX-2080-Ti-DUKE-740x581.jpg.bcab94e203657abd588f18ea94c46f66.jpg

 

NVIDIA-RTX-Turing-GPU_19-740x416.png.7ec305e7b7b1488ad959a364cb58d5f3.png

 

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites

I contacted NVidia support yesterday; no announcement yet whether the new Turing Geforce GPUS support NVLink or NVSwitch yet. And from what I've read about them in the PDFs available from NVidia's site, the NVSwitch is only documented for use with a specific Intel multiple Xeon board. (If you're interested ask me and I'll look it up) for use with the three Quadro Turing  cards. There doesn't seem to be any documentation showing what an actual implementation of the NVSwitch looks like aside from the diagram I've posted below. 

DcE4p9YVAAEm-wl.jpg.dbe4d0e2b3890fbc03090361f4990773.jpg

GV100-NVSwitch.thumb.jpg.8d380d9cfa155f52d97232f982d22fb7.jpg

nvswitch_678x452.jpg.c2ef700867c9c69e3b53d659c5d541e9.jpg

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites

One Turing  is pretty much equivalent to four Voltas...so they actually jumped a year in GPU development, just leapfrogged the Volta and cast it aside which is pretty amazing. Nvidia is a company in a big hurry.

 

And yes there is NVLink and some kind of variant of the NVSwitch for the GeForce Turing cards;

 

 

link.thumb.JPG.36ec3b8f1f7ebcdcbbfb53c59c3d33ea.JPG

switch.thumb.JPG.6cd286fd673db16e52ecae60a311488e.JPG

 

 

This stuff is just getting bizarre...

NVIDIA_RTX_Fig3.jpg.4e23472180e888d5ba4f19d42e20b5b1.jpg

 

DGX-2_5.jpg.d93e7c4e0608f07f7ca4652f0ae3558d.jpg

 

Edited by L'Ancien Regime
  • Thanks 1

Share this post


Link to post
Share on other sites

Well, I'm pretty upset with them price-gouging in a BIG way. I mean, the 1080Ti came out at $799, and here they are charging almost twice that, at $1299. They might as well have slapped a Titan label on it, instead. But then, they would have to offer more VRAM than the measly 11GB (measly for that kind of cash). That's Strike One, IMO. 

I'm going to wait until I see clear benchmarks and full reviews by the Tech Industry. Nvidia is known for hyping up a soggy bowl of Corn Flakes. "This bowl has 30 milliliters of calcium, Vitamin D and Golden Corn goodness. No crunching required" :D

I'm hoping AMD leapfrogs them, when they come out with their next generation. They've beaten the Green Team many times before. Now would be a good time for an encore performance. People can criticize them all they want, but the Vega line did match, if not beat, what NVidia had on the market. The Vega 64 and 56 beat the GTX 1080 and 1070 in many benchmarks. People were expecting them to outdo the 1080Ti, which is basically a Titan.

Share this post


Link to post
Share on other sites

I was also watching the show yesterday and my jaw dropped when I saw prices. I'm on the edge do I buy the card but I have to agree that this new tech really gives a lot for game devellopers. I don't why they didn't show this video at the show:

 

This is good example how RTX makes games better looking. Lighting plays a important part when it comes laying the mood of the scene.  

  • Like 1

Share this post


Link to post
Share on other sites
2 hours ago, AbnRanger said:

Well, I'm pretty upset with them price-gouging in a BIG way. I mean, the 1080Ti came out at $799, and here they are charging almost twice that, at $1299. They might as well have slapped a Titan label on it, instead. But then, they would have to offer more VRAM than the measly 11GB (measly for that kind of cash). That's Strike One, IMO. 

I'm going to wait until I see clear benchmarks and full reviews by the Tech Industry. Nvidia is known for hyping up a soggy bowl of Corn Flakes. "This bowl has 30 milliliters of calcium, Vitamin D and Golden Corn goodness. No crunching required" :D

I'm hoping AMD leapfrogs them, when they come out with their next generation. They've beaten the Green Team many times before. Now would be a good time for an encore performance. People can criticize them all they want, but the Vega line did match, if not beat, what NVidia had on the market. The Vega 64 and 56 beat the GTX 1080 and 1070 in many benchmarks. People were expecting them to outdo the 1080Ti, which is basically a Titan.

 

I don't feel this is price gouging at all (and it's $1119 USD not $1219.) I should be the one crying; I'll be paying in Canadian pesos)  Stop and think how Intel would have done this; they would have held back the Turing and put out the Volta and squeezed all the profit out of Volta they could, and then in a year from now they would have come out with the Turing. 

If Turing actually does what they say it does then this is a huge technological jump and an amazing bargain at these prices. RTX with Turing is not just another incremental  advancement. This is five years of progress jammed into one. Maybe AMD can catch them but so far they're not even close. the AMD/ATI vs NVidia race is in no way comparable to the AMD Intel  race.  What I do think is somewhat of a ripoff has been the Quadro prices, historically. As I've said before here I think or rather suspect the whole pro card/gamer card thing is a big of snake oil salesmanship, particularly if you're not into high end engineering.

It's going to be interesting to see how Intel fits into this competition.

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites

Until they show some raw benchmarks that says rendering in Octane, Redshift, VRay GPU, Arnold GPU is 3-6x faster than a GTX 1080Ti, then it's all marketing hype and little more. That inflated price tag is part early adopters fees and part greed (seeing the inflated prices in the market due to data miners driving the prices up...they want to be the ones profiting, rather than the price-gouging retailers).

Going from $800 to $1200 in one single generation is price-gouging no matter how you slice it...all new generation cards are supposed to outperform their predecessors...that's how they keep selling more cards. I think they went with Turing rather than Volta, because they know AMD is right on their heels and is also offering real-time raytracing tech.

 

Share this post


Link to post
Share on other sites

Well AMD does have HMB2 and the cryptocurrency boom is over, to the degree that it's lowering stock prices, and there are  Black Friday price reductions to look forward to.  Someone should have told Intel that new generation chips should outperform their predecessors, then they wouldn't be in the fix they're in now.

I'm closely watching the situation. It'll be itneresting.

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites
On 8/21/2018 at 10:44 AM, AbnRanger said:

Until they show some raw benchmarks that says rendering in Octane, Redshift, VRay GPU, Arnold GPU is 3-6x faster than a GTX 1080Ti, then it's all marketing hype and little more. That inflated price tag is part early adopters fees and part greed (seeing the inflated prices in the market due to data miners driving the prices up...they want to be the ones profiting, rather than the price-gouging retailers).

Going from $800 to $1200 in one single generation is price-gouging no matter how you slice it...all new generation cards are supposed to outperform their predecessors...that's how they keep selling more cards. I think they went with Turing rather than Volta, because they know AMD is right on their heels and is also offering real-time raytracing tech.

 

It's going to be quite a while before AMD can match the current Turings it appears; we're looking at the second half of 2020

 

 

 

https://www.extremetech.com/gaming/272764-new-amd-gpu-rumors-suggest-polaris-refresh-in-q4-2018

 

Next up, Navi. There’s a rather confused suggestion that Navi will be both a mainstream and high-end part arriving sometime in the 2019 timeframe, and that it will debut in the budget segment first before eventually launching as a high-end, HBM2 equipped part sometime “much later.” The suggested time frame is:

Q4 2018: Polaris 30 (performance up 15 percent).
H1 2019: Navi 10 (budget part, and timing on this introduction is unclear, with additional reference to a Q1 release)
H2 2020: A new, high-end Navi part, as a “true” successor to Vega

 

 

Share this post


Link to post
Share on other sites
On 8/21/2018 at 6:15 AM, AbnRanger said:

Well, I'm pretty upset with them price-gouging in a BIG way. I mean, the 1080Ti came out at $799, and here they are charging almost twice that, at $1299. They might as well have slapped a Titan label on it, instead. But then, they would have to offer more VRAM than the measly 11GB (measly for that kind of cash). That's Strike One, IMO. 

I'm going to wait until I see clear benchmarks and full reviews by the Tech Industry. Nvidia is known for hyping up a soggy bowl of Corn Flakes. "This bowl has 30 milliliters of calcium, Vitamin D and Golden Corn goodness. No crunching required" :D

I'm hoping AMD leapfrogs them, when they come out with their next generation. They've beaten the Green Team many times before. Now would be a good time for an encore performance. People can criticize them all they want, but the Vega line did match, if not beat, what NVidia had on the market. The Vega 64 and 56 beat the GTX 1080 and 1070 in many benchmarks. People were expecting them to outdo the 1080Ti, which is basically a Titan.

Forbes seems to be quoting your opinions now ahah

 

https://www.forbes.com/sites/jasonevangelho/2018/08/21/nvidia-rtx-20-graphics-cards-why-you-should-jump-off-the-hype-train/#41dcba773f8e

 

so get the 16 core 32 thread AMD Threadripper and for now get the 1080 Ti for now and wait for the next gen 7nm RTX Nvidia Geforce in 2018.

Share this post


Link to post
Share on other sites

https://www.nvidia.com/en-us/geforce/20-series/ 

Introducing NVIDIA GeForce RTX 20 Series of Graphics Cards 

NVIDIA® GeForce RTX delivers the ultimate PC gaming experience. Powered by the new NVIDIA Turing GPU architecture and the revolutionary RTX platform, RTX graphics cards bring together real-time ray tracing, artificial intelligence, and programmable shading. This is a whole new way to experience games.

rtx20xx.jpg

Share this post


Link to post
Share on other sites

Nvidia unveil RTX 2070, RTX 2080 and RTX 2080 Ti at Gamescom

Yes, Nvidia is finally moving on from its GTX naming convention after a decade, while also jumping from 10-series to 20-series cards. RTX refers to real-time ray tracing, a computationally expensive process of accurately modelling how light bounces between reflective surfaces in a scene. Ray tracing in real time has been a holy grail for graphics card makers for years, and it looks like Nvidia has become the first to truly crack this photo-realistic lighting effect.

Nvidia is promising ray tracing performance which is six times faster than previous generation graphics hardware, thanks to dedicated RT cores. Nvidia replayed the Star Wars short they released at GDC in March, and CEO Jensen Huang stated that the four Volta V100 data centre GPUs used to originally render the short are outperformed by a single consumer-grade RTX 2080 Ti - impressive stuff.

Share this post


Link to post
Share on other sites
9 hours ago, L'Ancien Regime said:

Forbes seems to be quoting your opinions now ahah

 

https://www.forbes.com/sites/jasonevangelho/2018/08/21/nvidia-rtx-20-graphics-cards-why-you-should-jump-off-the-hype-train/#41dcba773f8e

 

so get the 16 core 32 thread AMD Threadripper and for now get the 1080 Ti for now and wait for the next gen 7nm RTX Nvidia Geforce in 2018.

Ha. Ha. LOL. Yep. They know me. :D Truth of the matter, though, is that system builders tend to think alike. We know when we are being BS'ed and when we are being price-gouged. Nvidia is a little sore at a lot of the Tech Vloggers/Reviewers who typically don't bow down to them in worship. Same thing with Intel. That's why both of them are catching h3ll right now. They've done both of those things and the Tech Reviewers are calling them out on it.

Actually, I was a little bit off on my previous cost figure for the 1080Ti. It came out at $699, but retailers started price-gouging due to the data miners and the price tag went way above that. NVidia took note and basically said "why should the retailers get all the price-gouging pie? We should be getting the lion's share of it!" And that, my friends, is how we arrived at $1200+ for the successor of the 1080Ti. Nearly double the price.

I've made some stupid choices in my life, but this won't be one of them. 

Share this post


Link to post
Share on other sites
6 hours ago, AbnRanger said:

Ha. Ha. LOL. Yep. They know me. :D Truth of the matter, though, is that system builders tend to think alike. We know when we are being BS'ed and when we are being price-gouged. Nvidia is a little sore at a lot of the Tech Vloggers/Reviewers who typically don't bow down to them in worship. Same thing with Intel. That's why both of them are catching h3ll right now. They've done both of those things and the Tech Reviewers are calling them out on it.

Actually, I was a little bit off on my previous cost figure for the 1080Ti. It came out at $699, but retailers started price-gouging due to the data miners and the price tag went way above that. NVidia took note and basically said "why should the retailers get all the price-gouging pie? We should be getting the lion's share of it!" And that, my friends, is how we arrived at $1200+ for the successor of the 1080Ti. Nearly double the price.

I've made some stupid choices in my life, but this won't be one of them. 

 

One thing I do notice is that for Boxx and other suppliers of workstations, as well as people using photogrammetry programs , 64GB is no longer really enough. 128GB is the new standard, so take the money you save by not buying the top of the line new GPUs or a 32core CPU and spend that money on DDR4 RAM, 128GB of it.

Edited by L'Ancien Regime

Share this post


Link to post
Share on other sites

Optane memory is the way, but i dont know if 128GB capacity is available today.

Share this post


Link to post
Share on other sites
4 hours ago, Carlosan said:

Optane memory is the way, but i dont know if 128GB capacity is available today.

Intel Optane Memory Tested, Makes Hard Drives Perform Like SSDs. ... In short, it's a new memory tier, a faster storage repository for most often used data and meta data, that resides between system memory (RAM) and the main storage subsystem.

 

https://www.forbes.com/sites/davealtavilla/2017/04/26/intel-optane-memory-tested-makes-hard-drives-perform-like-ssds/#3946b7fb6090

 

So you're still going to need RAM even if you have Optane memory.

 

Intel Optane Memory for PCs looks like the average M.2 gumstick and in fact plugs into an M.2 slot on Intel 200 series chipset motherboards (7th gen Kaby Lake or newer). However, it’s designed to cache slower storage volumes like hard drives, offering orders of magnitude faster response times and essentially enabling spinning media to perform more like a high performance SSD in many applications, from workstation and content creation workloads, to gaming, web browsing and even productivity apps. It does this by storing most frequently used data, meta data and access patterns on either a 16GB or 32GB Optane Memory stick, allowing the system to make far fewer trips to a much slower hard drive for data access.

 

for now it´s just for caching and speeding up hard drives/ solid state drives with an Intel 7th generation processor. We still need RAM. DRAM latency is still faster than optane.

Soooo...just save up your empties and maybe you'll be able to afford..

ram.thumb.JPG.60844d7522175817188ea9b41a1df8c9.JPG

Share this post


Link to post
Share on other sites

https://wccftech.com/nvidia-dlss-explained-nvidia-ngx/

 

Standing for Neural Graphics Acceleration, it’s a new deep-learning based technology stack part of the RTX platform. Here’s a brief description from NVIDIA:

 

NGX utilizes deep neural networks (DNNs) and a set of Neural Services to perform AI-based functions that accelerate and enhance graphics, rendering, and other client-side applications. NGX employs the Turing Tensor Cores for deep learning-based operations and accelerates delivery of NVIDIA deep learning research directly to the end-user. Note that NGX does not work on GPU architectures before Turing.

 

 

Share this post


Link to post
Share on other sites

https://wccftech.com/exclusive-nvidia-rtx-series-msrp-pc-cost/

 

Potentially some bad news...

 

This story will go into one of the biggest problems stopping AIB (Add-in-Boards) partners from achieving MSRP pricing and what this has to do with President Trump’s trade tariffs.

…there is a 10% tariff impacting $200B of goods that is scheduled to take effect on 10/1/2018. Every Monday there is an update on whether there is any progress made by US and China in the negotiations. If the tariff does take effect on 10/1, then […] would try to move assembly and testing over to Taiwan in order to avoid the tariff but most likely there would have to push back shipment lead times or else raise prices while they get it all sorted out – Red feathered bird.

Share this post


Link to post
Share on other sites
7 hours ago, L'Ancien Regime said:

https://wccftech.com/exclusive-nvidia-rtx-series-msrp-pc-cost/

 

Potentially some bad news...

 

This story will go into one of the biggest problems stopping AIB (Add-in-Boards) partners from achieving MSRP pricing and what this has to do with President Trump’s trade tariffs.

…there is a 10% tariff impacting $200B of goods that is scheduled to take effect on 10/1/2018. Every Monday there is an update on whether there is any progress made by US and China in the negotiations. If the tariff does take effect on 10/1, then […] would try to move assembly and testing over to Taiwan in order to avoid the tariff but most likely there would have to push back shipment lead times or else raise prices while they get it all sorted out – Red feathered bird.

So, Nvidia can jack up the price of a 2080Ti (from the 1080Ti) 2x and it's not that big of a deal, but a 10% tariff on a portion of CHINESE parts is? I'd really like to see their math, explaining this equation, especially since Taiwan is where most of the parts come from, and Taiwan is not China, the last time I checked.

  • Like 1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×