Jump to content
3DCoat Forums

NVIDIA RTX Technology: Making Real-Time Ray Tracing A Reality For Games


 Share

Recommended Posts

  • Advanced Member

All of this is mostly relevant to servers, data centers and cloud computing where there's many tenants per hosts.. The fear is that one tenant could potentially eavesdrop on others using those cpu exploits.. For personal use as a workstation at home, I wouldn sweat it. Yes a local exploit could be used locally but then they would listen to what? If they're in your system, they have access to everything already anyways.. A good anti virus will do just fine for home use.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

https://www.hardocp.com/article/2018/11/21/rtx_2080_ti_fe_escapes_testing_by_dying_after_8_hours/

RTX 2080 Ti FE Escapes Testing by Dying After 8 Hours

We have spent over $4000 on NVIDIA RTX cards, on both 2080 Founders Edition and 2080 Ti Founders Edition video cards for testing here at HardOCP. As it stands now, two of our three 2080 Ti FE cards have died on us. Another one just bit the dust, and with very little fanfare this time.

  • Like 1
Link to comment
Share on other sites

  • Advanced Member

I was all set to buy a new short term rig this weekend to serve in the interim but now that's all up in the air. The 1080Ti's are good but they're still commanding a high price despite the glut caused by the falloff in cryptocurrency mining.  Those 2070's were annoying anyway in that they didn't allow for SLI or NVlink. What a scam.

And there's virtually no good deals for Black Friday on components at all; the best I found was on some low end Asus mobos that were undesireable at any price.

 

 

I guess I'll just have to wait some more and just surf the web on my $300 Dell Intuos craptop. 

 

:(

 

I suppose I could make do with this for now but I just sense something really good is coming soon;

 

https://www.amazon.com/AMD-Radeon-Vega-Frontier-Edition/dp/B072XKTT7C/ref=sr_1_3?s=electronics&ie=UTF8&qid=1543014738&sr=1-3&keywords=AMD+Radeon+RX+Vega+64

https://www.amazon.ca/AMD-Ryzen-Processor-Wraith-Cooler/dp/B07B428M7F

https://www.anandtech.com/show/13020/msi-new-threadripper-2-motherboard-on-steroids-x399-creation

 

And so on...

 

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Advanced Member

L'ancient Regime, I bought that exact same processor earlier today from the link you listed at Amazon.ca for $344 (Ryzen 2700x), I guess it went back up later today..

Bought this from your recommendations ty! It's not a Threadripper but it'll do pretty good I think.

 

 

Edited by Nossgrr
  • Like 1
Link to comment
Share on other sites

  • Advanced Member
1 hour ago, Nossgrr said:

L'ancient Regime, I bought that exact same processor earlier today from the link you listed at Amazon.ca for $344 (Ryzen 2700x), I guess it went back up later today..

Bought this from your recommendations ty! It's not a Threadripper but it'll do pretty good I think.

 

 

Nope that's one of the threadrippers. Congrats.  What mobo will you use for it. My EE friend recommended that  MSI’s new X399 Creation motherboard that I listed. And that price is the same....you got yours for $344 USD, that's $424.99 CDN.

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Advanced Member

I just bought a MSI Armor RTX 2070 ($550), so far it's been working great.  Otoy just released their version 4 of OctaneBench so I have been running the card through the bench.  Right now it's just below a GTX 1080ti by about 4 to 6 points (213).  Not bad with no RT cores being used, according to Otoy's testing the cards can get twice the speed when the RT cores are used, and that is not optimized yet. The 2070 is twice as fast as my old GTX 980 and has double the vram so I'm happy for now. In fact it's not much below a 2080, about 15 points. :D

https://render.otoy.com/octanebench/

 

  • Like 1
Link to comment
Share on other sites

  • Advanced Member

Here's a Cyber Monday deal;

 

https://www.amazon.de/AMD-506048-Radeon-GDDR5-Grafikkarte-Grafikkarten-GDDR5-Speicher/dp/B072HTFZM4/ref=pd_sbs_147_8?_encoding=UTF8&pd_rd_i=B072HTFZM4&pd_rd_r=74fd97f1-efd1-11e8-8f7c-734f98c58a09&pd_rd_w=XapA4&pd_rd_wg=qoyfR&pf_rd_i=desktop-dp-sims&pf_rd_m=A3JWKAKR8XB7XF&pf_rd_p=51bcaa00-4765-4e8f-a690-5db3c9ed1b31&pf_rd_r=0KF4GHNEC5CTHKK8QQ6H&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&psc=1&refRID=0KF4GHNEC5CTHKK8QQ6H

 

overheat.thumb.PNG.2ef0c490ae3b343422d3c5b016b12a9f.PNG

Really, really wanted to like this card, but...

The PROs...

Cosmetically, it's a handsome looking card in a stately, blockish, and rather plain looking blue and black metal casing ---- I prefer its looks to all of the nonsensical glowing and flashing LED frenzy provided by so many other graphics cards ---- I don't need the name of a graphics card lit up while it sits inside of a computer case where I'm not going to be looking at it.

Feature wise, the combination of two GPUs, each with it's own 16GB of video memory for a total of 32GB just screams "value" at you, especially given the $900 to $1000 price.

The CONs...

This card all too easily becomes insanely hot!!! Just give it enough to do, and especially something involving both GPUs, and within a mere minute or two, the all-metal outer casing on this card will become so hot that you actually cannot touch it. In fact, I found this thing became so hot that I considered whether I should put it by itself in one of those external GPU boxes linked by Thunderbolt 3, because I was afraid of it cooking chips underneath it on the motherboard and cooking other cards in neighboring slots.

I decided I wanted to at least quantify this issue by getting a temperature read from it, but for some reason, AMD's "Radeon Pro" driver software does not seem to include any way to provide any information about the card's temperature ---- I poked all around the menus looking for something that will tell me something about the condition of the card, but there was nothing. ---- WHY IS THAT?

 

 

I really, really wanted to like this card, but I have to say DO NOT BUY ---- at least not until something is done about this problem!

 

 

 

 

I thought we were supposed to be enjoying the benefits of progress!

 

Tons of memory and decent clocks for a professional card.

However, the secondary GPU on mine refused to run anywhere close to the clock of the primary, maxing out at 400Mhz for some reason. Not sure why; and I received 0 response from AMD on the matter.

 

 

 

And the prices on the 1080Ti's are out of this world..I guess everyone has figured out that the 2000 series is a bust or something so they can still command insane prices on the 1080ti's.

 

https://www.amazon.ca/s/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=1080ti

 

And no hope on the horizon from NAVI 7nm?

 

 

Edited by L'Ancien Regime
Link to comment
Share on other sites

  • Advanced Member

 

Everything about buying a new computer is dirt cheap right now...EXCEPT the GPU and that's turning out to be insane. 1080Ti's are like $1600 Canadian new. Insane. I hope they solve this problem fast so I can get a decent 2070 at a reasonable price. Or maybe just go buy a Titan V (Volta) for $5400 Canadian???

Link to comment
Share on other sites

  • Reputable Contributor
1 hour ago, L'Ancien Regime said:

 

Everything about buying a new computer is dirt cheap right now...EXCEPT the GPU and that's turning out to be insane. 1080Ti's are like $1600 Canadian new. Insane. I hope they solve this problem fast so I can get a decent 2070 at a reasonable price. Or maybe just go buy a Titan V (Volta) for $5400 Canadian???

I'm totally happy with the 1080 for now, and I usually don't upgrade to a new card unless there is a double performance gain. Seems a waste of $$$ to upgrade for just 10-25% improvements. Having said that, I stumbled on a Newegg Black Friday deal with a Gen 1 Threadripper 1950X for $450 USD. I had to jump on that. Was going to wait til next year to see what transpires after AMD hits the 7nm mark on the CPU. But this was too good to pass up. I just hope 3DCoat is able to leverage the extra cores. 

I remember Andrew stating in a recent build that memory usage in the Paint room is much better, so it did a little test and it seems 3DCoat isn't really putting more than one or 2 cores to work in the most recent build. Will have to do some more testing to see if it's an anomaly or what.

Link to comment
Share on other sites

  • Advanced Member

I assume all modeling tools only use one or two threads at most, like most games.  Multi cores are for rendering, and cooking up new textures. You're going to have a blast with Substance Designer and whatever render engine you use with all those threads and render buckets.

Link to comment
Share on other sites

  • Reputable Contributor
26 minutes ago, L'Ancien Regime said:

I assume all modeling tools only use one or two threads at most, like most games.  Multi cores are for rendering, and cooking up new textures. You're going to have a blast with Substance Designer and whatever render engine you use with all those threads and render buckets.

PhoenixFD uses only the CPU for simulations, as does 3ds Max's Bifrost fluids. Of course, Arnold also is currently only CPU based (but they are still developing a GPU engine for it), so it can help make rendering with that engine much more tolerable and feasible. As for 3DCoat, it's heavily multi-threaded, too, but I don't know how much that many cores will benefit. There is probably a threshold of diminishing returns, involved...and I am still concerned a little, that with 3DCoat's multi-threading code using Intel's TBB (Thread Building Blocks), there could still be a bit of Intel "crippling" of AMD CPU's. IIRC, AMD took Intel to court to stop that, but who knows to what extent Intel ever complied? 

There are still many calculations that are single-threaded, by necessity I think. I'm using a Ryzen 2700X, now, and it's a stout performer. I use Vue from time to time, and it's main render is CPU based, so it's hard to get away from the need for more CPU performance

Link to comment
Share on other sites

  • Advanced Member
13 hours ago, AbnRanger said:

PhoenixFD uses only the CPU for simulations, as does 3ds Max's Bifrost fluids. Of course, Arnold also is currently only CPU based (but they are still developing a GPU engine for it), so it can help make rendering with that engine much more tolerable and feasible. As for 3DCoat, it's heavily multi-threaded, too, but I don't know how much that many cores will benefit. There is probably a threshold of diminishing returns, involved...and I am still concerned a little, that with 3DCoat's multi-threading code using Intel's TBB (Thread Building Blocks), there could still be a bit of Intel "crippling" of AMD CPU's. IIRC, AMD took Intel to court to stop that, but who knows to what extent Intel ever complied? 

There are still many calculations that are single-threaded, by necessity I think. I'm using a Ryzen 2700X, now, and it's a stout performer. I use Vue from time to time, and it's main render is CPU based, so it's hard to get away from the need for more CPU performance

Speaking of Arnold GPU ...

 

 

 

Link to comment
Share on other sites

  • 4 weeks later...
  • Advanced Member

So it's looking like there won't be any NAVI till late 2019 to save us from Nvidia. The AMD release at CES in January seems to be oriented towards a 7nm Vega 2.

 

The GPU is the last of the problems I need to solve in buying my new rig and it's a thorny one. What is the optimal price to performance config right now?

 

If you can put aside the 1-3% of RTX 2070's 2080's and 2080tis that are turning into bricks and just look at the price performance I just discovered something interesting.

 

Running a Vray render test it appears that  even though there's no NVLink for 2070's, running two 2070's gets you better price performance by far than the solo RTX 2080ti. (the 2080ti's are running around $1800-$1900 Cdn).

So $1300 Cdn for two $650 RTX 2070's gets you a 33 second Vray benchmark render. $1300 is the price for a single GTX1080Ti in Canada. 

And $1900 Cdn for one RTX 2080Ti gets you a 52 second Vray benchmark render.

 

Furthermore...if you match a single 2070 against a single 2080 or dual 2070 against dual 2080 the 2070 WINS!!

And this despite the 2080 being $300-$400 more in Canadian money. How bizarre is that?

 

If anybody has better info on this than me please tell me how I'm wrong.

Check it out;

https://www.pugetsystems.com/labs/articles/V-Ray-NVIDIA-GeForce-RTX-2070-2080-2080-Ti-GPU-Rendering-Performance-1242/

vray2.JPG.8ee30edaf8c98c14386e24a5212b8417.JPGvray1.thumb.JPG.03ca31c41a054cb48aade629ad292503.JPG

 

 

 

vray3.thumb.JPG.8a078d0dd4bf81d405d014dc0ab775d4.JPG

 

vray4.thumb.JPG.770735e656127e7565c10a5e90433d14.JPG

 

 

vray5.JPG.3e27a0774911dff408e803c8cc575b4b.JPG

Edited by L'Ancien Regime
  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor

That's really strange, because the 2080 has more CUDA cores and is clocked a bit higher. I think it points to drivers not yet being optimized to take advantage of the extra cores or something. That doesn't make any sense. But yeah, 2x 2070's > 1x 2080Ti...in terms of GPU rendering horsepower. Cheaper, too. The 2070 is $499 USD, whereas the 2080Ti is $1299. I will say this...a 2070 is just barely an improvement over the GTX 1080, which can be had (used) on eBay for roughly $300-350. So, if you can find a used one that is less than a year old (ask for an original receipt for warranty purposes), you would have 2 more years of warranty left.

Link to comment
Share on other sites

  • Advanced Member
8 minutes ago, AbnRanger said:

That's really strange, because the 2080 has more CUDA cores and is clocked a bit higher. I think it points to drivers not yet being optimized to take advantage of the extra cores or something. That doesn't make any sense. But yeah, 2x 2070's > 1x 2080Ti...in terms of GPU rendering horsepower. Cheaper, too. The 2070 is $499 USD, whereas the 2080Ti is $1299. I will say this...a 2070 is just barely an improvement over the GTX 1080, which can be had (used) on eBay for roughly $300-350. So, if you can find a used one that is less than a year old (ask for an original receipt for warranty purposes), you would have 2 more years of warranty left.

That's interesting. I've had friends tell me that in person but I'm shying away from ebay for two reasons; it could have been run hard and also me and all my friends are sick of the bidding trickery that goes down there.

 

This vid just went up; it's the latest and most reliable word on CES and NAVI announcement. The NAVI info starts around 15:30 mark.

 

 

Link to comment
Share on other sites

  • Reputable Contributor
1 hour ago, L'Ancien Regime said:

That's interesting. I've had friends tell me that in person but I'm shying away from ebay for two reasons; it could have been run hard and also me and all my friends are sick of the bidding trickery that goes down there.

 

This vid just went up; it's the latest and most reliable word on CES and NAVI announcement. The NAVI info starts around 15:30 mark.

 

 

You can normally spot a Miner selling their cards. Some of them are honest enough to say it, something like "Bought these in _____this year and only used for a few months." But for those who try to hide it, you can look at their sales feedback and see if they sold multiple cards like it. That tells you they were probably miners, running them all the time.

However, there are plenty who have not, and you can double-check to see what they are currently listing as well. It would be really hard for a miner to hide from a savvy eBay user. Generally, they preferred the 1070 because of the lower power draw than the 1080 or 1080Ti. So, most 1080's and 1080Ti's are going to be from normal users. I rarely buy new. What I would recommend is getting 2x 1070's using the techniques I mentioned, to avoid buying from a miner. Right now, immediately after Christmas, the buying market will be low, so you can easily get a GTX 1070 for about $200 USD, if you just put the one's you are interested in, on your Watchlist. Have a little bit of patience...give yourself 2 days, to find the right deal. If you can catch them late at night or early in the morning...especially a weekend morning, you will probably get the card you are looking for at the price you are ok with.

 

Link to comment
Share on other sites

  • Reputable Contributor

...2x 1070's would be the best GPU rendering combo in terms of bang for your buck. I recently got a 1080Ti on ebay, but sort of regret not just going with 2x 1070's, as they would blow a single 1080Ti out of the water, and still cost less.

  • Like 1
Link to comment
Share on other sites

  • Advanced Member

 

He says the RTX is comparable to the GTX 1080 but look at the Canadian prices and comparative performance. The 2070 is almost 12% faster and 36% cheaper.  And note, once again the 2080 underperforms; it only beats the 2070 by ONE SECOND, for all that extra money you have to pay. 

 

And look at that price performance on the 1070ti. It's 6% more expensive and 16% slower than the 2070.

 

 

 

1080.thumb.JPG.6ceb15ff5bc22b0cb5c567877ce578c9.JPG

2070.thumb.JPG.3679151ccca040d7bdb9f17b7db52560.JPG

 

1070ti.thumb.JPG.35ef8ffc4fe652f852b9c71b74bead52.JPG

392942979_1080vs2070.thumb.JPG.2f1d99dc9941b11e1205dea3e3d7a9c2.JPG

 

 

Edited by L'Ancien Regime
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...