Carlosan Posted March 12, 2016 Report Share Posted March 12, 2016 NVIDIA is on the verge of launching their first 16nm FinFET based products featuring the highest performance and power efficiency to date. Pascal which is the name of their upcoming GPU architecture is built from ground up to deliver not only the best gaming performance but also the best compute performance. NVIDIA is going to present their Pascal GPUs at GTC 2016 but it seems like we already have a hint at when the consumer variant will be launching as Benchlife reports that NVIDIA is prepping the GeForce GTX 1080 for launch in May. Quote Link to comment Share on other sites More sharing options...
Advanced Member Aleksey Posted March 12, 2016 Advanced Member Report Share Posted March 12, 2016 can they at least come up with a new heatsink design? they all look the same gaaawwd... Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted March 12, 2016 Report Share Posted March 12, 2016 (edited) Looks like the article is just using an old stock image since there are no pics of the new generation of cards. It's too bad if the cards really do end up using GDDR5 again instead of HBM2, considering how much the next generation of gpus has been hyped up. Edited March 12, 2016 by PolyHertz Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 12, 2016 Reputable Contributor Report Share Posted March 12, 2016 Looks like the article is just using an old stock image since there are no pics of the new generation of cards. It's too bad if the cards really do end up using GDDR5 again instead of HBM2, considering how much the next generation of gpus has been hyped up. They were supposed to have gone to HBM with the next generation. It was on NVidia's roadmap before AMD came out with theirs. I was expecting a huge leap forward because AMD's FuryX cards really up the ante Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 16, 2016 Advanced Member Report Share Posted March 16, 2016 They were supposed to have gone to HBM with the next generation. It was on NVidia's roadmap before AMD came out with theirs. I was expecting a huge leap forward because AMD's FuryX cards really up the ante http://wccftech.com/amd-radeon-pro-duo-announced-worlds-fastest-graphics-card-16-teraflops-compute/ https://youtu.be/tuCuRiGUV6U Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted March 18, 2016 Advanced Member Report Share Posted March 18, 2016 I was disappointed to hear they're going DDR5 as well.. But TBH, I had a feeling this 10x the performance was not going to make it, I mean why give us X times faster this year when they can milk this for years to come right? I'd like to see AMD go all in with HBM2 and leave those NVidias DDR5 cards in the dust.. I've been on team NVidia for a decade+ now but I'd switch teams in an instant if AMDs offering are X times faster and not 20-25% Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted March 18, 2016 Report Share Posted March 18, 2016 Seems they're using a variation called GDDR5X, that's apparently faster then regular GDDR5. Still not as fast as HBM, but at least its better then what they've been using until now. Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted March 18, 2016 Advanced Member Report Share Posted March 18, 2016 Seems they're using a variation called GDDR5X, that's apparently faster then regular GDDR5. Still not as fast as HBM, but at least its better then what they've been using until now. From what I read online, GDDR5X wont be ready in time for NVidia or AMD's summer lineup.. That leaves us with GDDR5. So far, the only card I saw from NVidia with HBM2 is the Titan 1000 series, all the rest are GDDR5.. I'd hoped the 1070GTX and up would use HBM2, I guess we'll know more the first week of April during the GPU conference, AMD and Nvidia are suppose to make announcements. Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted May 7, 2016 Report Share Posted May 7, 2016 So looks like they will be using 8GB of GDDR5X for the 1080, and 8GB of regular GDDR5 for the 1070. Benchmarks have also started coming out: http://videocardz.com/59558/nvidia-geforce-gtx-1080-3dmark-benchmarks Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted May 7, 2016 Report Share Posted May 7, 2016 Some more info; -1080 costs $599 , releases on May 27 , touted as being faster then two regular 980 cards in SLI -1070 costs $379 , releases on Jun 10 , touted as being faster then the Titan X Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted May 7, 2016 Advanced Member Report Share Posted May 7, 2016 (edited) I just watched the whole live stream tonight, impressive numbers and prices. http://www.engadget.com/2016/05/06/nvidias-new-gtx-1080-gpu-is-even-faster-than-the-titan-x/ $599 for the GTX 1080 $379 for the GTX 1070 Both available starting May 27. Ok Now I need to see AMD's hand... PolyHertz beat me to it Edited May 7, 2016 by Nossgrr Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted May 17, 2016 Report Share Posted May 17, 2016 Review embargo is up for the 1080 (nothing for the 1070 yet) : http://anandtech.com/show/10326/the-nvidia-geforce-gtx-1080-preview http://arstechnica.co.uk/gadgets/2016/05/nvidia-gtx-1080-review/ http://www.eurogamer.net/articles/digitalfoundry-2016-nvidia-geforce-gtx-1080-review http://www.engadget.com/2016/05/17/nvidia-geforce-gtx-1080-review/ http://www.hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review 1 Quote Link to comment Share on other sites More sharing options...
Member AgentSam Posted May 17, 2016 Member Report Share Posted May 17, 2016 The number of display outputs from the GTX 1080 card is disappointingly low-end. Here's a quote from TECHPOWERUP.COM Display outputs include three DisplayPort 1.4 connectors, one HDMI 2.0b, and one dual-link DVI. How am I supposed to hook a 3x2 multidisplay setup to that without an external muxer or an additional second card. In any case, I'm sort of disappointed now. Cheers, AgentSam 1 Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted May 17, 2016 Reputable Contributor Report Share Posted May 17, 2016 The number of display outputs from the GTX 1080 card is disappointingly low-end. Here's a quote from TECHPOWERUP.COM How am I supposed to hook a 3x2 multidisplay setup to that without an external muxer or an additional second card. In any case, I'm sort of disappointed now. Cheers, AgentSam In terms of features every day CG artists can use, that really wouldn't factor into their decision, whether to buy or not. Just the raw horsepower would. Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted May 18, 2016 Advanced Member Report Share Posted May 18, 2016 How am I supposed to hook a 3x2 multidisplay setup to that without an external muxer or an additional second card. That's a very specialized setup, I think that for artists (and gamers) the GTX1080 has pretty much everything you'd want. Quote Link to comment Share on other sites More sharing options...
Carlosan Posted May 25, 2016 Author Report Share Posted May 25, 2016 Meet the Pascal family: Nvidia reportedly working on GP102-based GTX Titan and GTX 1080 Ti Exploring Nvidia's Pascal architecture Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted May 29, 2016 Report Share Posted May 29, 2016 Review embargo has now ended for the 1070: http://videocardz.com/60574/nvidia-geforce-gtx-1070-reviews 1 Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted July 7, 2016 Report Share Posted July 7, 2016 Geforce 1060 has been revealed. Costs $300 (founders edition), with an eventual drop to $250 (MSRP). Comes with 6gb vram, and does not support SLI. The card and official reviews will become available on July 19th. Leaked benchmarks though put its performance somewhere between the Geforce 970 and 980. 1 Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted July 8, 2016 Advanced Member Report Share Posted July 8, 2016 The Nvidia offerings are very expensive here, $900 for a GT1080, $650 for a GT1070.. If I do a bit of extrapolation, the GT1060 will be around $400+ here. On AMDs side, the R480 w/8gigs of ram version is $280.. I'm currently leaning towards AMD but I'll let this simmer for a few weeks and see what the real performance numbers are with the GT1060.. Hopefully we'll have a better idea of the retail price for the 1060 by then. Quote Link to comment Share on other sites More sharing options...
Member Rebelismo Posted July 8, 2016 Member Report Share Posted July 8, 2016 16 hours ago, Nossgrr said: The Nvidia offerings are very expensive here, $900 for a GT1080, $650 for a GT1070.. If I do a bit of extrapolation, the GT1060 will be around $400+ here. On AMDs side, the R480 w/8gigs of ram version is $280.. I'm currently leaning towards AMD but I'll let this simmer for a few weeks and see what the real performance numbers are with the GT1060.. Hopefully we'll have a better idea of the retail price for the 1060 by then. $650 for a 1070 sounds extremely harsh. Where are you located and why are the prices that much higher? Quote Link to comment Share on other sites More sharing options...
Advanced Member Nossgrr Posted July 8, 2016 Advanced Member Report Share Posted July 8, 2016 2 hours ago, Rebelismo said: $650 for a 1070 sounds extremely harsh. Where are you located and why are the prices that much higher? It is.. I'm in Canada and our dollar exchange rate with the US is taking a pounding lately.. Hey I noticed that 3DCoat 4.7 was released today, if there's a board member that already has a R480, maybe they can chime in on performance? Same goes for 1070s 1080s owners. Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted July 22, 2016 Report Share Posted July 22, 2016 The new TITAN card was just announced, and is being released on August 2nd for $1200. Oddly enough they're simply reusing the "Titan X" name, so be careful when buying; make sure you're getting the 2016 Pascal based model instead of the the 2015 Maxwell-based one. https://www.pcper.com/news/Graphics-Cards/NVIDIA-Announces-GP102-based-TITAN-X-3584-CUDA-cores 1 Quote Link to comment Share on other sites More sharing options...
Carlosan Posted July 24, 2016 Author Report Share Posted July 24, 2016 Quote Link to comment Share on other sites More sharing options...
Member Rebelismo Posted July 25, 2016 Member Report Share Posted July 25, 2016 I usually advocate for beefy cards, but I expected more than a 30% increase in performance over the 1080 and 16GB of RAM at this price tag. Is the 1080Ti going to happen at all now? It seems that there isn't a lot of wiggle room left for a Ti model. On the other hand, this new Titan X makes a good case for just getting two 1070s or 1080s. Quote Link to comment Share on other sites More sharing options...
Member Mencre Posted July 28, 2016 Member Report Share Posted July 28, 2016 I think, if overlock 1070 it will be better and cheapest. Quote Link to comment Share on other sites More sharing options...
Member Rebelismo Posted August 3, 2016 Member Report Share Posted August 3, 2016 On 7/28/2016 at 4:13 AM, Mencre said: I think, if overlock 1070 it will be better and cheapest. I wouldn't personally overclock cards for rendering as I've seen overclocked cards burn out and die. I can't wait til the CUDA 8.0 toolkit is released, and we see some true rendering benchmarks. I believe that you might be right about the 1070 being the best value for the money though. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.