Jump to content
3DCoat Forums

L'Ancien Regime

Advanced Member
  • Posts

    2,188
  • Joined

  • Last visited

Everything posted by L'Ancien Regime

  1. I sure wish all these GPU guys be they Nvidia or AMD would at least throw the content creators a few bones with their publicity, especially with these high end cards. I really don't care about playing The Division or Final Fantasy. If we're going to be expected to fork out this kind of money for a piece of technology they could at least print out a few paragraphs on how it runs with Arnold or Renderman or Keyshot.
  2. I wonder where that new Radeon Vega 7 would fall in that graph with it's 16GB of HBM2 VRAM? And how do you find that AMD Radeon Pro Render for SSS and caustics? Does it measure up to something like Maxwell Render? I'm reading that for things like particle cloud renders the CPU is still superior due to it's math engines like Embree..and Embree also runs on Ryzen CPUs as well https://software.intel.com/en-us/rendering-framework https://software.intel.com/en-us/articles/embree-highly-optimized-visibility-algorithms-for-monte-carlo-ray-tracing "All recent AMD CPU support Embree, including Rizen. Performance in Vray(for Max) are in between an 8core and a 10core I7. At least on very old release of Vray it was better to avoid mixing AMD and Intel when caching IM because some parts of the computation was random and behave slightly differently on the two hardware platform. Don't know if this is a problem with current release.. anyway you can always save GI maps on a single system and distribute the render for final frame, this should always work" http://forum.vrayforc4d.com/index.php?threads/19169/ CPUs' = complex elegant solutions. GPU's = brute force simplistic solutions. Or is that a concept that's now 5 years out of date?
  3. Aren't the algorithms on a CPU more complex due to the hard coded math engines in them compared to the ones used in a GPU no matter how fast? Well based simply on noise elimination BOXX says GPU rendering is over 6 times faster than CPU rendering. https://blog.boxx.com/2014/10/02/gpu-rendering-vs-cpu-rendering-a-method-to-compare-render-times-with-empirical-benchmarks/ But that's only one criterion. https://www.fxguide.com/featured/look-at-renderman-22-and-beyond/ Well it would seem that they've attained parity at least in specific render software. https://renderman.pixar.com/news/renderman-xpu-development-update So if that's the case, what is the correct way to proceed here in the purchase of a new rig? What is the optimal price performance configuration for a rig especially if you've been frustrated by the testing of new shaders/textures bottleneck and you want to make that workflow more agile and responsive?
  4. So this is interesting...Nvidia is coming out with a GTX 1180, that is a graphics card those doesn't have the RTX or the other specialized cores in it for a cheaper non RTX price New leaks reveal that Nvidia's GTX 1180 will support higher refresh rates through one VR-friendly connection, making it possible to render 4K@120 Hz content for each eye, and this would also apply to the TV-sized G-Sync 4K monitors. Price-wise, the GTX 1180 will not match the original launch MSRP of the GTX 1080, as there will be two versions and the most affordable one is supposed to cost US$999. by Bogdan Solca, 2018/06/15Desktop Geforce Gaming Huawei Mate 20 X 91% Huawei Mate 20 X Smartphone Review Acer TravelMate X3410 (i7, MX130, FHD) Laptop Review 87% Acer TravelMate X3410 (i7, MX130, FHD) Laptop Review Lenovo ThinkPad A285 (Ryzen 5 Pro, Vega 8, FHD) Laptop Review 88% Lenovo ThinkPad A285 (Ryzen 5 Pro, Vega 8, FHD) Laptop Review Oukitel WP1 80% Oukitel WP1 Smartphone Review Next Page 〉 Nvidia’s CEO Jensen Huang claimed at Computex this year that the next gen gaming GPUs will be released “a long time from now”, but trusted sources already informed that the new GTX 11xx series should be announced in late July / early August, with mobility GPU versions expected to land some time in Q4 2018. Huang most likely did not want to spoil a larger marketing scheme and had to cut it short for people who were expecting any teasers. Now, according to Tom’s Hardware’s anonymous sources, the upcoming GPUs should integrate a brand new VR-friendly connector that allows for much higher refresh rates over a single cable. This should translate to 120 Hz per each eye in 4K resolutions that will probably get delivered through a new HDMI 2.1 output. Previous reports claimed that the next gen GPUs from Nvidia would be priced quite similar to the launch MSRPs of the GTX 1080 series, but the latest info from TweakTown suggests that the GTX 1180 will come in two variants: a US$999 model and a US$1,499 model with more VRAM. Nvidia will probably start selling its Founders Editions in early August, while third-party integrators could start shipping their custom versions in September. The updated Quadro professional lineup is also expected to make an appearance at Siggraph in August. I'd rather get an RTX 2070 at under $700 than get this RTX 1180 at $999 and $1499
  5. Yep, and for me, like you, only the renders count. Games disappoint me. The AI and game designs and aesthetics for the most part have failed that is unless you love shooting and blowing stuff up. With 16 gb of VRAM that thing is actually very good value for the money.
  6. But what about the Wraith Ripper? http://www.coolermaster.com/cooling/cpu-air-cooler/wraith-ripper/ And thanks for all that info on the heat sinks. And also you're right that the 2990WX's problems are solely due to MicroSoft. here are the Linux Results on those renders. https://www.phoronix.com/scan.php?page=article&item=amd-2920x-2970wx&num=9
  7. http://www.entagma.com/building-your-own-houdini-workstation/#comment-20906 This guy is always brilliant.
  8. Thanks Nossgrr. I've been really studying up for my next build and it's a revelation every day. I was prepared to go big bucks even for the 2990WX but it seems that thousands of extra dollars doesn't necessarily buy you a proportional increase in performance. I just came across this sobering fact; That's crazy. And that 2990WX would be worth it if it gave you double the speed on renders (imagine 64 render buckets all going at 4ghz!) but because of the way AMD took it's 64 thread EPYC CPU and crippled the memory access with half of the four chiplet modules it just doesn't perform the way you'd think all those extra threads and render buckets should. https://bitsum.com/portfolio/coreprio/ It appears to be mainly oriented towards scientific and computational researchers. The 1950X is by far the superior buy.
  9. I posted this a couple days ago in the RTX Nvidia thread near the top of the page; check out the 2070 performace. There's no NVLink for the 2070 or SLI. Two of those outperform the 2080Ti Two 2070's cost $1300 Cdn One 2080ti costs $1900 Cdn So without any other data I'd say the answer to your question would be "YES" at least when it comes to rendering. For gaming I don't know.
  10. I would go with the more recent one because of certain possibly important subtleties in the UEFI. (I'm not pretending to be an expert here or anything; I'm just studying up on all these complexities myself right now in preparation for buying a 2950x and an X399 motherboard. I'm even tempted in buying a 2990WX but there's a lot of sophisticated stuff with that high end CPU that baffles even the experts so I'm getting sort of intimidated at the prospect of buying it and trying to make it work properly.) For example the later version of that CPU will probably have a better XMP so that with a simple switch in the UEFI you can have the full value of your RAM. Extreme Memory Profile (XMP) is a specification serving as an extension to the standard JEDEC SPD specifications developed by Intel. XMP is intended to make overclocking easier and more accessible to new users through profiles and predefined overclocking configurations that are known to be stable. Its only for Memory it will overclock the CPU only if you choose the CPU speed under XMP settings XMP is proprietary to Intel but you should check closely with your motherboard to see if it is enabled for AMD too. It probably is if it's second gen like the X399 boards. You don't want to buy 3200MHz RAM and never be able to use all that extra speed you paid for..
  11. So the plot gets even more interesting; Nvidia is in a lot of trouble as it's business model becomes less and less viable despite the propaganda. 1. Far from being a viable choice for big data centers, NVidia is being pushed out by superior custom solutions, particularly by Google. 2. Nvidia has failed in the automotive sector and Tesla has replaced their AI cards with their own custom solutions. 3. The 2000 series has been a massive failure with consumers and it seems that a new (3000?) series will be forthcoming in 2019, since the 2000 series simply doesn't perform well enough for what it aspires to do. 4. As APUs get more and more prevalent, the discrete GPU card will become less and less viable, the same way sound cards did. CPUs and motherboard chipsets will simply push the GPU form aside. I love Coreteks videos. He's got the best analysis.
  12. I'm afraid of ebay but I'm giving this serious consideration. My EE friend in E Europe is advising the same thing. Thx
  13. He says the RTX is comparable to the GTX 1080 but look at the Canadian prices and comparative performance. The 2070 is almost 12% faster and 36% cheaper. And note, once again the 2080 underperforms; it only beats the 2070 by ONE SECOND, for all that extra money you have to pay. And look at that price performance on the 1070ti. It's 6% more expensive and 16% slower than the 2070.
  14. Echoechonoisenoise...this is interesting.. https://echoechonoisenoise.wordpress.com/ “after internet, abuse could become a default approach to technology. to technological means of representation in particular. ubiquitous amateur extruder machines may be regarded as dull interfaces between virtual junk and plastic debris. on the other hand they offer cheap and easy access to direct material manipulation and can therefore be turned into monstrous yet valid bypass between desire of saturation and limited bandwidth of digital modeling, delivering multiscale depth beyond the flat screen.” or simply: it’s an established method of any avant-garde to appreciate media for what they can become.
  15. I discovered it in this excellent video by Gleb Alexandrov at the 17 min mark; http://structuresynth.sourceforge.net/
  16. That's interesting. I've had friends tell me that in person but I'm shying away from ebay for two reasons; it could have been run hard and also me and all my friends are sick of the bidding trickery that goes down there. This vid just went up; it's the latest and most reliable word on CES and NAVI announcement. The NAVI info starts around 15:30 mark.
  17. So it's looking like there won't be any NAVI till late 2019 to save us from Nvidia. The AMD release at CES in January seems to be oriented towards a 7nm Vega 2. The GPU is the last of the problems I need to solve in buying my new rig and it's a thorny one. What is the optimal price to performance config right now? If you can put aside the 1-3% of RTX 2070's 2080's and 2080tis that are turning into bricks and just look at the price performance I just discovered something interesting. Running a Vray render test it appears that even though there's no NVLink for 2070's, running two 2070's gets you better price performance by far than the solo RTX 2080ti. (the 2080ti's are running around $1800-$1900 Cdn). So $1300 Cdn for two $650 RTX 2070's gets you a 33 second Vray benchmark render. $1300 is the price for a single GTX1080Ti in Canada. And $1900 Cdn for one RTX 2080Ti gets you a 52 second Vray benchmark render. Furthermore...if you match a single 2070 against a single 2080 or dual 2070 against dual 2080 the 2070 WINS!! And this despite the 2080 being $300-$400 more in Canadian money. How bizarre is that? If anybody has better info on this than me please tell me how I'm wrong. Check it out; https://www.pugetsystems.com/labs/articles/V-Ray-NVIDIA-GeForce-RTX-2070-2080-2080-Ti-GPU-Rendering-Performance-1242/
  18. The photograph — taken by the European Space Agency's Mars Express orbiter and released by the agency on Thursday — shows the Korolev Crater, a dish-shaped basin on the broad plain that surrounds the Martian north pole. >The impact crater is almost 51 miles wide and more than a mile deep. It holds roughly 530 cubic miles of perpetually frozen water ice, which is almost five times the volume of Lake Erie. >The photo was stitched together from five images captured by a high-resolution camera aboard the uncrewed orbiter, which has been circling the Red Planet for the past 15 years. Each of the five "strips" used to create the composite image was taken during a separate orbit. >"This particular crater is very close to the polar ice cap, and the inside of the crater is at a lower elevation and more shadowed, so it creates a cold trap where the ice is stable," Kirsten Siebach, a planetary geologist at Rice University in Houston, told NBC News MACH in an email. >Siebach said it was unusual to see ice-filled craters on Mars. But if the Martian landscape is notoriously dusty and barren, the pockmarked planet holds quite a bit of water. Just about all of it is frozen, although this year instruments aboard Mars Express revealed the existence of a large underground reservoir of liquid water near the planet's south pole. >"There used to be liquid water in rivers and lakes on Mars, but it largely either froze as the atmosphere dissipated or was lost to space about 3 billion years ago," Siebach said. "Ice still exists on Mars near the poles, and the Martian atmosphere has a tiny amount of water vapor." https://www.nbcnews.com/mach/science/ice-filled-crater-mars-looks-huge-alien-skating-rink-ncna950681
  19. Good question. Back years ago here I was pretty obsessed with CUDA but after much questioning around here the verdict was that while implemented it wasn't being updated and that Andrew didn't feel it was really that worthwhile for the performance it gave. Now I may be wrong on this, and the real experts here may come and give me a good slap for spreading misinformation but that's my own recollection (from a conversation years ago). If the NAVI comes out soon enough and really is the equivalent of a RTX 2070 for half the price I am so there. Final rendering is best done by CPU's period.
×
×
  • Create New...