Jump to content
3DCoat Forums

L'Ancien Regime

Advanced Member
  • Posts

  • Joined

  • Last visited

Everything posted by L'Ancien Regime

  1. I would go with the more recent one because of certain possibly important subtleties in the UEFI. (I'm not pretending to be an expert here or anything; I'm just studying up on all these complexities myself right now in preparation for buying a 2950x and an X399 motherboard. I'm even tempted in buying a 2990WX but there's a lot of sophisticated stuff with that high end CPU that baffles even the experts so I'm getting sort of intimidated at the prospect of buying it and trying to make it work properly.) For example the later version of that CPU will probably have a better XMP so that with a simple switch in the UEFI you can have the full value of your RAM. Extreme Memory Profile (XMP) is a specification serving as an extension to the standard JEDEC SPD specifications developed by Intel. XMP is intended to make overclocking easier and more accessible to new users through profiles and predefined overclocking configurations that are known to be stable. Its only for Memory it will overclock the CPU only if you choose the CPU speed under XMP settings XMP is proprietary to Intel but you should check closely with your motherboard to see if it is enabled for AMD too. It probably is if it's second gen like the X399 boards. You don't want to buy 3200MHz RAM and never be able to use all that extra speed you paid for..
  2. So the plot gets even more interesting; Nvidia is in a lot of trouble as it's business model becomes less and less viable despite the propaganda. 1. Far from being a viable choice for big data centers, NVidia is being pushed out by superior custom solutions, particularly by Google. 2. Nvidia has failed in the automotive sector and Tesla has replaced their AI cards with their own custom solutions. 3. The 2000 series has been a massive failure with consumers and it seems that a new (3000?) series will be forthcoming in 2019, since the 2000 series simply doesn't perform well enough for what it aspires to do. 4. As APUs get more and more prevalent, the discrete GPU card will become less and less viable, the same way sound cards did. CPUs and motherboard chipsets will simply push the GPU form aside. I love Coreteks videos. He's got the best analysis.
  3. I'm afraid of ebay but I'm giving this serious consideration. My EE friend in E Europe is advising the same thing. Thx
  4. He says the RTX is comparable to the GTX 1080 but look at the Canadian prices and comparative performance. The 2070 is almost 12% faster and 36% cheaper. And note, once again the 2080 underperforms; it only beats the 2070 by ONE SECOND, for all that extra money you have to pay. And look at that price performance on the 1070ti. It's 6% more expensive and 16% slower than the 2070.
  5. Echoechonoisenoise...this is interesting.. https://echoechonoisenoise.wordpress.com/ “after internet, abuse could become a default approach to technology. to technological means of representation in particular. ubiquitous amateur extruder machines may be regarded as dull interfaces between virtual junk and plastic debris. on the other hand they offer cheap and easy access to direct material manipulation and can therefore be turned into monstrous yet valid bypass between desire of saturation and limited bandwidth of digital modeling, delivering multiscale depth beyond the flat screen.” or simply: it’s an established method of any avant-garde to appreciate media for what they can become.
  6. I discovered it in this excellent video by Gleb Alexandrov at the 17 min mark; http://structuresynth.sourceforge.net/
  7. That's interesting. I've had friends tell me that in person but I'm shying away from ebay for two reasons; it could have been run hard and also me and all my friends are sick of the bidding trickery that goes down there. This vid just went up; it's the latest and most reliable word on CES and NAVI announcement. The NAVI info starts around 15:30 mark.
  8. So it's looking like there won't be any NAVI till late 2019 to save us from Nvidia. The AMD release at CES in January seems to be oriented towards a 7nm Vega 2. The GPU is the last of the problems I need to solve in buying my new rig and it's a thorny one. What is the optimal price to performance config right now? If you can put aside the 1-3% of RTX 2070's 2080's and 2080tis that are turning into bricks and just look at the price performance I just discovered something interesting. Running a Vray render test it appears that even though there's no NVLink for 2070's, running two 2070's gets you better price performance by far than the solo RTX 2080ti. (the 2080ti's are running around $1800-$1900 Cdn). So $1300 Cdn for two $650 RTX 2070's gets you a 33 second Vray benchmark render. $1300 is the price for a single GTX1080Ti in Canada. And $1900 Cdn for one RTX 2080Ti gets you a 52 second Vray benchmark render. Furthermore...if you match a single 2070 against a single 2080 or dual 2070 against dual 2080 the 2070 WINS!! And this despite the 2080 being $300-$400 more in Canadian money. How bizarre is that? If anybody has better info on this than me please tell me how I'm wrong. Check it out; https://www.pugetsystems.com/labs/articles/V-Ray-NVIDIA-GeForce-RTX-2070-2080-2080-Ti-GPU-Rendering-Performance-1242/
  9. The photograph — taken by the European Space Agency's Mars Express orbiter and released by the agency on Thursday — shows the Korolev Crater, a dish-shaped basin on the broad plain that surrounds the Martian north pole. >The impact crater is almost 51 miles wide and more than a mile deep. It holds roughly 530 cubic miles of perpetually frozen water ice, which is almost five times the volume of Lake Erie. >The photo was stitched together from five images captured by a high-resolution camera aboard the uncrewed orbiter, which has been circling the Red Planet for the past 15 years. Each of the five "strips" used to create the composite image was taken during a separate orbit. >"This particular crater is very close to the polar ice cap, and the inside of the crater is at a lower elevation and more shadowed, so it creates a cold trap where the ice is stable," Kirsten Siebach, a planetary geologist at Rice University in Houston, told NBC News MACH in an email. >Siebach said it was unusual to see ice-filled craters on Mars. But if the Martian landscape is notoriously dusty and barren, the pockmarked planet holds quite a bit of water. Just about all of it is frozen, although this year instruments aboard Mars Express revealed the existence of a large underground reservoir of liquid water near the planet's south pole. >"There used to be liquid water in rivers and lakes on Mars, but it largely either froze as the atmosphere dissipated or was lost to space about 3 billion years ago," Siebach said. "Ice still exists on Mars near the poles, and the Martian atmosphere has a tiny amount of water vapor." https://www.nbcnews.com/mach/science/ice-filled-crater-mars-looks-huge-alien-skating-rink-ncna950681
  10. Good question. Back years ago here I was pretty obsessed with CUDA but after much questioning around here the verdict was that while implemented it wasn't being updated and that Andrew didn't feel it was really that worthwhile for the performance it gave. Now I may be wrong on this, and the real experts here may come and give me a good slap for spreading misinformation but that's my own recollection (from a conversation years ago). If the NAVI comes out soon enough and really is the equivalent of a RTX 2070 for half the price I am so there. Final rendering is best done by CPU's period.
  11. Thanks. Good to hear that your new rig is up and running; I'm without a viable modeling/texturing computer right now and I'm just waiting on the release of those AMD Radeon 3080's... It totally sucks not being able to work in 3d coat etc. as you can well imagine.
  12. The thing that's blowing my mind isn't just that you'll be able to get 16/32 cores/threads at 5.1ghz but that the prices are so damn cheap. Intel would rape your wallet brutally if they could offer those specs on a CPU without competition. With the rapid fall in SSD prices and cheap high quality X399 motherboards, the only fly in the ointment is this annoying GPU problem holding things up. Buy a cheap hot noisy RADEON with crappy drivers or pay an arm and a leg for a new NVIDIA GPU that's going to turn into a brick on you the day after the warranty elapses? By the way I've a related question for all you Radeon owners out there; as someone who really doesn't care that much about gaming ( I do game a bit but it's sort of spiritually empty I find) I'm more concerned about how they work out for both interactive vs final production renders. I've heard AMD cards are actually better than NVidia cards in this department. Is this the way to go? I mean 16 GB of HBM VRAM.... https://www.amazon.com/AMD-Radeon-Frontier-Liquid-Retail/dp/B072XLR2K7 "Fantastic card! Using it for 3D modeling and rendering in Maya, Max and other CAD software. Performance is astounding! Big step up from the Quadro 4000. Handles hundreds of thousands of object with tens of millions of polys without a sweat in Maya. Great features for Pro users like 10 bit color, 4 monitor ports, 16GB VRAM, Pro app settings via the pro drivers etc... " https://blog.maxwellrender.com/news/maxwell-4-gpu-frequently-asked-questions/ This is a pretty good article on all the considerations for GPU vs CPU rendering. Is it even worth worrying about a Provisional Render on a GPU if you've got a 32 or 64 thread CPU that can handle ray tracing, SSS, caustics more comprehensively and faster?
  13. I assume all modeling tools only use one or two threads at most, like most games. Multi cores are for rendering, and cooking up new textures. You're going to have a blast with Substance Designer and whatever render engine you use with all those threads and render buckets.
  14. Everything about buying a new computer is dirt cheap right now...EXCEPT the GPU and that's turning out to be insane. 1080Ti's are like $1600 Canadian new. Insane. I hope they solve this problem fast so I can get a decent 2070 at a reasonable price. Or maybe just go buy a Titan V (Volta) for $5400 Canadian???
  15. http://www.cgchannel.com/2018/11/alexey-vanzhula-makes-flux-1-0-available-free/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+cgchannel%2FnHpU+(CG+Channel+-+Entertainment+Production+Art) Link to free download; https://gumroad.com/alexeyvanzhula Tools developer Alexey Vanzhula has made Flux 1.0, his real-time Boolean and retopology tool, formerly sold for $100, available to download for free. The toolset was developed for Houdini, but is provided as a Houdini Digital Asset (HDA), which should make it possible to use in other DCC software via Houdini Engine. Like Modo’s MeshFusion in Houdini On its release last year, Vanzhula described Flux as being “like MeshFusion in Modo”, in reference to Modo’s own real-time Boolean modelling plugin, since incorporated into the core application. The software is designed to enable users to create complex Boolean assemblies non-destructively, seeing the results of any changes made to the model hierarchy updated in real time. Flux has since been superseded by the Soft Booleans system in Direct Modeling, Vanzhula’s newest product, which he describes in his forum post announcing the release as creating better topology. However, unlike Direct Modeling, Flux features a retopology toolset: its retopology widget enables any Houdini modelling tool to be used with the software’s TopoBuild SOP. There are also ‘simple tools’ for remeshing volumes in VDB format and baking geometry. Availability and system requirements Flux is available free for Houdini 16.0 and 16.5 only. It should also work in other DCC applications, including Maya and Cinema 4D, via Houdini Engine. If you like Flux and want to support development of future tools, you can support Vanzhula on Patreon.
  16. Here's a Cyber Monday deal; https://www.amazon.de/AMD-506048-Radeon-GDDR5-Grafikkarte-Grafikkarten-GDDR5-Speicher/dp/B072HTFZM4/ref=pd_sbs_147_8?_encoding=UTF8&pd_rd_i=B072HTFZM4&pd_rd_r=74fd97f1-efd1-11e8-8f7c-734f98c58a09&pd_rd_w=XapA4&pd_rd_wg=qoyfR&pf_rd_i=desktop-dp-sims&pf_rd_m=A3JWKAKR8XB7XF&pf_rd_p=51bcaa00-4765-4e8f-a690-5db3c9ed1b31&pf_rd_r=0KF4GHNEC5CTHKK8QQ6H&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&psc=1&refRID=0KF4GHNEC5CTHKK8QQ6H Really, really wanted to like this card, but... The PROs... Cosmetically, it's a handsome looking card in a stately, blockish, and rather plain looking blue and black metal casing ---- I prefer its looks to all of the nonsensical glowing and flashing LED frenzy provided by so many other graphics cards ---- I don't need the name of a graphics card lit up while it sits inside of a computer case where I'm not going to be looking at it. Feature wise, the combination of two GPUs, each with it's own 16GB of video memory for a total of 32GB just screams "value" at you, especially given the $900 to $1000 price. The CONs... This card all too easily becomes insanely hot!!! Just give it enough to do, and especially something involving both GPUs, and within a mere minute or two, the all-metal outer casing on this card will become so hot that you actually cannot touch it. In fact, I found this thing became so hot that I considered whether I should put it by itself in one of those external GPU boxes linked by Thunderbolt 3, because I was afraid of it cooking chips underneath it on the motherboard and cooking other cards in neighboring slots. I decided I wanted to at least quantify this issue by getting a temperature read from it, but for some reason, AMD's "Radeon Pro" driver software does not seem to include any way to provide any information about the card's temperature ---- I poked all around the menus looking for something that will tell me something about the condition of the card, but there was nothing. ---- WHY IS THAT? I really, really wanted to like this card, but I have to say DO NOT BUY ---- at least not until something is done about this problem! I thought we were supposed to be enjoying the benefits of progress! Tons of memory and decent clocks for a professional card. However, the secondary GPU on mine refused to run anywhere close to the clock of the primary, maxing out at 400Mhz for some reason. Not sure why; and I received 0 response from AMD on the matter. And the prices on the 1080Ti's are out of this world..I guess everyone has figured out that the 2000 series is a bust or something so they can still command insane prices on the 1080ti's. https://www.amazon.ca/s/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=1080ti And no hope on the horizon from NAVI 7nm?
  • Create New...