Jump to content
3DCoat Forums

L'Ancien Regime

Advanced Member
  • Posts

    2,218
  • Joined

  • Last visited

Everything posted by L'Ancien Regime

  1. I was just rereading that CG Channel article. This is sort of amazing...to see the logic of Houdini's system end up being applied like this to plug in toolkit development so that they're able to be universally applicable across radically different apps.. It’s provided as a Houdini Digital Asset, which means that it should be possible to use it in other DCC applications, including 3ds Max, Cinema 4D and Maya, Unity, via Houdini Engine.
    great looking hair. I'm impressed.
  2. http://www.cgchannel.com/2018/11/master-gpu-rendering-for-production-with-redshift/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+cgchannel%2FnHpU+(CG+Channel+-+Entertainment+Production+Art) Since its release in 2014, GPU-based production renderer Redshift has become one of the visual effects industry’s staple tools. Next week, visitors to Gnomon’s Hollywood campus will be able to discover why. In GPU Rendering: An Introduction to Redshift for Production, a free two-hour masterclass, former CoSA VFX CG supervsior David Stipinis will explore how Redshift can be used on personal and professional projects. The session, which can be viewed free on Twitch via the player embedded above, will introduce the renderer’s key features and explore specific workflows for increasing productivity and creativity. The event will close with an audience Q&A, enabling viewers to ask follow-up questions. Free to attend, but register online in advance GPU Rendering: An Introduction to Redshift for Production takes place at Gnomon’s Hollywood campus from 7.30-9:30pm on Thursday 15 November 2018. Entry is free, but you’ll need to register online first. If you can’t make it to LA in person, the session will also be broadcast live on Gnomon’s Twitch channel. You don’t need to register to watch the stream, and can ask questions via Twitter using the hashtag #gnomon.
  3. I was thinking of keeping my next workstation computer offline anyway and doing all my internet communications with other non SMT rigs I have. If I can't buy a new rig to sculpt, 3d texture and render with on Cyber Monday I may go insane.
  4. I had this conversation with my EE friend in E. Europe last night on Skype after I sent him this screenshot from the latest Linus video; ME; Portsmash affects AMD too..it affects all parallel processors EE in EU; AWWW S*****!!! EE in EU; i guess we wont be buying anytime soon! ME; Why?? ME; ******* ME; Its that bad eh? EE in EU; ayup ME; How is it being spread? EE in EU; good question. https://www.zdnet.com/article/intel-cpus-impacted-by-new-portsmash-side-channel-vulnerability/ Researchers say they notified Intel's security team last month, on October 1, but the company has not provided a patch until yesterday, the date on which researchers went public with their findings. An Intel spokesperson was not available for comment regarding the state of the PortSmash patching process before this article's publication. AMD CPUS LIKELY IMPACTED "We leave as future work exploring the capabilities of PortSmash on other architectures featuring SMT, especially on AMD Ryzen systems," the research team said in a version of their paper shared with ZDNet, but Brumley told us via email that he strongly suspects that AMD CPUs are also impacted. The work behind discovering PortSmash is also the first result of "SCARE: Side-Channel Aware Engineering," a five-year security research project funded by the European Research Council. "Security and SMT are mutually exclusive concepts," he added. "I hope our work encourages users to disable SMT in the BIOS or choose to spend their money on architectures not featuring SMT."
  5. No live stream I guess.. https://www.anandtech.com/show/13547/amd-next-horizon-live-blog-starts-9am-pt-5pm-utc Even the motherboard chipsets are going to be 7nm this year.
  6. My 2 cents; Samsung 970 EVO 2TB2048 GB M.2 SSD with sequential read write speeds up to 3500/2900 MB/s$590.00 Intel CPUs have security vulnerabilities galore. AMD doesn't. https://wccftech.com/side-channel-portsmash-hits-intel-cpus/ What is the best AMD CPU bang for the buck? https://www.guru3d.com/articles-pages/amd-ryzen-threadripper-2950x-review,32.html if the software supports 32 threads, there's nothing stopping this processor ripping threads and spitting out serious numbers. The base clock is nice and, if all threads are used, you'll see something like 3.8~3.9 GHz on all cores. With a few threads, that's a staggering 4.4 GHz under the condition that you properly cool the processor. You can tweak all cores to 4.1~4.2 GHz of course, but remember that will increase power consumption severely, but also doesn't bring in heaps of extra performance. Only with many threads, you'll see a gain, with fewer threads that XFR2 enabled 4.4. Ghz on four bins brings in the better advantage. https://www.amazon.com/AMD-Ryzen-Threadripper-Processor-YD295XA8AFWOF/dp/B07GFN6CVF $899.00 Now imagine doing a 4k unbiased render with 32 threads and lots of materials and objects in your scene. Imagine cooking up new textures in Substance designer when you've got all those render buckets... I'll be going with a 1080Ti if there's no hope for the forseeable future on AMD GPUs.. https://www.amazon.com/GeForce-Advanced-Gaming-Graphics-ROG-STRIX-GTX1070TI-A8G-GAMING/dp/B076RCTXGX/ref=pd_day0_hl_147_1?_encoding=UTF8&pd_rd_i=B076RCTXGX&pd_rd_r=4e2ec403-e179-11e8-aa9e-d7a355075e98&pd_rd_w=wflHt&pd_rd_wg=VEZ7P&pf_rd_i=desktop-dp-sims&pf_rd_m=ATVPDKIKX0DER&pf_rd_p=ad07871c-e646-4161-82c7-5ed0d4c85b07&pf_rd_r=2ATSS96392JHBV4R2QXX&pf_rd_s=desktop-dp-sims&pf_rd_t=40701&psc=1&refRID=2ATSS96392JHBV4R2QXX I can see gettign two of these in SLI..
  7. https://wccftech.com/amd-zen-2-ryzen-epyc-cpus-higher-than-expected-ipc-clocks/ AMD Zen 2 Frequency and IPC Higher Than Expected On Early Engineering Samples, Possible Preview Tomorrow at Next Horizon Event Tomorrow, AMD is going to talk a lot about their 7nm products which includes both CPUs and GPUs. While GPU talk would be mostly centered around Vega 20 with a possible look at AMD’s NAVI GPU which is aimed for launch sometime in 2019, the CPU talk would go around the AMD ZEN 2 CPU architecture. The event will start at 12 AM (Eastern Time) on 6th November, 2018.
  8. AMD Investor Relations will host a "Next Horizon" event on November 6th, and although there is no confirmation on what products will be announced there, the title alone makes us think about AMD's Zen 2 architecture. The company has just explained that on that day their executives will "discuss the innovation of AMD products and technology, specifically designed with the datacenter on industry-leading 7 nm process technology". AMD announced Ryzen and quite a lot of details about the Zen's processors on their last "Horizon" event, so it seems plausible that the incoming event will be perfect to talk about its next-gen architecture. That focus on the 7 nm process technology will probably make AMD talk about their new Vega graphics, but it seems end users will have to wait, as datacenters come first. https://community.amd.com/community/amd-corporate/blog/2018/08/27/expanding-our-high-performance-leadership-with-focused-7nm-development AMD’s next major milestone is the introduction of our upcoming 7nm product portfolio, including the initial products with our second generation “Zen 2” CPU core and our new “Navi” GPU architecture. We have already taped out multiple 7nm products at TSMC, including our first 7nm GPU planned to launch later this year and our first 7nm server CPU that we plan to launch in 2019. Our work with TSMC on their 7nm node has gone very well and we have seen excellent results from early silicon. To streamline our development and align our investments closely with each of our foundry partner’s investments, today we are announcing we intend to focus the breadth of our 7nm product portfolio on TSMC’s industry-leading 7nm process.
  9. I'm hoping AMD is going to take over Linux and save it from the PC COC cabal too.
  10. Come Cyber Monday this month I'll be buying the 1080ti for my new rig...AMD GPUs will come out in January. Will AMD steal a march on Nvidia and come out with a 7nm GPU that blows the door off Nvidia the way they did with Intel CPUs? I can dream, can't I? AMD’s 7nm Vega is a Monster – 1.25x Turing’s Compute at Half The Size Whilst the company hasn’t disclosed detailed specifications relating to the new GPU we could reasonably expect around one terabyte/s of memory bandwidth, higher clock speeds and significantly better power efficiency thanks to TSMC’s leading-edge 7nm process technology, which has reportedly enabled the company to extract an unbelievable 20.9 TFLOPS of graphics compute out of 7nm Vega, according to one source. If true, it would make it the world’s first 20 TFLOPS GPU. That figure is simply unheard of especially out of a chip of this size, which Anandtech estimates to be roughly 336mm² large. To put this into perspective, if accurate, the figure would put it 25% ahead of NVIDIA’s most powerful GPU yet, the 754mm² monster Turing GPU. All the while Vega 20 allegedly achieves this with half the silicon real-estate. https://wccftech.com/amd-confirms-new-7nm-radeon-graphics-cards-launching-in-2018/ And it's probably going to be a much lower price than NVidia too...
  11. Wow the story has legs... https://wccftech.com/geforce-rtx-2080tis-are-dying-and-there-are-different-rtx-2070-chips/ NVIDIA and the GeForce RTX 2080ti and the rest of the RTX family can’t seem to stay out of the headlines these days. But, all press is good press right? Maybe not always in this case. The RTX 2080ti is apparently having some issues staying alive and there are multiple variants of the RTX 2070 floating about that offer up very different performance levels under the same branding name. RTX 2080ti Failures Reports sprang up just before Halloween regarding an unusual amount of RTX 2080ti’s being reported on the GeForce forums indication a rather high number of cards dying for a new launch. Defects and DOAs aren’t exactly uncommon and do happen at launch and with a GPU with such a massive die it’s quite possible for some to slip through QA and make it out into the wild, only to be found by the unfortunate person it was shipped to. But this time the numbers appear, based on reporting so far, that there are quite a few more than normal. Visiting the GeForce forums will unveil quite a few threads regarding dead or dying RTX 2080ti’s or tales of RMAs gone wrong. The most common occurrence seems to be related to artifacts appearing on screen and ultimately crashing to black and never recovering. Artifacts like shown in the threads are typically a results from unstable memory overclocks, or just bad memory, which has caused quite a bit of finger pointing towards the GDDR6 memory used on the RTX 2080ti. But that’s kind of a hard sell to some since the issues appear to be pretty isolated to the RTX 2080ti Founders Edition and not the other AIB variants and cards that also carry GDDR6. I personally have a friend who’s card died on him just this morning after being hit with massive artifacts even on the desktop. I’ve actually spent the past several days trying to confirm whether ours was dying or not, thankfully our test card is okay since it turned out to be nothing more than a bad M.2 drive, odd that it would function fine with other cards but the test bench is rock solid again. GeForce RTX 2080ti’s Are Dying And There Are Different RTX 2070 Chips.
  12. https://www.techradar.com/news/nvidias-flagship-rtx-2080-ti-graphics-cards-are-failing-more-than-they-should The most powerful consumer GPUs to hit the market are experiencing some teething problems at the moment, with a number of early users reporting their Nvidia GeForce RTX 2080 Ti graphics cards are failing in a variety of ways. Among the issues being reported are day-one failures, according to Digital Trends, with instabilities and artifacts being found immediately after installing the card, while others are seeing their GPUs die after a few days of nominal use. Across both the Nvidia forums and relevant Reddit threads, an alarmingly high number of users have told the tale of their cards dying – more often than not, it's the Founders Edition of the GPU and, in some cases, even their replacement cards from the manufacturer have gone the same way.
  13. AVG? It's not 2003 anymore you know. Try getting ESET NOD32.
  14. Now some of us are artists and some of us are programmers. But then some are artists and programmers. That is the ideal. I've tried to learn to program many times. I've tried to learn C#, C (omg it makes me vomit), I've tried to learn Python at least 5 times, and I've tried to learn C++ I don't know how many times. Each time I felt that I was being taught commands but not the structure of programming itself. One video tutorial series, by a Stanford prof teaching C++ said mid way through a long tutorial that she assumed you already had a firm grasp of Java. Hmmm. I kept this in mind and forgot about it. Then I was over at Engagma, and those maniacs are working in VEX and VOPs, that is the native language of Houdini, Vector Expressions and VEX Operators. Once again I was following tutorials without grasping the fundamental structure of what was going on. Frankly I was lost and wasting my time. But then they mentioned that you should get your introduction to programming with Java again, in particular this course taught by this zany CS prof at the Tisch school of entertainment arts at Columbia University in NY. He uses a Java compiler with it's own special libraries constructed for 2D and 3D graphics. It's called Processing. There's two books. Book One; Of course there are other books for Processing It's perfect for 2d and 3d graphic artists like us; all the examples and all the problems are very cool ones involving fundamental graphic ideas, like generating noise in 2d using elements as simple as a bunch of rectangles filled with a random black and white level between 0 and 255. In a word, you won't be learning how to program your computer to say HELLO WORLD. It's hard work, but I've just finished Ch 8 and I'm having fun, and what is more important, it's making sense where none of the others did before. https://processing.org/tutorials/ https://www.youtube.com/channel/UCvjgXvBlbQiydffZU7m1_aw Of course you might ask "why bother learning to program at all when there's all these programs written for us?" Well for one thing there's a lot of tasks that come into play with managing data on complex tasks like creating and using hair, for examples, managing databases for it all the data that these intensive tasks require. And there's a lot of procedural stuff that you might want to do on a custom level, writing your own fractals from the vast and ever growing number of fractal algorithms found in books of pure mathematics, and this can be very fruitful leading to some genuinely original looking work. And I suspect if you can program as well as create art, you're more likely to get hired. And so on..
  15. https://community.foundry.com/discuss/topic/133055/mesh-fusion-and-moi3d
  16. http://www.cgchannel.com/2018/10/michael-gibson-unveils-moi-4/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+cgchannel%2FnHpU+(CG+Channel+-+Entertainment+Production+Art) Developer Michael Gibson has unveiled MoI 4 (Moment of Inspiration 4), the first update to his intuitive, lightweight NURBS and Boolean modelling software in three years. The release, which is currently in closed beta for users of MoI 3, makes the software a 64-bit application, improving performance on complex assets. The update also makes MoI a native Mac application, rather than relying on porting utility Wineskin, as in previous builds. And that’s essentially all the information so far: to judge from the announcment thread on the MoI forums, development work is still at a fairly early stage. Updated 22 October 2018: Michael Gibson has released a new beta build of MoI 4 adding the option to convert Sub-D surfaces to NURBS. In most cases, the conversion should generate a NURBS surface with G2 continuity or greater: there’s an interesting comparison with tools like 3ds Max and Modo in this thread on the MoI forum. The conversion is currently performed automatically when importing OBJ files, but Gibson says that he aims to make it possible to apply it to any Sub-D surface.
  17. That's the problem with 3dsMax; great plug ins like Hair Farm, but who wants to get into Max itself at this point?
  18. https://www.madewithmischief.com/ https://community.wacom.com/inspiration/blog/2015/january/50-trillion-to-one-the-magic-behind-mischief The unique shape representation made possible by ADFs manages to be neither pixels nor vectors. It takes the best of both worlds to yield a drawing program that’s fast, high quality, and infinitely scalable with no loss of resolution. “You can take any part of this infinite canvas and expand it to any size and any resolution – you can take your doodle and expand it to the size of a billboard,” Frisken said. “There’s a 50 trillion-to-one zoom factor, which is like sitting on the moon and looking at a single flower on Earth and then drawing on a petal of that flower.” Made with Mischief is a digital pen-based drawing system designed to be used with both Apple and Windows computers that combines two of the most commonly used design and drawing processes. Users can design using colored pixels, like Photoshop, while it also has Illustrator-like tools that lets users draw what Frisken called “mathematical curves.” The outcome is the most high-tech computer drawing process I’ve seen, allowing users the fluidity of drawing by hand, and the infinite possibilities of computer design, without the limitations of either Illustrator or Photoshop. Mischief is powered by a revolutionary patented shape representation, known as Adaptively Sampled Distance Fields (ADFs), co-invented by Frisken. ADFs have several advantages for creative applications: they provide high-quality stroke rendering; they are amenable to hardware-based rendering so drawing is extremely responsive; they are very compact, resulting in small file sizes; they can be scaled without introducing pixelation artifacts; and they can accurately represent much richer and more complex shapes than traditional vector-based stroke representations. For Frisken, the acquisition of Made With Mischief by The Foundry enables her to retain her core vision of providing high-quality software tools for a wide range of artists and to preserve an accessible price point, while bringing future versions of the platform to an even broader audience. ADF https://www.fxguide.com/featured/whats-the-foundry-buying-the-tech-of-adf/ ADF actually came from research into medical imaging for applications such as anatomy education, surgical simulation and computer-assisted surgery. "1998 was the first paper that represented shape with distance," Dr Frisken explains. The team needed a way to represent three-dimensional medical examples such as the way a knee worked as part of a coordinated a multi-institutional, multi-disciplinary project to simulate arthroscopic knee surgery. Frisken served as project leader, algorithm designer, software system designer and implementor on the project. "We couldn't represent a knee as just a mathematical equation with some distance, you needed to sample the space within which that knee sat, and the distance in that space. If you just sample at a very high rate you get a huge volume of data, and that makes it slow to process and render, so we started looking at ways to reduce the number of samples you needed and we came upon using an adaptive sampling of the space." The 'adaptively' part of ADF comes directly from that initial knee problem and it has since been implemented in a host of ways from simple Oct-trees to more complex and powerful data structures. The first research was done by Frisken and Ron Perry who continues today to work on ADFs at MERL, but who also consults to the Foundry. He was key to establishing Mischief and coding the original the product. Perry is very much a key author and developer of the technology, but he also has other important research interests at MERL where is is a senior researcher. ADF vs Voxels Back to 3D and the roots of the technology. You might be wondering about the differences between traditional voxel representations and ADF distance fields in 3D? It is a complex area of 3D maths because a voxel representation can be thought of as a partitioning of space into cubes, and one can store anything in each cube (including theoretically distance). But if one accepts that most voxel implementations are not storing distance then the comparison is a little easier. A more traditional voxel representation use regularly sampled volumes and they store either binary values such as inside or outside or density values (using a density of 1 inside the shape, a density of zero outside the shape, and an average density for voxels that contain the surface). Voxels have many uses - sometimes the value is assumed to be the average of what is in the cube/box/voxel, sometimes it is assumed the value is at one corner of a 3D grid. Significantly, if one uses a discrete sample approach, then you really only get binary samples for shapes that have hard surfaces and you have to interpolate the density inside the cube when you need to locate the surface of the shape for rendering or other processing. (eg. 1 inside the shape, 0 outside). There are several problems with this approach. First, traditional voxel models were regularly sampled so you needed a lot of samples to represent shapes that have complex and compound detail, even if that detail is only on the surface or limited to a very small part of the shape. If you are more than one voxel away from the surface, the density is just zero, so you have no way of knowing where the surface is. And, while one can reconstruct the surface position inside a voxel, you can't apply higher order filtering (which is applied over larger regions of space) to get a smoother surface reconstruction. Thus, voxel models tend to have a fair amount of aliasing, or require a lot of samples... ie. data. Distance fields are defined throughout space and they vary continuously across smooth surfaces (unlike voxel density which jumps from zero to one as soon as you cross into a shape). Thus, they can be sampled more sparsely and they can be reconstructed with less aliasing. Importantly, they can also be reconstructed relatively far away from the surface, as the reconstructed distance field gives you useful information such as what direction you should look to find the nearest surface and how far away it is (both useful for ray tracing or for estimating forces for interpenetrating objects). Which is why Frisken initially thought of using ADFs in collision detection over a decade ago. Also key for any good renderer is that when a sample is on the surface, the direction of the distance field is the same as the surface normal, which is useful. In the sculpting R&D test application above, Tomas Pettersson used a vector distance field for his sculpting system. This is an extension of ADFs, to do this you sample the vector distance instead of just the signed distance. The vector distance tells you both the direction to the closest surface and the distance from the surface. These are known as vector distance fields, and for the price of 3x as many values per sample, they allow The Foundry's team to reconstruct surface normals or direction vectors more accurately and they allow artists to represent non-manifold surfaces such as points, lines, and infinitely thin sheets. http://www.merl.com/publications/docs/TR2000-15.pdf
×
×
  • Create New...