Jump to content
3DCoat Forums

Zeddicus

Advanced Member
  • Posts

    197
  • Joined

  • Last visited

Everything posted by Zeddicus

  1. I came across this comparison video which illustrates quite well the different methods for creating sharp edges: Insofar as poly counts are concerned, there really is a big difference between smoothing groups versus adding extra edges all over the place. And that's just one simple mesh, not at all typical of a real scene (like in a game using the Unreal Engine).
  2. Yeah, but it's still worth discussing from time to time. I like your idea, but wouldn't it only work if you UV unwrapped each mesh based on it's original smoothing group configuration in Max? That would be both challenging and time consuming, probably just as time consuming as re-doing them all from scratch when your back in whichever program it is that supports them. Speaking of which, the number is growing. 3ds Max and the Unreal game engine have been mentioned. Another one is Modo. Oddly, Mudbox doesn't in spite of it being an Autodesk app. Most apps, like Maya and Blender for example, do support something akin to smoothing groups. They're pretty basic in functionality and generally involve setting edges to hard/creased versus soft/uncreased and thus aren't quite as advanced as Modo and Max. Hopefully this will change over time. Modo 3ds Max Blender Maya I find smoothing groups particularly useful when using the Turbosmooth modifier in Max (for rendering purposes), which has an option in it to separate by smoothing group. This is superior to adding a bunch of edge loops, something which can crank the poly count way up. It actually makes sense for a game engine like Unreal to support them when you think about it, as it can help give the impression that a low poly model is high res when it's actually not. Normal maps are another such trick, something that was also unsupported by most apps at one time and also had to be fought for (ubiquitous PTEX support currently being another). These simple rendering tricks are less computationally intensive than adding a lot of unnecessary edges that can add up quickly depending on the mesh, especially when you have a lot of such assets in a scene. I won't be surprised at all when we eventually see more game engines adopt smoothing groups for this very reason. The side benefit of it helping to keep each individual mesh as simple as possible is also useful because, once you start adding lots of edge loops to a mesh, it starts to become unwieldy. Especially when last minute edits are required. 3D Coat does support smoothing groups, though it's been a challenge to get it to do so properly (see link below). 3D Coat even exports them properly to the OBJ format and are seen by 3ds Max just fine upon import. Unfortunately 3D Coat still refuses to respect and preserve them whenever importing an OBJ file which already has them. As the original topic starter pointed out, if you export an OBJ with smoothing groups from 3ds Max, then turn around and re-import it straight back into Max, all the smoothing groups are still there, intact and unchanged. For some reason 3D Coat wipes them out, either replacing them with it's own automated setup (if the smoothing option is enabled during import) or puts all faces into the same smoothing group (when the smoothing option is disabled during import). If only 3D Coat would respect all of the smoothing groups already there instead of changing them, we could export our UV'ed meshes from 3D Coat back to programs like 3ds Max without having to go through the painful, hugely time wasting rigmarole of setting them all back up again from scratch. I'd give my left you know what for that... seriously, I would. http://3d-coat.com/mantis/view.php?id=493 PS: Importing an OBJ with smoothing groups for PTEX painting is even worse. I'm not at all sure what 3DC is doing to the normals of the mesh, but once back in 3ds Max, any smoothing groups you try to re-setup won't have any effect. The only way to fix it is to set the OBJ importer in 3ds Max to faceted, or apply two normal modifiers with each set to flip normals. After collapsing the stack, the mesh with be faceted and smoothing groups work once again.
  3. The lowest my frame rate will go using an overclocked GTX 670 (256-bit / 2048 MB GDDR5) with driver version 331.65 and 3D Coat 4.1.12 DX Cuda in Windows 7 SP1 64-Bit with a 30 million triangle mesh is 17 fps. I downloaded the 340.52 driver yesterday, but haven't installed it yet. Trying to figure out why I can't move, rename, or delete the installer lol.
  4. Just popping in to say THANK YOU for the new radial symmetry options! This is a major feature, at least IMHO, that had been missing from 3DC for far too long. I absolutely LOVE it and really appreciate the fact it's finally been added. I agree that "Show symmetry plane" should have been moved to the new dialog as well. Perhaps it might also be more conveniant if pressing the hotkey a second time would close it as well? Right now it always pops up in the center of the viewport and 3DC won't remember it's last location when moved. An alternative might be to have it always open under the mouse/pen cursor, which when combined with a hotkey toggle would make it quick and nimble. I guess I should probably also report on how the "Hide visual guides" option in the render room doesn't fully work with the new symmetry option as well (i.e. the yellow indicator isn't properly hidden when using realtime rendering). That and also the fact undo doesn't work while the symmetry dialog is open, which is when you really need it to work (i.e. while testing various value changes). Many thanks once again, Andrew!
  5. I should have worded my original post a little better. The way I was looking at it was replacing greyscale bump maps with vector displacement ones instead. There is really no reason for it not to be just as universal by now. The dinosaur video AbnRanger posted was a good example. The ear video, which I've seen, is a different scenario to my mind and in that case I can agree with you (I merge often). I guess my thinking is that if you go to the trouble of building in the ability to generate them for use in rendering, how big of a step does it take to also use them as a brush like we see in the dinosaur video? Or are boolean operations better in terms of easier coding and more efficient use of a PC's resources? I'm not a programmer, so maybe someone who is can comment on this.
  6. Agreed. It's surprising how, after all of this time, Mudbox is still the only sculpting app which can do this. I can sort of understand why Pixologic never implemented it given the way they went with their proprietary Insert Multi Mesh feature, but not 3D Coat which still relies on greyscale maps. I'd love to see vector displacement "stamps" become the norm instead. That way the library you build (and share!) could be used in any sculpting app exactly as shown in the video you posted.
  7. Ooooh, those autopo upgrades look exciting. Can't wait to play around with it and compare with ZRemesher (which I find to be unstable). Are we seeing the beginning of a game of oneupmanship between the two perhaps? I do hope so hehe. Ptex displacement works quite well using 3ds Max with Vray fwiw. It's my preferred method because seams are never an issue. I think one of the samples on the Ptex site shows this off too. It's a shame the industry has been so slow to adopt great things like Ptex and OpenSubdiv, while the few that have tend to have trouble with their particular implementation.
  8. BeatKitano pretty much nailed it dead on with his last post. I use 3D Coat primarily for retopo, uv setup, and bitmaps because those are what it excels at IMHO. I appreciate what 3D Coat is trying to do insofar as sculpting is concerned, particularly it's ability to freely add, subtract, cut and what not which comes closer to mimicking real clay than ZBrush does, especially when ease of use is considered. It just doesn't handle these and other sculpting tasks cleanly which is where ZBrush, and even Mudbox, trumps it. If Pixologic not only refines ZBrush, but also adds all the stuff currently missing when they release 5.0, it will truly become unstoppable with the final hurdles being it's interface (which I'm fine with) and higher price. Not going to even comment on rendering lol.
  9. Were you upgrading from v3 EDU to v4 PRO? When I entered my v3 EDU serial, it took me to a page that said I would have to pay $84, but didn't say which version I'd be getting, EDU or PRO. I posted a question in the v4 upgrade thread about this a few days ago but haven't gotten an answer yet unfortunately.
  10. I want to upgrade from v3 EDU to v4 PRO but can't figure out how to do that nor how much it costs.
  11. He could delete the loop and draw a new one using the stroke feature too. This wouldn't work in every scenario, but for the example image that was posted it should work perfectly. And I agree, retopology is definitely one of 3D Coat's greatest strengths.
  12. He could be talking about ptex. When you import a mesh for ptex painting, 3d Coat destroy all of it's UV's, replacing them with tiled ones which is something ptex doesn't technically require. I assume this is because 3D Coat isn't a true ptex painter (like Mari), but more of a hybrid one where traditional UV's and bitmaps are used for painting and then converted to ptex only when exported.
  13. I followed your instructions, AbnRanger, and the experience was exactly as you predicted. It's a very jarring, sudden change in performance completely dependent on brush size. It's speedy and just fine, then totally drops off with just a tiny change to the brush size. Shrink it back down just a tiny fraction and it goes right back to being speedy. Knowing Nvidia (and how corporate types think in general), their gaming cards are probably designed to work great only in games, while their pro cards are designed to work well only in CG apps. That way you're forced to buy both, or so they hope. Greed makes people do strangely illogical things. It would be interesting to see how AMD's 7970 would perform with 3DC if Andrew were to add OpenCL support. The article posted by L'Ancien Regime is an interesting read. Thanks for sharing it with us! I'll probably replace my GTX 670 with whatever blows away the AMD 7970. I try not to upgrade too often because even though it can be fun, it's often also time consuming and I do so hate the inevitable troubleshooting that tends to go with it lol. About memory with XMP, I had to turn it off because the timings it set would prevent my PC from getting past the BIOS screen, and sometimes not even that far. What I did was write down the settings it wanted to use, then entered identical settings into the BIOS myself using its manual override mode. Then it would boot perfectly fine and even ended up being super stable that way. Don't know why one way would work and the other wouldn't when the settings were identical, but there you have it. Fwiw they were Mushkin Enhanced Blackline Frostbyte DDR3-1600 rated for 9-9-9-24 timings at 1.5v. They easily ran at higher clock speeds so long as the timings were loosened, but after a lot of benchmarking I found that a slower frequency with tighter timings was actually a fair bit faster than a higher frequency with loose timings. Naturally YMMV.
  14. It sounds as if they could have actually released the hardware a while ago. If it truly was delayed for the reasons given, then it surely could have been sent out right now given the other concerns are all things that could have been dealt with after the fact (e.g. taken from their forum: "building our developer program, Airspace, OS interaction, etc"). I highly doubt the pre-order folks would mind finally having the finished/polished hardware right now while downloading more stable software down the road. In a way it's a bit like the boy who cried wolf and the kind of thing that can make one start to doubt their sincerity, dangerous considering this particular product is a want and not a need for most people. Hopefully they'll get it right come June. It's a lot smaller than I thought it was, after looking at some images of it, and could actually be built into a Wacom tablet I think. In fact I could see myself attaching it to the top of my tablet with stick-on velcro strips. What I see in my minds eye is holding the tablet pen in my right hand and sculpting/texturing with it at the same time as I use my left hand to manipulate the mesh (pan, zoom, rotate), kind of like holding one of those plastic models in one hand while painting it using the other. Can't really see myself using it to actually model though.
  15. "I have no idea what LuxMark scores are supposed to mean/prove?" I'm wondering the same thing. Synthetic benchmarks don't mean much IMHO, but I'm sure this one does to owners of AMD cards whom, after reading my post, are no doubt foaming from the mouth at this very moment lol. I used Nvidia cards in the beginning, switched to ATI for a while as they were called at the time, then went back to Nvidia due in part to driver support. In over a decade of use, I've only ever run into two problems with an Nvidia product. The first was a driver issue about three years ago that sounds very similar to bisenberger's problem, and the second was when I tried to set up SLI back when it was still relatively new tech. That one may have actually involved firmware rather than drivers and the manufacturer bent over backwards to make it right. As for my experience with CLI versus CUDA, using the former in Vray RT results in approximately the same performance as having it set to CPU mode, while the latter is much faster than either. On the other hand I don't see any difference between CUDA and SIMP when comparing these two 3D Coat versions. Take from that what you will, but personally I think AbnRanger is right about the situation. Not sure what the problem is with his 670 though as mine (made by Gigabyte) works great. And before anyone freaks out, I don't have anything against AMD and I would happily use a card designed by them provided some basic reassurances were met. After all, it's not just all about performance, is it now.
  16. There is a button in the tool palette which says Ptex local resolution when you hover over it. It's an image of a square with four smaller squares in the upper left corner of it. Clicking on it will bring up a bunch of options, at which point you can select which polygons you want to affect by left clicking on them (hold control key and left click to deselect).
  17. Cut him some slack, he's just one man trying to do all the work by himself and we're lucky we've gotten as much as we have since 2010. He really needs to hire a few more people to help out, but my impression from spending time here is that he really doesn't want to do that. Probably paranoid they'll steal his code or something lol.
  18. Hold down the shift key to smooth instead of using the effects tool, which doesn't work for me right now either.
  19. I completely agree. In the case of DPR, it's the small blocks of white on a mostly black / dark gray background that causes buzzing, probably stemming from how refresh rates and liquid crystal technology in general interact. My thought was this could be the case with 3D Coat if the color scheme had been radically customized to something the monitor didn't like.
  20. The "Import images as mesh" in the file menu is a good one too.
  21. The hissing sound could be the monitor too. Mine always makes an annoying buzzing sound whenever I surf the Digital Photography Review website.
  22. I'm not a programmer, but if I had to guess I'd wager in addition to choosing and knowing a good programming language for it (like C for example?), a solid understanding of algebra, geometry (which has quite a few branches), and trigonometry is needed just for starters. One place to start might be to examine open source code written by others.
  23. Now that this thread has been resurrected by AbnRanger, I can post this since nobody seems to have mentioned it yet: http://corona-renderer.com/ It's a really nice renderer and will probably go commercial a few years down the road, but for now it's free to any Max users interested in testing it and lots of great examples can be found on their forum after signing up.
  24. A way to save render settings to their own file and then easily reload each of them, say via a drop down menu labeled "Presets", that is a perfect example of what I had in mind. Simple things which are quick and easy to implement (hopefully). The renderer in 3DC will never be capable of photo realistic output no doubt, nor complex like those which are. However I do feel it is missing some very basic... um, basics. I find it's output can be quite similar to Marmoset with a bit of effort and luck; video game output is often what it looks and "feels" like to me. That's not necessarily a bad thing though and it should continue to embrace that instead of trying to copy others, such as ZBrush. Why reinvent the wheel? I too agree there is a lot of other stuff that needs to take precedence right now and for the foreseeable future, bugs in particular, but that doesn't mean the renderer should be ignored completely. For better or worse it's in there, isn't going away, and actually useful to some artists as we've seen. Perhaps it's time to consider hiring someone to help out for a little while? Edit: Awesome post, b33nine!
  25. Is it safe to say the render room is the area that needs the most work right now, yet is also the one getting the least attention? I guess the real question is how important it is to users. I tend to avoid it, but I'd love change that. Octane is a great example to follow, either to use as an extension of 3D Coat or to copy in general. Even just getting some good IBL would be a pretty big improvement at this point I think. Whatever the case, I'm glad to see I'm not the only one whose been thinking about this room.
×
×
  • Create New...