Jump to content
3DCoat Forums

Gian-Reto

Advanced Member
  • Posts

    215
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Gian-Reto's Achievements

Novice

Novice (3/11)

12

Reputation

1

Community Answers

  1. Thanks for the quick response. Preset is the right one, Unreal Engine. I never changed that anyway. Before and after images (first image is the mesh as edited in Blender imported as obj, second image is after selecting "Shuffle/Pack" in UV Tool room, and applying. Both are shown with height information only) I exported the mesh and imported it into Blender and Unreal Engine. 3D Coat is accurate in its display of the height information, the normals seem to be garbled in all other tools I imported the mesh to.
  2. I have a rather simple lamp mesh that I exported to obj from 3D Coat, and then modified in Blender to create a thin lampshade (Duplicated the lampshade tris, flipped the normals). Then I painted everything, looks good in the game engine. Now, I found later that the overlapping UVs for the lampshade generate a rather big issue, as I can no longer paint the self illumation by the lamp on the inner part of the shade as illum mask, but have to use a static helper light to bake it to the shadowmap. Apart from the inefficiency of the process (the shader is using a self-illum mask anyway), this kind of defeats the point of using a Stationary main light in UE4, as I can scale the brightness of the main light, but not the helper "self illum" light as this is static. Using another stationary helper light is a nonstarter as there is a max of 4 overlapping stationary lights in UE4, and this helper light would eat into that budget for the small effect it has... Long story short: I am looking for a way to re-shuffle my UV map so the overlapping parts are no longer overlapping, and I can start using a self illum mask to illuminate the inner part of the lamp shade without affecting the outer shell, yet keep the potential to scale the brightness through the shader. The 3D Coat UV Tool function "Shuffle/Pack" seems to be able to do the job. One click, and the overlapping UVs get separated, the UV Map gets neatly re-arranged. After that I hit "apply UV Set" (strangly enough I have to hit apply in the Paint room again). All the color layers I imported get their islands moved to the correct location, looks good. The Gloss map has some black pixels around the seams, but nothing that couldn't be fixed later. But the normal map is completly broken. Hard to describe.... looks like the islands have moved correctly (I see the edges and groves map to the correct location on the mesh), but it looks like the normaldirection of the pixels is messed up. This of course is quite annoying as there is no way to fix that. Is this a known issue? Anyone else had this before? Is there a setting I missed somewhere? I have my global setting set to the Unreal Engine preset, is there a separate setting just for the UV Tools room? Is there another way to separate the overlapping UVs and have 3D Coat rearrange the Islands in the Textures that is less destructive on the normalmap? Thanks in advance for any help.
  3. This is actually something I have noticed before... 3D Coat, at least up to Version 4.7.32, does fill voxel sculpts with colors if you floodfill a per pixel paint layer... Actually, sometimes even the airbrushed on paint ends up there. I never really paid that much attention to it, I didn't think about it as blowing up the filesize this much... probably didn't use floodfilling all that much before. The sculpt room objects where not filled with any vertex color information before baking and starting painting... Also, it usually transforms my voxel objects to surface mode, probably linked to transferring the colors to vertex colors. I guess this is just the normal behaviour when you do not switch off voxel objects in the paint room, like @Carlosan suggested. And while I am not sure why I cannot prevent the tool from transferring the color data or switching my voxel objects to surface mode without hiding them, its not the end of the world.
  4. Got a reply from Andrew. He was able to recover parts of the file. More importantly he was able to pinpoint what was the issue: 1) Part of the file was corrupted... probably due to not having enough disk space at the time. 2) The large size (which in turn caused number 1 to be a problem) was caused by me flood filling paint layers, not knowing that the voxel sculpt in the background would be also filled via vertex colors. And as I had quite a large Voxel sculpt in the Voxel Room, with many vertices to fill, that caused a lot of additional data to be written. So take away for 3D Coat users is: 1) never flood fill the whole layers or 2) keep your baked meshes you paint in a separate file to your voxel sculpt to avoid filling the sculpts vertices with vertex colors.
  5. So I am back with results: Installed 3D Coat Version 4.8.10, ran the DX Build, tried to open the same file. Same Result. Installed the GTX 1070 of the gaming rig in my work rig, tried opening the file again in V4.8.10 - same Result, still an "out of memory" exception even though only around 1.8 GB of VRAM used is reported by GPU-Z at the time of failure. Given we are talking about 8 GB of VRAM now, I guess we can rule out VRAM being to low. I will send Andrew an E-Mail with a link to this thread, the Dump and a download link to the file just noticed the filesize: almost 5 GB Filesize. I am guessing something is not right here, given the Voxelsculpt was just around the 1 GB marker last time I checked. Something must have gone wrong here. What I did afterwards was just creating a retopo mesh, baking, deleting the baked mesh and layers, tweaking the retopo mesh, rinse and repeat for maybe 10 or 15 times, until I was satisfied. Then I started painting. Around that time the "out of disk space" errors started hitting, and Undo history was going wonky. Is baking and then deleting the objects and layers leaving something back in the file? Could that have caused the filesize to grow to 5x the size of the voxelsculpt alone? I guess just the retopo mesh, UV Sets, and paintin layers cannot be 4 GB in size, right?
  6. I have trouble opening a file. Since some days I am getting "out of memory" exceptions during the load of the file. Memory used at the time of the crash with the DX version: 13.5 GB of 24 GB total available to the system VRAM used at the time of crash: 1.7 GB of 3.5 GB effective on the Video card Virtual Memory has been increased increased to 300 GB total (100 GB on D: / 200 GB on H:) Disk space free is: 32 GB on C: 49 GB on D: 1.7 TB on H: Trying to open the file in the GL version resulted in a similar crash. Only difference is that VRAM usage is at 2.4 GB at the time of crash. Hardware used: CPU: i7 970 (6C/12T) GPU: GTX 970 (3.5GB VRAM) Memory: 24 GB DDR3 Disk: 250 GB C: / 500 GB D: / 3 TB H: Measurements taken with GPU-Z on the VRAM side Doesn't sound like 3D Coat really is out of memory at the time of the crash. I had trouble with frequent "out of memory" exceptions while having the file open prior to the file becoming nonworking. At that time the Disk space was low, until I cleaned out some old junk to make room. At the time I thought the problem was that the 3D Coat Undo file was failing to write the history steps, given I was limited then to a single Undo step. What I haven't tried yet is to download and open the file in 4.8 .... because the page was down in the morning. Besides that, I have a GTX 1070 in my gaming rig that I can swap in to see if its a VRAM issue (given that card has more than double the effective VRAM)... but as that is quite some work, I'd like to verify first that a VRAM size issue is plausible. Given the readings of the VRAM used at the time of the crash, is it possible that either GPU-Z was not reaporting the VRAM usage correctly, or that the reading was before 3D coat was trying to get a huge chunk of VRAM? Also, given that there is still unused Memory at the time of the crash, and I should have plenty of virtual memory, can I be completly sure its not a limitation with the amount of RAM in the system? If needed, I can provide a crash report. Didn't wanted to attach it to the Thread for privacy reasons, given it shows quite a bit about the files on my local drives. Should I send an official crash report once I have exhausted all my options (newest 3D Coat version tried, bigger VRAM size tried), or reach out to Andrew directly?
  7. I second that. Fingerwheel support would be very welcome here. And a generalized system which could read all the different additional channels, and maybe generic analog axes that could be mapped to other devices like foot pedals would be great. You can never have too many analog axes to work with (unless you run out of fingers and feet to control them of course ;))
  8. Okay, gave the "retopo via decimate" a testrun today and have to say, I am officially impressed. For a moderatly complex model with many sharp angles (a crystal), the algorithm did a fabolous job once I got the amount of decimation right. Cleanup was needed, but once the Voxel object was smoothed, there were no weird artefacts, just some unneded bevels. I think this is pretty close to what I originally envisioned. Thanks for all the hints and tricks mentioned. I am grateful for all the help you have given me!
  9. Couldn't get that test done this weekend. Will try to look into it this week. I am pretty curious how well this method works, but I will maybe have no time to try until friday. I'll update the topic as soon as I have tried @digman: Is there a difference between how Blender and how Maya does the decimation? From my tests with 3D Coat voxel objects exported as meshes it looks like the problem with messy parts of the decimated mesh comes from the exported ZBrush mesh, which most probably has some weird artefacts somewhere in the mesh. So Blender seems to work just fine together with an exported 3D Coat voxel object, without any cleanup needed really. As I do no longer have access to Maya, could try you the same thing with Blender to see if there is a difference in the output? Again, only if you have Blender installed and the time to run the decimate modifier. Would be cool to see how the two tools stack up in this particular use case.
  10. Gave exporting the voxel mesh a try... wow! Works like a charm. Why the hell did I never see this option before? Is it new? Not 100% how well the resulting decimated mesh would be of use for baking, but seems to work rather well all things considered with the Blender decimate option. Certainly quicker to work from such a mesh than first creating a retopo mesh from scratch in some cases at least. Of course, this is only really usefull for static assets that are kinda damaged, or of natural origin, where exactness of the mesh is of no real importance. An improved autopo algorithm for game asset creation would probably still have some value, as Blender decimate, while doing a fine job, sometimes makes some questionable geometry and is not as configurable as I would wish. And of course, export > import of such a large mesh always takes quite a bit of time. BTW, thanks for the DarkBlender hint. I really have to chech this theme out. Maybe Blender starts looking like the professional tool it is with this? Still have to look into some of the other options... but thanks a lot for all the useful information guys!
  11. Which is a shame really. Given that I had quite good results with the decimate modifier in Blender, is there any way to directly create a mesh from a Voxel object and export that? I mean, 3D Coat has to create a mesh anyway to send to the GPU, sooooo... is there a way to export the mesh of millions of polygons that is used to display the voxel sculpt? That would allow to export the sculpt, and let other 3D tools with different decimate capabilities run over it. Like Blenders decimate, which ain't perfect, but does the job.
  12. In fact I did not. Seems I missed some options on the Autopo window the last time I tried. I will try it out and report back how it works for me now. Thanks for the hint.
  13. But can you tell the algorithm to favor tris? I found that I often had to go in and add edges because the algorithm ignored part of my guides and placed a quad on an edge of my voxel model. Also, I still think the algorithm shouldn't wholly rely on density paint to choose the density of the mesh at certain points... thats just wasteful for game assets. But thanks for the hint about running autopo only on parts of the model. Have to try it one of these days when working with a simpler voxel object.
  14. Long time 3D Coat user here, been mainly using 3D Coat to create game assets for my own hobby projects for some years now. Impressed what the tool can do, having to use blender less and less with some of the features I have found in 3D coat over the years. One thing never really worked for me however, and that is the autopo feature. Now correct me if I am wrong, but I see the current implementation of autopo as being more optimized for people wanting to create 3D models for animation, rather than games. The heavy emphasis on quads and even distribution of the polygons seem to suggest this. For game assets, at least the static ones, that is the wrong approach in my expierience however. Getting the best shilouette match with the least amount of tris used, and more often than not, using tris instead of quads to make sure every single tris counts towards the shilouette seem to be the right way to go forward. Having purchased ZBrush lately because some of the brushes are just better for general sculpting of rocky objects, I found out that I could directly export the high poly sculpt from ZBrush as a mesh, which besides being easy to import as a high poly voxel object for retopo in 3D Coat, also allowed me to import it into blender to run the decimate modifier over it. And to my amazement, that modifier actually did work, and gave me an 90-95% usable low poly mesh I could then load into 3D Coat for retopo. Now, Blender decimate is not perfect. There are the misplaced edges sometimes, which result in ultrathin tris giving 3D Coat headaches during the bake, which need to be cleaned up after. Sometimes the algorithm just seems to stumble and fall on its face, resulting in areas with tons of tiny tris for no apparent reason (that could be a problem with the ZBrush mesh though). So I did have to clean up the meshes. But even with that, I got a mesh that very closely followed the shilouette of the highpoly mesh, and was able to quickly bake the low poly mesh after about an hour or so of clean up. Contrast that to the amount of time I would spend on putting guides on the high poly mesh for Autopo, and then STILL get a mesh with just evenly spaced quads placed without taking the shillouette of the voxel sculpt into consideration apart from my guides, and I bet I would spend way longer going down that route. So what I would like to discuss here if there would be a need for an improved Autopo function, which lets you select different algorithm to generate the Autopo mesh. So the idea clearly is not to replace the current algorithm, which I guess is fine for many users, and what I am suggesting here is very much tailored to the needs of game assets... but giving the user the option to select what kind of meshing algorithm should be used. The algorithm I envision would basically try to to create a mesh, matching the high poly mesh / voxel objects shilouette as best as possible while using the least amount of tris within a given tris budget. So if there is a flat surface... ideally use only 2 tris, or however many are needed to get a good shape in case the flat surface resembles an octagon more than a quad, for example. If there is an edge in the high poly mesh or voxel object, place the edge of an Autopo mesh tris on it. Favour the more acute edges, and steeper angles first, only placing edges on smaller detail edges when the bigger details have been accounted for. Keep the tris budget input, keep the guidelines (as you might want to help guid the topology for something like a face), keep the density modulator (in some use cases that might be helpful)... but the main work should be done automatically by the algorithm, so when you have run it on your voxel object, you get a pretty usable mesh for a game asset right out of the box, without fiddling around with guidelines or density. As long as you don't care about the topology, you should get a mesh which best captures the details it can from the high poly object within the tris budget you set, without anything other than setting that budget and hitting the autopo button. Maybe a selector switch for tris or quad preferrence would be a good idea.... after all, some people prefer quads, seems to be still preferred for skinned meshes. Maybe a way to paint areas where one or the other should be preferred the way it is currently done with density? What do you guys think? Am I just not getting the way the current Autopo tool works, or is there a real need for an Autopo revamp?
  15. Still cannot post in the subforum... do I have to wait a day or something?
×
×
  • Create New...