Jump to content
3DCoat Forums

kwhali

Member
  • Posts

    65
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

kwhali's Achievements

Neophyte

Neophyte (2/11)

5

Reputation

  1. I know of SOS, these bugs not an emergency. I'm not sure how to get rights to post in bug report and feature request sections. I have been user for a while and made quite a few posts, must require reputation or become paid customer? I just play with demo/trial from website, is it considered beta?
  2. When running the Geometry menu and choosing close holes, the popup is aware of 50+ holes. I ask it to close them, some tiny holes that were not easy to spot such as single triangle become large, however, hole fill appears to have failed. Can I some how highlight the holes instead or cycle through them with camera aligning to the hole with previous and next buttons to navigate? As it is surface, the mesh have some areas that might be considered a hole but is not, eg window or door cavity with no thickness . Perhaps this causes the failure. I could manually repair actual holes if coat could show me where they are.
  3. Yeah, texture streaming is a really needed performance feature for this kind of work. Hope to see it in 4.9 release if lucky.
  4. Please move to bug forum as I'm unable to create posts there for some reason. I used the Sculpt Room Primitive tool, I selected a cube from the models tab where alphas and such are for brushes. I enabled Click to Place, Scale to Brush Radius, Use Stroke Direction. I place the primitive on to an existing surface(eg another cube), it looks correct, I can keep placing it. If I change my mind and choose a different model(or just click the same cube model again), the latice deforms. To correct this one must click reset axis(or reset primitive), then select the model to load again, the latice will be corrected, which avoids distorted mesh, although orientation was lost(I'm not sure why orientation was changing slightly in the first place). You may notice that the reloaded mesh/model had grown in size, this is a 2nd bug, just keep clicking the model icon to reload it, it will grow in size progressively to a certain point(not sure why there is a limit). This 2nd bug occurs when scale has been altered(only seems to affect smaller scale not larger than original scale). So altering scale via transform/lattice toggle or Scale to Brush Radius cause this error. 1st bug is related to Use Stroke Direction. Using Transform/Lattice Toggle and scaling/rotating the model, repositioning it with Click to Place and reloading the model doesn't appear to distort the lattice, but slight rotation offset can be noted of the lattice from the model, this fixes itself(applies incorrect lattice transformation to model) when transforming or repositoining the lattice/model(Click to Place or Transform/Lattice toggle gizmo). I noticed no difference with Use Stroke Direction(just testing now). It was when the scale was made smaller like Scale to Brush Radius setting when enabled that the lattice distortion was more evident from a rotation. No rotation, just click to place with Scale to Brush Radius and small brush radius, then reload the model, the lattice has enlarged but model stays original size, use Click to Place without scaling to the brush or use gizmo to move/transform the shape and it will then apply the lattice to itself, if you load another model in again instead of a transform, it will also apply the lattice(increase scale), then grow the lattice again with the model being replaced with larger size model. I have tried with Use Stroke Direction on different sides of the cube I was attaching the primitive model to, the lattice behaves differently(no orientation via Use Stroke Direction, and just Click to Place and Scale to Brush Radius results in lattice growing equally all lengths), on the top face I was noticing only scaling in the y axis, it also adjusted the orientation on that axis.
  5. I cannot post to bugs forum yet, please move to there. When pressing shift or shift+ctrl to switch to one of the alternate brushes like smoothing, reduce or anti-bump, presumably it is not meant to take on the behaviour/effect of the active main/primary brush in use?(brush sizes/alpha etc yes, but not the properties that make a brush unique from others, like pinch or flatten) Copy Clay when using either alternate brush will apply it's effect(sample or stamp mesh data to volume) then apply the alternate brush effect, such as smooth or add detail. Most tools don't seem to behave this way, so presumably it's a bug. I noticed with Freeze brush, hold shift to toggle brush like reduce which clearly is not smoothing, it will smooth out any mask selection, actual intention of the alternate brush doesn't happen(neat that you can smooth freeze like this but caused some confusion of how these alternate brushes with shift/shift+ctrl weren't working), similar behaviour with Surface Hide, however it doesn't toggle hide/unhide with alternate brush modes, which is what I'd expect, hence I think Copy Clay is behaving incorrectly.
  6. This still appears to be an issue almost 6 years later. I am not noticing any effect of different alpha/depth/falloff or focal shift with Copy Clay with various combinations of tool options. Beyond depth values pushing the applied clay data to the mesh at higher elevation, only blend with laplacian and adjusting the strength seems to really have effect, doesn't necessarily allow for properly blending with a fall off / smooth / softened edge if the cloned data has varied elevation..
  7. Unable to post in bug forum, please move there. I used some live clay and made detailed sculpt area, then use Copy Clay tool to sample and apply elsewhere on the mesh, I noticed that once I sampled this data 3D Coat become quite laggy/stuttering on Intel skylake i5-6500(quadcore 3.2GHz 4 cores/threads). While bit difficult due to very low FPS, if I navigate mouse to tool options and clear data the CPU usage for Coat dropped down from 70% to 10-15%. I also tried hide the layer, no effect. It turns out that Blend being enabled caused the high CPU load, if disabled, CPU usage is still higher(averaging about 20-25%), clear data will be equivalent of changing tool which brings CPU usage back to normal(unless return to the tool). When changing tools sometimes I noticed the sampled mesh data appear in grey like when using import or primitives tool. I imagine while it is not visible when using the tool, perhaps 3DC is interactively moving this mesh around still, so the denser the mesh data sampled, the more intense on the CPU? Any reason to do this, if there is no interactive preview, then perhaps the tool should not have high CPU usage doing whatever it is doing non-stop and just apply when user clicks? I imagine the CPU usage is higher for blend enabled because it's interactively blending in advance before applying when clicked? The brush only seems to apply by single click, not applying multiple instances along a brush stroke, so it seems like this is a bug to me.
  8. If you have a hole that is not a usual hole shape(eg an L or U, a line shape hole if you will) these seem to have problems closing, especially so if you have a ring shape hole with outer and inner surface(within the same volume) and you want to merge/connect them as a single mesh, understandably this is something tools like fill holes and poly remove(with remesh after) seem to struggle with. In addition to that, if I have a surface with the edge chipped of, like... ___n___ where the n is missing data, I have noticed poly remove with remesh enabled also can struggle here. How do you go about closing/extending? I have tried booleans in the past with varied success, the muscle brush has worked the best requiring some extra cleanup work after, but struggles in some situations giving failed boolean operation often.
  9. Sometimes when doing boolean operations they fail, the error is long and usually about intersection. Adjusting and trying again(especially with say muscle brush) this can be frustrating process... Any tip on how to resolve? Is it just bad geometry or lack of or too much triangle density/difference between the two volumes?
  10. I do not have permission to post in other forum like bugs/feature requests. Please move to appropriate forum. I used in Sculpt Room, the top menu Geometry -> Smooth All. I later find out that to just smooth tangents I need to set smooth degree to 0. While experimenting with this feature, I noticed that there is no cancel, press escape closes the window, but applies the effect, you have to undo the operation, no way to prevent it being applied once the pop up appears.
  11. I was not aware of focal shift, it is not displayed by default(showed it in top panel by enabling in preferences). Falloff was at 0% yes, I have tried tweaking values of both these, and neither affects the height range of the flatten effect, only increasing the brush size/radius. Just to clarify, I am using Surface Tools -> Flatten, with "On Plane" selected and "Type of Surface" set to "Plane Defined by RMB" and RMB Action "Pick Point & Direction" Should I try record and share a video to show the issue? I do like that this brush has a cut off point for the height disparity it will apply the flatten effect to, but to use smaller brush size(since polygonal lasso is not permitted) it needs to be possible to adjust the cut off point and preferrably keep it the same across brush size instead of scaling by brush size).
  12. I have found switch to voxel mode risk losing details/form which is big no-no. Perhaps I don't know how to do this properly. I understand that voxel is like pixel but cube, and whole mesh resolution quality is affected by how small the voxel is, if want to retain very fine details that are in some parts of the volume, you must use very dense voxel volume? Environments I work on lately have been buildings like a house or spiritual/cultural building with intricate details. We do not have best scan equipment and going back to recapture poor data areas is not always an option. Import data is not water tight, often with holes/damage of missing or misinterpreted detail(photogrammetry). It is surface volume, I did try import to voxel volume with higher density mesh, this did not seem to have errors like importing surface volume >500MB, but the method just imported the vertex then a voxel sphere and merge nearby ones, very much looked like a grid and could not import good quality mesh that resemble the surface data triangles. Either too sparse with surfaces not connecting, or too dense areas blobby., inbetween lots of holes due to grid like reconstruction. I did not do too much with AUTOPO settings, I think I set capture detail fairly high at around 90-95%, draft quality, no stroke guidelines or density masking, I can't recall if I allowed decimation from 10 mil triangles, I don't think I did, target triangles was also raised to 1 million(high, but for this stage in pipeline is our low poly, it will be split for optimization and LOD later). I wasn't sure if it was going to finish, on Windows I'd see 25% progress on titlebar and it would be completely unresponsive and not like to show any windows contents, at home I was able to run it while I am at work, then come home and it still going for a bit while I browse forums, say 0% on linux still. But it did finish(single core/thread operation most of time from what I noticed), and the quality impressed me, I expected much worse, it identified good topology direction, kept shape pretty well too(although some flat areas had waste of many quads while smaller detailed areas with smooth curve shapes lacked and were a bit blocky at angles). It did fail in a few areas causing holes(relatively small, usually only a few triangles beyond 2 areas), if it didn't take so long I could probably try the stroke guides and density mask features it suggests to refine results, but I imagine that means recalculate again from scratch so I did not do that. I did not expect good results or even for the autopo to have success. This was a test with data that was only half way repaired, many interesection and holes + problem areas have been fixed since. We presently just use decimate and sometimes MeshMixer remesh/reduce tools(but this tool is very slow and frustrating to use on so many triangles), it has max deviation settings which has been really interesting. I have noticed that 3D Coat will decimate flat surfaces well, but if the mesh is slightly not flat, it still tries to retain these forms, so some noisy area like carpet or weaving decorations where most detail can be brought out in real-time via engine(normal maps or displacement/parallax) instead of triangles, these need to be flattened/smoothed out. Oddly decimate tools don't seem to allow apply to a selection like poly remove or similar(the polygonal lassoo is disabled and switch to tool removes the mask selection if made prior), requires bit more care around edges so not to affect surfaces nearby that should not lose detail. When this is done in dense areas that are not important due to being quite flat, 3d coat decimate works charm, much faster than mesh mixer too. I should request max deviation(error metric setting) variant though for 3D Coat I guess Really interesting about the voxel volume being able to transfer vertex colour back to sculpt volume during conversion. I only recently started to use vertex colour with imported mesh which has been helpful(even though it's quality is low from all the decimation of original data to get into coat). If I find time to experiment some more with autopo on these meshes, I'll be sure to reach out to you (or forum post in general so others can benefit)
  13. It is possible but I think it would mention that instead of crash with out of memory error. Would also be mentioned somewhere if that were the case Also turns out that one ofthe mesh I used recently it took over 8 hours to do AUTOPO with draft quality(fastest), on my personal (intel skylake quadcore 3.5GHz, 32GB RAM, GTX 1070) not threadripper machine at work. Support/lead dev said it should only ever take 5-10 minutes at most, I will try again on the fully repaired model soon when it's done and at work to see if any difference. I think because it is not water tight mesh it had trouble, I also did AUTOPO of 1 million triangles target which is much higher than usual default. If it's still slow at this I will try get authorization to send the model for testing the feature, maybe is bug that can be fixed. The AUTOPO was really good quality result even for draft, I was impressed but of course at 8 hours long, not easy to iterate/tweak. The developers are responsive to resolve issues like this. I did trial on older version months ago(but team was too busy with client work to show interest). Back then, all I really tried was import some data, I think it could handle more than 500MB then, but it took very long 4-8+ hours to import(but save as .3b native 3d coat format afterwards and it loaded instantly), was also the case for some smaller <500MB files I think(PLY extension, binary format). IIRC 4.8 release addressed this performance issue and I could start to really try use the trial version properly. So with some luck, next release will be able to resolve these import issues and artists will be happier can import more detail. As a note this is data from 3D scan software, it is all over the place(not spatially sorted clusters, and quality of mesh data/topology can be poor), other industries maybe not affected with similar sized datasets.
  14. I believe I've already raised it on the forums in the past and notified Andrew. The only competitors I'm aware of are Substance Painter(but can't paint across UDIM/uvsets) and MARI(not particularly nice to use or train staff on, plus expensive, but only one that does texture streaming of this size well and can paint across tiles/sets). I will push for it again in 2018 after holiday season perhaps. It would definitely be a game changer for my industry not much competition out there and 3D Coat is a great, friendly and affordable(monthly subscription/rent to own license would be nice too of course) software to use, all our artists love it over zbrush now.
  15. I am not a customer yet, only used trial version at home and work to evaluate. Will be getting a license though, hopefully then will get paid support to fix Support has been pretty good already on these forums and few e-mail support(it's also holiday season for many so I don't expect much action until 2018).Just to clarify was only 500MB of 3D data(and binary so contains more triangles) that was limit. I have a thread for the bug here: I have sent a bug report to support. On the thread you can see picture of import the file the usual way, and the broken version is the multi-import feature(just lets you import multiple mesh files at once instead of each one manually). Ideally though I would like to import larger dataset file, original ldata is 30GB for that model, but we have data up to 120GB currently(have to reduce that down to 500MB as whole or split into pieces/volumes). It's not as bad as it sounds, The original dataset is for baking to a low poly that I make with 3D Coat. Often there is bad data that I repair for low poly, but since I cannot edit such large dataset for high poly, have to make repairs to textures once baked.
×
×
  • Create New...