Jump to content
3DCoat Forums

Please, Help!!!! 3D-Coat - best pc build -> (AMD Threadripper?)


Rygaard
 Share

Recommended Posts

  • Reputable Contributor

I know Andrew had one of his developers working on a GPU-based brush engine, and that Andrew liked the test build he provided. Not sure where that stands now, but it may help to put that back on the front burner, if enough people chime in and ask Andrew about it (support@3dcoat.com). That and Sculpt layers in the Sculpt workspace.

Link to comment
Share on other sites

  • Contributor
2 hours ago, AbnRanger said:

I know Andrew had one of his developers working on a GPU-based brush engine, and that Andrew liked the test build he provided. Not sure where that stands now, but it may help to put that back on the front burner, if enough people chime in and ask Andrew about it (support@3dcoat.com). That and Sculpt layers in the Sculpt workspace.

Thanks for letting me know!
I will contact the support to inform me more about the possibility of system of layers, improvement of the brushes and tools in the sculpt room. In my opinion, this should all be a priority in 3D-Coat.
 

Link to comment
Share on other sites

  • Contributor

Please, for everyone, who can also contact the support requesting these features to Andrew we will all win. The more people asking, Andrew will be able to turn his attention to those requests!

thank you

Link to comment
Share on other sites

  • Contributor
3 hours ago, Carlosan said:

I registered in the Mantis bug system of 3D-Coat and put several +1 +1 +1 +1 asking for the layer system in the sculpt room.

Yesterday, I sent an email to the support asking not only for the system of layers in the sculpt room.

As well as tool enhancements and tooling for the Sculpt Room such as:
- GPU-based brush engine
- A morph brush system similar to ZBrush
- Tool that allows the transfer of details from one mesh to another mesh.
- Improve mesh detailing in the Sculpt Room (because a very dense mesh is required to achieve good detail and not be low resolution).
- As well as other various tools aimed at sculpting.

I ask all the 3D-Coat community to also join in and ask for these implementations and improvements in the sculpt room. Just so, all of us asking, 3D-Coat developers will be able to know and put 3D-Coat in the way of those features that are extremely important to us artists.

  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor
23 hours ago, Rygaard said:

In the Paint Room, sculpting or detailing with displacement when I increase the subdivision of the mesh (Adjust Tesselation) and using textures of 8K the painting process gets slow.
I know Mari was designed to handle 8K textures up, but I wish 3D-Coat would have a better performance with 8K textures, it would be fantastic.

At the moment, it seems the best way to deal with this problem is to work with 4K maps for each character fragment as you said it (head, torso and etc).

I hope that with 128GB of memory and a GTX 1080Ti the procedure of painting and sculpting has an excellent performance working on 3D-Coat.

Excellent video tutorial, just like everyone else! :)

If not ask too much, could I suggest a video tutorial?
Could you demonstrate how to achieve fine, realistic detailing in a character on 3d-Coat? Well, I can not detail my characters in 3D-Coat in the same way as in ZBrush. The best realistic detail I can achieve is using ZBrush. :(
 

You would sculpt details in 3D Coat the very same way you would in ZBrush. That is to say, you subdivide the mesh sufficiently to sculpt high frequency detail and use the various brushes available to you in the apps, to perform the task. There are lots of additional preset brushes in the PRESET panel. It's a good idea to start with the Head bust available from the Splash screen > Voxel/Surface sculpting > experiment with and tweak the presets to suit your preference (save/update them by RMB clicking a preset and choose UPDATE PRESETS).

As you finish your rough and intermediate stages, you may elect to use LiveClay to dynamically subdivide in the face region, to sculpt fine details. You could also wait and leave the fine details until you have your low poly mesh in the Paint room. There, you can sculpt those details using normal/displacement maps. There, you actually have SCULPT LAYER functionality, including masking and even making localized changes to the depth, as shown in the video, a few posts back.

The same techniques you would use in ZBrush would apply to 3D Coat. The presets have various Polish, Pinch, Trim and Clay brushes that are very similar to ZB's.

Link to comment
Share on other sites

  • Contributor
1 hour ago, AbnRanger said:

You would sculpt details in 3D Coat the very same way you would in ZBrush. That is to say, you subdivide the mesh sufficiently to sculpt high frequency detail and use the various brushes available to you in the apps, to perform the task. There are lots of additional preset brushes in the PRESET panel. It's a good idea to start with the Head bust available from the Splash screen > Voxel/Surface sculpting > experiment with and tweak the presets to suit your preference (save/update them by RMB clicking a preset and choose UPDATE PRESETS).

As you finish your rough and intermediate stages, you may elect to use LiveClay to dynamically subdivide in the face region, to sculpt fine details. You could also wait and leave the fine details until you have your low poly mesh in the Paint room. There, you can sculpt those details using normal/displacement maps. There, you actually have SCULPT LAYER functionality, including masking and even making localized changes to the depth, as shown in the video, a few posts back.

The same techniques you would use in ZBrush would apply to 3D Coat. The presets have various Polish, Pinch, Trim and Clay brushes that are very similar to ZB's.

Thank you very much for the explanation of the detailing.

I'll go deeper into the Presets, I've even created some.

My intention of detailing the mesh is for 3D printing.

The first part in the character's primary and secondary forms is easier to do with all the fantastic 3D-Coat tools.

The part that is being a little complicated is in the fine details.
When I try to use live clay, every brushstroke I make on the surface of the mesh, the live clay tool generates a large amount of polygons, even controlling the detailing slide in the live clay tool. This results in an extremely dense mesh before even finishing the final detail of the character and the computer starts to slow down.

The use of the paint room for the fine details would be very good if my goal was animation, but I aim for 3D printing.

The only thing I need to understand in 3D-Coat is how to control the mesh in the detailing, since in zbrush the procedure is only to divide the mesh in the limit that the computer supports and to use the alphas with the desired detailing.
In the zbrush the details do not have a "low resolution" effect even in low subdivisions, whereas 3D-Coat still has this effect if it does not divide the mesh very much.

I know it sounds a bit confusing in what I'm trying to explain in fine detailing using 3D-Coat, but it's something very noticeable when you detail using ZBrush and then try to do the same thing in 3D-Coat. You end up feeling and seeing the difference.

Link to comment
Share on other sites

  • Member
On 12/16/2017 at 2:48 AM, Rygaard said:

I thought that with 128GB RAM, 3D-Coat could reach millions more polygons.

The maximum you work with is 10-15 million? Or does your work need not go beyond this amount of polygons?

With the PC Buid that I will acquire, I hope to be able to work fluently in 3D-Coat. :)

The layer system might be easier for zbrush I suppose since it is not proper 3D while working on it iirc. It's more of 2.5D system, then when you export it will create the 3D mesh. So maybe that is a reason why we don't have it in sculpt room. Just note the different volume types, surface and voxel, and that you can have a tree of objects(nested volumes within a volume) that you can sculpt across(still different from a layer system I know). Paint room of course has layer system that works how you'd want, but I imagine it's because it's not altering 3D data like in sculpt room. Displacement brush has worked fine for me personally when I have painted depth in the past in paint room, but I haven't done anything on model that is very dense in paint room.

The maximum is due to import bug. I import ply or fbx file that needs to be decimated version of several billion triangles, unfortunately 3D coat has error and crashese on any file over 500MB. It claims out of memory but I have plenty of RAM available and disk paging/virtual memory(which google results for 3d coat out of memory return). I have reported as bug but not sure when/if it'll be looked into and resolved. I experienced this issue on multiple machines and input files that I tried 3D coat on. There is also an import multiple mesh under File menu, but this also has bug, while it would allow me to import many smaller chunks of the original data it does not import the data correctly and looks like mess of triangles, also loses vertex colour.

Obviously would be better to work on higher resolution since we could retain better details at decimation stage that are more important. In current versions some areas are not very good because lack some triangles to properly form the shape.

 

1 hour ago, Rygaard said:

The part that is being a little complicated is in the fine details.

When I try to use live clay, every brushstroke I make on the surface of the mesh, the live clay tool generates a large amount of polygons, even controlling the detailing slide in the live clay tool. This results in an extremely dense mesh before even finishing the final detail of the character and the computer starts to slow down.

I use very small detail value like 0.025 in environment I am in. Often I am using Clean Clay -> Remesh though rather than adding new details via sculpting, so perhaps it works differently. You'll probably find the size of the brush and surface it covers affects the amount of density like Clean Clay brush does. Clean Clay brush also has Decimate option, if you have smoothing set to 1(full right on slider range), it will try to retain the overall shape while reduce the polygons. You can control how aggressive reduction is with the detail and brush size.

It might be possible to transfer paint room with displacment to sculpt room?(try look in the top menus). In the sculpt room, you can go to top menu "Geometry" and near the bottom/middle is something about proxy, there is option to use reduce or decimate and how much times, then you can toggle the proxy and you'll have lower res version to work on broader shapes, the tools available are reduced, you're not permitted to add/remove polygonal detail, only manipulate/move the current surface. When you toggle proxy mode off it'll return to original dense version and apply the broader strokes. It is not always accurate to the low res proxy version but usually it is.

Link to comment
Share on other sites

  • Contributor
51 minutes ago, kwhali said:

The layer system might be easier for zbrush I suppose since it is not proper 3D while working on it iirc. It's more of 2.5D system, then when you export it will create the 3D mesh. So maybe that is a reason why we don't have it in sculpt room. Just note the different volume types, surface and voxel, and that you can have a tree of objects(nested volumes within a volume) that you can sculpt across(still different from a layer system I know). Paint room of course has layer system that works how you'd want, but I imagine it's because it's not altering 3D data like in sculpt room. Displacement brush has worked fine for me personally when I have painted depth in the past in paint room, but I haven't done anything on model that is very dense in paint room.

The maximum is due to import bug. I import ply or fbx file that needs to be decimated version of several billion triangles, unfortunately 3D coat has error and crashese on any file over 500MB. It claims out of memory but I have plenty of RAM available and disk paging/virtual memory(which google results for 3d coat out of memory return). I have reported as bug but not sure when/if it'll be looked into and resolved. I experienced this issue on multiple machines and input files that I tried 3D coat on. There is also an import multiple mesh under File menu, but this also has bug, while it would allow me to import many smaller chunks of the original data it does not import the data correctly and looks like mess of triangles, also loses vertex colour.

Obviously would be better to work on higher resolution since we could retain better details at decimation stage that are more important. In current versions some areas are not very good because lack some triangles to properly form the shape.

The layer system in the sculpt room is not the same as we know it to be in Zbrush. In 3D-Coat is a volume organizer, both in voxel and in surface. As you said, the true layering system that happens in 3D-Coat is in the Paint Room and I also believe that the way you sculpt displacement in the Paint Room is completely different in sculpting the mesh in the Sculpt Room. Perhaps the advantage of sculpting displacement in the Paint Room is because you will be using a mesh with fewer polygons, much lighter than a mesh of millions of polygons in the Sculpt Room.

What did support tell you about this file import error of more than 500GB? There must be a way to fix this kind of problem ...
As you said that memory is not the problem!

True, you need to work with an extremely dense mesh to be able to access the finest fine details to be able to have the most accurate shape. Since 3D-Coat is making the error in the file what will you do? I would even suggest splitting the mesh into 2 pieces, but it looks like this is also causing you problems. Unfortunately, I can´t help you.

52 minutes ago, kwhali said:

I use very small detail value like 0.025 in environment I am in. Often I am using Clean Clay -> Remesh though rather than adding new details via sculpting, so perhaps it works differently. You'll probably find the size of the brush and surface it covers affects the amount of density like Clean Clay brush does. Clean Clay brush also has Decimate option, if you have smoothing set to 1(full right on slider range), it will try to retain the overall shape while reduce the polygons. You can control how aggressive reduction is with the detail and brush size.

It might be possible to transfer paint room with displacment to sculpt room?(try look in the top menus). In the sculpt room, you can go to top menu "Geometry" and near the bottom/middle is something about proxy, there is option to use reduce or decimate and how much times, then you can toggle the proxy and you'll have lower res version to work on broader shapes, the tools available are reduced, you're not permitted to add/remove polygonal detail, only manipulate/move the current surface. When you toggle proxy mode off it'll return to original dense version and apply the broader strokes. It is not always accurate to the low res proxy version but usually it is.

 

For sculpting, I'm not currently using live clay. I created a preset, where I adapted the buildup brush, activating remove stretching and normal sampling at 10% which has worked pretty well for me.

I have to learn how live clay works properly. I thank you for the tips.

Yes, you can transfer a displacement mesh from the Paint Room to the Sculpt Room. I believe there are 2 ways to do this. The first is through the Bake menu and the other is through the File / Export menu (choosing from 3 mesh density options) and then importing into the sculpt room.

I already tried using Proxy as if it were divisions that we knew in ZBrush. I sculpted each level of mesh reduction / decimate. But unfortunately, there was a mess in the mesh. I think to be able to work this type of method, you should use the reduced / decimal proxy making the changes with the Move and also sculpting then return to the original mesh.

Link to comment
Share on other sites

  • Member
2 hours ago, Rygaard said:

What did support tell you about this file import error of more than 500GB? There must be a way to fix this kind of problem ...
As you said that memory is not the problem!

I am not a customer yet, only used trial version at home and work to evaluate. Will be getting a license though, hopefully then will get paid support to fix :) Support has been pretty good already on these forums and few e-mail support(it's also holiday season for many so I don't expect much action until  2018).Just to clarify was only 500MB of 3D data(and binary so contains more triangles) that was limit.

I have a thread for the bug here:

 I have sent a bug report to support. On the thread you can see picture of import the file  the usual way, and the broken version is the multi-import feature(just  lets you import multiple mesh files at once instead of each one manually). Ideally though I would like to import larger dataset file, original ldata is 30GB for that model, but we have data up to 120GB currently(have to reduce that down to 500MB as whole or split into pieces/volumes). It's not as bad as it sounds, The original dataset is for baking to a low poly that I make with 3D Coat. Often there is bad data that I repair for low poly, but since I cannot edit such large dataset for high poly, have to make repairs to textures once baked.

Link to comment
Share on other sites

  • Contributor
40 minutes ago, kwhali said:

am not a customer yet, only used trial version at home and work to evaluate. Will be getting a license though, hopefully then will get paid support to fix :) Support has been pretty good already on these forums and few e-mail support(it's also holiday season for many so I don't expect much action until  2018).Just to clarify was only 500MB of 3D data(and binary so contains more triangles) that was limit.

Really, the 3D-coat community is wonderful! :)

I messed up talking about 500GB and actually I was meaning 500MB!

Could this problem you are passing be a limit on the Trial version of 3D-Coat?

Link to comment
Share on other sites

  • Member
46 minutes ago, Rygaard said:

Could this problem you are passing be a limit on the Trial version of 3D-Coat?

It is possible but I think it would mention that instead of crash with out of memory error. Would also be mentioned somewhere if that were the case :)

Also turns out that one ofthe mesh I used recently it took over 8 hours to do AUTOPO with draft quality(fastest), on my personal (intel skylake quadcore 3.5GHz, 32GB RAM, GTX 1070) not threadripper machine at work. Support/lead dev said it should only ever take 5-10 minutes at most, I will try again on the fully repaired model soon when it's done and at work to see if any difference. I think because it is not water tight mesh it had trouble, I also did AUTOPO of 1 million triangles target which is much higher than usual default. If it's still slow at this I will try get authorization to send the model for testing the feature, maybe is bug that can be fixed. The AUTOPO was really good quality result even for draft, I was impressed but of course at 8 hours long, not easy to iterate/tweak.

The developers are responsive to resolve issues like this.  I did trial on older version months ago(but team was too busy with client work to show interest). Back then, all I really tried was import some data, I think it could handle more than 500MB then, but it took very long 4-8+ hours to import(but save as .3b native 3d coat format afterwards and it loaded instantly), was also the case for some smaller <500MB files I think(PLY extension, binary format). IIRC 4.8 release addressed this performance issue and I could start to really try use the trial version properly. So with some luck, next release will be able to resolve these import issues and artists will be happier can import more detail.

As a note this is data from 3D scan software, it is all over the place(not spatially sorted clusters, and quality of mesh data/topology can be poor), other industries maybe not affected with similar sized datasets.

Link to comment
Share on other sites

  • Reputable Contributor

@Kwhali

Autopo requires a clean error free mesh to function correctly. This is true of other autopo programs, Mudbox will crash if the mesh is not good. It does it's own validation and will let you know the problems. 

A mesh with errors will take hours, crash 3DC and in the end if it does not crash give a poor retopo mesh in a lot of cases.

I have not seen your Autopo settings, this has a big infulence also.

I repair scan models at times, though not that it appears on the scale of yours but still know some tips. 

Example, you can switch if you using surface mode to voxel mode to fix errors. When returning 3DC will re-bake the color vertex data to the surface mode model. Not talking about baking to the paint room here. Voxels are the best place to fix errors in scanned data. 

In all the above, I just mentioned a few things, if you like to get together on a Skype session to ask me questions about autopo and scan data plus some autopo settings just pm me. I do this as a volunteer...

  • Like 1
Link to comment
Share on other sites

  • Member
1 minute ago, digman said:

@Kwhali

Autopo requires a clean error free mesh to function correctly. This is true of other autopo programs, Mudbox will crash if the mesh is not good. It does it's own validation and will let you know the problems. 

A mesh with errors will take hours, crash 3DC and in the end if it does not crash give a poor retopo mesh in a lot of cases.

I have not seen your Autopo settings, this has a big infulence also.

I repair scan models at times, though not that it appears on the scale of yours but still know some tips. 

Example, you can switch if you using surface mode to voxel mode to fix errors. When returning 3DC will re-bake the color vertex data to the surface mode model. Not talking about baking to the paint room here. Voxels are the best place to fix errors in scanned data. 

In all the above, I just mentioned a few things, if you like to get together on a Skype session to ask me questions about autopo and scan data plus some autopo settings just pm me. I do this as a volunteer...

I have found switch to voxel mode risk losing details/form which is big no-no. Perhaps I don't know how to do this properly. I understand that voxel is like pixel but cube, and whole mesh resolution quality is affected by how small the voxel is, if want to retain very fine details that are in some parts of the volume, you must use very dense voxel volume? Environments I work on lately have been buildings like a house or spiritual/cultural building with intricate details. We do not have best scan equipment and going back to recapture poor data areas is not always an option.

Import data is not water tight, often with holes/damage of missing or misinterpreted detail(photogrammetry). It is surface volume, I did try import to voxel volume with higher density mesh, this did not seem to have errors like importing surface volume >500MB, but the method just imported the vertex then a voxel sphere and merge nearby ones, very much looked like a grid and could not import good quality mesh that resemble the surface data triangles. Either too sparse with surfaces not connecting, or too dense areas blobby., inbetween lots of holes due to grid like reconstruction.

I did not do too much with AUTOPO settings, I think I set capture detail fairly high at around 90-95%, draft quality, no stroke guidelines or density masking, I can't recall if I allowed decimation from 10 mil triangles, I don't think I did, target triangles was also raised to 1 million(high, but for this stage in pipeline is our low poly, it will be split for optimization and LOD later). I wasn't sure if it was going to finish, on Windows I'd see 25% progress on titlebar and it would be completely unresponsive and not like to show any windows contents, at home I was able to run it while I am at work,  then come home and it still going for a bit while I browse forums, say 0% on linux still. But it did finish(single core/thread operation most of time from what I noticed), and the quality impressed me, I expected much worse, it identified good topology direction, kept shape pretty well too(although some flat areas had waste of many quads while smaller detailed areas with smooth curve shapes lacked and were a bit blocky at angles). It did fail in a few areas causing holes(relatively small, usually only a few triangles beyond 2 areas), if it didn't take so long I could probably try the stroke guides and density mask features it suggests to refine results, but I imagine that means recalculate again from scratch so I did not do that.

I did not expect good results or even for the autopo to have success. This was a test with data that was only half way repaired, many interesection and holes + problem areas have been fixed since. We presently just use decimate and sometimes MeshMixer remesh/reduce tools(but this tool is very slow and frustrating to use on so many triangles), it has max deviation settings which has been really interesting. I have noticed that 3D Coat will decimate flat surfaces well, but if the mesh is slightly not flat, it still tries to retain these forms, so some noisy area like carpet or weaving decorations where most detail can be brought out in real-time via engine(normal maps or displacement/parallax) instead of triangles, these need to be flattened/smoothed out. Oddly decimate tools don't seem to allow apply to a selection like poly remove or similar(the polygonal lassoo is disabled and switch to tool removes the mask selection if made prior), requires bit more care around edges so not to affect surfaces nearby that should not lose detail. When this is done in dense areas that are not important due to being quite flat, 3d coat decimate works charm, much faster than mesh mixer too. I should request max deviation(error metric setting) variant though for 3D Coat I guess :)

Really interesting about the voxel volume being able to transfer vertex colour back to sculpt volume during conversion. I only recently started to use vertex colour with imported mesh which has been helpful(even though it's quality is low from all the decimation of original data to get into coat). If I find time to experiment some more with autopo on these meshes, I'll be sure to  reach out to you :) (or forum post in general so others can benefit)

Link to comment
Share on other sites

  • New Member
On 1/1/2018 at 7:22 AM, Carlosan said:

Gigabyte x399 Aorus Gaming 7

or 

ASROCK Taichi X399

Any recommendation ?

I have the Gigabyte x399 Aorus Gaming 7 and am enjoy it!  I haven't put it through any major rigors or tests yet but so far it's holding up pretty well!  I was able to get it for about $300 USD during the Black Friday shenanigans this year, so it definitely helped be a deciding factor.  

  • Like 1
Link to comment
Share on other sites

  • Reputable Contributor
On 12/16/2017 at 10:36 PM, kwhali said:

I have found switch to voxel mode risk losing details/form which is big no-no. Perhaps I don't know how to do this properly. I understand that voxel is like pixel but cube, and whole mesh resolution quality is affected by how small the voxel is, if want to retain very fine details that are in some parts of the volume, you must use very dense voxel volume? Environments I work on lately have been buildings like a house or spiritual/cultural building with intricate details. We do not have best scan equipment and going back to recapture poor data areas is not always an option.

Import data is not water tight, often with holes/damage of missing or misinterpreted detail(photogrammetry). It is surface volume, I did try import to voxel volume with higher density mesh, this did not seem to have errors like importing surface volume >500MB, but the method just imported the vertex then a voxel sphere and merge nearby ones, very much looked like a grid and could not import good quality mesh that resemble the surface data triangles. Either too sparse with surfaces not connecting, or too dense areas blobby., inbetween lots of holes due to grid like reconstruction.

I did not do too much with AUTOPO settings, I think I set capture detail fairly high at around 90-95%, draft quality, no stroke guidelines or density masking, I can't recall if I allowed decimation from 10 mil triangles, I don't think I did, target triangles was also raised to 1 million(high, but for this stage in pipeline is our low poly, it will be split for optimization and LOD later). I wasn't sure if it was going to finish, on Windows I'd see 25% progress on titlebar and it would be completely unresponsive and not like to show any windows contents, at home I was able to run it while I am at work,  then come home and it still going for a bit while I browse forums, say 0% on linux still. But it did finish(single core/thread operation most of time from what I noticed), and the quality impressed me, I expected much worse, it identified good topology direction, kept shape pretty well too(although some flat areas had waste of many quads while smaller detailed areas with smooth curve shapes lacked and were a bit blocky at angles). It did fail in a few areas causing holes(relatively small, usually only a few triangles beyond 2 areas), if it didn't take so long I could probably try the stroke guides and density mask features it suggests to refine results, but I imagine that means recalculate again from scratch so I did not do that.

I did not expect good results or even for the autopo to have success. This was a test with data that was only half way repaired, many interesection and holes + problem areas have been fixed since. We presently just use decimate and sometimes MeshMixer remesh/reduce tools(but this tool is very slow and frustrating to use on so many triangles), it has max deviation settings which has been really interesting. I have noticed that 3D Coat will decimate flat surfaces well, but if the mesh is slightly not flat, it still tries to retain these forms, so some noisy area like carpet or weaving decorations where most detail can be brought out in real-time via engine(normal maps or displacement/parallax) instead of triangles, these need to be flattened/smoothed out. Oddly decimate tools don't seem to allow apply to a selection like poly remove or similar(the polygonal lassoo is disabled and switch to tool removes the mask selection if made prior), requires bit more care around edges so not to affect surfaces nearby that should not lose detail. When this is done in dense areas that are not important due to being quite flat, 3d coat decimate works charm, much faster than mesh mixer too. I should request max deviation(error metric setting) variant though for 3D Coat I guess :)

Really interesting about the voxel volume being able to transfer vertex colour back to sculpt volume during conversion. I only recently started to use vertex colour with imported mesh which has been helpful(even though it's quality is low from all the decimation of original data to get into coat). If I find time to experiment some more with autopo on these meshes, I'll be sure to  reach out to you :) (or forum post in general so others can benefit)

 

  • Like 1
Link to comment
Share on other sites

  • 3 weeks later...
  • Carlosan unpinned this topic
  • 1 month later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...