Jump to content
3DCoat Forums

DragonFist

Member
  • Posts

    28
  • Joined

  • Last visited

Everything posted by DragonFist

  1. Thanks, been here for a while though mostly posted bugs/troubles I ran into. Love 3DC. The attention to detail and customer needs from Andrew and the team is top notch!
  2. If you own a 3Dconnexion SpacePilot (or any other of their devices, or just to help for that matter). Head over to 3Dconnexion.com :: View topic - Adding SpacePilot support to 3DxWare 10 and vote on the poll to get it supported by the latest drivers. For some reason, 3Dconnexion thought it was a great idea to block support for this device even though it was still being sold in 2010 and is supposed to be on "Continued Support" (which means, according to their own site: Continued Support: The software for this product will be maintained until this deadline.) until 2013. These drivers add full driver support for the first time, where one can use it in any program as a mouse, joystick or keyboard. It is a big deal to be not have support for this driver. Customers have been asking for this functionality for years (earliest posts I have read date to 2007). One can still buy this device from vendors. 3Dconnxion stopped selling in Mar of last year, but that's no excuse to leave customers who bought it before or after out in the cold now that they are finally providing full drivers that don't require software makers to hardcode support for the device. In addition, the problem isn't that the drivers can't support the device (it uses the same tech as the current model, just doesn't have a color LCD and has less button). The drivers provide the device data. Programs with hardcoded support can still use these drivers. The part of the software that maps the data to joystick/mouse/keyboard bindings has simply been coded to ignore the device, presumably to encourage owners to upgrade to the SpacePilot Pro (a $500 dollar device, hell the original SpacePilot was $400. That's a lot of money to pay and not get the fully functional drivers!) So, head on over there and let them know they should support their products. Thanks
  3. Just an idea, since I've found myself in need of a similar feature recently, is a means of baking or projecting the UV set of one mesh to that of another for re-topology so that the UV sets match. Like the GATOR tool in XSI. (I know the surfaces can be baked to another UV set, but relatively matching UVs would make some things easier at times.) Like I said, found myself looking for a tool for this (found xsi) and thought it would be nice to have in 3D Coat as it would make it more of a one stop shop for these kinds of operations.
  4. Thanks. It is all good. Glad you found the cause and fixed it. I finished up that model in the old version and will use the new from now on.
  5. As it turns out, if I re-import the model, it works fine. The .3b file that I was using from the original import from an older version still crashes. However, reimporting it handles the problem.
  6. Okay, opened it up quick and tried it. Still crashes when the brush goes over the inner geometry. ( I should note, I didn't cap the holes with 3DC. I did it manually in LW.) I have no issue sending you the model if that would help (and I knew where to send it.) Other than removing the layers I wasn't using at the moment and capping off the holes, it is just the basic model that MakeHuman spits out. Best, Shawn
  7. I sent a couple of crash reports. But wanted to voice it here. I've been importing a makehuman object that I cap off the holes on. Up until version 35.03, it worked fine. Any version with the new brushes crashes as soon as the brush passes over a section with inner geometry (such as the face with the inside of the mouth) in PPP. I can paint fine as long as I don't pass the brush over, say the lips or ears, where there is such geometry. It was fine in earlier versions prior to the 35.04 set. Thanks for a great product. With the development so far, I have doubt we'll see a solution soon. Best, Shawn
  8. A feature request. You know how, if you have a color selected, materials modulate that color. Well, I'd like to be able to have that effect applied to a layer's color instead of a single color. That way one could apply subtle changes of shade to a layer based on a mask.
  9. It can be quite useful as it is. But sometimes I rotate my view and the scale of the mask changes. This is annoying, as I'm not zooming, and I have to rescale to get back to where I was. Also, sometimes in the retop room, you rotate the camera using the widget at the top and curves get drawn on the mesh. May not be related but a similar issue.
  10. As for tool zooming, I think it can be useful as it is but it can also be a pain. Would like to see it be toggled by a checkbox or something.
  11. I really don't want to go down this road further. The facts are that this statement simply isn't true. There are so many other possible factors that there is no way that statement can factually be made. I can think of several things to check right off the top of my head - corrupt drivers, any of the issues mentioned in earlier posts that are now visible because the drivers are engaged, etc. Secondly, if the problem were solely driver related, then the majority of computers out there would be experiencing the same problem with the same drivers. I don't assume a cause until I've found what fixed it. Because I don't know what caused it and any theories I have are just that until then. And that's the issue. Anyhow, use what works for you. Both ATI and Nvidia have worked for me.
  12. I don't argue your point. My problem is with blanket comments as to the usability of one product over an over based on "their drivers suck" when given information is likely not caused by said drivers (for example, catalyst control panel not opening shows problems well beyond a driver problem). I don't argue that Nvidia has gained a strong step foward in the gpu rendering arena. I would not buy ATI if this was the application area I intended to use. For games, I would buy ATI at the moment. For 2D and 3D general applications, I've found no legitimate basis to prefer one over the other. I've had BSODs with either, for a while, Nvidia's drivers resulting fans shutting off(never had this issue with ATI). Most BSOD issues I've experienced when I traced them down actually had nothing to do with the video card but with either the system bus or the harddrive controller during access to the page file and were handled by upgraded (or downgraded) drivers for the motherboard. That's been my experience. Obviously, mileage may vary. I don't have the same system as others and can't base their experience on theirs. But it goes both ways. In the end, buy what you want. I'm no ATI fanboy, I've own cards by both companies. And generally have been happy with all purchases, (one exception was the dual gpu cards ATI produced in the late 90's, they worked with almost nothing and crashed on nearly any game or app unless I disabled one gpu. Wasn't happy with that one at all.) But it bugs me a bit when I see these blanket statements made regarding one product over an other based on one bad experience that is likely a bad configuration, insufficient power supply, a bad ram stick, etc. When I installed the current card on my current system, I had crashes all the time when ever I the card was put use for 3d. Changing drivers did nothing. Returning to the old card did. Finally, I discovered that my 5 year old power supply was the problem and couldn't keep up with the new card. A new power supply handled it and I've had no issues since. And every time I've fixed someone's issues, it has been similar, the card didn't the case and was on an angle, the 3rd ram stick was bad. Old drivers from a card by the same company were never removed before installing the new one, etc., etc. I've just never seen frequent crashes be the result of drivers from either company, with some noted exceptions, like the fan issue mentioned earlier (which was caught and handled by Nvidia with recommendation to rollback until they found the cause). Not saying it can't happen but I haven't seen it, yet I see it claimed to be the reason often.
  13. I disagree with the anti-ATI sentiments. I'm currently using a 3800 and have owned primarily ATI cards in the past with two Nvidias in there. I had just as much trouble with the nvidias as I have had with the ATIs, which hasn't been much. Basically, if there was a problem with my system, it didn't matter which card I put in, both crashed under the same circumstances. Pretty much 100% of the time, problems were the result of the software in question or, usually on a newly built rig, with the configuration of the the rig (usually unstable memory). I can't think of a single time that I had a show stopping issue that turned out to be an "ATI Driver" issue in the end. I've had tons of issues with Nvidia's RAID drivers on their motherboards (I've had to pull the a specific storage driver and mix it with current MB drivers to not have crashes under high disk usage) and likely never buy a motherboard product from them again. But I've had little issue with their video cards. Anyhow, I've said my peace. I think both companies have made great hardware and would put either into my machines anytime (though I tend to pair Nvidia with Intel set ups and ATI with AMD set ups).
  14. Ah, cool. Looks like my lack of knowledge then. I'll check that out later and see. But looks like I missed the triangle aspect of the interface.
  15. First off, I want to say that I am loving version 3. I'm trying out the free trial and plan to buy the upgrade when I get my check for the new month. Couple of things I'd love to see: Recovery from hitting memory limit - Currently, if some action hits an out of memory error, 3D Coat crashes and I have to restart. I normally save before any major action so it isn't a big deal (and I seem to have an imagination larger than my system's ram, lol), but it would be nice to have it recover and roll back to the beginning of the action so I can change some settings and try again. VoxStamp Object Sub-folder support - Here, it is possible that I am just missing something, but I'll blab away anyhow. When I added Tinker's models, I couldn't find them right away. Took me a bit to figure out that they were in a sub-folder of the merge folder. They don't show up in the merge list unless I move them to the top and restart 3D Coat. It would be really nice if the folders showed in the list and if you click them you move down to that set. It would make it possible to organize your merge stamps and not have a thousand random ones to search through. Again, If I'm missing some setting that makes this happen already, forgive my ignorance(certainly seem possible since the add-on pack installs them in such a way.) Anyhow, I love this product. It is an excellent addition to my 3D tools and as I learn more and more, the ease of creating highly detailed objects is apparent. Shawn
  16. I think the problem is that one cannot expect a film-quality hair solution from a model detailing/mapping/sculpting program. Yes, Zbrush does fibers, but they are a polygonal/mesh based thing. It isn't a full-rendering based fine hair and styling tool that can only be supplied by the final rendering program. That said, a lot can be done with the polygon/mesh based stuff and it can look quite good. It just isn't going to give King-Kong fur or Hair-commercial simulations of hair. It isn't realistic to expect it because it is a different animal that depends on the renderer.
  17. I have used zbrush but am not versed in it. So I am not sure if this is the same thing or similar. While I am also not well versed in mudbox, I know that it allows you to move any mesh data or layers that you are not currently working on to a disk cache. This allows you to work on very high poly meshes without having all the data in physical memory and therefore, you can work. It isn't as clean as Zbrush's 2.5D trickery, but it gets the job done and you can get very, very detailed with your mapping. This is the kind of thing I'm looking at. Seems to me that one could just cut up an object and do it piece by piece now, but the above I think would be more flexible.
  18. I do understand that it would not be easy. I do think, though, that it would be the kind of feature that would put 3DC as a viable alternative to Zbrush, as the handling of high res meshes is where it excels. Though, certainly, much can be done now and with planning, same results can be achieved I think. Thanks for looking at it. Best, Shawn
  19. Granted, I have not programmed this and don't know for sure, but it does seem that on import, 3DC is trying to load the entire object into memory. If I have an object with 4 UV sets and set the textures to 4096 x 4096 and set the millions of polygons to the 18 that would be needed for that resolution, the mess will give each material a fourth of that mesh resolution. This results in a model that is not usable and the entire mesh is in memory. I am talking about being able to have each individual material at the proper resolution, mesh-wise, for the texture size and individually move each into and out of memory as needed to work on the object. Otherwise, you never even get to the point of hiding things. Yes, I could divide the object into smaller pieces and working that way. I am just hoping to have a more out-of-the-box solution. Also, please don't take this as criticism. The is a great tool at a great price. And with Andrew's obvious skill, I can see it really taking on z-brush down the road.
  20. This is an awesome step in the right direction. A few of suggestions. 1. Create a cache file on disk for the mesh data that is held in memory. Allow the user to set where it is and its max size. 2. By default hold the high (mid) poly data here and show only the carcuss in memory with a texture preview. 3. Have some sort of toggle on a list of the sections of a model that when set, moves that portion into memory so it can be worked in full res without the rest being in memory. 4. Allow further moving of data to disk via masking or "making invisisble" selections of polygons. 5. Have a way to automate the assigning of "millions of polygons" automatically based on the texture size so there is enough to handle the texture. I'm kind of ambitious and want to do a highly detailed, high poly model. If, the majority of the the polygon data can be moved out of memory so I can get very detailed on one section at a time without worrying about my hitting my 2gb of ram limit, it would be great. Thanks again for the amazing work and unbelievable development pace! Shawn
  21. I'd like to think that we're not "old timers" quite yet.
  22. I'm on it. I'm trying to discuss the current scene regarding MH as the beta is beta and under NDA. That said, I am hopeful that the current scene will not be the same when that's done. One thing that I know is that the export of Callada from MH has a number of xml errors that causes it to not even be usable by the Whoopla converter. It loads in Maya 8 but not Maya 8.5. So those issues need resolving, irrespective of the LW 9.5 beta. Again, I really don't want to come across as overly negative on this, as I do like MH and recognize that it itself is not a finished product. I mostly think that they should rework the base mesh to use only quads and make sure the obj and collada exports are fully standards compliant (talking mostly about the collada export on this). It would then be a completely awesome tool that works with all packages out-of-box, not just blender.
  23. I use Merge Trigons. Two problems there are that the triangles in the make human mesh are singular and scattered so Mtrigons doesn't handle most and it doesn't handle the messed up normals. Granted, the messed up normals are probably the result of the obj importer on LW. Anyhow, don't get me wrong, MH is still a great program. I just wish it was a bit easier to import to LW. As the reason to use such a program is so you don't have to individually model tons of humans. This effect is somewhat lessened by the clean up process nessecary.
×
×
  • Create New...