Jump to content
3DCoat Forums

Scott

Member
  • Posts

    60
  • Joined

  • Last visited

Posts posted by Scott

  1. Interesting!

    I wonder why it persists...

    I would imagine that as Andrew suggests, version 2.2 is needed and not the latest 2.3 supported drivers and toolkits from Nvidia.

    Andrew ideally could easily include CUDART version himself with the distribution, (Nvidia made it so end users should not need the toolkit anymore) this would make it easier for this issues.

  2. Too bad he can't have at least three other people- on showing re-topology/UV tools, one showing the painting tools, and one doing voxel sculpts. Add someone to walk around and answer questions while the others demonstrate and Andrew talks tech and you've got the beginnings of a great booth IMHO.

    Nvidia should have him hanging around or talking about using CUDA IMHO. He's one of the few 3D app developers taking it full on AFAIK.

    Yes i agree the Nvidia CUDA angle is something both Nvidia and Andrew should play up, as is cross platform D3D and OGL.

    It would be really nice to see an early Linux version running, just to let people know that it is a real option for 3DC. It's something Pixologic and Zbrush won't be

    able to show. And having a real sculpting application with real 2D Layer paint capability is sure to impress the larger studios that are often Linux based.

    But i also agree if he can't have some fast impressive sculpters on the floor at least get some impressive video demos running. Retopology, Voxels and particularly stuff like Curves, Snake etc which are very unique to Voxels and 3D, Also try and print up some CD's with manuals, videos and 3DCoat trials to hand out to ensure a popular Siggraph meet.

    And it might be time for 3DC loyalists to try and turn out some really nice sculpt/renders for Andrew to slideshow. ;)

  3. I have allready installed cuda 2.2 and toolkit (the 64 bit version),I have to install also the 32 bit version?

    64 bit cuda works for me,the 32 bit doesn't

    Yes If you wish to use both, you will require both be installed...

    Hopefully Andrew starts to use CUDA 2.2, and includes CUDART.dll in future versions of 3DC

    Mainly so i don't have to answer this question 100 more times.. :)

  4. Also Mudbox V1.07 works fine, the new 2009 version doesn't.

    You are incorrect i use Mudbox09 daily since it was released with an 8800GTS and have never had any issues apart from some AOshader issues, which were fixed with SP updates. It also works well on my 9650GT Notebook videocard, also a non-quadro. As mentioned Autodesk only officially generally support Quadro's for a range of their 3D and CAD software, If you want technical support with an issue, you generally only get it you are running certified drivers as mentioned previously.

    Back on topic but with a similar issue..... Andrew a small suggestion would be that now CUDA 2.2 has been released (today), perhaps you could update to CUDA 2.2 and then include the runtime .dll's with the 3DC install. This would seem to be less confusing and would allow users to only need 3DC and Nvidia driver to use CUDA. Currently the Toolkit is needed as well. CUDA 2.2 may offer some slight performance increases from 2.1 as well. Not a big deal but it may be useful when 3DC 3.0 is officially released.

  5. Hi Scott, maybe you'd better do a search before start a new topic. :rolleyes:

    Hi, Well it wasn't in this forum, and i don't think i plan on searching every forum topic, before i post every subject. It seems Andrew didn't see or answer the previous thread, So i guess it cannot hurt to have another? I'm not sure if Andrew has intimate knowledge of his OGL code, So perhaps this thread may be lucky enough to get a reply from someone with some knowledge on the subject. :blink:

    Personally 3DC has slowed down a bit over the 3.0, and having 5x faster OGL is a priority, Speed is always going to be an issue for 3DC, Mudbox and ZBrush are handled much better on laptops etc, So Blindness OGL extensions are quite useful to me.

  6. I'm wondering if Andrew has had a chance to evaluate "Nvidia's Blindness" OGL extensions.

    Nvidia are claiming that significant speed and performance increases are offered in the Nvidia 1.85 drivers and above, when using the two new extensions.

    http://developer.nvidia.com/object/bindless_graphics.html

    So I'm wondering if 3DC OGL versions on the PC, Mac, Linux could gain some speed by changing this? D3D doesn't have an equiv idea as yet, But I'm sure they will soon enough. It seems to be the ability to have GPU address pointers, Which is quite a powerful change to OGL, it seems many Developers believe it's a very good idea.

    It would be nice to think that our current Videocards could push 3DC 5x faster than they are currently?

    More discussion here: http://www.opengl.org/discussion_boards/ub...729&fpart=1

  7. The CUDA beta driver at the link on the first page is a much older display adapter driver, not only CUDA.

    So is each version of 3DCoat going to be locked to a particular driver set? Currently, nvidia is up to 182.50, but the driver for this alpha is 181.20, from several months ago.

    I installed all the files as instructed, but then had a complete system lockup trying to update my nvidia drivers back to 182.08.

    In other words, will there need to be a new version of 3DCoat every time Nvidia puts out new drivers in order to be able to use the new drivers?

    I really don't understand how it all works - the relationship between 3DCoat, the Nvidia display drivers, and the CUDA toolkit.

    Thanks to anyone who can answer. :)

    CUDA is NOT going anywhere, Nvidia are well on there way for release of 2.2. And many other releases beyond that.

    And have invested way too much time and money on this technology, will not be leaving it or abandoning this tech. So forget what a few LW users don't know.. :)

    Andrew should now be able to change the way CUDA is distributed, I believe the latest license allows Andrew to distribute CUDART.DLL with 3DC itself. (As of recently) So Andrew should be able to compile with say the new version 2.2 (beta) and release the CUDART.DLL binary with 3DC, rather than needing us to download the appropriate Toolkit. EULA now gives right to distribute Nvidia libraries. http://developer.download.nvidia.com/compu...EULA_081215.pdf

    I'm sure this would be good for Andrew in final release as it would cause less confusion for new and potential customers. Andrew would need to compile 2.2 with 3DC, and then users will already have CUDART rather than downloading the toolkiit in the future. All more modern drivers from Nvidia support CUDA, but the toolkit is still needed (but not really anymore) if Andrew updates 3DC.

    So the tookit should only be necessary for developers in the future, Customers will just need Application with CUDA (3DC) and Nvidia drivers for it all to work well.

  8. Pointers are pointers. It is number of pointers in memory. If this value growth much it means there is some memory leak. Agree, it in tot very useful to user...

    I would love to see a "Subdivision Level" display given in the same place...

    So if i have increased resolution twice, this would be "SubDLevel = 2" Just lets me know where i am, or how many times i have increased res.

  9. I was just curious if anyone has had luck with this. I textured this in 3D Coat and was wondering if anyone has gone directly from 3dcoat to Maxwell or if you have to translate it through an app like Blender or Cinema4D

    3DC exports various file formats, .LWO, .Obj etc...

    Maxwell comes with a plugin for 3D host applications, or can be used as a standalone package. So you could take a .obj from 3DC and use it Maxwell Studio to do lighting, camera etc, and then have a full Maxwell Render. Blender doesn't connect to Maxwell, Sketchup is likely the cheapest software with a plugin for it.

    I would download and try the demo, as it will give you an idea of how the pipeline workflow would work for you. I use it via LW and Houdini, but the Standalone Studio is good enough to get a decent render. (Being an Unbiased Renderer, beware the LONG rendertimes)

  10. I noted several esseential issues that should be fixed - seams across imported normalmap, better cavity detecting, VS->DP integration, tearing in VS, shaders issues, several more. They require more essential experimenting, so it will be done, but now I should start interface and then return to overall refinement of all tools.

    My plan over interface:

    - make possibility of docks

    - make all existing windows dockable

    - rearrange main menu

    - rearrange all tools as it was described by Shadow

    I will make frequent updates, but anyway first stage requires much work, so new updates will start in a week or so.

    Sounds like a good plan, after Interface implementation is done, bug fixes and optimizations could be considered beta stage, and it seems quite achievable for your expected release of May.

    I would of thought using Qt library being that it's free or commercial (LGPL) and would help with virtually no code changes between PC/Mac/Linux compiles, OGL/DX and also helps with language translations too. http://www.qtsoftware.com/products/ But i think that standard features like docking etc, is fairly standard throughout the Qt SDK. I also see other companies like Newtek, and Daz both heading that way too.

    Either way look forward to next week.

  11. For best results try to avoid topology that stresses polygons in awkward ways such as your example. You are free to do as you wish, I just do not see it as a shading bug in 3dcoat. Just trying to help you out.

    Avoiding topology isn't always an option, if LW can handle the object without smoothing errors, I really think most other applications should be able to handle the same objects with the same methods. So i agree with Phil, that 3DC should be able to handle the object, at least as well as LW does.

    Edit: I See Andrew has fixed the problem, Kudos...

  12. hi, i have some question for people who have tested 3d coat with cuda support.

    1: Is it really faster than the normal version ?

    2: I have a gforce 8600Gt with about 60c° when i use it to work. Can i use the cuda version without any danger for my gpu?

    thanks a lot

    The variables are many and it's hard to give an exact speedup. Your videocard has 32SP, which won't give you a large speedup,

    but then any speedup is always welcome really. Keep an eye on the heat, but i doubt that CUDA would change the temp much. I would still give it a go, it does tend to work when large brush sizes are an issue.

    Andrew is meant to receive official Nvidia speedup results from Nvidia sometime soon, to try and quantify the numbers.

  13. Hi,

    ...all I would need is a Photoshop replacement. (don't say gimp)

    +1 for 3d coat for linux (64bit? - SuSE).

    +1 For OpenSUSE x64 distro.

    As for Photoshop, CS2 works via wine, not sure about the others, But thankfully Andrew's painting tools are likely

    enough for most things anyway.

  14. This thread should be rolled into the graphics card thread IMHO.

    Sure changes the price for 3DCOat. For me ... changes the price from $200.00 to

    $300.00. I bought a new NVIDIA card about 1 year ago. NVIDIA GeForce 8500 GT.

    And now I find it is no longer any good for this app ???

    Why doesn't it work?, It seems others use Nvidia 8xxx series cards, without the need to complain?

    Why is it suddenly no good? CUDA nor Nvidia are a requirement for 3DCoat. CUDA is an optional an enhancement you can choose to take advantage, if you have the means too. Also try to remember it's unlikely your videocard is only used by 3DC. It has other uses and programs that utilize it.

    Vidoecards and computers will never be fast enough, If you are using 3DC as a profession, than the cost is inbuilt to your income you make by using the tool, if you are a hobbyist, than perhaps use the hardware you have until you are ready to turn professional, and can afford the Nvidia upgrade.

    Well ... then I must consider it part of the price of getting 3DCoat. You can break it

    down mentally however you see fit. But this is the way I see it.

    I think due to the buggy Alpha, missing progress bars (see VOX import), and other considerations ...

    you might want to think about extending the alpha another few months to get this thing right. Otherwise ... I might have to toss down another couple hundred and get ZBrush instead.

    Well as it's an Alpha you haven't paid money, which makes your comments about cost laughable And then you threaten the author with attempts to leave to another application, Classy! Feel free to go and use Zbrush and remember the cost difference right up front... Might help make the Nvidia choices seem rather more logical. Please feel free to check out Zbrush! :)

    I love 3DCoat ... and I do see it moving along well. But having to buy a new Graphics card

    is a huge consideration for me ... even if it's just another $100.00 card.

    Nobody is forcing you to buy a new videocard, If the applications you wish to run don't run well on your hardware, It's say it's your computer that is the problem and not that of the software you choose to run on it. 3D Sculpting software is quite computationally expensive and therefore it's likely that the cat and mouse game of performance will always be a hurdle to overcome. It seems you made a silly choice to invest in a substandard card for the thing you want to do.

  15. I don't care for this idea. I like to see the tools before I go to click on them, not think to click on it, then see it. Plus I don't care for the extra keyboard press every time I want to change tools, which can happen quite often while sculpting or with retopo. RMB wouldn't work because some tools need RMB.

    I would at least like a Full Screen option, being able to hit "Tab" like Photoshop or Mudbox and just have a complete full screen, is often all i need when using a Wacom. hitting Tab is all it takes for the menus and buttons to appear again.

    I do like Shadow's look it is at least clean and pro looking, I'm not 100% with all the ideas, but it's cleaner.... Maybe Andrew should start there and let users interact with certain decisions from then on. This discussion shows why Customization is important, because we will never please everybody. It's all about the basics and let users skin whatever they want. Silo got this bit right.

    For myself the Interface is about Workflow, i.e how fluid i can keep sculpting, painting and working without having to search,

    through menus, click 50 times to achieve a single function, stop working to think of where the next tool or function may be etc.. Pretty is nice and all, but workflow is far more important! (Do you hear me Microsoft?) CS3 for example was a huge step backwards, once the menus had all the items listed, now i have a little down arrow i have to press just to see what other items are on the menu.... It's thinking like this that should be avoided at all costs... :P

  16. Oh, I was not able to resist temptation again... And it is result...

    cloth.png

    I was so easy and natural to implement cloth processing in volumetric approach... I was not able to avoid it.

    Cool!, Now for your next trick..... How about Realtime Paint Physics! Drop a paint bucket on the object and let CFD drip over the objects via gravity,

    while allowing rotation etc of the object being splashed. I want drips of ink running down my 3D objects.... :)

  17. So, you have installed right cuda drivers/toolkit?

    I will upload 32-bit cuda version soon.

    Doh!, No i still had an older CUDA Toolkit installed, I uninstalled previous version, installed the latest 2.1 beta toolkit for Vistax64 release and

    all is working now.

    So thanks for the efforts... looks like there is a few issues as mentioned by others (precision?), but the performance with large radius brushes is quite excellent,

    although it seems to smooth slightly oddly. Cannot wait to see more, thanks for posting the early tests! :)

    Does 3DC need Full double precision FP? I know CUDA supports it, but only supported on 200 series Nvidia's, which means i guess you cannot use it currently?

    Keep up the good work....

  18. I unfortunately cannot seem to run either 64bit OGL or 64bit DX.....

    32bit version works, but of course i wanted to test CUDA... :(

    It doesn't seem to be a CUDA related error, and all the other CUDA examples i have work fine.

    "3D-CoatGL64.exe - Application Error

    The application failed to initialize properly (0xc000007b). Click OK to terminate the application."

    I will also try it out on my laptop ASAP but i have to load the CUDA toolkit for the 9650GT and lappy first.

    Any Ideas Andrew....? Vista x64, 512Mb 8800GTS.... (Fault Module Name = ntdll.dll)

  19. Started to play with CUDA. I can't postpone this topic. It is important for speed and for marketing. At least I must to know what can I expect from this technology, is it so cute as I expect or no.

    I hope to upload first results soon.

    Lol @ Cute comment, You are an animal.... :) Cannot wait to see what kind of results you get.

    does CUDA work on ATI cards?

    Not at the moment, but in the next version of CUDA, Cuda should emulate on the CPU.

    It means Andrew can code a single CUDA.exe that will work on Nvidia in CUDA mode, but also if an Ati or Intel card is found.

    At the moment the .exe will have to be separate for Ati and Nvidia users.

    ATi/Apple/AMD and the Kronos group will likely adopt similar ideas for their own open standard of OpenCL.

    PS... Andrew Linux is available in 64bit for quite some time, and also has support for OGL and CUDA.

  20. Cool,Andrew,about five days you make 64bit version,how many days will you make cuda support?

    I will take a guess and say quite a while... :)

    I have been involved in CUDA development for about 12months and for Andrew he's code will require quite a rewrite. ( I imagine)

    CUDA is still limited by many many technical limitations, getting a good speedup via CUDA even when rewriting code,

    is not always easy.. Sometimes the benefits of speedups are lost to the stupidity of the limitations.

    I imagine Andrew will want to thoroughly test it and work around these issues, which could be a long process.

    But i hope that Andrew's Genius will handle it with the same professionalism and speed that he shows elsewhere. :)

    But in comparison to 64bit there is a lot more coding changes to be done in the CUDA enhanced versions.

  21. "transpose" is great! You should probably rename it though to avoid complications with Pixologix. How about "Sculptform"?

    Really looking forward to 64-bit version!

    Not sure that is a problem, many 3D programs use Transpose because it's a mathematical term.

    It may seem a little silly to want to not standardize terminology as it only simplifies the issues.

    Anyway no complaints either way, but just to let you know it's a legit 3D term regarding algebra.

    http://en.wikipedia.org/wiki/Transpose

×
×
  • Create New...