Jump to content
3DCoat Forums

Instability of 3D Coat with large files


Recommended Posts

  • Advanced Member

Hi

I get multiple crashes per day with a Voxel sculpt of mine now. Besides weird crashes I also get other errors. When I increase virtual memory, the problem goes away, but I still get the warning sometimes "your work has become unstable. Please save your work now" with a subsequent crash of 3D Coat.

The File is pretty large both in amount of polys on screen (around 65-70m), amount of voxels amd amount of layers (over 50). To reduce the amount of voxels onscreen, 7 of the largest and least used layers have been reduced in size by tapping on the "simplify" option that writes the original to disk. Now around 55-58m Polys are still onscreen.

 

 

My System:

 

i7 970 (6cores, 3.2 GHz)

24G RAM

240G SSD System Disk

500G Data Disk

 

Windows 7 64bit

 

EDIT: Oops, forgot to post my 3D Coat version. I'll get the exact version when I get home over lunch, for now I can say its this one

most current 4.1 build, CUDA Open GL 

 

1) My virtual memory has already been raised to 250G. I could give it maybe another 50G, but then I am out of Disk space to allocate (my data disk is only 500G, its a Velociraptor thus small in size). Should I have more virtual mem space? Is it advisable to add another disk to the system just for virtual memory (giving me 500-1000G of virtual memory)? Is it important to have a fast disk for virtual memory, or could I get a cheap standart 1000G Disk for this task?

 

2) Then I have the problem of "dissapearing content" on some layers. The weird thing is: if I save the now empty looking layer to a 3b file and open this, I see the original content again! So the content is still there, just invisible. Any suggestions what is causing this? What can I do besides reimporting and realigning this layers to their original places?

 

3) Any general suggestions how to get 3D Coat to be more stable with such large files? Most probably the best thing would be to separate parts of the model into different files, but then I really like to see everything together to be able to tell if it looks good or not. Why is 3D Coat eating up virtual memory like mad? Is it the size of my layers? Should I try to split up large layers into multiple smaller ones? Should I reduce the amount of layers by combining multiple layers into a single one?

 

 

Any help and suggestions are appreciated

 

 

Gian-Reto

 

 

EDIT: Just read Andrews comments on memory. So 3D Coat can use 4G of virtual memory max? That is not my expierience, I had a file I couldn't open anymore with 120G of virtual memory, increasing to 150G virtual memory solved the issue for me. Of course I have no way of telling what the virtual memory of my machine is used for, but as normally its only 3d coat running, I don't see what else could eat up so much virtual memory.

 

Is this remark of Andrew from 2008 still correct?

 

On the other hand, I see that 3D Coat is not really using my available physical memory really, only about 6-8G on the large file I have trouble with out of 24G. Why is that? Is there a way to tell 3D Coat to go ahead and use 20G for example? Will that help at all?

 

Could anyone give a breakdown as to what 3D Coat really can use, does use in general and why? Also, is there a good tool to debug virtual memory usage for windows (that gives a real time breakdown what the virtual memory is used for)?

 

 

 

EDIT2: I just seen that the performance metrics of the task manager should show me at least totals. I'll check it over lunch and post the numbers seen with 3D Coat empty, small Voxel sculpt and the problematic file in this thread.

Edited by Gian-Reto
Link to comment
Share on other sites

  • Advanced Member

Will you please add your graphic board ?

 

Sorry, of course! Forgot another thing.

 

Nvidia GTX 580, factory Overclocked.

 

 

If you need more exact specs I will provide them.

 

 

Now that I think about it:

 

Could the memory problems come from to little VRAM being available? Nvidia cards suffer from quite anemic memory compared to AMD for some time now, and in the Fermi Generation the VRAM was ridicously low.... 1,5G on the GTX 580 AFAIK.

 

Apart from possibly loosing some CUDA speedup because of the cut FP Performance, would one of the new GTX 780 6G VRAM cards give 3D Coat more headroom to work with?

 

 

EDIT: Got the exact Version:

 

3D Coat 4.1.04A CUDA GL

 

 

Here is the crash error I get when the file crashes on startup (doesn't happen every time). This is with the CUDA GL Version

post-6386-0-57666700-1406548709_thumb.pn

 

Edited by Gian-Reto
Link to comment
Share on other sites

Usually is Andrew who reply here any support request.

 

In meantime, will you please test the same scene using the DX -non Cuda- version ?

 

And if you like, the last DX experimental version ?

http://3d-coat.com/forum/index.php?showtopic=10395&page=1

 

yes, you can have 2+ app version installed at same time, remember to not overwrite your original version project to avoid compatibility issues.

 

ty

Link to comment
Share on other sites

  • Advanced Member

Okay, I first tried to start the CUDA GL Version of 3D Coat with this file provoking some crashes, and then was able to open the file in the non-CUDA DX Version, both of 4.1.04A build.

I observed the detailled memory metrics in the task manager (Resource Monitor).

 

CUDA GL:

 

Crashes gave the error you can see in my post above.

 

Interesting thing is, altough the allocated virtual memory was around 6G at the time of the crash, 4G of that is physical memory, there is a weird metric "freeable space" (sadly my Windows is set to german, cannot find the exact translation). This reached 23,5G just before crashing, so I GUESS this might be were the problem might lay. However the Windows help page support.microsoft.com/kb/2299554/de (in german, cannot find the english equivalent) tells me to ignore the freeable memory metric, so the question is really:

 

Why are only 4G of physical memory assigned? Why only 2G of Virtual memory?

 

Any suggestions?

 

 

non-CUDA DX:

 

Aaaaaaand it finally started up. Not only that, at least the problem with the invisible objects is gone (which is a known 3D Coat bug to me, that is usually created by 3D Coat not having enough Memory (or at least until now more virtual memory solved the problem)).

 

I didn't had time to really work with the file to see if it is more stable.

 

when I check the memory metrics in the resource monitor, I see that the "freeable memory" hits about 23.2G After the file has loaded.

Around 5.2G of physical memory used, about 100M of additional virtual memory allocated.

 

 

1) Why is the non-CUDA DX Version starting up fine while the CUDA GL Version crashes? Anything else I should try here?

2) Why am I getting all this memory related problems when I am not even using a quarter of my physical memory, not to speak of the 200G of virtual memory assigned to the system? Are there any Memory limits of 3D Coat that come into play here?

 

 

I will test the experimental version tomorrow morning if that helps with the investigation. I will also try the non-CUDA GL and the CUDA DX Version to see if its CUDA or GL which is responsible for the problems.

 

 

Thanks for all the help till now, Carlosan!

Edited by Gian-Reto
Link to comment
Share on other sites

DX version handle memory better than GL version.

 

thinking about memory relocation in your pc

 

is related to this ?

http://support.microsoft.com/kb/978610

 

Click Start, type msconfig in the Search programs and files box

In the System Configuration window, click Advanced options on the Boot tab.
Click to clear the Maximum memory check box, and then click OK.
Restart the computer.

 

uncheck_maximum_memory_thumb.png

 

 

- BIOS settings

Enable the memory remapping feature is on ?

Link to comment
Share on other sites

  • Advanced Member

DX version handle memory better than GL version.

 

thinking about memory relocation in your pc

 

is related to this ?

http://support.microsoft.com/kb/978610

 

Click Start, type msconfig in the Search programs and files box

In the System Configuration window, click Advanced options on the Boot tab.

Click to clear the Maximum memory check box, and then click OK.

Restart the computer.

 

 

 

 

- BIOS settings

Enable the memory remapping feature is on ?

 

Thanks for these advices carlosan, I will check them out tomorrow morning.

 

 

What I can already say is that my windows 7 installation is showing the full 24G of Memory. So the problem in the link should not affect my installation, right?

 

 

If you say the DX version handles Memory better than the GL Version... what is the difference? I mean, will it use 10% less memory, or, just half the amount of memory?

Link to comment
Share on other sites

DX versions only work in Windows and are optimized for consumer level graphics cards (Geforce, Radeon), OGL version is faster on professional Quadro cards. So unless you have a Quadro, or are a Mac/Linux person, DX version is likely to be better suited to your hardware.

Link to comment
Share on other sites

  • Advanced Member

DX versions only work in Windows and are optimized for consumer level graphics cards (Geforce, Radeon), OGL version is faster on professional Quadro cards. So unless you have a Quadro, or are a Mac/Linux person, DX version is likely to be better suited to your hardware.

 

Good to know... this is something that might be helpful to update the installer with.

 

Currently it will tell new users just:

 

"DX Version is faster on slower cards... GL Version is faster on faster cards" (or something similar). If the DX is only really suited for Quadros, that note should be put somewhere under this information in the installer window (At least for the windows installer).

Edited by Gian-Reto
Link to comment
Share on other sites

  • Advanced Member

So, checked the settings:

 

- Maximum memory was already deselected in msconfig

- My Mainboard gives no option in the BIOS to activate Memory Remapping, but seems all never Mainboards will activate it automatically... and as the Board already sees 24G (actually 24500M.... wow, got 500M for free ;) ), it should not be a problem according to multiple internet sources

 

3DC CUDA DX works just as fine as the non CUDA version. Until now (10 minutes working with it) it has been stable for me.

 

Sadly, for me and my Videocard the GL Version was less laggy... when it worked. Of course, IF the DX version really proves to be more stable, I will take stability over FPS.

 

I just had an idea and installed GPU-Z to have a look into my VRAM usage. And Loh and behold, 1.35G of my 1.5G VRAM is used when the DX Version is running. And when I try to start the GL Version, the usage is at over 1.5G at the time of crash.

Is it save to assume for me that my Problem is actually not with RAM, but with VRAM?

 

If this really is the case, maybe the error handling should give that information to the user. Now I know that my 24G of RAM most probably still are enough for 3D Coat, but that I should get a double VRAM Nvidia Card for 3D Coat.

 

Which usually is a stupid idea until you try some extreme AA Mods, but it seems with 3D Coat such a card has its perfect niche.

 

 

I'll keep using DX Version for now and wait for the 6G GTX 780 to come down in price (GTX 880 should be coming soon :) ).

Edited by Gian-Reto
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...