Jump to content
3DCoat Forums

Search the Community

Showing results for tags 'Nvidia'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Announcements
    • Tutorials and new feature demos
    • New Releases, Bugs Reports & Development Discussion
  • 3DCoat
    • General 3DCoat
    • Coding scripts & plugins
    • SOS! If you need urgent help for 3DCoat
  • Community
    • CG & Hardware Discussion
    • Content exchange
    • Artwork & Contest
  • International
    • Chinese forum - 3DCoat用户讨论组
    • Japanese forum - 日本のフォーラム
    • German Forum - Man spricht Deutsch
    • French Forum - Forum Francophone
    • Russian Forum
  • 3DC's Topics
  • 3DC's Tips
  • 3DC's Topics
  • 3DC's Paint
  • 3DC's Hipoly
  • 3DC's Lowpoly

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Google+


YouTube


Vimeo


Facebook


Location


Interests

Found 6 results

  1. Every time I start my license and I start to make a choice out of the menu my screen is filling with lines and windows. My configuration: i7 -6700 32 internal memory gtx 1080 (latest driver is installed) windows 10-64 Please give advice so I can start working?
  2. http://www.ign.com/articles/2017/03/01/nvidia-officially-reveals-its-geforce-gtx-1080-ti By Seth G. Macy Rumors of NVIDIA's newest flagship graphics card turned out to be true, with NVIDIA unveiling its GTX 1080 Ti. The $699 graphics card promises 35% faster performance than the GTX 1080, and "is even faster in games" than the Titan X, according to NVIDIA. Spec-wise, the card is pretty beastly, with 11 GB GDDR5X memory, 3584 CUDA cores and a 1582 MHz boost clock. For comparison, the $1200 Titan X has 12 GB GDDR5X, the same 3584 CUDA cores, and a boost clock of 1531 MHz. Both cards gobble up 250 watts of power. As far as real-world testing is concerned, the promotional page for the GPU compares its performance to the company's own GTX 980, showing performance upgrades better than double, and even triple in some cases. However, it's a little misleading. While the GTX 980 is a great GPU, it's nowhere near the level of the 1080 Ti, and was already beaten by NVIDIA's own 10 series cards, so it's more of a comparison for dramatic effect.
  3. Hi all, I bought 3dcoat some months ago, but only now I am starting using it for real. I would need some help to improve the performance of 3DC because I checked and the gpu is not working at all (using GPU-Z software to see it). my system is the following asus N7 17" 3d coat 4.5.19 dx 64 CUDA and not Intel i7-3630QM @2.40 GHz RAM 16 GB W.10 64bit GPU NVIDIAGeForce GT 740M My software gets stuck trying to make a Vox Layer of a section of a model about 750.000 triangles is it normal? Thanks
  4. This morning I did some tests in the painting room and I saw something about my setup. I have a NVidia Quadro K2000 in ubuntu 14.04 and I am using private drivers, it seems is not available in 3d coat. Then I made a rendering of PBR and the result was terrible, I know this should not be so because this Texture worked for a time in Windows 10 with a test copy of 3D-Coat and rendering did not look like this. Can anybody help me?. Thanks.
  5. Hello hi, I'm sorry, this is a question that probably comes up a lot over here, but Google or search don't cover everything. Or its just very outdated and I'm really REALLY not very techy at all, I'm trying tho, my PC doctor is of no help either he keeps addressing AMD cards. (specs at the bottom ) I would absolutely want to make fully worked out game ready characters and dive deeper into 3D, if possible in 3DC because I just like the workflow, and I was thrilled to find an affordable sculpting all around program,I love 3DC, buuuuut I'm having technical difficulties that surfaced even faster in 3DC because of how it works, Voxels n all. More than often 3DC shuts down with messages like 'not enough virtual memory' and simply 'not enough memory' or crashes and has me send the errors to the devs. But I gather its related. Not to mention making holes in the mesh. As for virtual memory I have been doing research, my PC has full access to 16364MB, I didn't have to change anything in preferences so that's the full 16 RAM being used. At least I suspect it to be. Do I perhaps have to change something in 3DC itself ? My graphic card is a whole different story tho '1024MB ATI AMD Radeon HD 5800 Series (MSI)' I suspect this is the culprit. I'd very much like a CUDA card to give 3DC more power. After browsing here I found some cards, but I'd like to have more experienced people to take a look at what the best solution is I'd rather not find out I'v done a misbuy. I'v found the EVGA line recommended here ranging from the budget one to the slightly more expensive . However in the store I'v found the ASUS line, the prices are nearly identical, maybe even a bit cheaper I think I'v seen this one at the store . Is there a significant performance difference between these two lines ? I'm pretty sure I'v seen a 4gig ASUS geforce as well that didn't reach 300 euro, or is the EVGA still the better one for 3D work. And, regardless of series, would the cheaper ones have enough horsepower to not crash and minimize the lag ? Or do I have no choice to go to the more expensive solutions if I want to work headache free ? ofcourse the specs ;
  6. Hi, I'm going to build a new pc for CAD, and I was wondering if there were any owners of the Nvidia Quadro K2200 card who could tell me if it's a good card for 3DC? Thanks in advance Seb
×
×
  • Create New...