Jump to content
3D Coat Forums

Search the Community

Showing results for tags 'Render'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General 3DCoat
    • Announcements
    • 3DCoat
    • 3DCoat Printing
    • New Releases and Betas
    • 3DCoat AppLinks testing
    • SOS! Only for official urgent help
    • General Discussion
    • InstantLight Real Time Render
  • Galleries & Challenges
    • Finished & Wip projects
    • Official Contest
    • Community Challenges
    • Archived: Gallery of Finished Works
  • Resources & Development
    • Tutorials and New Feature Demos
    • 3DCoat Exchange Library
    • SDK, Plugins and Programming
  • Russian Forum
    • 3d Coat
    • Поддержка
    • WIP
    • Уроки
    • Предложения, вопросы
  • Gallery & Competition
  • French Forum - Forum Francophone
    • Questions relatives au Forum Francophone
    • Discussions sur 3DCoat
    • Tutoriaux
    • WIP Travaux en cours
    • Galerie Travaux Terminés
    • Hors Sujets et Créations numériques 3D / 2D associées
  • German Forum - Man spricht Deutsch
    • Fragen und Antworten zu 3D Coat
    • Tutorials
    • Stellenanzeigen

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Google+


YouTube


Vimeo


Facebook


Location


Interests

Found 40 results

  1. mitchellchelin

    Baking texture from rendered view

    Hi everyone. im new to this application but loving its functionality rather then other mapping & baking softwares. i managed to stumble across a way of baking all the specular, < shine and lighting & also the ambient occlusion maps into one image but i havent managed to find a way to do it again with other models. reasons behind i would like these textures to be baked with predone lighting ect. as seen in the preview window is because where the models will be placed do not allow me to have individual maps for the model just one texture file thats all i get.. thank you secondlife... i know there is a way to bake exactly what you see onscreen to the uv layout... but i am struggling to figure it out again any help would be fantastic Great program none the less guys! keep up the amazing work!
  2. I've very recently started playing around with 3D coat. Now I've finished texturing my first model I've hit an issue. The image on the left is how i see it in 3D Coat and the right is after i have rendered it in 3DSmax. It seems to have lost a significant amount of detail and a lot of the spec I'm using the default scan line renderer in this image in max but have also tried VRay and had the exact same result. Can anyone give me any pointers as to what i am likely doing wrong? I've basically spent the entire day trying to figure this out and feeling a bit fried. (we've all been there) Any help gratefully received. http://sta.sh/0c2rmylhn6j
  3. dimitribastos

    Cold Shadow Fan Art

    From the album: FanArt

    First render for an upcoming project. Hope to show more very soon. Hope you like!
  4. dimitribastos

    civilwarkids ironman

    From the album: FanArt

    Second illustration for Civil War Kids series. Team Iron Man Hope you like it!
  5. From the album: Weapons and more

    Only used the 3D-Coat
  6. у меня такой вопрос, можно ли прикрутить какой не будь сериозни плагин для рендера, илье интегрировать в соат.? напримр кeишот, закончил модель текстуры, выбрал настройки, нажал на кнопочку, и модель со своми текстурами экспортируется в каишот. назначается стандартный шейдер с текстурами. в общем я думаю что эта программа очень очень очень нуждается в форутеалистичном рендерштуке а как вы дуайте? и думайте ли вообще об этом ?
  7. Fantome

    skull render.jpg

    From the album: Fantome artwork

    Rendered in marmoset.
  8. In the render viewport when you select a light colour it gives you the exact opposite of the colour you choose. This is using the HSL1 colour picker. I have tried others and they all give the same value. However it's not quite so simple as that. If you choose pink at the NE setting of the hue wheel you get the correct colour. Equally if you choose green in the SW quadrant it seems to be correct. Other positions are not the colour you choose. I am using 4.0.04A on a Mac Pro with Mavericks 10.9.1
  9. Source http://io9.com/disneys-new-rendering-technique-could-usher-in-a-new-e-1467435361 Link to the paper http://www.disneyresearch.com/wp-content/uploads/Project_jis_sigasia2013-paper.pdf An animation studio can spend days rendering a scene that features water, smoke and other substances that affect light (and its simulation) in complex ways. Now, a team led by Disney Research Zürich has developed a computational algorithm that can reduce rendering times for such scenes by a factor of up to 1,000. Animators call substances like smoke, fog, water, mist, and dust "participating media," due to their tendency to deflect or scatter light as it travels the distance separating its source from the camera (the shafts of light visible in the upper right hand corner of the Monsters University screenshot featured above are a good example of how we perceive illumination in the presence of participating media.) Existing rendering algorithms account for participating media by randomly sampling potential paths that light might take through a scene, and then averaging the results. But the number of potential paths is usually quite numerous, and many of these paths will often have little or no effect on the look and feel of the final animated sequence. Imagine, for example, a path that intersects with an object in the scene that blocks the light entirely, or a light source that is too distant from the camera, or separated by too much dense fog, to be seen at all. Calculating paths like these are not only a waste of time and processing power, they can also introduce unwanted effects in the animation. Getting rid of these unwanted effects, of course, winds up taking even more time, and introduces still further opportunities for rendering errors to emerge. Expand Above: The interaction of light and dust in a rendered scene from Zero Dark Thirty, via fxguide.Now, a team led by Disney Research scientist Wojciech Jarosz has developed a method called "joint importance sampling" that efficiently identifies which paths of light are most likely to contribute to what the camera actually sees. As its name suggests, the algorithm helps sort out which paths are important to include when rendering the final scene, and which can be excluded. The researchers will present their findings this week at the this year's ACM SIGGRAPH conference in Hong Kong. Researchers have looked into the use of importance sampling (IS) techniques in the past. Some of the most sophisticated methods rely on a bidirectional process that traces potential light paths not just from the light source to the camera, but from the camera back to the light source. Traditionally, however, bidirectional techniques have sampled the light-source–>camera and camera–>light-source paths independently. Jarosz's team's joint importance sampling method, in contrast, identifies potential light paths with mutual knowledge of both camera and light source locations. The result, the researchers claim, is a boost in efficiency that reduces rendering times while improving animation quality. Expand In the figure above, the results of the team's joint importance sampling method is contrasted with traditional techniques for both unidirectional and bidirectional path tracing. The boxes along the bottom of the images show close-up views of the scene. The relative lack of noise highlights the improved accuracy of Jarosz's team's sampling method."There's always going to be noise," said Jarosz in a statement, "but with our method, we can reduce the noise much more quickly, which can translate into savings of time, computer processing and ultimately money." It could also add considerably to the creative process itself. "Faster renderings allow our artists to focus on the creative process instead of waiting on the computer to finish," Jarosz explains. "This leaves more time for them to create beautiful imagery that helps create an engaging story." In an exhaustive two-part essay on the latest trends in the VFX industry (which goes into much greater detail than I can provide you), fxguide's Mike Seymour cites from a speech delivered by Jarosz earlier this year, and highlights how novel techniques like joint importance sampling could help pave the way to a new era of animation: [in "The Perils of Evolutionary Rendering Research: Beyond the Point Sample,"] the keynote by Jarosz at EGSR 2013, [Jarosz] argued that the way "we approach many difficult problems in rendering today is fundamentally flawed." Jarosz put forward the case that "we typically start with an existing, proven solution to a problem (e.g., global illumination on surfaces), and try to extend the solution to handle more complex scenarios (e.g., participating media rendering)." While he feels that this "evolutionary approach is often very intuitive," it can lead to algorithms that are significantly limited by their evolutionary legacy. To make major progress, we may have to rethink (and perhaps even reverse) this evolutionary approach." He claimed that "a revolutionary strategy, one that starts with the more difficult, more general, and higher-dimensional problem – though initially more daunting, can lead to significantly better solutions. For more information on joint importance sampling, visit Jarosz's team's research page. For more on how joint importance sampling fits into the broader field of visual effects, I highly recommend "The State of Rendering," Seymour's comprehensive overview of the latest trends in VFX, including, most notably, the industry-wide push to achieve increasingly plausible shading and lighting.
  10. Fantome

    Great white shark

    From the album: Great white shark

  11. Fantome

    great white shark.

    Hi, I'm starting a small new project, a great white shark. I'll be modeling and texturing it in 3d coat. Once the model is done I'll try to make a nice underwater scene and render it in mental ray.
  12. New CPU based render. http://www.fluidray.com/ Interface somewhat similar to Octane.
  13. once I have retopologized and painted my object, is there any way to arrange this to create a scene with other retopo'd meshes I have previously created (or even duplicate my mesh). A few trees, a wall, a gate etc. Currently, I can't see any way to do this in 3D Coat without taking the individually meshes to an external program. Is there? Thanks Simon
  14. alvordr

    Flickering Renders

    Is it just me or do the real time render and turntable renders flicker?
  15. Carlosan

    WebGL Path Tracing

    http://madebyevan.com/webgl-path-tracing/ Path tracing is a realistic lighting algorithm that simulates light bouncing around a scene. This path tracer uses WebGL for realtime performance and supports diffuse, mirrored, and glossy surfaces. The path tracer is continually rendering, so the scene will start off grainy and become smoother over time.
×