Jump to content
3DCoat Forums

Best GPU value for 3D-Coat


probiner
 Share

Recommended Posts

  • Reputable Contributor

W

 

 

 

2 GPU's not SLI, one used for the monitor.

 

Will programs like 3D-Coat automatically use the other GPU not being used for the monitor, or does it need to be configured.

 

Not having SLI has thrown me  a bit..

 

.

3D Coat is not alone among CG apps that don't utilize multiple cards. Again, the only benefit you will get from multiple cards is when using GPU accelerated renderers and some PhysX-based simulations in 3ds Max should be able to use a 2nd card just for computing dynamics simulations. FumeFX and PhoenixFD might allow you to designate a 2nd card just for GPU previews (while the other card drives the regular display).

 

Having said that...at one point, Andrew had planned to port the Cycles Render engine (in Blender) into 3D Coat, but it got put on the back-burner. However, I think Raul (one who developed/develops LiveClay) is supposed to come back to Kiev (Ukraine) for a few months. That was his assigned task before he left, so we might see that project revived, if he does. So, that would utilize CUDA and should recognize multiple cards

 

http://code.blender.org/index.php/2013/08/cycles-render-engine-released-with-permissive-license/

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...