Jump to content


Notice about Feature Requests & Bug Reports

The bug reporting and feature request forums (except for the Urgent Help section) are going away soon! Please use Mantis to report any bugs and feature requests. the link to Mantis is:
3d-coat.com/mantis/

Read more about it HERE.

Photo

cuda + kepler


  • Please log in to reply
3 replies to this topic

#1 robotbob

robotbob

    Neophyte

  • Member
  • Pip
  • 85 posts
  • Gender:Male

Posted 14 February 2012 - 01:09 AM

a friend showed me this today and i thought some here might like to see what the new kepler cards might get up too, especially with cuda.

http://extrahardware...hell-kepler.png
mac 10.6.6 | 2 x 2.26 GHz quad core Xeon | NVIDIA GeForce GTX 285 | http://www.bitstate.com/

#2 robotbob

robotbob

    Neophyte

  • Member
  • Pip
  • 85 posts
  • Gender:Male

Posted 14 February 2012 - 11:55 AM

i am hoping these numbers actually happen but cuda definitely rocks.
mac 10.6.6 | 2 x 2.26 GHz quad core Xeon | NVIDIA GeForce GTX 285 | http://www.bitstate.com/

#3 chingchong

chingchong

    Novice

  • Member
  • PipPip
  • 404 posts
  • Gender:Male
  • Location:Germany

Posted 14 February 2012 - 08:28 PM

http://www.tomshardw...ries,14642.html

Yeah...this is all the more reason for Andrew to re-compile CUDA for 3D Coat to bring it up to date, and hopefully get it much more involved in the application. I have been using the app for over 3yrs now and I still don't know to what extent it is incorporated in the voxel workflow (only really large brush sizes, using Masks?). I think this is one reason why NVidia isn't as gung-ho about it as they are with Mari. It has a very limited role. Get it involved in the Paint Room. The multi-threading there is so silky smooth up to a point (really large brush sizes, some brush types are much slower than others, and large texture sizes...8k maps+). Handing that task over to the GPU with CUDA when the pixels per brush stroke ratio reaches an arbitrary level, could make 3D Coat really shine as a texture painting application.


i think every user with a nvidia card out there (like me ), will appreciate when they can actually enjoy the CUDA-boost. :) :)

#4 robotbob

robotbob

    Neophyte

  • Member
  • Pip
  • 85 posts
  • Gender:Male

Posted 15 February 2012 - 09:52 PM

the thing i really like about the potential of cuda is a much cheaper upgrade cycle for my hardware.
i can upgrade my video card instead of my CPU to get a much faster machine - for the software that uses cuda.
that can be much much cheaper overall. that and the recent awful flickering ATI drivers make nvida a better choice anyway.
adobe has embraced cuda and i use their products every single day and thats why i am mostly interested in this.
still a while to wait - april / may so i guess we will know if these numbers are real as its not official.
mac 10.6.6 | 2 x 2.26 GHz quad core Xeon | NVIDIA GeForce GTX 285 | http://www.bitstate.com/




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users