Jump to content
3DCoat Forums

ATI HD 4650


tess
 Share

Recommended Posts

  • New Member

will the 3d coat app run with ATI HD 4650 i dont know graphics cards well

so will it run and how well (im just a person who wants it for personal use i dont get computers much)

also what would be a better card thats under a price limit of $120 bucks

Sorry if this is the wrong place to post this im good at that

Link to comment
Share on other sites

  • Reputable Contributor

will the 3d coat app run with ATI HD 4650 i dont know graphics cards well

so will it run and how well (im just a person who wants it for personal use i dont get computers much)

also what would be a better card thats under a price limit of $120 bucks

Sorry if this is the wrong place to post this im good at that

I sold my 4850 on Ebay a few months ago...and only got about $80 for it. At that pricepoint, I'd look at a NVidia GTX 260 on EBay...lots of cores for CUDA to use, and a MUCH, MUCH better card than an ATI 4650. The GTX 260 was a $400 card just a year ago...but with competition with ATI and subsequent newer lines, the price has dropped to where you should be able to get one for $150 or less.

http://cgi.ebay.com/MSI-NVIDIA-GeForce-GTX-260-896-MB-MINT-CONDITION_W0QQitemZ290386837611QQcmdZViewItemQQptZPCC_Video_TV_Cards?hash=item439c67806b

I did plenty of research on this topic as I did a recent system build, with Overclocking in mind, and went with a Galaxy GTX 275...which has some serious aftermarket cooling apparatus (compared to the standard reference cooling that you find on 90% of the cards)...got it for $229 just before Christmas at Frys.

http://www.atomicmpc.com.au/Review/149347,galaxy-gtx275.aspx

Overclocked (it comes with software to let you do that within the Windows environment), it performs the same or better than the GTX 285 (currently NVidia's fastest single GPU card). So, if you can spring an extra $100, it's well worth it. I upgraded from a GTS 250 (which is essentially an updated 9800 GTX), and there is a WORLD of difference...in 3DC and 3ds Max 2010.

With 3DC having CUDA support, staying with NVidia is a no brainer. So glad I switched from ATI. The 4850 wouldn't work with Combustion (which is what I use for compositing). I can't recommend the GTX 275 highly enough. It gives you top end performance at a low to mid price point. I'm extremely happy with it...working with the later builds in 3DC...particularly brush speed in voxels and surfacing mode, it seems every bit as fast as Mudbox, IMHO.

Link to comment
Share on other sites

  • Advanced Member

With ATIs better performance at a better price plus its future proof nature, going ATI would be more of a no brainer.

Just for the inf my x1950 works perfectly fine in 3D Coat. I'm not sure how much the cuda perforance scales but having a fast cpu and graphics card could make up for that, plus the potential of opencl in the future.

Link to comment
Share on other sites

  • Reputable Contributor

With ATIs better performance at a better price plus its future proof nature, going ATI would be more of a no brainer.

Just for the inf my x1950 works perfectly fine in 3D Coat. I'm not sure how much the cuda perforance scales but having a fast cpu and graphics card could make up for that, plus the potential of opencl in the future.

I'm all in favor of getting the best bang for your buck, too...but right now, for anyone using 3DC or other applications that utilizes PhysX and CUDA, an NVidia card IS FUTURE PROOFING and GETTING THE MOST BANG FOR YOUR BUCK. A modicum of research/Google searches will bear that out. You're trying to nullify the advice of someone who has done so throughly. I have no brand loyalty whatsoever. Whatever works best for the least amount of money, gets the nod for me, and that's why I'm stating what I do and giving solid references to back it up...as well as relevant personal experience.

I owned nothing but ATI's up until this year, but when a card prohibits me from using one of my key programs....it's coming out of the computer...period. Andrew didn't spend all that time to add CUDA support for some fringe marketing gimmick. It works, and you're doing yourself a disservice if you go out and buy an ATI card, knowing this, IMHO. Sure 3DC works OK with an ATI card, but I saw a dramatic difference going from the ATI 4850 to an NVidia GTS 250 (comparable NVidia card to the 4850)...and then an even more dramatic difference when I put the GTX 275 in a new build.

For just over $200, I was able to get a card and easily overclock it (using the software that comes with it) to perform the same as the model above it, which costs almost twice as much. With twice the amount of processors as the GTS 250, CUDA operations run that much faster. With ATI you don't get jack, and you're are knowingly choosing to enter a 3 legged horse into the race. At it's price point, the GTX 275 handles it business in sheer frame rates, no matter what application you use. The closest price competitor to the GTX 275 is the ATI 4890, and it blows the doors off of it in overall performance.

Look at the following benchmarks...the card I chose is slightly better than the MSI GTX 275 Lightning OC in this chart (I'm running higher speeds than they have listed in this article). So, for $230 the card beats the stock GTX 285 and comes close to the new ATI 5870 (ATI's top single GPU card), at $430. You do the math. $230+CUDA > $430-CUDA

GTX 275 Overclocked comparison

compare the prices here

http://www.newegg.com/Store/SubCategory.aspx?SubCategory=48&page=1

So, please don't try to sell this nonsense that a $200 Nvidia card is inferior to an ATI that costs $200. Plus, Both companies routinely leapfrog each other every 6-12 months...so this whole bit about future-proofing is bunk. NVidia routinely adjusts their prices to correspond with comparable ATI cards.

Bottom line is, when using 3DC, a $150 NVidia card will comfortably outperform a $150 ATI card...same goes for whatever pricepoint you choose, actually. Why? CUDA....Nuff said.

Link to comment
Share on other sites

  • Reputable Contributor

With ATIs better performance at a better price plus its future proof nature, [b]going ATI would be more of a no brainer[/b].

Just for the inf my x1950 works perfectly fine in 3D Coat. I'm not sure how much the cuda perforance scales but having a fast cpu and graphics card could make up for that, plus the potential of opencl in the future.

CUDA vs just CPU benchmarks:

http://www.3d-coat.com/wiki/index.php/12.2_The_Benchmark

CUDA overview:

http://www.3d-coat.com/wiki/index.php/12.1_CUDA_Basics

If you did a little reading and some research, you would know...and consequently refrain from giving ill-informed advice. Andrew gets no monetary benefit from recommending NVidia cards, so it's an unbiased, objective recommendation he is making here. Again, if you read before offering advice, he states "The real advantage of CUDA over a 4-core CPU can be achieved only if GPU has more then 64 processors, otherwise the 4-Core CPU will yield better speed."

In this test...he was using a 9800GT, which has 112 stream processors. A GTX 275 or 285 has 240...so the advantage is that much greater.

Link to comment
Share on other sites

  • Advanced Member

I was just being a counter point to your propanganda. I dont care what the person above gets, but I dislike someone expousing personal experiences as universal fact, and saying things in block capitals DOES not MAKE you ANYMORE right. PhysX and Cuda are Nvidia exclusive, unless Nvidia becomes a monopoly or opens these up to everyone they will not be future proof. Thats my opinion the same as the above was your opinion.

Nvidia are having lots of problems at the moment, a hell of a ton of rebranding, loads of cards arent even being sold. The new cards they are making technically arent even going to be graphics based. Unless this fermi comes up to par in the not too distant future then they will be in a lot of trouble.

But whatever.

edit:

Those Benchmarks dont seem that significant to me, you only gain a few fps, and they only work best in a few specific circumstances.

I think its foolish to buy a card for a single piece of software, whereas you can get a good all round card.

Link to comment
Share on other sites

  • Reputable Contributor

I was just being a counter point to your propanganda. I dont care what the person above gets, but I dislike someone expousing personal experiences as universal fact, and saying things in block capitals DOES not MAKE you ANYMORE right. PhysX and Cuda are Nvidia exclusive, unless Nvidia becomes a monopoly or opens these up to everyone they will not be future proof. Thats my opinion the same as the above was your opinion.

Nvidia are having lots of problems at the moment, a hell of a ton of rebranding, loads of cards arent even being sold. The new cards they are making technically arent even going to be graphics based. Unless this fermi comes up to par in the not too distant future then they will be in a lot of trouble.

But whatever.

edit:

Those Benchmarks dont seem that significant to me, you only gain a few fps, and they only work best in a few specific circumstances.

I think its foolish to buy a card for a single piece of software, whereas you can get a good all round card.

There's no propoganda at all. I just don't care to be corrected by someone who clearly doesn't know what they are talking about. No surprise then, that you'd try to discount my recent experience with both brands....while you have none. I pointed out that I was an owner of multiple ATI cards before this. I know what I'm talking about. I've been building my own systems since the mid 90's. I always research the items I buy...cause I don't want to shell out the cash for something based on heresay, and wish later I hadn't. I got the ATI card well before I bought a seat of 3DC. In hindsight, I regretted it. That's largely why I take the extra time when someone asks others about their experiences. Funny...if it didn't matter then why would anyone even bring the subject up here?

I did my homework when I bought this card just a few weeks ago. Did you do yours before offering advice (telling folks to ignore what Andrew clearly stated...that the newer generations of NVidia cards generally perform better in 3DC with CUDA)? You admitted you didn't know how much CUDA would help...so why are you telling someone differently when you're not that informed yourself?

A lot has changed since you got that ATI 1950, or whatever model it was. I also pointed out how Andrew stated that the more processors the NVidia card has, the more CUDA scales in its performance...yet you're saying it amounts to nothing. Show us something...anything to solidify your case, and then show us how relevant it is, as this is a 3DC forum and the original poster was asking about it in regard to how it would perform in 3DC.

Show us some clear evidence that an ATI card in the $100-$250 range is more FUTURE PROOF, as you put it, than a comparably priced NVidia card. The recent ATI 5800 series only wins the performance crown at the very high end. In a month or so, that will change too...as it always does. It's cyclical. Nevertheless, it doesn't change with the low to midrange cards. I submit to the jury....exhibit A.

http://www.legitreviews.com/article/944/14/

http://www.techspot.com/review/164-radeon-4890-vs-geforce-gtx275/page2.html

A quote from this article:

As luck would have it, the GeForce GTX 275 was not just a quick and easy counter for the Radeon HD 4890, it was the perfect counter. Prior to the launch of these two cards the Radeon HD 4870 and GeForce GTX 260 were already doing battle. The Radeon HD 4890 was meant to outclass both products and conquer the $250 price range but evidently that didn't go as planned.

It doesn't take a rocket scientist to figure out if a particular card at the top of your pricerange clearly dominates it's competitor (as the GTX 275 does the ATI 4890) in overall performance AND offers significant acceleration capability for one or more of your programs...that is the card that makes most sense. So, you see...had you done your homework, you'd know that the decision is not just for one particular application.

What's FOOLISH is trying to correct someone with no factual information to back it up.

Conclusion of the TechSpot comparison:

..."Even when taking overclocking into account, we don't feel the HD 4890 is worth the price premium over the Radeon HD 4870. For now we feel the Radeon HD 4870 ($185) and GeForce GTX 260 ($180) are far better value alternatives, while the GeForce GTX 275 is the best graphics card $250 can get you.

You were saying...?

Link to comment
Share on other sites

  • Advanced Member

Used puters since the days 386's needed math's co-processors purchased as optional upgrades and I've been the owner of my first ever ATI card for the past few months, all other cards Nvidia.

I currently have an ATI 4850 in a Dell i7 920 and I have no complaints at all. Use 3dCoat, Vue 8 Infinite, Carrara 7 Pro, Quicktime Pro, FXHome, all the usual 2d editors etc.

The only thing is I do not use a compositor like mentioned above, and I also do not game with the machine.

I have had one bluescreen crash in about 5months. No other issues, glitches other than the (I'm half-kidding here) Usual 3dCoat new installation gremlins.

Your experience may vary of course, theres millions of variables that come into play here.

Group Hugs everybody :)

Link to comment
Share on other sites

  • Reputable Contributor

Looks like programs that have Mental Ray integrated (3ds max, Maya, Softimage), should be getting iRay with upcoming new versions. Halfway through this video, he mentions that it works on multi-core CPU's but for those who have NVidia cards, CUDA will provide a significant degree of acceleration.

http://www.mentalimages.com/index.php?id=634

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...