Advanced Member L'Ancien Regime Posted March 5, 2015 Advanced Member Report Share Posted March 5, 2015 (edited) http://www.pcworld.com/article/2892922/nvidia-surprises-launches-the-12gb-geforce-titan-x-the-most-advanced-gpu-ever.html The most advanced gpu ever? http://www.forbes.com/sites/jasonevangelho/2015/03/04/out-of-nowhere-nvidia-announces-12gb-geforce-titan-x-graphics-card/ Epic head Tim Sweeney explained to the attendees that this new VR experience was so powerful it required a new graphics solution. “Does anyone have any ideas how we can do this?” he asked the audience. After a few moments of silence, Nvidia CEO Jen-Hsun Huang walked out saying “I have one.” Ever the showman… “It’s the most advanced GPU the world has ever seen,” Huang said, and proceeded to hand Sweeney what is allegedly the company’s first production unit. All we know at this point? It’s based on Nvidia’s Maxwell architecture, has a 12GB framebuffer, 8 billion transistors, and took “thousands of engineer-years to build.” We also know that it’s apparently well beyond a concept, as Huang says it “will power GDC 2015,” meaning that multiple VR demos, at the very least, are being driven by the Titan X. Is it technically a dual-GPU like the Titan Z? Or is this a true, fully usable 12GB of VRAM? Nvidia isn’t talking until GTC which kicks off in a couple weeks. I’m going with the dual-GPU option, but with Maxwell at its core we can expect significant gains in performance and power efficiency. And DirectX 12 may hold some secret sauceallowing users and developers to utilize combined Video RAM in a way — namely, 2x6GB actually becomes a usable 12GB. Theoretically, this could blow past supporting 4K resolutions and power surround 4K (that’s 3 screens each running at 4K). Or really intense VR… http://www.amazon.ca/EVGA-GeForce-Dual-Link-Graphics-12G-P4-3990-KR/dp/B00JZ4SN4C It's got an all black body; http://wccftech.com/nvidia-gtx-titan-x-revealed-gdc-2015/ Edited March 5, 2015 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Advanced Member Aleksey Posted March 5, 2015 Advanced Member Report Share Posted March 5, 2015 $900 sounds reasonable.. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2015 Reputable Contributor Report Share Posted March 5, 2015 (edited) The Titan Black series are the ones to get if you use Cuda enabled rendering software a lot unless you bump up to Quadro cards. Cuda compute on the Titans is 1/3 not 1/32 like the new 980s... Plus the Titan has a 384 bit bus.. I will be curious on the final price of the 12gb and I bet they release a 6 gig one as well... The current Titan Blacks are getting harder to find. Edited March 5, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member Aleksey Posted March 5, 2015 Advanced Member Report Share Posted March 5, 2015 technically this isnt a titan black. its a titan X, which happens to be painted black. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2015 Reputable Contributor Report Share Posted March 5, 2015 (edited) I love black for office furniture and computers... And the new Titan X 12gb will be expensive... Funny how things have changed a CPU cost you less now than your Video card most times... Edited March 5, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 5, 2015 Author Advanced Member Report Share Posted March 5, 2015 I love black for office furniture and computers... And the new Titan X 12gb will be expensive... Funny how things have changed a CPU cost you less now than your Video card most times... Actually for 12 gb of ddr5 RAM that looks surprisingly inexpensive to me. Just factor it in to the ultimate cost. The big question for me is how they're calculating that RAM. Is it two GPUs with 6gb each and they're adding them because of DirectX12? Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 5, 2015 Reputable Contributor Report Share Posted March 5, 2015 (edited) Thanks for the correction, I was reading some information wrong at another site on the price range of the Titan X. 1000 to 1350 bucks would be a good value for the money. Edited March 5, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 8, 2015 Author Advanced Member Report Share Posted March 8, 2015 (edited) http://www.techpowerup.com/210448/nvidia-geforce-gtx-titan-x-pictured-up-close.html Count them; 12 x 1 gb ddr5 memory chips and it looks like they're arrayed around just one big honking Maxwell GPU so no trickery with 2x 6gb RAM for two GPUs here with DirectX12 Edited March 8, 2015 by L'Ancien Regime 1 Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 14, 2015 Author Advanced Member Report Share Posted March 14, 2015 http://wccftech.com/amd-r9-390x-nvidia-gtx-980ti-titanx-benchmarks/ Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 15, 2015 Author Advanced Member Report Share Posted March 15, 2015 (edited) http://wccftech.com/nvidia-geforce-gtx-titan-x-pictured-full-detail-features-gm200400-gpu-pcb-unveiled/ Here’s the proof we are looking at Big Maxwell…As you can see GTX TITAN X has GM200-400-A1 GPU. I can’t tell you for sure this is the full chip, but I have every reason to believe it actually is 3072-core processor. The PCB layout is quite similar to what we know from GK110 products. It would be interesting to know if Kepler waterblocks will fit onto GM200 board, wouldn’t that be great?. I know many of you will go liquid with this card. I do know however, that EVGA is making GTX TITAN X HydroCopper. The rumor is, it will cost just one kidney this time. Awesome. Edited March 15, 2015 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 16, 2015 Author Advanced Member Report Share Posted March 16, 2015 (edited) And the day before it's official announcement, here comes AMD with a monster to rival it; http://wccftech.com/amd-radeon-r9-390x-alleged-specifications-performance-numbers-leaked-60-faster-r9-290x/ 4096 vs 3072 cores, 8.1 teraflops vs 5.1 teraflops Edited March 16, 2015 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted March 16, 2015 Contributor Report Share Posted March 16, 2015 So AMD now has CUDA? Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 16, 2015 Author Advanced Member Report Share Posted March 16, 2015 (edited) http://wccftech.com/amd-r9-390x-8-gb-hbm/ And Tony, in answer to your question, my EE friend who is sending me this stuff says that with DirectX 12 the answer is "yes, it's 4096 cores will be used as CUDA cores ". But don't hold your breath for Andrew to add any updates for CUDA in 3d Coat. I think he's done with CUDA. Edited March 16, 2015 by L'Ancien Regime Quote Link to comment Share on other sites More sharing options...
Contributor Tony Nemo Posted March 16, 2015 Contributor Report Share Posted March 16, 2015 My main interest is for Octane. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 16, 2015 Author Advanced Member Report Share Posted March 16, 2015 My main interest is for Octane. I'd contact Octane's engineers... Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 16, 2015 Reputable Contributor Report Share Posted March 16, 2015 (edited) My main interest is for Octane. The amount of cuda cores is important but the power of cuda cores is very, very important. Crippled cuda cores are no fun at all... I still like that 12 gig of Vram for the Titan X for rendering GPU scenes since your are limited to the amount of vram on your card for rendering scenes. Hybrid Renders will use your Cpus as well but I do not know if they will use system ram too... I have not googled it yet... A quick Google... "Indigo is capable of Hybrid GPU + CPU rendering. So the GPU does one part of the rendering process and the CPU another! The benefit here is that the GPU don't do material calculation stuff and don't need to load all the textures in the (limited) GPU-RAM. The problem in this way of letting the GPU help you render is that if CPU and GPU aren't "eqivalent" in power one of them needs to wait for the other to be ready with calculation to bring "that stuff calculated together". I use Blender Cycles a non hybrid renderer so I need all the vram I can lay my hands on... Edited March 16, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member Grimm Posted March 16, 2015 Advanced Member Report Share Posted March 16, 2015 Octane has out of core memory use for textures now, so vram is less of a problem. There is a 10 to 20 percent decrease in speed when it's used though. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 16, 2015 Reputable Contributor Report Share Posted March 16, 2015 Octane has out of core memory use for textures now, so vram is less of a problem. There is a 10 to 20 percent decrease in speed when it's used though. And Thea Render developed "Bucket Rendering" on the GPU to skirt around out of VRAM issues. Bucket Rendering was a boon for CPU rendering, because it was so much more efficient at handling memory, and the rendering network could assign some buckets to other render machines (Distributed Rendering). Same principle is being applied to the GPU, by Thea. Pretty smart. That's why I'm happy I bought a seat Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 16, 2015 Reputable Contributor Report Share Posted March 16, 2015 (edited) And Thea Render developed "Bucket Rendering" on the GPU to skirt around out of VRAM issues. Bucket Rendering was a boon for CPU rendering, because it was so much more efficient at handling memory, and the rendering network could assign some buckets to other render machines (Distributed Rendering). Same principle is being applied to the GPU, by Thea. Pretty smart. That's why I'm happy I bought a seat Cycles gives two types of rendering... Progressive which renders the entire image at once... which take lots more vram or render by tiles(bucket)... . I forgot to mention that... I like Thea myself though I do not own it yet but I still have the trial version... Edited March 16, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 16, 2015 Author Advanced Member Report Share Posted March 16, 2015 http://www.techpowerup.com/210740/more-radeon-r9-390x-specs-leak-close-to-70-faster-than-r9-290x.html Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 17, 2015 Reputable Contributor Report Share Posted March 17, 2015 I just wish AMD would push Intel this hard on the CPU front. If they did, we'd probably be looking at CPU's with 24 cores by now Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 17, 2015 Author Advanced Member Report Share Posted March 17, 2015 I just wish AMD would push Intel this hard on the CPU front. If they did, we'd probably be looking at CPU's with 24 cores by now Quote Link to comment Share on other sites More sharing options...
PolyHertz Posted March 17, 2015 Report Share Posted March 17, 2015 Just got released today: http://www.geforce.com/geforce-gtx-titan-x/buy-gpu Looks like DP is 1/32 , same as the 980. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor digman Posted March 17, 2015 Reputable Contributor Report Share Posted March 17, 2015 (edited) Just got released today: http://www.geforce.com/geforce-gtx-titan-x/buy-gpu Looks like DP is 1/32 , same as the 980. Could you please share another link with that information, that link does not show the FP64 = 1/32 FP32. Maybe it is buried, if so point to the location. EDIT: Ok, I found one though their wording is somewhat confusing but yes it looks like FP64=1/32 FP32 for the Cuda cores on the Titan X is correct... Titan X = just an expensive gaming card if this is true with 3072 crippled cuda cores... I will wait though till there are some more reviews to post them here.... Edited March 17, 2015 by digman Quote Link to comment Share on other sites More sharing options...
Advanced Member Aleksey Posted March 17, 2015 Advanced Member Report Share Posted March 17, 2015 384bit memory, 12gb (of hopefully real) ram, i think in like 4 months im sold. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 18, 2015 Author Advanced Member Report Share Posted March 18, 2015 Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 20, 2015 Author Advanced Member Report Share Posted March 20, 2015 http://wccftech.com/nvidia-ethical-pricing-conundrum/ The game Nvidia Corporation (NASDAQ: NVDA) is playing is one of pure numbers, and involves (technically) legal tactics such as price skimming and utilizing the gap in its competitor’s lineup, all the while skirting the edge of legality and trust. The price point strategy and positioning of the new TITAN-X and the Quadro M6000 is what we will be exploring in this opinion editorial. A TITAN Idea – Some of the perks, without the full cost of going ProLet’s start with the TITAN branding first. The TITAN branding was originally designed to establish a brand new market. A market which Nvidia dubbed as “Semi-Pro”. The logic and rationale was clear. TITAN series GPUs would get unlocked double precision performance but not the ISV and Drivers support of the Quadro Series. A sort of midway between the mostly single precision based GPUs of the usual nomenclature. Because of the unlocked DP, the price tag had received a begrudging nod from me. Having firsthand experience of how much double precision matters in some programs (AutoCAD and the video industry being prime examples), it was something new, and something exciting. As a result, Green gained some more momentum in the Industry, establishing an even stronger foothold in what was already more or less set concrete. The TITAN-X – None of the perks, but still with some of the cost of going ProAt the same time, the Quadro K6000 Flagship (with all its lineup) stood proud, with full ISV and drivers support. Nvidia (NASDAQ: NVDA) charged a price tag accordingly, and because of its monopolistic position in the professional industry, AMD could not compete. To be fair, the Quadro K5000 pre-dated the Quadro K6000 with 1/24 FP32 but also a relatively marginal price tag of just over 2000 USD. Something that once again, granted the nature of Quadro branding, could be ignored. Read more: http://wccftech.com/nvidia-ethical-pricing-conundrum/#ixzz3UwsEerSJ Quote Link to comment Share on other sites More sharing options...
Advanced Member Aleksey Posted March 20, 2015 Advanced Member Report Share Posted March 20, 2015 i dunno, its nice that at least someone makes this stuff. nvidia could just pull an "apple" and go: Hey, more money in the consumer market, pro market be dammed.. And that's it we are stuck with whatever we have. Kinda like what happened to CPU's where it seems intel has just completely given up on increasing speeds. Quote Link to comment Share on other sites More sharing options...
Reputable Contributor AbnRanger Posted March 23, 2015 Reputable Contributor Report Share Posted March 23, 2015 i dunno, its nice that at least someone makes this stuff. nvidia could just pull an "apple" and go: Hey, more money in the consumer market, pro market be dammed.. And that's it we are stuck with whatever we have. Kinda like what happened to CPU's where it seems intel has just completely given up on increasing speeds. Yeah, Intel's improvements are a slow, tiny trickle compared to the pace of past development (when AMD was pushing them). I used to keep up with hardware advancements, regularly, but outside of the graphic card market, it's largely dried up. I'm still using a 4-5yr old i7 970 (6 core/12 threads) that isn't far from the performance of new i7s. What a shame. Quote Link to comment Share on other sites More sharing options...
Advanced Member L'Ancien Regime Posted March 26, 2015 Author Advanced Member Report Share Posted March 26, 2015 http://wccftech.com/nvidia-geforce-gtx-980-ti-coming-summer/ http://wccftech.com/review/review-titan-sli-performance/ Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.