It looks like you're new here. If you want to get involved, click one of these buttons!
www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable/
"How did this come about? Sources in Santa Clara tell SemiAccurate that GF100 was never meant to be a graphics chip, it started life as a GPGPU compute chip and then abandoned. When the successor to the G200 didn't pan out, the GF100 architecture was pulled off the shelf and tarted up to be a GPU. This is very similar to what happened to ATI's R500 Xenos, but that one seems to have worked out nicely in the end."
Comments
If only Intel launched the Larrabee, then nVidia would have an exit strategy. nVidia had the lead from 2006~2009, and was not able to make enough profit to account for the upcoming operating loss of being 2nd place due to issues with the G80 micro-architecture. Competition is definetly needed at this point since ATI is propping up AMD until 2011 when AMD can release its fusion chips. If nVidia fails to deliver, we could be seeing $600 HD5870. The HD5870 has already gone up $100 since release.
Sometimes Charlie is too much doom and gloom.. the gpu is gonna be pretty fast, it'll be exciting. And $600.
Fast, probably. But eats huge amounts of power, is very expensive and you can forget dual gpu cards right away.
Never believe anything from Semi Accurate that is Nvidia related: charliedemerjianisadouchebag.blogspot.com/
"There is no truth to this. Charlie has become a sponsored site of his sole advertiser. It's no coincidence his website looks like an AMD ad." - Ken Brown, nVidia PR
Charlie has it in for Nvidia big time.
Lets wait and see until the press get a fermi and run the real tests from the real tech websites.
I read elsewhere that dual Fermi will happen about a month after regular release.
If the card is good enough I will buy two regular Fermi's rather than a dual card.
Corsair does not recommend using 2x ATI 5970s on their 1000w PSU. So an upgrade path for Dual Chip cards from ATI or Fermi are not really going to mainstream right. A single 5970 is just equal to 2x5850, a dual fermi will just a take a similar approach.
So dual card wise the 5970 is a massive 31cm or so, a regular GF100 won't be that big.
You know how much an ATI 4870X2 uses at full load? 270W. I have one. So if the high end Fermi is and I stress is 280w pull (and that was the "demo" card") it's performance is still going to beat the 4870x2 up big time and release the same price as the 4870x2 a while back. You don't even want to know what a dual X-fire 2900XT ATI used. At the end of the day in this particular case best to wait and see. Even if I personally go 2xSLI top end Fermi's im gonna be looking at 400W pull easy.
Charlie can't handle the new GF doing better tesselation, better performance, and can't handle being invited to proper press events. He even says the die is massive, it is smaller than the gt200 series lol. Bloke has big issues.
Well he's clearly biased against nVidia, but he also happened to be right on about everything he's said from bumpgate on, especially the fermi progression. The clear bias makes it a little harder to take at face value but he is pretty accurate in his info.
They still haven't said how big the die size is on the GF100 so can't call him wrong on that yet. But the board manufacturers have already said it's going to be a really big chip, whether that means bigger or smaller than GT200 I dunno.
But I'm definitely a wait and see guy. Only thing I've really cared about was the release date predictions and he's the only one that's been nailing that, being told to wait for Fermi back in September because it's suppose to hit 'next month' was just bullshit.
charlie takes a lot of flak but he has been right about a lot of stuff this past year.
Yea and contrary to what some fanbois will say he definitely knows his tech too. His bumpgate expose was a very enlightening and interesting read. The people who trash his work aren't even in the same realm of knowledge.
nVidia need to make proper GPUs and stop grasping at straws, this GPU cannot and will not compete in my opinion.
"Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience"
CS Lewis
Even single takes so much power that making dual fermi without massive downclocking is impossible. So making dual card isn't viable considering single card cap for pcie standard is 300 watts.
fermi die size = over 500mm2 40nm
radeon 5870 334mm2 40nm
gt200 = 576 65nm
And some facts:
All demoed fermis have been liquid cooled -> clear cooling problems
Recent press conference -> no information about models, clocks or exact features ->> they haven't been decided yet aka the cards are far from release
The performance you hinted at is still a myth, only nvidia's own slides are out which can never be taken seriously.
You do realize that ATI's 5970 dual card offering is downclocked right? I believe it was something like 294w that it uses and is downclocked from 5870 x 2 to 5850 x 2. Nvidia will probably do the exact same thing as ATI. Downclock the card to meet PCIE standards then let people OC it over the 300w limit.
As far as the "cooling" problems. That isn't a fact at all. It could be for a bunch of different reasons like that they didn't want a gigantic heatsink in the case that detracts from the fermi cards. Has there even been a confirmation from a reliable source that the card runs hot? I mean most of what I've seen is 2nd and 3rd hand crap. Even the "rumor" states that 1-2 cards is FINE but after that you need extra cooling. WELL DUH. Who runs Tri/Quad SLI and doesn't take cooling into consideration?
By models do you mean the GF100 and 104 or the GTX 360 and 380?. There is obviously tons of information on what features the cards have like a ton more geometry performance, just go read all the information that came out. Yes there isn't clocks yet or price and those will be firmed up come closer to launch in March.
As for performance I wouldn't say it is a myth. There is obviously 1 video of it running Farcry 2 vs the 285. Of course until reviewers get them and bench them you have to take it all in stride.
Yea but the 5870 was already well below the 300W limit, so it only took a little underclocking to get heat and power draw into the acceptable limit. If everything is true the Fermi would take such a massive underclock to get low enough power/heat to dual them that the performance result would be worthless.
You do realize that ATI's 5970 dual card offering is downclocked right? I believe it was something like 294w that it uses and is downclocked from 5870 x 2 to 5850 x 2. Nvidia will probably do the exact same thing as ATI. Downclock the card to meet PCIE standards then let people OC it over the 300w limit.
As far as the "cooling" problems. That isn't a fact at all. It could be for a bunch of different reasons like that they didn't want a gigantic heatsink in the case that detracts from the fermi cards. Has there even been a confirmation from a reliable source that the card runs hot? I mean most of what I've seen is 2nd and 3rd hand crap. Even the "rumor" states that 1-2 cards is FINE but after that you need extra cooling. WELL DUH. Who runs Tri/Quad SLI and doesn't take cooling into consideration?
By models do you mean the GF100 and 104 or the GTX 360 and 380?. There is obviously tons of information on what features the cards have like a ton more geometry performance, just go read all the information that came out. Yes there isn't clocks yet or price and those will be firmed up come closer to launch in March.
As for performance I wouldn't say it is a myth. There is obviously 1 video of it running Farcry 2 vs the 285. Of course until reviewers get them and bench them you have to take it all in stride.
Previous nvidia gt200 too is massively downclocked, nothing new there. The point is that fermi would have to be downclocked so much, that there would not be any point to make dual gpu card. Even single fermi goes very close to the 300 watt limit.
Have you even seen any other card marketed only with liquid coolers? I haven't.
Like I said, no information regarding clocks and shader amounts, far from ready.
Video made by nvidia promoting team is no video at all.
Even single takes so much power that making dual fermi without massive downclocking is impossible. So making dual card isn't viable considering single card cap for pcie standard is 300 watts.
fermi die size = over 500mm2 40nm
radeon 5870 334mm2 40nm
gt200 = 576 65nm
And some facts:
All demoed fermis have been liquid cooled -> clear cooling problems
Recent press conference -> no information about models, clocks or exact features ->> they haven't been decided yet aka the cards are far from release
The performance you hinted at is still a myth, only nvidia's own slides are out which can never be taken seriously.
Thing is Charlie knows that a regular Fermi which will compete against a regular 5870 and by all accounts better it.
What is the next thing he moves to? To compare it against the dual GPU in one, 5970. What is the die size of that card? It is 2x 334 mm2 - is that a fair comparison? Dual to Single? I think not, but he is just looking at it from perfomance perspective.
Was not the draw for a 5970 widely reported at 384W pre launch with ATI engineers scrambling to get it under 300W. So why can not the same be done with Fermi? just because it has a larger die size to begin with what is left is guess work, therefore pure guess can't. More respective sites state that the GF100 draw could be 220 TO 280W.
Everyone knew a 5970 would not be 2x5870 because of the changes needed for it to get under 300w. It is the largest card in existance right now. Those GF100 demo'd are the same size in length as a GTX285 bumping the size up for a dual card and cutting stuff back will still leave it in a competitive position vs a 5970.
The was no water cooling on the GF100 on the main demo stage http://www.youtube.com/watch?v=gkI-ThRTrPY just the H50 on the CPU.
As for performance there a few sites out there re-doing the 5870 comparison and getting similar results to nvidias.
Obviously will have to wait for the Fermi numbers to be verified externally.
Even single takes so much power that making dual fermi without massive downclocking is impossible. So making dual card isn't viable considering single card cap for pcie standard is 300 watts.
fermi die size = over 500mm2 40nm
radeon 5870 334mm2 40nm
gt200 = 576 65nm
And some facts:
All demoed fermis have been liquid cooled -> clear cooling problems
Recent press conference -> no information about models, clocks or exact features ->> they haven't been decided yet aka the cards are far from release
The performance you hinted at is still a myth, only nvidia's own slides are out which can never be taken seriously.
Thing is Charlie knows that a regular Fermi which will compete against a regular 5870 and by all accounts better it.
What is the next thing he moves to? To compare it against the dual GPU in one, 5970. What is the die size of that card? It is 2x 334 mm2 - is that a fair comparison? Dual to Single? I think not, but he is just looking at it from perfomance perspective.
Was not the draw for a 5970 widely reported at 384W pre launch with ATI engineers scrambling to get it under 300W. So why can not the same be done with Fermi? just because it has a larger die size to begin with what is left is guess work, therefore pure guess can't. More respective sites state that the GF100 draw could be 220 TO 280W.
Everyone knew a 5970 would not be 2x5870 because of the changes needed for it to get under 300w. It is the largest card in existance right now. Those GF100 demo'd are the same size in length as a GTX285 bumping the size up for a dual card and cutting stuff back will still leave it in a competitive position vs a 5970.
The was no water cooling on the GF100 on the main demo stage http://www.youtube.com/watch?v=gkI-ThRTrPY just the H50 on the CPU.
As for performance there a few sites out there re-doing the 5870 comparison and getting similar results to nvidias.
Obviously will have to wait for the Fermi numbers to be verified externally.
Fermi is being compared to 5970 cause they are in equal price class. I didn't say dual fermi would be impossible, just not sensible at all with current manufacturing process.
You do realize that ATI's 5970 dual card offering is downclocked right? I believe it was something like 294w that it uses and is downclocked from 5870 x 2 to 5850 x 2. Nvidia will probably do the exact same thing as ATI. Downclock the card to meet PCIE standards then let people OC it over the 300w limit.
As far as the "cooling" problems. That isn't a fact at all. It could be for a bunch of different reasons like that they didn't want a gigantic heatsink in the case that detracts from the fermi cards. Has there even been a confirmation from a reliable source that the card runs hot? I mean most of what I've seen is 2nd and 3rd hand crap. Even the "rumor" states that 1-2 cards is FINE but after that you need extra cooling. WELL DUH. Who runs Tri/Quad SLI and doesn't take cooling into consideration?
By models do you mean the GF100 and 104 or the GTX 360 and 380?. There is obviously tons of information on what features the cards have like a ton more geometry performance, just go read all the information that came out. Yes there isn't clocks yet or price and those will be firmed up come closer to launch in March.
As for performance I wouldn't say it is a myth. There is obviously 1 video of it running Farcry 2 vs the 285. Of course until reviewers get them and bench them you have to take it all in stride.
Previous nvidia gt200 too is massively downclocked, nothing new there. The point is that fermi would have to be downclocked so much, that there would not be any point to make dual gpu card. Even single fermi goes very close to the 300 watt limit.
Have you even seen any other card marketed only with liquid coolers? I haven't.
Like I said, no information regarding clocks and shader amounts, far from ready.
Video made by nvidia promoting team is no video at all.
I understand what your saying about them being close to 300w already but my business side of me says "Why would NVidia be making them if they can't be made". They obviously found a way passed it since they have plans to release them after the March launch.
Fermi isn't marketed as solely water cooled.
media.bestofmicro.com/M/9/235377/original/fg100_small.jpg
That's a single FERMI at CES on AIR. The Tri-SLI one was liquid cooled
www.pcgameshardware.com/aid,702823/Nvidia-Fermi-GF100-Triple-SLI-and-mass-production/News/&menu=browser&image_id=1236924&article_id=702823&page=1&show=n
Also one reason they probably haven't released clocks yet is because Fermi doesn't really run the way previous cards have.
"The core clock has been virtually done away with on GF100, as almost every unit now operates at or on a fraction of the shader clock. Only the ROPs and L2 cache operate on a different clock, which is best described as what’s left of the core clock. The shader clock now drives the majority of the chip, including the shaders, the texture units, and the new PolyMorph and Raster Engines. Specifically, the texture units, PolyMorph Engine, and Raster Engine all run at 1/2 shader clock (which NVIDIA is tentatively calling the "GPC Clock"), while the L1 cache and the shaders themselves run at the full shader clock."
Lastly those videos are not made by the NVidia team. The Nvidia team held an editors day shortly after CES and that is where the videos came from. There are multiple videos from different angles of different qualities on different sites. Is it air tight proof of performance? No but it is a decent gauge at this point and shows the performance certainly isn't a Myth. But by all means cook up conspiracy theories about how it was actually Nvidia employees filming and the editors were given the videos and paid off to keep their mouths shut!
All I can say is the proof is in the pudding. Come March we will know exactly what the Fermi can and cannot do as well as what Nvidia can or cannot manufacture. I have faith in Nvidia; they have yet to put out a graphics card that disappointed me or anyone that I know. I cannot say the same for ATI though so that is why I am waiting. If the Price/performance of the Fermi is competitive I will buy one, if the ATI offerings are on par but a lot cheaper than I might pick up one of those.
By all means be skeptical but don't let those team red tinted sunglasses alter what you see :P.
You do realize that ATI's 5970 dual card offering is downclocked right? I believe it was something like 294w that it uses and is downclocked from 5870 x 2 to 5850 x 2. Nvidia will probably do the exact same thing as ATI. Downclock the card to meet PCIE standards then let people OC it over the 300w limit.
As far as the "cooling" problems. That isn't a fact at all. It could be for a bunch of different reasons like that they didn't want a gigantic heatsink in the case that detracts from the fermi cards. Has there even been a confirmation from a reliable source that the card runs hot? I mean most of what I've seen is 2nd and 3rd hand crap. Even the "rumor" states that 1-2 cards is FINE but after that you need extra cooling. WELL DUH. Who runs Tri/Quad SLI and doesn't take cooling into consideration?
By models do you mean the GF100 and 104 or the GTX 360 and 380?. There is obviously tons of information on what features the cards have like a ton more geometry performance, just go read all the information that came out. Yes there isn't clocks yet or price and those will be firmed up come closer to launch in March.
As for performance I wouldn't say it is a myth. There is obviously 1 video of it running Farcry 2 vs the 285. Of course until reviewers get them and bench them you have to take it all in stride.
Previous nvidia gt200 too is massively downclocked, nothing new there. The point is that fermi would have to be downclocked so much, that there would not be any point to make dual gpu card. Even single fermi goes very close to the 300 watt limit.
Have you even seen any other card marketed only with liquid coolers? I haven't.
Like I said, no information regarding clocks and shader amounts, far from ready.
Video made by nvidia promoting team is no video at all.
I understand what your saying about them being close to 300w already but my business side of me says "Why would NVidia be making them if they can't be made". They obviously found a way passed it since they have plans to release them after the March launch.
Fermi isn't marketed as solely water cooled.
media.bestofmicro.com/M/9/235377/original/fg100_small.jpg
That's a single FERMI at CES on AIR. The Tri-SLI one was liquid cooled
www.pcgameshardware.com/aid,702823/Nvidia-Fermi-GF100-Triple-SLI-and-mass-production/News/&menu=browser&image_id=1236924&article_id=702823&page=1&show=n
Also one reason they probably haven't released clocks yet is because Fermi doesn't really run the way previous cards have.
"The core clock has been virtually done away with on GF100, as almost every unit now operates at or on a fraction of the shader clock. Only the ROPs and L2 cache operate on a different clock, which is best described as what’s left of the core clock. The shader clock now drives the majority of the chip, including the shaders, the texture units, and the new PolyMorph and Raster Engines. Specifically, the texture units, PolyMorph Engine, and Raster Engine all run at 1/2 shader clock (which NVIDIA is tentatively calling the "GPC Clock"), while the L1 cache and the shaders themselves run at the full shader clock."
Lastly those videos are not made by the NVidia team. The Nvidia team held an editors day shortly after CES and that is where the videos came from. There are multiple videos from different angles of different qualities on different sites. Is it air tight proof of performance? No but it is a decent gauge at this point and shows the performance certainly isn't a Myth. But by all means cook up conspiracy theories about how it was actually Nvidia employees filming and the editors were given the videos and paid off to keep their mouths shut!
All I can say is the proof is in the pudding. Come March we will know exactly what the Fermi can and cannot do as well as what Nvidia can or cannot manufacture. I have faith in Nvidia; they have yet to put out a graphics card that disappointed me or anyone that I know. I cannot say the same for ATI though so that is why I am waiting. If the Price/performance of the Fermi is competitive I will buy one, if the ATI offerings are on par but a lot cheaper than I might pick up one of those.
By all means be skeptical but don't let those team red tinted sunglasses alter what you see :P.
Card have had many different clocks before fermi too. Couple more won't make any difference, cards are still sold with clear gpu/memory/shader numbers.
The point is that nvidia created the environment for performance tests. Manufacturers always create such situations where their products perform the best. 3rd party is required to give reliable test.
From my part I expect nvidia the create their biggest failure for a long while, huge expensive chip which will hardly be any faster than its competitor (if at all, shall see). Fact is that fermi can't compete against 5k series unless it proves to be much faster.
And what comes to fanboyism, you can't find such ability from me. Before this message I haven't said a single word towards ati. Everything is strictly speculation based on hardware.
intel had a big mouth!one problem its facing is ibm(amd)ibm isnt as dead as intel would like!
ibm sold some stuff so they could concentrate on what they do best!think about futur toy !
and you know what they are succeeding!a lot faster then intel thot!their larabee was a good idea
but by the time its ready ati will still be in the lead by at least 6 month!so intel will probably have to change their plan and make a
graphic card the size of a processor die instead so you ll have 1 die for processor and one die for graphic card chip!
since larabee idea wont be edavanced enough to compete with ati high-end graphic card market
intel wont lunch larabee and will concentrate on the i3 and i5 version instead .it looks like intel will skip this generation too since their high-end design cant even compete with top nvidia(295)
so against ati 5970 intel isnt even there 2 year behind)and amd/ati will lauch their 32 nm version at the start of next year at the latest
ati will probably concentrate on finetuning their program for their dx11 graphic card since they have such a huge lead
wich will make the 5970 look even faster once ati polished the driver for them.
nvidia was the closest !nvidia cant manufacture last year chip size !imagine !forget the futur 32 nm ati is gettting ready
and ibm (amd/ati ally)is working on the next gen ,22nm.and by the look of it.ibm 22nm devellopement is going very smoothly
so intel will have to work very hard in the next 5 years to even just = ati just in graphic card
and amd is probably working very closesly with ibm to dev.the next gen of their processor.
true ibm is the one really pushing amd on the technology front!and this should scare intel
since ibm has had some of the greatest idea in the past,and the fact ibm came back to what they do best
wont help intel one bit!ibm is the sleeping dragon!
Uhm, IBM really doesn't take sides. They have been the main company forcing competition. Since most tech is made by IBM today, tech companies agree to whatever IBM says in order to use their patents. In this respect Intel and AMD are on the same playing field.
Don't feed the troll.
Don't feed the troll.
especially if the Troll doesnt know what their talking about