Anyone else notice nearly all the tech sites used the same benchmark. Asus Crosshair V, and a nVidia GTX 570 or 580? Seems this might also be a problem with the performance as the results were better when using AMD GPUs.
personally i'm disappointed... i regret telling people to wait to see how BD turns out. i honestly wasnt expecting phenom2 to beat BD... personally i waited to wait so i can see how BD is, but thats my own time. telling others to wait was wasting their time:( i apologize. i was honestly expecting something either on par with current gen i7 or something between an i7 and i5.
the biggest disappintment was the IPC. it didnt even keep up with phenom2.... as quizzy said, technically speaking it may vindicate itself, but not in the useful lifetime of this current gen chip:/
on the bright side, they should have the fixes in for trinity:/ but who's to say they aren't stuck on the same 32nm node instead of 28nm:( i thnk BD failed because designers were expecting it to be put on the 28nm node... if they were able to release the chip at a 4.5 GHz stock speed, it might have been what the designers intended the chip to be and have it competitve.
oh well... i guess it's a 2500k from newegg for me. i REALLY hope southern islands doesnt get delayed too much.
Anyone else notice nearly all the tech sites used the same benchmark. Asus Crosshair V, and a nVidia GTX 570 or 580? Seems this might also be a problem with the performance as the results were better when using AMD GPUs.
AMD shipped that motherboard with the processor, basically saying, we guarantee that this one will work. Reviewers didn't want to use some other Socket AM3+ motherboard and have their review spoiled by a BIOS glitch that wouldn't be fixed until the day before launch.
Some reviewers worry that using an AMD video card will bias processor tests toward AMD, or perhaps that their readers may think it will. The fears are mostly unfounded if you're not using an APU, but a discrete Nvidia card eliminates them. They use a high end video card to try not to be video card limited, and thus get a better measure of the processor.
on the bright side, they should have the fixes in for trinity:/ but who's to say they aren't stuck on the same 32nm node instead of 28nm:( i thnk BD failed because designers were expecting it to be put on the 28nm node.
No one is making a 28 nm process node suitable for high end x86 processors. Use a process node meant for GPUs and maybe the processor can't clock over 3 GHz. Trinity is going to be on the same 32 nm node as Zambezi. It will have next generation "piledriver" cores that fix whatever went wrong in bulldozer cores, however. Even so, AMD won't be able to fix Windows 7 scheduling issues.
Huge disappointment. I have been a fairly AMD/ATI oriented PC individual for awhile now, but I will probably be switching to Intel processors at least, perhaps even switching over to Nvidia graphics.
I expected a lot more out of the Zambezi release, especially since AMD talked it up like it was going to be the shit. Turns out it is just shit.
Only AMD fanbois need apply, which I am certainly no longer one of.
Huge disappointment. I have been a fairly AMD/ATI oriented PC individual for awhile now, but I will probably be switching to Intel processors at least, perhaps even switching over to Nvidia graphics.
Why switch to Nvidia graphics because you don't like an AMD processor? That makes even less sense than buying Zambezi anyway because Intel graphics are so bad. After all, Sandy Bridge as a GPU is worse than Zambezi as a CPU.
I expected a lot more out of the Zambezi release, especially since AMD talked it up like it was going to be the shit. Turns out it is just shit.
Only AMD fanbois need apply, which I am certainly no longer one of.
yah there is definately alot of hype behind BD and i'm disappointed that it didnt even keep up with the 1100T. if you are comparing apples to apples, the 1100T should be going up aginst the 6 core, not the 8 core BD in the reviews... it's unsettling that even with all the optimization for OCing, and being on a 32nm node didn't even help it keep up with its MUCH older sibling.
HOWEVER, i'm not writing ATI... i mean AMD off completely:D southern islands is just a die shrink and that should offer a predictable amount if performance increase. so i'll stick with ATI gfx for now. nvidia needs to get their performance per watt up to par before i'll be looking at another nvidia again. wd
Expected,ati,and amd cannot go at being good and then an accountant step in an try cost cuting.probably can't be fixed since a lot of stuff was cut to save money.I give them a month to speak then it will be too late .it is a shame they were finally on the 32 nm process and they bow their chance because of an accountant.very sad
Comments
Anyone else notice nearly all the tech sites used the same benchmark. Asus Crosshair V, and a nVidia GTX 570 or 580? Seems this might also be a problem with the performance as the results were better when using AMD GPUs.
personally i'm disappointed... i regret telling people to wait to see how BD turns out. i honestly wasnt expecting phenom2 to beat BD... personally i waited to wait so i can see how BD is, but thats my own time. telling others to wait was wasting their time:( i apologize. i was honestly expecting something either on par with current gen i7 or something between an i7 and i5.
the biggest disappintment was the IPC. it didnt even keep up with phenom2.... as quizzy said, technically speaking it may vindicate itself, but not in the useful lifetime of this current gen chip:/
on the bright side, they should have the fixes in for trinity:/ but who's to say they aren't stuck on the same 32nm node instead of 28nm:( i thnk BD failed because designers were expecting it to be put on the 28nm node... if they were able to release the chip at a 4.5 GHz stock speed, it might have been what the designers intended the chip to be and have it competitve.
oh well... i guess it's a 2500k from newegg for me. i REALLY hope southern islands doesnt get delayed too much.
AMD shipped that motherboard with the processor, basically saying, we guarantee that this one will work. Reviewers didn't want to use some other Socket AM3+ motherboard and have their review spoiled by a BIOS glitch that wouldn't be fixed until the day before launch.
Some reviewers worry that using an AMD video card will bias processor tests toward AMD, or perhaps that their readers may think it will. The fears are mostly unfounded if you're not using an APU, but a discrete Nvidia card eliminates them. They use a high end video card to try not to be video card limited, and thus get a better measure of the processor.
No one is making a 28 nm process node suitable for high end x86 processors. Use a process node meant for GPUs and maybe the processor can't clock over 3 GHz. Trinity is going to be on the same 32 nm node as Zambezi. It will have next generation "piledriver" cores that fix whatever went wrong in bulldozer cores, however. Even so, AMD won't be able to fix Windows 7 scheduling issues.
Huge disappointment. I have been a fairly AMD/ATI oriented PC individual for awhile now, but I will probably be switching to Intel processors at least, perhaps even switching over to Nvidia graphics.
I expected a lot more out of the Zambezi release, especially since AMD talked it up like it was going to be the shit. Turns out it is just shit.
Only AMD fanbois need apply, which I am certainly no longer one of.
Why switch to Nvidia graphics because you don't like an AMD processor? That makes even less sense than buying Zambezi anyway because Intel graphics are so bad. After all, Sandy Bridge as a GPU is worse than Zambezi as a CPU.
yah there is definately alot of hype behind BD and i'm disappointed that it didnt even keep up with the 1100T. if you are comparing apples to apples, the 1100T should be going up aginst the 6 core, not the 8 core BD in the reviews... it's unsettling that even with all the optimization for OCing, and being on a 32nm node didn't even help it keep up with its MUCH older sibling.
HOWEVER, i'm not writing ATI... i mean AMD off completely:D southern islands is just a die shrink and that should offer a predictable amount if performance increase. so i'll stick with ATI gfx for now. nvidia needs to get their performance per watt up to par before i'll be looking at another nvidia again. wd