It looks like you're new here. If you want to get involved, click one of these buttons!
www.anandtech.com/video/showdoc.aspx
I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed.
nvidia sure got told!
Comments
Meh... Give it a week. The big N will do something to this effect as well...
Also, I don't care if it runs 6 displays, I'm not buying another ATI Card again. Worst. Computer related investment. Ever!
ATI can do this because they have so many more processing units. If they broke down their processing units by 1/6th it would still be over the rate alot of developers plan for. Personally, I don't like the look of 6 monitors stacked on each other. The divide between monitors takes something out of it. I would rather mod the monitor so that divide is gone using good old fashion sodering. Also the electricity would compete with your computer. Thats about 240watts when its turned on, unless they used an LED backlit model.
I could never play on something like that. Having the image broken up like that just looks so tacky.
--------
"Chemistry: 'We do stuff in lab that would be a felony in your garage.'"
The most awesomest after school special T-shirt:
Front: UNO Chemistry Club
Back: /\OH --> Bad Decisions
agreed
"Good people are good because they've come to wisdom through failure. We get very little wisdom from success, you know." William Saroyan
The largest monitor you can buy is 46". Could you imagine playing on 9 of those using a crossfire setup? That would be the equivalent of a 138" screen. 9'2" wide, 6'11" tall. In full crossfire modes you have 24 possible monitors. Its actually a pretty cheap way to get a large screen that is effective even in lit environments.
I'd love to see crysis on that bastard do as much.
It would prolly explode!
ahhh the smell of crysis in the morning. ..... crispy graphics cards for breakfast. mmmm....
i think some of you are missing the point here. taken from the article:
The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.
we now have a VGA capable of high frames at a astonishing 7680x3200 resolution. also he states the end use wasnt intended for 6 monitors. they just use it to show what the gpu is capable of.
With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display.
I get the point of it and it is impressive compared to my itty bitty 32" tv, but I still prefer my tv to having 6 huge monitors synced like that with high resolutions and high framerate but also having the image broken up.
--------
"Chemistry: 'We do stuff in lab that would be a felony in your garage.'"
The most awesomest after school special T-shirt:
Front: UNO Chemistry Club
Back: /\OH --> Bad Decisions
lol, nvidia fanboys ftl.
I finally switched over to ATI after 7~ years of nvidia and I couldn't be more happy.
Nvidia is still years behind ATI, believe it or not.
www.youtube.com/watch
a single ATI HD 5870 2GB can run the CryEngine 3 tech demo under Direct3D 11 at 5760x1200 at max settings across three displays and it looks like it's getting some impressive frame rates.
"The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed."
I doubt they tried it in Dalaran. They would probably umnount their 6 screen setup and go home, waiting for a ATI 8870 to arrive.
meh, I'm still waiting for a neural interface, or at least a decent VR headset with accelerometers tied directly to the gfx engine.
"Good? Bad? I'm the guy with the gun."
Nifty and all, but an exercise in excess.
I found the perfect monitor to use this on. 3-46" LG 8000 seamless LED TV. No border.
dalaran is gona be fixed when microsoft release their toy next year once the w7 lunch is finish dev will be good to finish what they showed with ut modified by ms to test their toy
(if it doesnt just stay in window or xbox 360 game only )i would cry if that happen
mm can i have my screen wrapped around my field of view that would be perfect (ok too heavy lol)
I always thought Nvidia was the better graphics card company?
Yes and no. Both have one back and forth alot over the past few years. My money is on ATI for the simple fact they have all AMD R&D at there hands. Thats my $.02
Played Aoc/DDO/FFXI/WAR / LoTRo / CO / Aion
Playing Rift
Waiting for FFXIV to be the game it should. so sad =(
The better company is the one that has better products at the moment, whether in absolute performance, performance per dollar, performance per watt, the features you want, or whatever criteria matter to you. That changes as time passes. The general consensus seems to be that in the days of the Radeon 9700, ATI was far superior at the high end, while in the days of the GeForce 8800 GTX, Nvidia was far superior at the high end.
If you want to buy a new, high end card in the next month or so, it's all but guaranteed that ATI will be far superior, as the Radeon HD 5870 clobbers anything Nvidia has to offer, and that won't change until Nvidia releases something new. But when Nvidia does release something new, that might well change.
Even so, that doesn't necessarily even matter if you want a new video card, but are looking for something cheaper. If you want a card for $100-$200, the Radeon HD 5870 won't matter because it's out of your price range. ATI will give you better performance per dollar at the moment, but there are other factors, such as the high idle power consumption of ATI's cards in that range (which was the glaring flaw of the Radeon 4000 series, and fixed in the Radeon 5000 series). That will probably change when ATI releases Juniper to dominate that price segment, and then likely change again when Nvidia releases the derivative cards from the GT 300 line.
For that matter, for performance in a given price segment, which company is better can change if the cards stay the same and someone cuts prices. If Nvidia were to start selling GTX 275s at retail for $120 each, that would destroy anything ATI has at that price. Of course, Nvidia won't do that because it would mean losing money on every card they sell. Indeed, the very reason why ATI has better performance per dollar in the $100-$200 range right now is that they cut prices further than Nvidia from when they were about even several months ago.