Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

one ati gpu runs 6 30" dell monitors!

frozenvoidfrozenvoid Member Posts: 40

 

www.anandtech.com/video/showdoc.aspx

 

I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed.

 

nvidia sure got told!

 

image

Comments

  • soap46soap46 Member Posts: 169

    Meh...  Give it a week.  The big N will do something to this effect as well... 

     

    Also, I don't care if it runs 6 displays, I'm not buying another ATI Card again.  Worst. Computer related investment.  Ever!

  • CleffyCleffy Member RarePosts: 6,414

    ATI can do this because they have so many more processing units. If they broke down their processing units by 1/6th it would still be over the rate alot of developers plan for.  Personally, I don't like the look of 6 monitors stacked on each other.  The divide between monitors takes something out of it.  I would rather mod the monitor so that divide is gone using good old fashion sodering.  Also the electricity would compete with your computer.  Thats about 240watts when its turned on, unless they used an LED backlit model.

  • mrcalhoumrcalhou Member UncommonPosts: 1,444

    I could never play on something like that. Having the image broken up like that just looks so tacky.

    --------
    "Chemistry: 'We do stuff in lab that would be a felony in your garage.'"

    The most awesomest after school special T-shirt:
    Front: UNO Chemistry Club
    Back: /\OH --> Bad Decisions

  • xiirotxiirot Fallen Earth CorrespondentMember Posts: 328
    Originally posted by mrcalhou


    I could never play on something like that. Having the image broken up like that just looks so tacky.

     

    agreed

    "Good people are good because they've come to wisdom through failure. We get very little wisdom from success, you know." William Saroyan

  • CleffyCleffy Member RarePosts: 6,414

    The largest monitor you can buy is 46".  Could you imagine playing on 9 of those using a crossfire setup?  That would be the equivalent of a 138" screen.  9'2" wide, 6'11" tall.  In full crossfire modes you have 24 possible monitors.  Its actually a pretty cheap way to get a large screen that is effective even in lit environments.

  • remixedcatremixedcat Member Posts: 27

    I'd love to see crysis on that bastard do as much.

     

    It would prolly explode!

     

    ahhh the smell of crysis in the morning. ..... crispy graphics cards for breakfast. mmmm....

  • frozenvoidfrozenvoid Member Posts: 40

    i think some of you are missing the point here. taken from the article:

     

    The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.

     

    we now have a VGA capable of high frames at a astonishing 7680x3200 resolution. also he states the end use wasnt intended for 6 monitors. they just use it to show what the gpu is capable of.

    With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display.

     

    image

  • mrcalhoumrcalhou Member UncommonPosts: 1,444

    I get the point of it and it is impressive compared to my itty bitty 32" tv, but I still prefer my tv to having 6 huge monitors synced like that with high resolutions and high framerate but also having the image broken up.

    --------
    "Chemistry: 'We do stuff in lab that would be a felony in your garage.'"

    The most awesomest after school special T-shirt:
    Front: UNO Chemistry Club
    Back: /\OH --> Bad Decisions

  • KrilsterKrilster Member Posts: 230
    Originally posted by soap46


    Meh...  Give it a week.  The big N will do something to this effect as well... 
     
    Also, I don't care if it runs 6 displays, I'm not buying another ATI Card again.  Worst. Computer related investment.  Ever!

     

    lol, nvidia fanboys ftl.



    I finally switched over to ATI after 7~ years of nvidia and I couldn't be more happy.



    Nvidia is still years behind ATI, believe it or not.

    image

  • frozenvoidfrozenvoid Member Posts: 40

    www.youtube.com/watch

     

    a single ATI HD 5870 2GB can run the CryEngine 3 tech demo under Direct3D 11 at 5760x1200 at max settings across three displays and it looks like it's getting some impressive frame rates.

    image

  • VolgoreVolgore Member EpicPosts: 3,872

    "The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed."

     

    I doubt they tried it in Dalaran. They would probably umnount their 6 screen setup and go home, waiting for a ATI 8870 to arrive.

    image
  • heremypetheremypet Member, Newbie CommonPosts: 528

    meh, I'm still waiting for a neural interface, or at least a decent VR headset with accelerometers tied directly to the gfx engine.

    "Good? Bad? I'm the guy with the gun."

  • neoterrarneoterrar Member Posts: 512

    Nifty and all, but an exercise in excess.

  • CleffyCleffy Member RarePosts: 6,414

    I found the perfect monitor to use this on.  3-46" LG 8000 seamless LED TV.  No border.

  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    Originally posted by VoIgore


    "The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed."
     
    I doubt they tried it in Dalaran. They would probably umnount their 6 screen setup and go home, waiting for a ATI 8870 to arrive.

    dalaran is gona be fixed when microsoft release their toy next year once the w7 lunch is finish dev will be good to finish what they showed with ut modified by ms to test their toy

    (if it doesnt just stay in window or xbox 360 game only )i would cry if that happen

  • drbaltazardrbaltazar Member UncommonPosts: 7,856
    Originally posted by heremypet


    meh, I'm still waiting for a neural interface, or at least a decent VR headset with accelerometers tied directly to the gfx engine.

    mm can i have my screen wrapped around my field of view that would be perfect (ok too heavy lol)

  • havok527havok527 Member Posts: 80

    I always thought Nvidia was the better graphics card company?

  • Zandora2018Zandora2018 Member Posts: 240
    Originally posted by havok527


    I always thought Nvidia was the better graphics card company?

     

    Yes and no. Both have one back and forth alot over the past few years. My money is on ATI for the simple fact they have all AMD R&D at there hands. Thats my $.02

    Played Aoc/DDO/FFXI/WAR / LoTRo / CO / Aion
    Playing Rift

    Waiting for FFXIV to be the game it should. so sad =(

  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Originally posted by havok527


    I always thought Nvidia was the better graphics card company?



     

    The better company is the one that has better products at the moment, whether in absolute performance, performance per dollar, performance per watt, the features you want, or whatever criteria matter to you.  That changes as time passes.  The general consensus seems to be that in the days of the Radeon 9700, ATI was far superior at the high end, while in the days of the GeForce 8800 GTX, Nvidia was far superior at the high end.

    If you want to buy a new, high end card in the next month or so, it's all but guaranteed that ATI will be far superior, as the Radeon HD 5870 clobbers anything Nvidia has to offer, and that won't change until Nvidia releases something new.  But when Nvidia does release something new, that might well change.

    Even so, that doesn't necessarily even matter if you want a new video card, but are looking for something cheaper.  If you want a card for $100-$200, the Radeon HD 5870 won't matter because it's out of your price range.  ATI will give you better performance per dollar at the moment, but there are other factors, such as the high idle power consumption of ATI's cards in that range (which was the glaring flaw of the Radeon 4000 series, and fixed in the Radeon 5000 series).  That will probably change when ATI releases Juniper to dominate that price segment, and then likely change again when Nvidia releases the derivative cards from the GT 300 line.

    For that matter, for performance in a given price segment, which company is better can change if the cards stay the same and someone cuts prices.  If Nvidia were to start selling GTX 275s at retail for $120 each, that would destroy anything ATI has at that price.  Of course, Nvidia won't do that because it would mean losing money on every card they sell.  Indeed, the very reason why ATI has better performance per dollar in the $100-$200 range right now is that they cut prices further than Nvidia from when they were about even several months ago.

Sign In or Register to comment.