Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD Radeon RX 480 Graphics Card With Polaris 10 Leaked – 5.5 TFLOPs Compute, 8 GB GDDR5 Memory

2

Comments

  • filmoretfilmoret Member EpicPosts: 4,906
    Malabooga said:
    filmoret said:
    Quizzical said:
    filmoret said:
    Oh nice did you see the AMD graphics card running faster then the GTX1080 and using half the energy?  And it cost 200$ less.  Yea they have made it clear they focused on VR graphics chips.  I saw something about them making something comparable to the i7 chips but didn't quite grasp it.  Looked like they made something better then i7.
    A Radeon RX 480 is not faster than a GeForce GTX 1080 apart from certain corner cases.  Those corner cases include pretty much anything that's really heavy on local memory usage, which is actually a lot of non-graphical compute tasks.  But not games.

    Also, the card starts at $200, which is $500 less than $700.  So it's not $200 less.  It's a lot better than that.
    I guess they were using dual RX 480's for the demonstration.  Which makes me wonder if they were showing the power usage from both cards or just one of the cards compared to the 1080.  But even with dual cards its considerably cheaper.
    It was crossfire and they didbt show power usage, but 480 should use 110-130W, 1080 uses 180-ish
    Start the video at 36:43 to see the comparison and power usage compared to the 1080.  I believe they are using Doom as the benchmark.
    Are you onto something or just on something?
  • filmoretfilmoret Member EpicPosts: 4,906
    I just watched it again oh man that was tricky.  If you look the game on the left is the AMD version and the graphics are toned down quite a bit.  You can see a considerable difference between quality.  Of course you can push out higher fps with lower quality.  Blah I dont like marketing scams.
    Are you onto something or just on something?
  • RidelynnRidelynn Member EpicPosts: 7,383
    And I agree with the sentiment - if your depending on CF/SLI to give you the speed boost, you need to be very careful, since that only works in specific instances and not the general case.

    If you are thinking to buy 2 480's and beat a 1080 all the time for less money, that definitely won't be the case. It will, in some specific games (such as AotS, apparently), but not as a general rule.
  • filmoretfilmoret Member EpicPosts: 4,906
    Man this has me worried.  If AMD is willing to put up a lying sales pitch for a major announcement then what else have they lied about.  I guess I will have to sit back and just let time show what is really happening.  For those who don't know what I'm talking about a recent presentation they did comparing their dual card vs a nvidia 1080.  They ran their card at considerably lower settings and claimed it was doing better and it really wasn't.  See for yourself at 37:24.  The card on the left is the AMD and the right is the 1080.





    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited June 2016
    They ran at same settings, where are you getting lower settings?

    here is Fury X on the same settings. You can also check videos of previous NVidia cards.



    what is actually becoming apparent is what was hinted by some reviewers that NVidia natively renders less detail in DX12 or 1080 has some texture rendering problems because ti didnt render it correctly (its actually 1080 that seems to render less detail). Because 480 was running details at extreme.

    You also have these

    http://www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/0561a980-78ce-4e24-a7c3-2749a8e33aac

    http://www.ashesofthesingularity.com/metaverse#/personas/d5d8a436-6309-4622-b9f0-c9c141052acd/match-details/f00df630-16f2-49bf-a475-241209ef4128

    And theres also this:

    "Ashes of the singularity uses some form of procedual generation for its texture generation ( aswell as unit composition/behavior to prevent driver cheats) which means that every game session and bench run will have various differences in some details."

    https://www.reddit.com/r/Amd/comments/4lz6ay/anyone_else_noticed_this_at_the_amd_livestream/d3rc5hv





    Post edited by Malabooga on
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    I dont know why people make such fuss of that. Reviews will come out and youll judge by them, atm you have general idea how the card performs, around GTX980 level. Is it less or more - only future will tell.
  • VolgoreVolgore Member EpicPosts: 3,872
    Recore said:
    AMD just confirmed the price of the 480 will be $199. 

    Over 5 TFOPS and made for VR. 



    Uhh...that makes me overthink upgrading to a 1070.
    But then i'm unfortunately among those who had problems with AMD cards in the past.

    image
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Volgore said:
    Uhh...that makes me overthink upgrading to a 1070.
    But then i'm unfortunately among those who had problems with AMD cards in the past.
    And I had problems with an Nvidia card causing random blue screens when playing Anarchy Online.  Does that mean I should recommend that no one ever buy Nvidia?

    It was a problem of an old driver, and immediately fixed by updating the video driver.  (And also fixed by quitting Anarchy Online, as the game wasn't very good, anyway.)  I didn't hold it against Nvidia then, wouldn't today, and even if I did, it's ancient history (GeForce 4 series--not 400 series) to the degree of being basically irrelevant by now.

    Both GPU vendors have problems from time to time.
  • filmoretfilmoret Member EpicPosts: 4,906
    Malabooga said:
    They ran at same settings, where are you getting lower settings?

    here is Fury X on the same settings. You can also check videos of previous NVidia cards.



    what is actually becoming apparent is what was hinted by some reviewers that NVidia natively renders less detail in DX12 or 1080 has some texture rendering problems because ti didnt render it correctly (its actually 1080 that seems to render less detail). Because 480 was running details at extreme.

    You also have these

    http://www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/0561a980-78ce-4e24-a7c3-2749a8e33aac

    http://www.ashesofthesingularity.com/metaverse#/personas/d5d8a436-6309-4622-b9f0-c9c141052acd/match-details/f00df630-16f2-49bf-a475-241209ef4128

    And theres also this:

    "Ashes of the singularity uses some form of procedual generation for its texture generation ( aswell as unit composition/behavior to prevent driver cheats) which means that every game session and bench run will have various differences in some details."

    https://www.reddit.com/r/Amd/comments/4lz6ay/anyone_else_noticed_this_at_the_amd_livestream/d3rc5hv





    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Are you onto something or just on something?
  • VolgoreVolgore Member EpicPosts: 3,872
    Quizzical said:
    Volgore said:
    Uhh...that makes me overthink upgrading to a 1070.
    But then i'm unfortunately among those who had problems with AMD cards in the past.
    And I had problems with an Nvidia card causing random blue screens when playing Anarchy Online.  Does that mean I should recommend that no one ever buy Nvidia?

    It was a problem of an old driver, and immediately fixed by updating the video driver.  (And also fixed by quitting Anarchy Online, as the game wasn't very good, anyway.)  I didn't hold it against Nvidia then, wouldn't today, and even if I did, it's ancient history (GeForce 4 series--not 400 series) to the degree of being basically irrelevant by now.

    Both GPU vendors have problems from time to time.
    I did not recommend that no one should buy AMD because i had some faulty cards, but of course it affects my personal decision in which card to buy.
    If somebody had to deal with 3 RMAs of say a Samsung drive he probably wouldn't walk into a store after that to buy just another Samsung drive.
    I had to buy a spare video card in order to do anything with my PC during the weeks of back and forth RMAing.

    And i guess nobody can deny that there have been alot of problems over various generations of AMD cards in the past, from bad driver support to faulty boards, memory, coils...and on top of that heat generation and power consumption.

    image
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited June 2016
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$ (or < 500$ as they put it depending if you opt for 8GB versions)
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
  • filmoretfilmoret Member EpicPosts: 4,906
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    But if they get things working properly they could offer a third card. To make it even faster.  So people who cannot afford the 700$ card can just buy one at a time and eventually end up with the equlivant of something much better.  Then I guess Nvidia could do the same thing and you can buy two of the 700$ cards.  Man this is giving me a headache now...

    What I'm thinking is you get the two cards and later when another card comes out you can simply upgrade one of them.  So each upgrade will only cost you about 200$ instead of the 700$ for each Nvidia upgrade.  So with each upgrade you replace the oldest card and you end up with two gens of cards but it ends up being just as fast.  Then gain this probably doesn't work unless they get the interfacing drivers for such a thing.
    Are you onto something or just on something?
  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited June 2016
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    Well, the way things are developing GPUs are headed same way as CPUs, incresing performance by adding more "cores" (core=1 GPU chip). Its even rumored that next gen consoles might feature dual GPU chips for the same reason - you get same performance as 1 big chip for cheaper - if the games properly support that.

    Also in VR, from what has been told, 1 (smaller) GPU per eye is preferable than single big GPU (with an added benefit of same performance for cheaper just as consoles)
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    filmoret said:
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    But if they get things working properly they could offer a third card. To make it even faster.  So people who cannot afford the 700$ card can just buy one at a time and eventually end up with the equlivant of something much better.  Then I guess Nvidia could do the same thing and you can buy two of the 700$ cards.  Man this is giving me a headache now...

    What I'm thinking is you get the two cards and later when another card comes out you can simply upgrade one of them.  So each upgrade will only cost you about 200$ instead of the 700$ for each Nvidia upgrade.  So with each upgrade you replace the oldest card and you end up with two gens of cards but it ends up being just as fast.  Then gain this probably doesn't work unless they get the interfacing drivers for such a thing.
    I don't follow you.  If you have two of card A in CrossFire/SLI, then buy one of card B in the future, in order to use it, you pull out both of card A and now just have one of card B.
  • Thomas2006Thomas2006 Member RarePosts: 1,152
    filmoret said:
    Oh nice did you see the AMD graphics card running faster then the GTX1080 and using half the energy?  And it cost 200$ less.  Yea they have made it clear they focused on VR graphics chips.  I saw something about them making something comparable to the i7 chips but didn't quite grasp it.  Looked like they made something better then i7.
    Yeah it was two 480's against the single 1080 but your point still stands. For less than $500 you had two cards running at 50%+ max and doing 60+ FPS with less power. Awesome.
    That is nice and all.. Until you look at the number of games that do not support sli or crossfire.  (Just about every UE4 game and most Unity games and a load of others) It is part of the reasion Nvidia is pushing so hard to phase out sli. Most companies and engines are just not putting the time or effort into the support so its a wash.
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    Malabooga said:
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    Well, the way things are developing GPUs are headed same way as CPUs, incresing performance by adding more "cores" (core=1 GPU chip). Its even rumored that next gen consoles might feature dual GPU chips for the same reason - you get same performance as 1 big chip for cheaper - if the games properly support that.

    Also in VR, from what has been told, 1 (smaller) GPU per eye is preferable than single big GPU (with an added benefit of same performance for cheaper just as consoles)
    I object to your misuse of the word "core" here.  One could argue that the GPU equivalent of a CPU core is a graphics processing cluster or a compute unit or a sub-slice or a SIMD unit or a shader (yes, I'm deliberately mixing terminology from different vendors), but it's definitely not an entire chip.

    They could theoretically put two GPU dies into a single multi-chip module the way AMD and Intel have done at times with CPUs (e.g., AMD Magny-Cours or Intel Core 2 Quad) with a ton of bandwidth to connect the two on an interposer.  But you'd need crazy amounts of bandwidth connecting the two GPU dies for it to work well.  You'd only do that if either you're pushed hard in that direction because yields on a single large die are terrible or the single die you want is larger than foundries can physically manufacture (around 600 mm^2).  Neither of those are likely to be the case in consoles, as you'd end up with a console that is way too expensive and burns way too much power.

    On your second paragraph, that's not true.  If you have one GPU handle each eye, that will likely scale better than normal CrossFire/SLI.  Maybe two of card X is then 1.7 times as good as one of card X, rather than only 1.4 times as good.  But it's still nowhere near twice as good.

    If you have one big GPU handle everything, it can do a lot of geometry computations once (loosely, entire vertex shader up through most of the tessellation evaluation shader) to see where some vertex is relative to some point, then do separate computations for each eye after that.  If you have two smaller GPUs, everything after you split the computations for each eye scales well, but everything before it has to be replicated on each GPU.
  • filmoretfilmoret Member EpicPosts: 4,906
    Quizzical said:
    filmoret said:
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    But if they get things working properly they could offer a third card. To make it even faster.  So people who cannot afford the 700$ card can just buy one at a time and eventually end up with the equlivant of something much better.  Then I guess Nvidia could do the same thing and you can buy two of the 700$ cards.  Man this is giving me a headache now...

    What I'm thinking is you get the two cards and later when another card comes out you can simply upgrade one of them.  So each upgrade will only cost you about 200$ instead of the 700$ for each Nvidia upgrade.  So with each upgrade you replace the oldest card and you end up with two gens of cards but it ends up being just as fast.  Then gain this probably doesn't work unless they get the interfacing drivers for such a thing.
    I don't follow you.  If you have two of card A in CrossFire/SLI, then buy one of card B in the future, in order to use it, you pull out both of card A and now just have one of card B.
    They would have to make it so A and B are compatible.  That way you are using A,B and later when another card comes out you get rid of the A and have B,C. 
    Are you onto something or just on something?
  • MikehaMikeha Member EpicPosts: 9,196
  • [Deleted User][Deleted User] Posts: 12,262
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • acidbloodacidblood Member RarePosts: 878
    filmoret said:
    Quizzical said:
    Malabooga said:
    filmoret said:
    Yea IDK exactly what is wrong but the image on the left seems like they have foilage turned down.  Which is where you can really see the difference in quality.  Another thing I dont understand is with dual cards they only needed to run at 58%.  Why not just run 1 card at 98% and it would match the 1080 which is also running at 98%?
    Well igues point they wanted to make was that you can get same performance for less with 2x480. Just look how NVidia marketed 1080 as "2xGTX980" and that was "bad" as GTX980 costs 450$ and 1080 is 599/699$

    But now AMD is showing that same GTX980 performance for 200$/400$. Also puts 1070 in the spotlight - 1070=400$ and 2x480 as fast as 1080 also=400$.
    One could argue that Nvidia was saying "buy one GTX 1080 instead of two GTX 980s", while AMD was saying, "buy two RX 480s instead of one GTX 1080".  Advice of "buy one card instead of two" is not equivalent to "buy two cards instead of one".  One faster cards is preferable to two slower cards.
    But if they get things working properly they could offer a third card. To make it even faster.  So people who cannot afford the 700$ card can just buy one at a time and eventually end up with the equlivant of something much better.  Then I guess Nvidia could do the same thing and you can buy two of the 700$ cards.  Man this is giving me a headache now...

    What I'm thinking is you get the two cards and later when another card comes out you can simply upgrade one of them.  So each upgrade will only cost you about 200$ instead of the 700$ for each Nvidia upgrade.  So with each upgrade you replace the oldest card and you end up with two gens of cards but it ends up being just as fast.  Then gain this probably doesn't work unless they get the interfacing drivers for such a thing.
    Generally doesn't work like that, as in you need two of the same (ideally identical) cards to use them in SLI / XF. Not sure if it's still an option, but SLI did have a thing where you could run one for physX and the other for rendering; I ran that setup for a while, but honestly the benefit was pretty small.

    Not saying the buy one card now and one later is a bad option (have done it in the past), but the other thing to consider is the size of the card and support from the motherboard / case / power supply. For example, technically I can fit 2 full size graphics cards in my case, and my MB / PSU is compatible, but it would mean having to take out a hard drive and blocking the 1x slot... so a single card solution is a better option in my case.

  • RidelynnRidelynn Member EpicPosts: 7,383
    Recore said:
    Clock speed doesn't tell the whole story though.

    Otherwise, we'd all be using AMD FX-9590 CPU's rocking 5Ghz stock.
  • RidelynnRidelynn Member EpicPosts: 7,383
    The other flaw with "buy one now, add a second later"

    I too, thought the exact same thing. This was when I bought a GeForce GTX 260. I usually buy upper tier cards, but I thought to myself "You know, I don't really need anything faster now, and when I do, I can just SLI a second card in". SO I found a good deal on a BFG GTX 260 (which wasn't terribly old at the time), and picked up one now, and figured I'd pick up a second later when I needed it. And I patted myself on the back for saving about $300 over a 280 or a 5870 (I later kicked myself for not going with the 5870 like my brother did)

    And that was all true, up until the point where I did need something faster, which happened about 18 months later. And by then, GTX 260's were not terribly common any longer (Fermi had just released), and BFG was on the ropes. But I spent the money to buy an identical BFG GTX 260 so I could SLI them without having firmware issues.

    And SLI worked. In about 3 games. The same 3 games that could also use the second card as a PhysX processor. And in about 3 games, it actually performed worse than a single card, and I would have to disable SLI to play those games. And in most of the games I played, it was more or less the same experience as having the single GTX260.

    So when I was due for another upgrade, I swore off multi-GPU (at least for the moment, that could change), and bought a 6970 the day they released. That 6970 worked very well for many years, until it was finally upgraded with a GTX 980.

    At the end of the day, I spent more on the two 260s (the first wasn't too expensive, but the second was too expensive) than I did either the 6970 or the 980, and I only got about half of the useful life out of it - maybe 3 years total before I wasn't happy with it, largely because the SLI scaling just wasn't there.

  • CleffyCleffy Member RarePosts: 6,414
    I wonder about multi-gpu scaling moving forward in DX12. One of the benefits being displayed by DX12 is that it should scale to more GPUs better than in DX11 and sli/crossfire profiles will not be needed. In some engines you can run an AMD and nVidia card on the same game. However, until this is shown in practice it's still just a guess.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Cleffy said:
    I wonder about multi-gpu scaling moving forward in DX12. One of the benefits being displayed by DX12 is that it should scale to more GPUs better than in DX11 and sli/crossfire profiles will not be needed. In some engines you can run an AMD and nVidia card on the same game. However, until this is shown in practice it's still just a guess.
    That would be a very nice step forward.
Sign In or Register to comment.