Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD Vega GPUs will launch "over the next couple of months".

12357

Comments

  • OzmodanOzmodan Member EpicPosts: 9,726
    Well AMD just came out with a huge driver update that could make a big difference:

    https://gaming.radeon.com/en-us/introducing-radeon-software-crimson-relive-edition-17-7-2/

    Seems to me that you are tilting at windmills for a product that is not even available yet.

    Seems to me we should just wait for release and see some real reviews before condemning or lauding it.  
  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    I predict that at least one but not all three of the following will happen:

    1)  A Radeon RX Vega will be faster than a GeForce GTX 1080 Ti.
    2)  A Radeon RX Vega will be cheaper than a GeForce GTX 1080 Ti.
    3)  All of the Radeon RX Vega cards will quickly be bought up by Ethereum miners.
    [Deleted User]HrimnirOzmodanAmazingAvery
  • HrimnirHrimnir Member RarePosts: 2,415
    Quizzical said:
    I don't like that they chose an AMD favoured benchmark game. It's one they have chosen to use before with cards. I wonder if they demanded it of him.

    Ridelynn said:
    They chose a game at a resolution that wasn't particularly demanding, but it's something.

    That's one reason why it might be an outlier.  Still, most DX12 and Vulkan games thus far seem to be fairly pro-AMD, and that's probably the future as APIs go.

    Even so, to beat your competitor's best in a legitimate test with only mild cherry-picking of which game to use sure beats losing to your competitor even with that cherry-picking.  It's not like it was a benchmark of closed-source code written by AMD, of the sort that Nvidia pushes with GameWorks, or in another era, GPU PhysX.

    This could easily be a rerun of Fiji or Ryzen where AMD released early that their product won at certain cherry-picked but otherwise reasonable tests (4K in several games for Fiji, Cinebench for Ryzen), and other benchmarks when the full reviews arrived were somewhat less favorable to AMD's product.  Even so, that heralded Fiji and Ryzen being competitive products.  A similarly modest amount of cherry-picking wouldn't have made Bulldozer look good, and this gives reason to doubt the claims that a Radeon RX Vega was only going to be competitive with a GeForce GTX 1070, which would be a genuine Bulldozer-level fiasco.

    Sadly AMD's willingness to do these things with their last few releases doesn't give me hope for Vega.  The Fiji cherry picking was especially bad, combined with their marketing department hyping it as the fastest card in the world, etc etc.  I mean I know their backs are against the wall, but there are lies, damn lies, and cherry picked benchmarks.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • mgilbrtsnmgilbrtsn Member EpicPosts: 3,430
    Keep hope alive AMD enthusiasts.  I'm rooting for ya!

    I self identify as a monkey.

  • RidelynnRidelynn Member EpicPosts: 7,383
    Quizzical said:
    I predict that at least one but not all three of the following will happen:

    1)  A Radeon RX Vega will be faster than a GeForce GTX 1080 Ti.
    2)  A Radeon RX Vega will be cheaper than a GeForce GTX 1080 Ti.
    3)  All of the Radeon RX Vega cards will quickly be bought up by Ethereum miners.
    If you replace RX Vega with Fury X (and 1080 TI with 980 Ti) (and Etherium with Bitcoin) - all 3 statements that generation proved false.
    [Deleted User]Ozmodan
  • CleffyCleffy Member RarePosts: 6,414
    Ethereum is more dependent on memory than Bitcoin. The card that sold out the fastest this recent series was the R9 Nano. The Fury and Nano have the highest hash rates on Ethereum.
  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    Ridelynn said:
    Quizzical said:
    I predict that at least one but not all three of the following will happen:

    1)  A Radeon RX Vega will be faster than a GeForce GTX 1080 Ti.
    2)  A Radeon RX Vega will be cheaper than a GeForce GTX 1080 Ti.
    3)  All of the Radeon RX Vega cards will quickly be bought up by Ethereum miners.
    If you replace RX Vega with Fury X (and 1080 TI with 980 Ti) (and Etherium with Bitcoin) - all 3 statements that generation proved false.
    Bitcoin hasn't driven GPU sales since 2011, and Fiji certainly didn't launch in the middle of such a rush on cryptocurrency mining as we have now.
    Ozmodan
  • RidelynnRidelynn Member EpicPosts: 7,383
    Quizzical said:
    Ridelynn said:
    Quizzical said:
    I predict that at least one but not all three of the following will happen:

    1)  A Radeon RX Vega will be faster than a GeForce GTX 1080 Ti.
    2)  A Radeon RX Vega will be cheaper than a GeForce GTX 1080 Ti.
    3)  All of the Radeon RX Vega cards will quickly be bought up by Ethereum miners.
    If you replace RX Vega with Fury X (and 1080 TI with 980 Ti) (and Etherium with Bitcoin) - all 3 statements that generation proved false.
    Bitcoin hasn't driven GPU sales since 2011, and Fiji certainly didn't launch in the middle of such a rush on cryptocurrency mining as we have now.
    That doesn't change my original comment here at all.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188

    https://videocardz.com/71280/amd-vega-10-vega-11-vega-12-and-vega-20-confirmed-by-eec

    https://www.reddit.com/r/Amd/comments/6q2y7w/eurasian_economic_commission_site_confirms_vega/

    AMD VEGA 20 with 32 GB HBM2
    Unlike Vega 10, Vega 20 is said to have four HBM2 stacks, which enables 32 GB capacity. According to the list from EEC, Vega 20 is currently only expected as GL variant, which means Radeon Pro/Instinct series.

    AMD VEGA 11, 12
    This is the first trace of Vega 11, a GPU we discussed a few weeks ago. It’s a smaller Vega chip for mid-range segment. Based on the charts we posted it was meant to replace Polaris, however that was a long ago, so the plans could’ve changed. The list below lists Vega 11 XT and Vega 11 PRO for AIBs, which basically mean custom cards based on Vega 11. There’s also an entry for FirePro card.

    There’s also one entry with Vega 12 XT, a model listed as FirePro card.

    AMD VEGA 10 XTX, XT and XL
    Finally, the list confirms rumored codenames for upcoming Radeon RX Vega lineup. The XTX variant will launch as a liquid-cooled card, but it will also be available with air cooling. The XTX variant has higher TDP than XT, which is also listed below.
    Vega 10 XL is the last variant listed. It’s the card for AIBs and OEMs.



  • kitaradkitarad Member LegendaryPosts: 8,178
    These Ethereum miners are pushing the prices of video cards to really insane heights.
    Ozmodan

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited July 2017
    Doesnt look like Vega stock/price has much to fear from the Etherium miners, here are a couple hashing reviews from the Vega FE:

    http://1stminingrig.com/amd-vega-frontier-edition-mining-performance-review/

    https://www.reddit.com/r/EtherMining/comments/6k521l/vega_eth_hashrate_3035_mhs/

    http://wccftech.com/amd-radeon-vega-frontier-edition-performance-benchmarks-unboxing/

    Then again - wait until we get the ... mining edition(?) of the card before passing judgement.
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    Quizzical said:
    I predict that at least one but not all three of the following will happen:

    1)  A Radeon RX Vega will be faster than a GeForce GTX 1080 Ti.
    2)  A Radeon RX Vega will be cheaper than a GeForce GTX 1080 Ti.
    3)  All of the Radeon RX Vega cards will quickly be bought up by Ethereum miners.
    https://gaming.radeon.com/en-us/vega-freesync-budapest/?sf101934658=1

    Even if we pitted Radeon RX Vega against the mightier GeForce 1080Ti, my money would still bet on a similar outcome: that you wouldn’t really be able to tell the difference with variable refresh rate.

    AMD are saying the 1080Ti is mightier ;)

    Also looking at feedback from both event it seems like the G-Sync monitor had better, richer, deeper colours (which they tried to cover up and mask).

    Which is why the events are a waste of time when pitched like they were for the sole reasons of comparing pricing of the monitors in a graphic card decision, as like someone already said in the thread, people won't buy a card and a monitor at the same time.

    I predict:

    1.) The AMD Vega XTX variant (liquid cooled limited edition) will trade with the GTX 1080Ti in AMD favoured games. As it will have a higher TDP it will be clocked a little higher but there will be no more room if any for anyone interested in Overclocking. It will draw insane power. This card will be available from some AIB's. It will be priced the same as a GTX 1080Ti or within 10%. 2x 8 Pin PSU connectors needed.

    2.) The AMD Vega XT (the non liquid cooled card) will be in and around a regular GTX 1080 in most game benchmarks. As this card will come with a blower style card chassis it will be loud. It will also be more costly to run than a regular GTX 1080. This card one come from AIB's at least not soon. It will be loud. It will be priced about the same as a regular 1080. Two 8 pin PSU connectors needed.

    3.) The AMD Vega XL will be trading with a GTX 1070. Some benchmarks will show it knocking on the GTX 1080 regular door. Maybe one or two AMD rated games which will show it within 2% better. It will also consume higher power. Some AIB variants will have it being more equal in some games on customer AIB designs. However, with no water cooling it will be louder and more noticeable. It will be priced between a 1070 and 1080 for vanilla cards. Customer AIB ones will be closer to 1080 in price marketed as being the same as a 1080. 1x6pin 1x8pin needed.

    4.) Vega will be marketed as better for gamer's inclusive of monitor language around FreeStink and the new software that just came out.

    5.) Nvidia comparable cards on G-Sync monitors will look better. The events and Budapest and Portland will be seen as stupid as the real reviews come out using the same monitor. There will be noticeable differences to the majority.

    6.) Being generally hot and noisy cards AMDs Vega's play will be on price. Nvidia will be waiting with a price cut to react for a less noisy, less heat, mature drivers pitch. Most likely a holiday season discount.

    7.) FreeSync 2 will come out and will be on par with G-Sync, because of the FreeSync v1 monitor push in AMD marketing this will upset folks who brought into the hype and got a new monitor right away

    8.) AMD will take up to 6 months to sort Vega drivers out that will increase some gaming performance up to 10%.

    9.) Nvidia Volta will come sooner than expected. The strategy will be when Nvidia discounts the 18 month old GTX 1080 Volta won't be far behind.

    10.) Vega 14mn including refresh will go through to end of 2018. Start of 2019 we'll start to see Navi 9mn. Navi will be 8 months late to Nvidia's party (based on Vega being AMD's competitor so late to market for a "somewhat" competing product) 


    I think it will be another Fury X, good but not amazing. If they have yields/price on the favorable side, it will sell well for them. Fury X had 2 major issues. 4GB RAM and insane power consumption. This card looks to have even worse power consumption for performance based on FE, but at least amount of memory should be future-proof enough to let it age nicely.

    Post edited by AmazingAvery on
    Ozmodanpantaro



  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    If it plays out the way you're predicting, Vega will be worse outright than Polaris in all of the efficiency metrics.  It's one thing to expect it not to be a meaningful improvement, but worse outright?  I can't think of a time that any GPU vendor has ever made a new architecture that was worse than their previous in all of the efficiency metrics.  And that would be in spite of building on a far more mature version of the same process node that Polaris had.  One would expect using HBM2 rather than GDDR5 to improve energy efficiency, as well.
    Ozmodan
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    Oh Vega will be better than Polaris in performance but by inheritance perf/w efficiency hasn't really got any better with the past few iterations. It's worse for ASIC and TDP.
    Things like Vega's pixel engine being a client of the onboard L2 cache offering reduction in overhead for graphics workloads and the engine employing new Draw Stream Binning Rasterizer are great. However, even with the DSBR helping with power and efficiency in overhead for graphics workloads Vega is still coming out with lots (and evidently) higher overall power usage concerns.

    It looks like this - Vega consumes 400W to achieve 1.6Ghz clock.
    At 1.6Ghz clock, it "could" still slower than a reference GTX 1080, which consumes 200W.
    15 months late compared to a GTX 1080
    The only saving grace for Vega at this point is price, which Nvidia adjust accordingly at any time.

    AMD has execution issues.



  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    If you ignore Vega, then right now, Nvidia has a much better GPU architecture than AMD by any efficiency metrics.  If you were to take Polaris and scale it up to Vega 10's die size and 300 W, you'd get a chip that mostly beats a GeForce GTX 1080 but is still shy of a GeForce GTX 1080 Ti in gaming--and while using more die space and more power.  That would still be a clear loss for AMD by any reasonable measure of efficiency, and Vega might well do a lot better than that.

    Vega might narrow the efficiency gap with Pascal somewhat, close the gap entirely, or even take the lead.  Or it might leave AMD still way behind Nvidia on efficiency metrics.  But I'd be very surprised if it means that AMD falls further behind, as that would take an enormous screw-up.  Even the historically bad GPUs like the Radeon HD 2900 XT and the GeForce GTX 480 weren't a downgrade from their respective vendors' previous architectures.
  • OzmodanOzmodan Member EpicPosts: 9,726
    Oh Vega will be better than Polaris in performance but by inheritance perf/w efficiency hasn't really got any better with the past few iterations. It's worse for ASIC and TDP.
    Things like Vega's pixel engine being a client of the onboard L2 cache offering reduction in overhead for graphics workloads and the engine employing new Draw Stream Binning Rasterizer are great. However, even with the DSBR helping with power and efficiency in overhead for graphics workloads Vega is still coming out with lots (and evidently) higher overall power usage concerns.

    It looks like this - Vega consumes 400W to achieve 1.6Ghz clock.
    At 1.6Ghz clock, it "could" still slower than a reference GTX 1080, which consumes 200W.
    15 months late compared to a GTX 1080
    The only saving grace for Vega at this point is price, which Nvidia adjust accordingly at any time.

    AMD has execution issues.
    Sorry, but until the cards launch what you are intimating is pure speculation.   Personally most of us are hoping AMD gives Nvidia some real competition on the high end to get some price reductions.  

    The vast majority of gamers are still not running 4k and I do not see that changing any time soon due to high monitor prices, hence the need for 1080 ti+ speeds have limited applicability. 
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Ozmodan said:
    Oh Vega will be better than Polaris in performance but by inheritance perf/w efficiency hasn't really got any better with the past few iterations. It's worse for ASIC and TDP.
    Things like Vega's pixel engine being a client of the onboard L2 cache offering reduction in overhead for graphics workloads and the engine employing new Draw Stream Binning Rasterizer are great. However, even with the DSBR helping with power and efficiency in overhead for graphics workloads Vega is still coming out with lots (and evidently) higher overall power usage concerns.

    It looks like this - Vega consumes 400W to achieve 1.6Ghz clock.
    At 1.6Ghz clock, it "could" still slower than a reference GTX 1080, which consumes 200W.
    15 months late compared to a GTX 1080
    The only saving grace for Vega at this point is price, which Nvidia adjust accordingly at any time.

    AMD has execution issues.
    Sorry, but until the cards launch what you are intimating is pure speculation.   Personally most of us are hoping AMD gives Nvidia some real competition on the high end to get some price reductions.  

    The vast majority of gamers are still not running 4k and I do not see that changing any time soon due to high monitor prices, hence the need for 1080 ti+ speeds have limited applicability. 

    The TDP is known, the ASIC is known, the delivery date is known. It's late.
    The CES 2017 monitors will be with consumers anytime now, including those 4k 144hz+ HDR monitors. The knock on effect is the same reason as your competition statement. It will drive prices down on today's monitors and thus over the next year increase the number of higher res users as the market shifts that way. If AMD was truly looking to make a statement they would and should (with the time they had) provide high end solutions to 4k 60fps - that is frame per second; not "Feels Per Second" with their "Blind tests".
    pantaro



  • QuizzicalQuizzical Member LegendaryPosts: 25,509

    Also looking at feedback from both event it seems like the G-Sync monitor had better, richer, deeper colours (which they tried to cover up and mask).

    5.) Nvidia comparable cards on G-Sync monitors will look better. The events and Budapest and Portland will be seen as stupid as the real reviews come out using the same monitor. There will be noticeable differences to the majority.

    7.) FreeSync 2 will come out and will be on par with G-Sync, because of the FreeSync v1 monitor push in AMD marketing this will upset folks who brought into the hype and got a new monitor right away

    What are you prattling on about?  Neither FreeSync nor G-sync has anything to do with monitor image quality.  They only affect when monitors decide to display a new image.

    FreeSync has been competitive with G-sync for as long as both have been available.  The only notable differences between them are vendor compatibility and the price tag.  Unless they're going to make it work with windowed games or multi-monitor gaming, it's not obvious what improvements could be made.  And if you try to make it work with windowed games, then what happens if you have two different games running on the same monitor with different refresh rates?
    Phry
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    edited July 2017
    ^ a few users in the blind tests and AMD noticed themselves that the image quality wasn't equal between the two monitors. They said the G-Sync monitor looked better. IPS / VA monitors used in terms of the configuration used and AMD tweaking them would play into perception of overall quality from an end user perspective. That is why testing on two different monitors is stupid.
    Should be same one monitor, one game at a time.

    Sure FreeSync is competitive. It was used in those tests probably to mask any dips in performance. It's mainly to prevent tearing caused the variation, but you'd still have less frames being displayed. If FPS doesn't matter then the entirety of both companies high end offerings don't matter. Humans tend to notice variability rather than two different, but consistent values. If AMD was slower, but more consistent that can be a good thing. Shame the FPS was covered up, wonder why. For instance I won't buy a Freesync monitor until it can do frame strobing at the same time. I find persistence to be more of an issue than tearing. My monitor is 120hz and my computer generally can deliver those frames relatively consistently. I haven't noticed tearing since I went to a high refresh monitor. In this use case AMD's argument isn't valid.

    If Vega can't do 4k 144hz 60fps comfortably then those that want that will go G-Sync because the Nvidia card will. So if the card is priced comparably I'd chose G-Sync. 

    Both Freesync & G-sync were made to compensate for times when the video card can't handle everything on the screen smoothly. 
    So, as gamers, we want the fastest GPU possible to *not* be forced to use tech that will compensate for the lack of the ability of the video card, but, that isn't what these marketing guys want the world to think when they are at a disadvantage with their product.
    Post edited by AmazingAvery on



  • RidelynnRidelynn Member EpicPosts: 7,383
    I actually can't believe that we still have fixed refresh rates on most nearly everything, now that variable refresh tech is out there, and even standardized.

    Seems the laptop crowd would have jumped on this across the board lot harder than they have, given the potential for battery savings; instead of just relegating it to a few handful of gaming laptops.
    [Deleted User]Quizzical
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    No live streaming from Siggraph... they must really want to sweep this turd under the rug. I don't see this as a positive sign at all. Hopefully the NDA will be lifted and facts come out. 



  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    Ridelynn said:
    I actually can't believe that we still have fixed refresh rates on most nearly everything, now that variable refresh tech is out there, and even standardized.

    Seems the laptop crowd would have jumped on this across the board lot harder than they have, given the potential for battery savings; instead of just relegating it to a few handful of gaming laptops.
    The original intended use for adaptive sync was to reduce the refresh rate on laptops when nothing on the screen is moving.  The point of that is, as you said, to improve battery life.
  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    edited July 2017
    No live streaming from Siggraph... they must really want to sweep this turd under the rug. I don't see this as a positive sign at all. Hopefully the NDA will be lifted and facts come out. 
    Because it's traditional to live stream reviews getting posted online?  That's what we're looking forward to, isn't it?  Who cares about some marketing junk presentation once reviews are out?
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    ^ Look at the comments from the Twitter announcement linked there.
    Yes it is traditional.

    http://wccftech.com/amd-polaris-10-polaris-11-launch-event/
    http://www.anandtech.com/show/10341/amd-announces-computex-2016-webcast
    http://www.techadvisor.co.uk/how-to/game/amd-e3-2015-press-conference-live-video-stream-fiji-radeon-300-gpu-launch-replay-3615542/

    new architecture has always had some event. 

    "I guess you have nothing important to show."
    "RIP RTG. Most important launch and you can't give info and won't stream it either. Failed product? With a failed launch? ..."
    "No confidence in Vega I see. Next time don't hype a substandard product 1 year before launch."
    "We waited for a year. and yet no live stream just tweet whats happening in the show. Facepalm"
    "Twitter update from AMD "we promise it's totally amazing" phew, that was a big card launch, I think we nailed it"



  • QuizzicalQuizzical Member LegendaryPosts: 25,509
    edited July 2017
    Well yes, AMD has done Vega presentations.  Quite a few of them, in fact.  For example:

    http://www.anandtech.com/show/11476/computex-2017-amd-press-event-live-blog-starts-10pm-et

    But the big presentation usually doesn't coincide with the reviews going live.  For example, for the GeForce GTX 1080, here's the big presentation on May 6, 2016:

    http://www.anandtech.com/show/10305/the-nvidia-geforce-2016-liveblog

    The cards wouldn't paper launch until three weeks later, and wide availability at MSRP was still several months away.  Does that mean that the GeForce GTX 1080 was a failure because they didn't livestream the actual launch?
Sign In or Register to comment.