Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Tom's Most Updated GPU Chart

[Deleted User][Deleted User] Posts: 12,262
The user and all related content has been deleted.

거북이는 목을 내밀 때 안 움직입니다












Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    No RX480...

    As far as 1080 being king of the hill - at least until Vega, at which point nVidia will probably throw out the next Titan/Ti edition and reclaim it.
  • [Deleted User][Deleted User] Posts: 12,262
    The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • RidelynnRidelynn Member EpicPosts: 7,383
    I mean, we can make a pretty good guess where Tom would throw it, it's just a bit odd that it's not on there already.
  • CleffyCleffy Member RarePosts: 6,414
    edited July 2016
    They had to wait for the updated drivers that takes power management into account, and this article was probably completed and in editing before that happened. AMD places it around the R9 290. If you do want an RX-480 it is probably best to wait for the 3rd party coolers and changes.
  • centkincentkin Member RarePosts: 1,527
    If you cap out a game, will a higher card run it easier (IE lower watt/temp/less fan use) than a lower card, or will it not? 
  • Jamar870Jamar870 Member UncommonPosts: 573
    Well for me I would think that when you get a "scene" where there is alot of stuff going on. A higher/better card will probably handle it better, but not sure if it will use less power. Probably will use less in less demanding "scenes".
  • CleffyCleffy Member RarePosts: 6,414
    More than likely a higher end GPU will draw more power than a GPU with a lower TDP. There are several things your board does where it can't apply power management. For instance 4GB verse 8GB video memory. If they are on the same process node, than the 4GB will always draw less power regardless of load. However, this is only a couple watts.
    You also have to remember at Idle a lower TDP GPU will draw less power which is the state the GPU is the majority of the time. Most GPUs of a family share the same architecture, only the higher end GPU has more of it. This means if a game is pushing the lower end GPU at 100% and the higher end GPU at 50%, they are still consuming the same amount of resources, thus same amount of electricity consumed.
  • GladDogGladDog Member RarePosts: 1,097
    Ridelynn said:
    No RX480...

    As far as 1080 being king of the hill - at least until Vega, at which point nVidia will probably throw out the next Titan/Ti edition and reclaim it.
    Yeah I wish they had the 480 on the list. I am still on the fence about which card to buy. I am thinking I may just wait a little longer. I am using a R9 380 and it struggles on some things but....
    I'm waiting to see if the 1070 falls to close to $300.  If not, I will wait for the AMD Vega chips (RX-490, Fury II?) and see what prices do then.  I was originally waiting for 2nd gen cooling on the 480, and that might still be what I end up getting, but I want to see what prices do after the GPU Veganization strikes!

    The card I have isn't bad, I'll be just fine until then (XFX 7870 2GB Ghost edition).


    The world is going to the dogs, which is just how I planned it!


  • H0urg1assH0urg1ass Member EpicPosts: 2,380
    As someone who has owned Nvidia cards for the past few generations, and mind you it's not out of any sense of brand loyalty, I really wish they'd pull their heads out of their ass.  Why?  Because their multi-monitor support is absolute trash.

    The flaming hoops you have to jump through to swap from triple monitor to span the displays with surround are ridiculous.  If I want to span my displays right now, I have to open the control panel, select Configure Surround, then click the check mark to begin configuring, but nope I have to close Chrome, Galaxy Client, PIA, Battlenet and half a dozen other programs, because for some asinine reason it can't properly configure with any of those open, then I have input my settings again every single time I want to switch and hit apply.

    There should be a hotkey that I can set that just damn remembers my settings and toggles spanned displays on and off, but nooooo, that would actually take some development effort.
  • QuizzicalQuizzical Member LegendaryPosts: 25,531
    Tom's Hardware commonly releases their new "best GPU for the money" list just before a major new card launches rather than just after.  I'm not sure why, but my best guess is to avoid having their chart influenced by review-day shenanigans, and instead have a more considered opinion with actual information about availability.
Sign In or Register to comment.