the money wasted on i7 and 8gb gpu 32gb ram lol .... isnt worth it , if u are rich sure go ahead but for the general playerbase ....that pc budget is insane just the geforce gtx 1080 cost as much as my current PC
Oh but wait I thought you said the Rx480 is better then the gtx 1060. Looks like its the opposite. I can't believe you posted benchmarks saying otherwise. Isn't that against your rules of conduct?
OMG whole of 1 FPS in NVidia gimpworks title rofl. Its doing great considering its being actively gimped by NVidia
OTOH just look at that poor Kepler 780ti, also being gimped by NVidia and now peforms on 380 level and it costed 700$ just 2 years ago lol. OTOH, at the same time, 280x/380x costed 299/239 $
Its funny, in AMD sponsored titles 780ti performs as it should and beats/is on par with 970 just like it was at release, but in NVidia gimpworks titles 970 is 30+% faster lol
and your limited capacity cant comprehend that i speak on THE WHOLE where 480 is better than 1060.
Checked out the Watchdogs 2 Benchmarks on Gamers Nexus, makes for interesting viewing honestly, though whether they are more informative than the OP's benchmarks or not, i couldn't say, certainly the AMD cards did not fare as well as the Nvidia ones, but apparently that is largely due to AMD not having an equivalent card at this time to the 1080's or the 1070's
Funny how NVidia cards did almost teh same/bit better but AMD cards did much worse across the board, interesting huh. But russians are not sponsored by anyone, while Gamers Nexus is heavily sponsored, NVidia included.
1080 is what matters to 97,5%+ people, some may even care about 1440p, 4k is tending to 0 rofl. It matters more for consoles than PC.
and to remember that those morons used to call 1080 4k card rofl. Thats 700$ down the drin right there lol
So when you are comparing dual RX480's to the gtx 1080. For some reason you quoted the 1440 and ignored the 1080 resolution. According to your theory that the 1080 resolution is the most important. Then the amd cards are total crap compared to the Nvidia cards. Even your own benchmarks from your amd loving website says so.
Again, the GPU comparison and the CPU comparison chart doesn't show even close to the same numbers (i7 5960X 1080 SLI, one shows 104/121, the other shows 92/109).
Apart from the usual inconsistency - it's nice to see an SLI-enabled title, and beyond that there is a surprising amount of CPU per-core scaling as well. The FX-8150 (the original crappy Bulldozer) beating out a Haswell i5 is almost unheard of in gaming. You can also see a very clear distinction between Sandy Bridge, Haswell, and Skylake in the test, which is also somewhat unsual.
1. thers no incosistency, you simply arent looking as good as you SHOULD look at whats tested.
2. thats what happens when you use 8 cores, you act surprised, but in decently threaded applications 8150 was between i5 and i7 in most cases and faster than i7 in other cases.
Games being late in using more CPU cores is just a testament how cr@ppy devs are, coupled with ancient DX11 which only used 1 core (DX11.3 aka pseudo DX12 finally opens the gates and DX12/Vulkan are capable of using all cores without much trouble)
Also, 6700k being a whole lot of 13% faster than 5 years old 2600k, and 6600 being marginally faster than 4670k, along with 6700k being marginally faster than 4770k. Thats how much Intels CPUs "improved" over 5 years, thats 5 generations of CPUs and Kaby Lake has virtually no improvement over Skylake... ... ...
its all there for those who dont just skim and miss important details. Just look at guy above trying to sound smart, but being embarassed in the process.
its all there for those who dont just skim and miss important details.
Thanks, I didn't see that, or maybe I have no idea what LLu means and it wasn't clear to me. But I appreciate you pointing it out, because it wasn't clear to me at all.
Yeah, all dem benchmarks are obsolete, they just benched first mission, beyond that performance is abbysmal (as more and more videos of actual gameplay like above started to pop up... ... ....)
I was thinking of upgrading to a 1080 but the price here in Rome is so high still and in Euros some more. I have an Alienware with like 875 watts power supply so upgrading should not be an issue wish it did not cost so much though .
Comments
OTOH just look at that poor Kepler 780ti, also being gimped by NVidia and now peforms on 380 level and it costed 700$ just 2 years ago lol. OTOH, at the same time, 280x/380x costed 299/239 $
Its funny, in AMD sponsored titles 780ti performs as it should and beats/is on par with 970 just like it was at release, but in NVidia gimpworks titles 970 is 30+% faster lol
and your limited capacity cant comprehend that i speak on THE WHOLE where 480 is better than 1060.
and there you go again meddling with stuff you have no clue about rofl
1080 is what matters to 97,5%+ people, some may even care about 1440p, 4k is tending to 0 rofl. It matters more for consoles than PC.
and to remember that those morons used to call 1080 4k card rofl. Thats 700$ down the drin right there lol
https://www.youtube.com/watch?v=VyeyPCzWMQQ
Apart from the usual inconsistency - it's nice to see an SLI-enabled title, and beyond that there is a surprising amount of CPU per-core scaling as well. The FX-8150 (the original crappy Bulldozer) beating out a Haswell i5 is almost unheard of in gaming. You can also see a very clear distinction between Sandy Bridge, Haswell, and Skylake in the test, which is also somewhat unsual.
2. thats what happens when you use 8 cores, you act surprised, but in decently threaded applications 8150 was between i5 and i7 in most cases and faster than i7 in other cases.
Games being late in using more CPU cores is just a testament how cr@ppy devs are, coupled with ancient DX11 which only used 1 core (DX11.3 aka pseudo DX12 finally opens the gates and DX12/Vulkan are capable of using all cores without much trouble)
Also, 6700k being a whole lot of 13% faster than 5 years old 2600k, and 6600 being marginally faster than 4670k, along with 6700k being marginally faster than 4770k. Thats how much Intels CPUs "improved" over 5 years, thats 5 generations of CPUs and Kaby Lake has virtually no improvement over Skylake... ... ...
Graph 1: i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 104/121 fps
Graph 2: i7 5960x, Gigabyte x99 mobo, 1080p, VHQ, 1080SLI - 92/109 fps
So what's the difference?
2. 5690x @ stock 3 GHz
thats 53% CPU OC
its all there for those who dont just skim and miss important details. Just look at guy above trying to sound smart, but being embarassed in the process.
*brought to you by NVidia Gameworks lol