LOL it's a hardware conversation.. this is talking about hardware. Seems like somebody can't back up their claims... Doesn't own a DK2. It's relevant because when someone comes in, attacking everything turning the conversation for their own purposes, they should be able to back up their claims. That person was shown to have lied in the past about hardware -- I don't think there is anything wrong with that.
What GPU did you upgrade from, and what GPU did you upgrade to for ARMA 3 ?
from: I dont recall TO: GTX760
Safe to say i caught you in a lie.
No.1 You "supposedly" buy ~$1500 PCs every 3 years. No.2 You "supposedly" bought a ~$1500 PC 3 years ago but you do not remember the specifics No.3 You "supposedly" bought a 760 18 months later to boost your PC for ARMA 3
Inconsistency No.1 : The "supposed" ~$1500 PC had a gpu weaker than 760 ? Inconsistency No.2 : You "supposedly" bought a 760 upgrade only to boost ARMA 3 experience(fps). All while ARMA 3 is a cpu bound game that benefits little from GPUs, and has server fps caps in multiplayer. Prime Inconsistency : For 760 to make any difference you would have had to have used an extremely weak GPU beforehand thus making ~$1500 PC a confusing conundrum.
Conclusion : Your buying choices and prowess around custom built computers were extremely poor 3 years ago, or your narrative is swiss cheese.
You would probably get considerably different cpu benchmarks if the test gpu was an AMD 390x or Fury X. As we know, NVidia cards do not properly support dx12. So it doesn't provide the cpu benefits an AMD card would.
Semi on-topic: A 970 is the "sweet spot" for a gamer that like graphics. As better cards come out obviously it will be lower.
Totally on-topic: My 970 rocks it! Game is gorgeous but I'm not digging how the cars control.
Yep, not worth a ban or warning which is why I don't often respond to that poster anymore.
I also have a 970 and it totally rocks. The game is beautiful and it's freaking hard. I also suck at racing games, but it's still a pretty hard game.
The controls feel realistic with regards to physics and how different cars handle. I don't know if I really like that or not, but it is impressive. I'm more of a Mariokart kind of player.
The video did point out that they really had to push both cards before they saw drop offs and it's no secret that the limited memory of the 970 doesn't scale well to 1440 or 4k. But my current rig is 1080p so I don't have any of those problems.
This whole 3.5 memory thing is really being overblown. Either that or my card does not suffer from it. I play all my games on at least 2351x1325 (1.5 DSR) if not 2715x1527 (2.0 DSR) at 144Hz and have had zero issues with games like GTA5 and Witcher 3 and Fallout 4. None at all. I get an average of 45-70 FPS on all those games. Good enough for gaming with no stutters. I also have 16GB of DDR4 3000 and running i7 6700K that is not OC.
Am I just lucky?
Bartoni's Law definition: As an Internet discussion grows volatile, the probability of a comparison involving Donald Trump approaches 1.
I thought it was overblown too - from all the testing I saw, it only had an impact in some very contrived or specific situations that were not common at all.
In many games you can observe R9 390 pulling ahead more as you increase resolution. Some games less, sme games more but lately there have been more and more games where difference is drastic even on 1440p. Of course you can always reduce memory heavy settings like AA, but that is the thing, GPU iself could carry it if it wasnt for lack of memory.
As far a CPUs go, they dont OC anything (unless its specified) so you can somewhat extrapolate slower CPUs from data. It isnt perfect but its accurate enough. Tests are done with 980ti so if you have slower GPU youll get lower FPS naturally (unless you lower graphic details) but its good to know how much can CPU push the game until its gets CPU bound.
I think in the case of Forza, and specifically on Win10, it would be interesting to see what difference there is between the versions of the game, the UWP version and the none UWP version, in other games where there are both types, the none UWP version tends to run better and isn't forced to run in windowed mode.
"Does DirectX 12 and UWP support full screen exclusive mode?
Full screen exclusive mode was created back in the original release
of DirectDraw to provide games with enhanced performance when using the
entire screen. The downside of full screen exclusive mode is that it
makes the experience for gamers who wish to do other things on their
system, such as alt-tab to another application or run the Windows
GameDVR, more clunky with excessive flicker and transition time.
We thought it would be cool if gamers could have the versatility of
gaming in a window with the performance of full screen exclusive.
So, with Windows 10, DirectX 12 games which take up the entire
screen perform just as well as the old full screen exclusive mode
without any of the full screen exclusive mode disadvantages. This
is true for both Win32 and UWP games which use DirectX 12. All of
these games can seamlessly alt-tab, run GameDVR, and exhibit normal
functionality of a window without any perf degradation vs full screen
exclusive."
"Does DirectX 12 and UWP support full screen exclusive mode?
Full screen exclusive mode was created back in the original release
of DirectDraw to provide games with enhanced performance when using the
entire screen. The downside of full screen exclusive mode is that it
makes the experience for gamers who wish to do other things on their
system, such as alt-tab to another application or run the Windows
GameDVR, more clunky with excessive flicker and transition time.
We thought it would be cool if gamers could have the versatility of
gaming in a window with the performance of full screen exclusive.
So, with Windows 10, DirectX 12 games which take up the entire
screen perform just as well as the old full screen exclusive mode
without any of the full screen exclusive mode disadvantages. This
is true for both Win32 and UWP games which use DirectX 12. All of
these games can seamlessly alt-tab, run GameDVR, and exhibit normal
functionality of a window without any perf degradation vs full screen
exclusive."
From time to time, i still play TDU2, its years old now of course, but, when you compare it to a more recent video comparing Forza6 on XB1 and PC-uwp (which are remarkably similar for some reason) then all i can say is, that while i am not surprised that the XB1 can run Forza 6 at 1080/60fps i am very surprised that the PC version isn't better, as the improvements vs those in TDU2 seem, marginal. So was Forza 6 made specifically for the XB1, and is just a console port, so thats why the PC version doesn't look any better, or is it because UWP is still not there yet as a platform? Given that other games using both UWP and none-uwp also display some significant differences, with the none uwp versions being a better game experience for the most part, i am not sure that you can place the lack entirely on it being just a console port.
Those games were still made in ancient APIs. DX12 is whole another ballgame. Its not just DX11 with few bells and whistles added.
Most of AAA games today are console ports as consoles sell boatload more than PC. And those games which exhibit "significant differences" usually run like trash on PC relative to consoles:
Those games were still made in ancient APIs. DX12 is whole another ballgame. Its not just DX11 with few bells and whistles added.
Most of AAA games today are console ports as consoles sell boatload more than PC. And those games which exhibit "significant differences" usually run like trash on PC relative to consoles:
From 2:06 :
Yes, we have had any number of dodgy console ports, you only have to look back to the recent Arkham Knight debacle to see how bad things have truly become, which considering consoles recently adopted a more PC like architecture, it makes you wonder what on earth is going on, though with the XB1's peculiar construction i can see why porting an XB1 game to PC might seem a daunting proposition, perhaps this is why UWP is a thing, is it perhaps Microsofts attempt to keep the PC, despite whatever hardware might be present, pegged at XB1 levels of performance?
Console run a lot closer to the hardware. PCs have this huge fat software bloat and API layers that console games sit on.
The benchmarks and visual side-by-sides are interesting and show design strengths and weaknesses, but practically speaking most people don't notice or care. When I play games on the PS4 I think wow that looks nice. When I play them on the PC I don't try and visualize a comparison. I normally don't notice a difference. I think, wow that looks nice.
With Forza it looks beautiful. I have the configuration set mostly to "auto" (I'm on 1080p) because in a racing game I want performance first. I don't want hitching or frame slog. I want to win the race. It still looks beautiful, but I'm not trying to push a settings for the sake of it unless it gives me a visual benefit. If it doesn't make a practical visual improvement, then I'm not bumping up the setting just to do it.
But thats it, for instance Uncharted 4 has been proclaimed as "stunning" visualy (and a great game). And its on PS4 which costs 300$. Yes, it is in 1080p. but by steam stats 1440p and higher is ~3% of the market. Means that for now people dont care too much for higher resolutions.
Badass TV+300$ console seems better deal than 1000+$ PC and its hassle free, you dont have to worry about drivers, installs, "is this CPU enough" or "is this GPU enough"... .... ...
And now theyre updating consoles to match 4k that will be sold along 1080p ones so even among consoles youll have a choice and games are more or less guaranteed to work at a certain level.
As far as PCs go, even 10 yo PC can perform vast majority of stuff people usually use PCs for so theres no incentive to buy new.
I've always thought that the software bloat that commonly gets associated with PCs was overblown. And why I think that SteamOS, or any OS or API for that matter, will never really win on the premise of "reducing the bloat".
The main attraction for consoles to developers, was that it was a common enforced hardware standard, and you could tightly optimize to that standard. You know every PS4 is going to have this CPU architecture, this GPU architecture, this Memory access standard, etc. And you can get compilers to do a lot of automatic optimzation, and hand tuning can get you a good deal more. And the hardware designers of the consoles are competent enough to know how to exactly optimize their hardware for gaming: so many CPU cores coupled with so many GPU cores, coupled with the memory access standards, tied closely with the I/O, etc. PCs retain a very generic set of standards that are optimized for the general case, and often lopsided.
That's not necessarily the same as software bloat. Sure, that's some of it, but your standard PC has enough compute power now that it can handle a good deal of bloat and not really get bogged down. Case in point - how many games really push a computer to 100% CPU power? How many games can even push a single core on a computer to 100%?
The fun part about PCs is that you can build them yourself, tinker with the software, tinker with mods, tinker with settings, etc. With a console, your buying into a very tightly controlled and very static ecosystem. I'm not saying that's a bad thing, but you lose a lot of the hands-on ability, which is what I find enjoyable about PCs.
The genericization (is that even a word) of the PC, which is part of what makes it wonderful for me, is also the reason why it can't compete, dollar for dollar, with a specialized machine just for gaming, or actually, a specialized machine for anything really. I think Steve Jobs actually nailed it when he said the PC was more or less the pickup truck of the computing world. There's a lot of things you can do with a pickup truck, but it does very little of it very well. Jack of all trades, master of none.
All the software in the world (or lack thereof) can't make up deficiencies in the underlying hardware.
I've always thought that the software bloat that commonly gets associated with PCs was overblown. And why I think that SteamOS, or any OS or API for that matter, will never really win on the premise of "reducing the bloat".
The main attraction for consoles to developers, was that it was a common enforced hardware standard, and you could tightly optimize to that standard. You know every PS4 is going to have this CPU architecture, this GPU architecture, this Memory access standard, etc. And you can get compilers to do a lot of automatic optimzation, and hand tuning can get you a good deal more. And the hardware designers of the consoles are competent enough to know how to exactly optimize their hardware for gaming: so many CPU cores coupled with so many GPU cores, coupled with the memory access standards, tied closely with the I/O, etc. PCs retain a very generic set of standards that are optimized for the general case, and often lopsided.
That's not necessarily the same as software bloat. Sure, that's some of it, but your standard PC has enough compute power now that it can handle a good deal of bloat and not really get bogged down. Case in point - how many games really push a computer to 100% CPU power? How many games can even push a single core on a computer to 100%?
The fun part about PCs is that you can build them yourself, tinker with the software, tinker with mods, tinker with settings, etc. With a console, your buying into a very tightly controlled and very static ecosystem. I'm not saying that's a bad thing, but you lose a lot of the hands-on ability, which is what I find enjoyable about PCs.
The genericization (is that even a word) of the PC, which is part of what makes it wonderful for me, is also the reason why it can't compete, dollar for dollar, with a specialized machine just for gaming, or actually, a specialized machine for anything really. I think Steve Jobs actually nailed it when he said the PC was more or less the pickup truck of the computing world. There's a lot of things you can do with a pickup truck, but it does very little of it very well. Jack of all trades, master of none.
All the software in the world (or lack thereof) can't make up deficiencies in the underlying hardware.
I am not sure what who you are refering to when you talk about 'reducing the bloat' I dont think we are talking about software bloat.
Speaking for myself
Is Linux well known to be slower for games on the same spec software than a Windows Machine? yes
Is this issue true with Steam Machines? yes
software bloat? maybe I missed it, sorry if I did
Please do not respond to me, even if I ask you a question, its rhetorical.
970/390 would be entry level to high end - 1080p GPUs/capable 1440p GPUs (though 970 3,5GB memory starts to be an issue here so generally not recommended for 1440p)
so in video cards there is something called a 'sweet spot' its usually between the very high end and the so called 'entry level to gaming' cards or as I target it, 2-3 tiers below the best card possible.
That sweet spot is where the current iteration of manufacturing is at its peek and its where consumers get the best performance for the dollar.
This 'sweet spot' is what I am refering to. I should have explain it better or used a better phrase.
so with all that, yeah it seems the 970 is currently the 'sweet spot' card
The sweet spot IMO is whatever is a good value for $300. I've build well over 200 machines over the last 24 years, and whenever I build a 'high end' machine, I ignore the bluster and hoohaw, I get video cards that are under $300. Why? Because the amount of game improvement you get going with a top of the line card vs what you can get for $300 is rarely enough to get excited about. And, in two years, what you get for $300 will blow away what you paid $600+ for two years ago.
Right now, you can get a GTX970 or an R9-380X for under $300 if you shop around. Those are great cards! Both of them will power an Occulus Rift. But, in about 4-5 months, you can get a GTX 1070 or R9-480X powered video card for that magic $300 number. Those will be faster than anything in the current generation except the Fury X and the GTX 980, both of which are $600+ cards.
And in two years, you can spend that $300 you saved to buy something that will blow away the Fury X and the GTX 980!
I did want to mention that this is a lesson I learned the hard way, buying $500-$600 dollar cards that are almost door stops in 5 years.
The world is going to the dogs, which is just how I planned it!
Comments
No.1 You "supposedly" buy ~$1500 PCs every 3 years.
No.2 You "supposedly" bought a ~$1500 PC 3 years ago but you do not remember the specifics
No.3 You "supposedly" bought a 760 18 months later to boost your PC for ARMA 3
Inconsistency No.1 :
The "supposed" ~$1500 PC had a gpu weaker than 760 ?
Inconsistency No.2 :
You "supposedly" bought a 760 upgrade only to boost ARMA 3 experience(fps). All while ARMA 3 is a cpu bound game that benefits little from GPUs, and has server fps caps in multiplayer.
Prime Inconsistency :
For 760 to make any difference you would have had to have used an extremely weak GPU beforehand thus making ~$1500 PC a confusing conundrum.
Conclusion :
Your buying choices and prowess around custom built computers were extremely poor 3 years ago, or your narrative is swiss cheese.
Semi on-topic: A 970 is the "sweet spot" for a gamer that like graphics. As better cards come out obviously it will be lower.
Totally on-topic: My 970 rocks it! Game is gorgeous but I'm not digging how the cars control.
Bartoni's Law definition: As an Internet discussion grows volatile, the probability of a comparison involving Donald Trump approaches 1.
거북이는 목을 내밀 때 안 움직입니다
Please do not respond to me, even if I ask you a question, its rhetorical.
Please do not respond to me
거북이는 목을 내밀 때 안 움직입니다
Maybe it's gonna be better with fps cap off with the new win10 patch
거북이는 목을 내밀 때 안 움직입니다
거북이는 목을 내밀 때 안 움직입니다
Am I just lucky?
Bartoni's Law definition: As an Internet discussion grows volatile, the probability of a comparison involving Donald Trump approaches 1.
As far a CPUs go, they dont OC anything (unless its specified) so you can somewhat extrapolate slower CPUs from data. It isnt perfect but its accurate enough. Tests are done with 980ti so if you have slower GPU youll get lower FPS naturally (unless you lower graphic details) but its good to know how much can CPU push the game until its gets CPU bound.
http://wccftech.com/uwp-unlocked-frame-rate-gsyncfreesync-support-today-exclusive-full-screen-coming/
So was Forza 6 made specifically for the XB1, and is just a console port, so thats why the PC version doesn't look any better, or is it because UWP is still not there yet as a platform?
Given that other games using both UWP and none-uwp also display some significant differences, with the none uwp versions being a better game experience for the most part, i am not sure that you can place the lack entirely on it being just a console port.
Most of AAA games today are console ports as consoles sell boatload more than PC.
And those games which exhibit "significant differences" usually run like trash on PC relative to consoles:
From 2:06 :
Please do not respond to me, even if I ask you a question, its rhetorical.
Please do not respond to me
Badass TV+300$ console seems better deal than 1000+$ PC and its hassle free, you dont have to worry about drivers, installs, "is this CPU enough" or "is this GPU enough"... .... ...
And now theyre updating consoles to match 4k that will be sold along 1080p ones so even among consoles youll have a choice and games are more or less guaranteed to work at a certain level.
As far as PCs go, even 10 yo PC can perform vast majority of stuff people usually use PCs for so theres no incentive to buy new.
The main attraction for consoles to developers, was that it was a common enforced hardware standard, and you could tightly optimize to that standard. You know every PS4 is going to have this CPU architecture, this GPU architecture, this Memory access standard, etc. And you can get compilers to do a lot of automatic optimzation, and hand tuning can get you a good deal more. And the hardware designers of the consoles are competent enough to know how to exactly optimize their hardware for gaming: so many CPU cores coupled with so many GPU cores, coupled with the memory access standards, tied closely with the I/O, etc. PCs retain a very generic set of standards that are optimized for the general case, and often lopsided.
That's not necessarily the same as software bloat. Sure, that's some of it, but your standard PC has enough compute power now that it can handle a good deal of bloat and not really get bogged down. Case in point - how many games really push a computer to 100% CPU power? How many games can even push a single core on a computer to 100%?
The fun part about PCs is that you can build them yourself, tinker with the software, tinker with mods, tinker with settings, etc. With a console, your buying into a very tightly controlled and very static ecosystem. I'm not saying that's a bad thing, but you lose a lot of the hands-on ability, which is what I find enjoyable about PCs.
The genericization (is that even a word) of the PC, which is part of what makes it wonderful for me, is also the reason why it can't compete, dollar for dollar, with a specialized machine just for gaming, or actually, a specialized machine for anything really. I think Steve Jobs actually nailed it when he said the PC was more or less the pickup truck of the computing world. There's a lot of things you can do with a pickup truck, but it does very little of it very well. Jack of all trades, master of none.
All the software in the world (or lack thereof) can't make up deficiencies in the underlying hardware.
Speaking for myself
Is Linux well known to be slower for games on the same spec software than a Windows Machine?
yes
Is this issue true with Steam Machines?
yes
software bloat? maybe I missed it, sorry if I did
Please do not respond to me, even if I ask you a question, its rhetorical.
Please do not respond to me
Right now, you can get a GTX970 or an R9-380X for under $300 if you shop around. Those are great cards! Both of them will power an Occulus Rift. But, in about 4-5 months, you can get a GTX 1070 or R9-480X powered video card for that magic $300 number. Those will be faster than anything in the current generation except the Fury X and the GTX 980, both of which are $600+ cards.
And in two years, you can spend that $300 you saved to buy something that will blow away the Fury X and the GTX 980!
I did want to mention that this is a lesson I learned the hard way, buying $500-$600 dollar cards that are almost door stops in 5 years.
The world is going to the dogs, which is just how I planned it!