Thanks for posting this. Clearly DX12 isn't that big a deal if you don't have a top-PC (which is what I've been saying for 2 years already). Basically, if you have a mid-range PC (like mine), you'd best stick to Win7/8 and keep your privacy. If you REALLY are a epeen junkie and want to see the difference between 48FPS (FX8350/GTX1060) and 100FPS (i7/GTX1080), then Windows 10 is yours for the price of your privacy (if you can even see the difference between 48 and 100 FPS to start with...)
DX12's focus is on enabling a dramatic increase in visual richness through a significant decrease in API-related CPU overhead. Clearly it is a big deal.
True, yet looking at the same statement about DX10 and DX11 being a dramatic increase over DX9 and how long it took for the developers to actually use it and how it's used in general (mostly fog/lightning shading only on top of DX9 graphics) I still stand with my statement on epeen and it not being worth to hop to Windows 10...
It wasn't the video card drivers that were behind, it was the game engines and game developers. Many get locked into a specific Direct X generation. Like GW2, which is still Dx 9.0c, which is more CPU than GPU.
Windows 10 IS worth jumping onto. Why SECURITY, SECURITY, SECURITY. All other Windows will just get major security updates or none at all (like Win XP).
Security? tbh Win 10 is only as secure as the third party software you use to protect it, and just as insecure as all the previous versions if you don't. O.o
Not true. If you believe that - fine. BUT, EVEN THE SECURITY EXPERTS SAY WIN 10 IS MORE SECURE OUT OF THE BOX.
Every OS is the same, the OS is only as secure as the operator.
Sure it's safer out of the box..... because you're letting big brother in to "watch over you" thus making you more secure while having less choice in some areas because ofc big bro knows best!
LOL - every OS since WP XP did that - better stop with the kool aid, you might get even more paranoid.
Less choice? I don't get it. It has the same or more choices, it depends on the hardware manufacturers to write the drivers, not MS.
Like I said, please stop drinking that kool aid, it is meant for Trump supporters now.
LMAO you're the one that needs to stop drinking the MS cool aid. I don't really have a beef with MS I never did but w10 has clearly moved away from freedom of choice when it comes to disabling certain things to a new mobile phone BS type OS that wants to tell you Hey I wanna be apple so do things like this or don't use me at all.
If you think your privacy is anywhere as secure as any of the OS they produced before you're seriously deluding yourself.
Lastly but not least if ever there was an OS that resembled Trump it surely is windows 10.
Yea windows is becoming much more invasive and basically watching everything you do. I'm not saying they are going to steal your identity but they are using it for marketting for sure. I remember when computers did what you told them. That was so easy to deal with a computer back then. Now they are smarter then you so they set everything the way that MS wants them and if you change it then it will change back first time windows updates. Hell even the cars are smarter then you and lock themselves when they feel like it. They have good intentions but at the same time for us who know what we are doing it makes things very difficult sometimes.
Yea windows is becoming much more invasive and basically watching everything you do. I'm not saying they are going to steal your identity but they are using it for marketting for sure.
The problem I see in 'monitoring' everything you do is that when you create something inventive and are using Windows 10, Mickey$oft will instead use it and sue your ass of when you want to take credit for it. After all, YOU DID GRANT Mickey$oft an 100% full royalty free license to (ab/mis) use anything YOU do on YOUR COMPUTER.
As a creative programmer who thinks way outside the box, Windows 10 is a big no-no for that reason alone. At times I come up with solutions to problems I encounter programming that Mickey$oft might learn something from it. Well, if they want to, they gotta pay for it and not 'steal' it because I'm using Windows 10...
GTX 1080 was 61 - 81% faster when I7 processor was used.
Uhh where did you see your numbers? He clearly posted the benchmarks. The difference between the 8370 and core i7 ranges from -1% to 16% on the 1080.
That part of my comment was aimed at Malabooga's price/performance comparison of GTX 1080 and RX 480. He counted that GTX 1080 was 55 - 60% faster than RX 480, when in the review he linked GTX 1080's results on I7 were 61 - 81% faster.
That's why I called bullshit on that number he used. He's posting bullshit and trusting that none actually notices the numbers he used are not from the test he posted.
Yeah, you see, if you knew anything about AMD CPUs you would know those are all exact same unlocked chip. But then, you dont have a clue what youre talking about so.....lol
Yeah, you see, if you knew anything about AMD CPUs you would know those are all exact same unlocked chip. But then, you dont have a clue what youre talking about so.....lol
Only difference there is that the 8370 is 100% guaranteed to hit the 4Ghz and you can OC it even further, while the 8300 remains to be seen if you can OC it to 4Ghz and keep it stable.
Yeah, you see, if you knew anything about AMD CPUs you would know those are all exact same unlocked chip. But then, you dont have a clue what youre talking about so.....lol
Only difference there is that the 8370 is 100% guaranteed to hit the 4Ghz and you can OC it even further, while the 8300 remains to be seen if you can OC it to 4Ghz and keep it stable.
8300 is guaranteed to hit 4 GHz lol the only thing youre not guaranteed is 5GHz, but then, youre not guaranteed to hit that with 8350/8370 either.
you know that same chips OC to over 5 GHz and hold a world record at 8,4 GHz right? lol
you know that same chips are first and the only ones to come with 5 GHz stock as 9590?
GTX 1080 was 61 - 81% faster when I7 processor was used.
Uhh where did you see your numbers? He clearly posted the benchmarks. The difference between the 8370 and core i7 ranges from -1% to 16% on the 1080.
That part of my comment was aimed at Malabooga's price/performance comparison of GTX 1080 and RX 480. He counted that GTX 1080 was 55 - 60% faster than RX 480, when in the review he linked GTX 1080's results on I7 were 61 - 81% faster.
That's why I called bullshit on that number he used. He's posting bullshit and trusting that none actually notices the numbers he used are not from the test he posted.
and when its only 40%, at what it averages huh genuis?
It's actually really easy to pick out good bins from AMD since they have been special binning for years. Chances are a new 8370 will not clock as well as one from when it was released as AMD already binned them out. But if you know their binning principles then you know where the good bins are. They will separate out the good bins for low power consumption applications and overclocking applications. These would the the 8320e, 8370e, 9370, and 9590. So theoretically you can pick out an FX 8320e and overclock it to 4.0 GHz by increasing the voltage so it consumes 125w. I will test it out next year when I replace my system as I have a FX 8370e. I don't want to do it now since I just burned out my MSI R9 290X Lightning edition that is suppose to be a special binning.
Yeah, you see, if you knew anything about AMD CPUs you would know those are all exact same unlocked chip. But then, you dont have a clue what youre talking about so.....lol
Only difference there is that the 8370 is 100% guaranteed to hit the 4Ghz and you can OC it even further, while the 8300 remains to be seen if you can OC it to 4Ghz and keep it stable.
8300 is guaranteed to hit 4 GHz lol the only thing youre not guaranteed is 5GHz, but then, youre not guaranteed to hit that with 8350/8370 either.
you know that same chips OC to over 5 GHz and hold a world record at 8,4 GHz right? lol
you know that same chips are first and the only ones to come with 5 GHz stock as 9590?
AMD doesn't guarantee an 8300 will hit 4.0 Ghz. It might hit that, and maybe even ~probably~ hit that, but those are both very very different things from guaranteeing that it will hit that. AMD only guarantees that it will run at 3.3 Ghz: anything else and your on your own, at your own risk, with a voided warranty, and certainly not guaranteed. Unless your willing to put up the money to do that...
Just because "the same chip" gets binned at a lot of different frequencies, doesn't mean that every chip can hit those frequencies. That's a bit part of the reason of why they bin them in the first place; otherwise, why wouldn't they just sell all the 8-core Visheras as 9590's? There's some metric in the binning process (usually either power draw or thermal) that causes AMD to say "You know, we aren't going to guarantee this particular piece of silicon at 4.7Ghz, but it should work just fine at 3.3 Ghz, so we'll just mark down the price a bit and sell it with that rating."
guar·an·tee
ˌɡerənˈtē/
verb
past tense: guaranteed; past participle: guaranteed
provide a formal assurance or promise, especially that certain conditions shall be fulfilled relating to a product, service, or transaction.
Yeah, you see, if you knew anything about AMD CPUs you would know those are all exact same unlocked chip. But then, you dont have a clue what youre talking about so.....lol
Only difference there is that the 8370 is 100% guaranteed to hit the 4Ghz and you can OC it even further, while the 8300 remains to be seen if you can OC it to 4Ghz and keep it stable.
8300 is guaranteed to hit 4 GHz lol the only thing youre not guaranteed is 5GHz, but then, youre not guaranteed to hit that with 8350/8370 either.
you know that same chips OC to over 5 GHz and hold a world record at 8,4 GHz right? lol
you know that same chips are first and the only ones to come with 5 GHz stock as 9590?
AMD doesn't guarantee an 8300 will hit 4.0 Ghz. It might hit that, and maybe even ~probably~ hit that, but those are both very very different things from guaranteeing that it will hit that. AMD only guarantees that it will run at 3.3 Ghz: anything else and your on your own, at your own risk, with a voided warranty, and certainly not guaranteed. Unless your willing to put up the money to do that...
Just because "the same chip" gets binned at a lot of different frequencies, doesn't mean that every chip can hit those frequencies. That's a bit part of the reason of why they bin them in the first place; otherwise, why wouldn't they just sell all the 8-core Visheras as 9590's? There's some metric in the binning process (usually either power draw or thermal) that causes AMD to say "You know, we aren't going to guarantee this particular piece of silicon at 4.7Ghz, but it should work just fine at 3.3 Ghz, so we'll just mark down the price a bit and sell it with that rating."
guar·an·tee
ˌɡerənˈtē/
verb
past tense: guaranteed; past participle: guaranteed
provide a formal assurance or promise, especially that certain conditions shall be fulfilled relating to a product, service, or transaction.
EVERY single FX83xx that goes into retail will hit 4 GHz. 8300 will go eve further, 4.2 GHz
Because guess what? FX8300 has turbo 4.2 GHz lol
You can theorize how much you want about it, but thats a guarantee lol.
Im not really sure why you guys insist on embarassing yourselves instead educating yourselves.
These are CPUs that hit 5GHz regularly. 4GHz is like easy natural state for them lol.
Turbo =/= steady state. Everyone loves to tout AMD's multi-core advantage because it has 8 "true" cores in the 8x00 series. But crank all 8 of those cores to 100%, start hitting that 95W+ TDP, and see how long your Turbo stays turboed on stock settings (which is, after all, the "guarantee" from AMD).
It's not the same thing at all as base clock.
But then again, as an educated person who doesn't like to embarass themselves, I'm sure you knew all that anyway and.. well, then that just means your trolling because you just like to hear yourself argue.
Man, no wonder they can sell you crap, so many clueless people, and not even that, they cherish their stupidity and drive it further lol
This is you:
"it will hit 4.2 Ghz. But it isnt guaranteed to hit 4.2 Ghz"
ill let you in on a well kept secret: for CPU to actually work at 4.2 GHz it HAS to work at 4.2 GHz lol
4.2 Ghz on one core is a lot different than 3.3 Ghz on 8 cores. But you know what, for the sake of Christmas - you win. An 8300 will "do*" >4 Ghz.
*limited time only, only when specific core counts are not exceeded, die temperatures may prohibit actual clock speeds, power usage may fluctuate due to the dynamic nature of AMD (tm) Turbo Core (c) Technology, do not try this at home unless you are a professional, do not taunt the 8300,. the 8300 may glow in the dark which is normal for certain operating conditions, if dry mouth or red rash occur consult a physician immediately, do not ingest the 8300, if swallowed induce vomiting immediate and contact your nearest Poison Control authority
It's actually really easy to pick out good bins from AMD since they have been special binning for years. Chances are a new 8370 will not clock as well as one from when it was released as AMD already binned them out. But if you know their binning principles then you know where the good bins are. They will separate out the good bins for low power consumption applications and overclocking applications. These would the the 8320e, 8370e, 9370, and 9590. So theoretically you can pick out an FX 8320e and overclock it to 4.0 GHz by increasing the voltage so it consumes 125w. I will test it out next year when I replace my system as I have a FX 8370e. I don't want to do it now since I just burned out my MSI R9 290X Lightning edition that is suppose to be a special binning.
actually on most chips you dont have to even raise voltage to hit 4 GHz. On 8350/8370 you can actually undervolt them quite a bit for much lower power consumption.
@Ridelynn thanks for proving again you have no absolutely clue what youre talking about, but yeah, every 8300 will OC > 4GHz guaranteed, suck it up its christmas, you dont need even more embarassament lol
You should be. But then again you wont missinform others any more.
TL;DR for those who dont want to read through all the missinformation:
EVERY single FX83xx will hit 4GHz easy. 4-4,4 GHz is sweetspot for these chips, depending on a chip and, from experience and countless results, all of these chips can at least hit 4,4-4,5 GHz.
Up to 4,4 can be done on air with 20-25$ cooler and some good 970 chipset mobos (can really go higher but with some precautions)
5GHz is not guaranteed (except on 9590) and for > 4,5 GHz its advisable to use decent 990 chipset motherboard and water cooling (or some more extreme air coolers)
I would HOPE that most people realize most of the time,peripherals are GIVEN to sites as sponsored material.It is also a FACT that often ,not always but often times those items are superior to what we might buy off the shelves.Also you cannot simply run some program to mimic EVERY game,game engine.Way too many factors come into play,including special beefed up drivers just to raise the benchmark for certain games. Also on the mention of Microsoft and W10,Tim Sweeney of Epic games made a strong comment against Microsoft and their intentions on W10 api's. https://www.theguardian.com/technology/2016/mar/04/microsoft-monopolise-pc-games-development-epic-games-gears-of-war Also in similarity,Microsoft forced GOW2 to be ONLY an Xbox platform game to help drive the Xbox sales.Yes i know exclusive gaming is happening on several platforms but i am merely pointing out that Microsoft is every bit as greedy and trying to monopolize as any other.
Never forget 3 mile Island and never trust a government official or company spokesman.
Turbo =/= steady state. Everyone loves to tout AMD's multi-core advantage because it has 8 "true" cores in the 8x00 series. But crank all 8 of those cores to 100%, start hitting that 95W+ TDP, and see how long your Turbo stays turboed on stock settings (which is, after all, the "guarantee" from AMD).
It's not the same thing at all as base clock.
Very true. My 1090T CAN hit 3.6Ghz on turbo, but when it does for a longer period of time it gets overheated quite badly (still within save margins, but close to the edge). Since I get my system to 90%+ quite a lot when programming or even playing 2 games at the same time (BDO AFKing to train horses/process materials), I don't chance the prolonged periods of 3.6GHz and high temperatures. For that reason only I've completely disabled the turbo option for the CPU and let is run safely at 3.2GHZ (not that I really notice that 400MHz difference anyway...)
GTX 1080 was 61 - 81% faster when I7 processor was used.
Uhh where did you see your numbers? He clearly posted the benchmarks. The difference between the 8370 and core i7 ranges from -1% to 16% on the 1080.
That part of my comment was aimed at Malabooga's price/performance comparison of GTX 1080 and RX 480. He counted that GTX 1080 was 55 - 60% faster than RX 480, when in the review he linked GTX 1080's results on I7 were 61 - 81% faster.
That's why I called bullshit on that number he used. He's posting bullshit and trusting that none actually notices the numbers he used are not from the test he posted.
and when its only 40%, at what it averages huh genuis?
Another invented number. The smallest difference between GTX 1080 and RX 480 was that GTX 1080 was 48% faster.
Comments
As a creative programmer who thinks way outside the box, Windows 10 is a big no-no for that reason alone. At times I come up with solutions to problems I encounter programming that Mickey$oft might learn something from it. Well, if they want to, they gotta pay for it and not 'steal' it because I'm using Windows 10...
GTX 1080 was 61 - 81% faster when I7 processor was used.
FX 8370 used in that test costs 185$ in NewEgg. It's not 100$ processor.
Also it's likely that the test used 8GB version of RX 480. That's not available at 200$ anywhere.
https://www.amazon.com/AMD-8-Core-FX-8300-Processor-FD8300WMHKBOX/dp/B00TR8YL4W/ref=sr_1_1?ie=UTF8&qid=1482499902&sr=8-1&keywords=fx+8300
105$
dude, try not to embarass yourself so much lol
480 4GB is exact same chip like 8GB no shennanigans like GTX1060 6GB vs gimped 1060 3GB which is cut down chip lol
He clearly posted the benchmarks. The difference between the 8370 and core i7 ranges from -1% to 16% on the 1080.
That's why I called bullshit on that number he used. He's posting bullshit and trusting that none actually notices the numbers he used are not from the test he posted.
http://www.newegg.com/Product/Product.aspx?Item=N82E16819113398
It has 700 MHz faster clock speed than your $105 processor.
you know that same chips OC to over 5 GHz and hold a world record at 8,4 GHz right? lol
you know that same chips are first and the only ones to come with 5 GHz stock as 9590?
you are too desperate lol
They will separate out the good bins for low power consumption applications and overclocking applications. These would the the 8320e, 8370e, 9370, and 9590. So theoretically you can pick out an FX 8320e and overclock it to 4.0 GHz by increasing the voltage so it consumes 125w. I will test it out next year when I replace my system as I have a FX 8370e. I don't want to do it now since I just burned out my MSI R9 290X Lightning edition that is suppose to be a special binning.
Just because "the same chip" gets binned at a lot of different frequencies, doesn't mean that every chip can hit those frequencies. That's a bit part of the reason of why they bin them in the first place; otherwise, why wouldn't they just sell all the 8-core Visheras as 9590's? There's some metric in the binning process (usually either power draw or thermal) that causes AMD to say "You know, we aren't going to guarantee this particular piece of silicon at 4.7Ghz, but it should work just fine at 3.3 Ghz, so we'll just mark down the price a bit and sell it with that rating."
guar·an·tee
Because guess what? FX8300 has turbo 4.2 GHz lol
You can theorize how much you want about it, but thats a guarantee lol.
Im not really sure why you guys insist on embarassing yourselves instead educating yourselves.
These are CPUs that hit 5GHz regularly. 4GHz is like easy natural state for them lol.
It's not the same thing at all as base clock.
But then again, as an educated person who doesn't like to embarass themselves, I'm sure you knew all that anyway and.. well, then that just means your trolling because you just like to hear yourself argue.
Man, no wonder they can sell you crap, so many clueless people, and not even that, they cherish their stupidity and drive it further lol
This is you:
"it will hit 4.2 Ghz. But it isnt guaranteed to hit 4.2 Ghz"
ill let you in on a well kept secret: for CPU to actually work at 4.2 GHz it HAS to work at 4.2 GHz lol
Under certain conditions, such as being dropped into a free fall off a cliff.
Malabooga said: 4.2 Ghz on one core is a lot different than 3.3 Ghz on 8 cores. But you know what, for the sake of Christmas - you win. An 8300 will "do*" >4 Ghz.
@Ridelynn thanks for proving again you have no absolutely clue what youre talking about, but yeah, every 8300 will OC > 4GHz guaranteed, suck it up its christmas, you dont need even more embarassament lol
TL;DR for those who dont want to read through all the missinformation:
EVERY single FX83xx will hit 4GHz easy. 4-4,4 GHz is sweetspot for these chips, depending on a chip and, from experience and countless results, all of these chips can at least hit 4,4-4,5 GHz.
Up to 4,4 can be done on air with 20-25$ cooler and some good 970 chipset mobos (can really go higher but with some precautions)
5GHz is not guaranteed (except on 9590) and for > 4,5 GHz its advisable to use decent 990 chipset motherboard and water cooling (or some more extreme air coolers)
I would HOPE that most people realize most of the time,peripherals are GIVEN to sites as sponsored material.It is also a FACT that often ,not always but often times those items are superior to what we might buy off the shelves.Also you cannot simply run some program to mimic EVERY game,game engine.Way too many factors come into play,including special beefed up drivers just to raise the benchmark for certain games.
Also on the mention of Microsoft and W10,Tim Sweeney of Epic games made a strong comment against Microsoft and their intentions on W10 api's.
https://www.theguardian.com/technology/2016/mar/04/microsoft-monopolise-pc-games-development-epic-games-gears-of-war
Also in similarity,Microsoft forced GOW2 to be ONLY an Xbox platform game to help drive the Xbox sales.Yes i know exclusive gaming is happening on several platforms but i am merely pointing out that Microsoft is every bit as greedy and trying to monopolize as any other.
Never forget 3 mile Island and never trust a government official or company spokesman.
so yeah, when someone mentions that FX83xx isnt far superior to phenom or something like 1090t