Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

is this CPU even worth getting at the price

2

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    wanderica said:



    Gdemami said:





    Aragoni said:



    First of all CPUs nowadays are a bit of a gamble, as it depends on how well the upcoming games will handle multi-threading.






    Not really, it is easily predictible how it will go.







    Here is the crux:


    Even if the game is well threaded, the bottleneck will still be on GPU and additional cores, more MHz or higher IPC won't help because the amount of workload we put on GPUs increase at much faster rate than CPU load increase.




    This core craze is the same core craze we had when FX was launched - 'everyone' was saying how games will be using more cores. Nothing happened since, and nothing will change in foreseeable future because the paradigm above still applies.


    I'm not so sure.  It feels more like Intel Extreme vs C2D or C2D vs Quad Cores to me.  In both of those examples, more cores was the correct prediction.  I will admit, however, that games today are being written to scale with available threads making 8 thread CPUs viable for much longer than single core or C2D chips were.  At what point, however, do available threads overtake the GPU as the bottleneck source?  BF1 is getting close to what we could expect to see.  Even though it's still GPU limited, as you say, it still caps out the 4c / 8t chips at near 100% usage.  Doom, on the other hand, using Vulkan, doesn't have that limitation, but it does scale very well with more available cores.

    I think in the end you're correct.  A 7700k will be a fantastic CPU for quite a while, but processors with less than 8 threads already bottleneck GPUs in BF1.  I think those CPU requirements will creep up a bit faster than we realize due to thread scaling in modern games.


    A lot of the question is, more cores how soon?  Twenty years ago when it was assumed that CPUs would only have one core, one fast core would beat two slightly slower cores.  That's not even close to true today.  When the first quad core CPUs came out, two faster cores generally beat four slightly slower cores.  That's often not true today--but it is the reason why Intel disabled overclocking on dual cores for years to try to make their more expensive quad core processors superior in even the lightly threaded cases.  Four faster cores will often beat eight slower cores today, but will that still be true in ten years?

    It's likely that twenty years from now, a Ryzen 7 1700 will mostly be better than a Core i7-7700K.  It's also likely that they'll both be awful by then, and whatever you buy today will have long since been replaced by then.

    There's also a question of whether it matters.  If at the settings you like in some particular game, one CPU can deliver 150 frames per second and another 200, does the difference between those matter?  What if your video card can only do 80 and your monitor has a refresh rate of 60 Hz?
  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    Ozmodan said:

    Gdemami said:
    Aragoni said:



    First of all CPUs nowadays are a bit of a gamble, as it depends on how well the upcoming games will handle multi-threading.
    Not really, it is easily predictible how it will go.

    Here is the crux:



    Even if the game is well threaded, the bottleneck will still be on GPU and additional cores, more MHz or higher IPC won't help because the amount of workload we put on GPUs increase at much faster rate than CPU load increase.

    This core craze is the same core craze we had when FX was launched - 'everyone' was saying how games will be using more cores. Nothing happened since, and nothing will change in foreseeable future because the paradigm above still applies.
    Completely depends what you are playing.  I like strategy games at times, I received a nice email from Stardock saying that all their strategy games are going to be using a new engine that will benefit greatly from more cores.  They are very excited by the 8 core AMD's and are recommending them for their games.

    They were getting much better performance from the AMD cores than the Intel I7 cores.  So my next build will probably be AMD for just that reason.


    I would expect strategy games where the heavy CPU load is from AI computations rather than graphics to be the first to put many cores to good use.  If a quad core CPU can't deliver playable frame rates, many people aren't going to play your game.  But people can accept game time only passing 1/2 or 2/3 as fast as if you had a CPU with more cores.
  • hatefulpeacehatefulpeace Member UncommonPosts: 621

    Quizzical said:



    Ozmodan said:


    Gdemami said:
    Aragoni said:




    First of all CPUs nowadays are a bit of a gamble, as it depends on how well the upcoming games will handle multi-threading.
    Not really, it is easily predictible how it will go.

    Here is the crux:




    Even if the game is well threaded, the bottleneck will still be on GPU and additional cores, more MHz or higher IPC won't help because the amount of workload we put on GPUs increase at much faster rate than CPU load increase.

    This core craze is the same core craze we had when FX was launched - 'everyone' was saying how games will be using more cores. Nothing happened since, and nothing will change in foreseeable future because the paradigm above still applies.
    Completely depends what you are playing.  I like strategy games at times, I received a nice email from Stardock saying that all their strategy games are going to be using a new engine that will benefit greatly from more cores.  They are very excited by the 8 core AMD's and are recommending them for their games.

    They were getting much better performance from the AMD cores than the Intel I7 cores.  So my next build will probably be AMD for just that reason.




    I would expect strategy games where the heavy CPU load is from AI computations rather than graphics to be the first to put many cores to good use.  If a quad core CPU can't deliver playable frame rates, many people aren't going to play your game.  But people can accept game time only passing 1/2 or 2/3 as fast as if you had a CPU with more cores.


    Well apparently you never played a 4x game like stellaris or sins. It works like this, they would not make a game and force every one to use high end settings, just like they don't with any mmos. They would do this, if you only have a quad core we recommend you do not raise the the number of ais past say 10, or what ever they in their testing found would be stable with a quad core. If you have 8 cores, you can have 20 ais. 

    It is like stellaris, if you only have a low end computer, don't make the star system 3000, with 100 ais, and a bunch of mods. Stellaris if you want to have high end stuff and alot of it can require a really high end cpu, there are thousands of people who play stellaris with a low end computer, they just can't play 3000 star systems.

    No big company is going to make a game that cuts off a large majority of potential customers. 
  • gervaise1gervaise1 Member EpicPosts: 6,919
    @angerbeaver ; :  

    If you find your processor in this table http://www.tomshardware.com/reviews/cpu-hierarchy,4312.html you will see that any new processor is going to give you a huge jump in performance. The chart lumps processors together so that people can get a quick comparison and your 950 is in the 4th bracket. So a jump to ANY 1st bracket processor will be huge.


    Some in this thread have suggested "less powerful" cpus than the one you posted. And there is a reason for this. Back when you bought your 950 the "best consumer cpus" were the most powerful available at the time  - so i7s and so on.

    Tech has moved on and as a result the "most powerful" consumer cpus are overkill for most people. Hence why people have suggested "slower" cpus. Basically don't let the slower fool you, all Kaby Lake processors along with the new Ryzen processors are beasts.

    So you have lots of choice when it comes to cpus. And lots of budget options as well.

    Maybe what you might want to look at first is your motherboard and maybe spend a little bit more on it. And if looking at Intel look at the Z270s series e.g. MSI Z270 SLI Plus rather than a H or even a B series chipset e.g. MSI H110M Grenade. Intel's list price for Z is only about US$20 more so the equivalent Z series motherboard shouldn't be that much more than the same H or B series. 

    And then focus on your graphics card.

    Then memory. I think all Z270s support up to 64Gb so if you then decide how much memory you want - minimum of 16Gb probably but you may decide to opt for more. 

    After that see what those parts would cost and how it stacks up with what you want to spend. And that might decide which cpu to go with.

    You also asked about Intel's next gen cpus. The expectation is that the new cpus will use the same sockets as Kaby Lake. So - if you are "concerned" - an option might be to go with an inexpensive Kaby Lake Pentium (all Kaby Lakes are beasts) and upgrade to the new Cannon Lake in the future. (Although I think you will find that if you have managed with a 950 until now whatever you choose will be so much faster you won't feel the need).

     
  • CleffyCleffy Member RarePosts: 6,414
    edited April 2017
    More cores allows for more actors on screen with more things to do and can generally be scaled well to more cores. We won't break 4 cores anytime soon because the general consumer is just getting out of 2 cores. Game developers tend to program games for the general consumer since it would be difficult to program a game to run on less cores while still taking advantage of more cores. We won't see Game developers creating 16 thread PC games until the typical PC can compute 16 threads.
  • RidelynnRidelynn Member EpicPosts: 7,383
    It's one of those chicken and the egg scenarios:

    Developers won't write software that needs cutting edge hardware
    Cutting edge hardware doesn't sell because there's no software that needs it.

    Every now and then, you get a defining "something" that breaks through:

    I can remember the first time I saw Quake running on a Voodoo card. That was a world-altering moment for me. Not many people had dedicated 3D Accelerators, but that game right there showed me what they can do, and I saved my money and bought a Voodoo2 shortly after that.

    Crysis was probably the next title after Quake to really push the envelope. For several hardware generations after that game was released, it was still the standard question to ask: "But can it run Crysis on Max?"

    I can remember Aegia PhysX - that didn't catch on in it's original implementation, but it definitely brought attention to in-game effects and lives on, not only as PhysX but as numerous other libraries that brought a new level of realism to gaming.

    I can remember my first dual-CPU build (The Abit BP6, which I was reminded of not too long ago) - running WinNT 4 (because Win98 didn't support SMP at all). Absolutly nothing really supported SMP back then, but there was potential, and today we are talking about 16-core chips in consumer devices. It took almost a decade for even just 2 cores/CPUs to catch up to mainstream from then.

    The first computer I booted off an SSD - computers changed for me that moment. Such a small thing as booting up in <10 seconds and nearly eliminating all UI lag made a huge difference in how I use computers.

    The iPhone - I still have my 1.0 laying around somewhere, and it still works (even though I ran it through the clothes washer once on accident). 

    If you make something truly great and game-changing, it will bring the world up to it's standard. But if your not truly great, everyone will just stop, point and say "That was stupid" and move on. For every moment I can think of that was actually game changing, there are hundreds that thought they would be too, but just weren't there.

    It won't necessarily take every computer in the world having 16 cores to finally get a good game that needs 16 cores. And conversely, even if everyone in the world did have 16 cores, that doesn't mean developers would all of a sudden be able to jump out there and actually use them all to great effect. 

    Honestly, I think we are all discussing very small evolutionary steps in the current desktop/laptop CPU. I don't think anything there shy of thousands of cores is even really all that earth shattering or ground breaking. What comes ~after~ we all sit around counting x86 cores, that I think is what is going to be exciting. 
  • GrubbsGradyGrubbsGrady Member UncommonPosts: 371






    I spend alot of time deciding that question. If you have a 144 monitor I would go with the i7. If you have lots of money to waste the 7700k as stated is the fastest, but the 6700 isn't that far behind. If you over clocked a 6700 to 4.5, which wouldn't be to hard to do, my friend and I got his to 4.5 in like 10 mins. 

    I dono  if I could find my post on here, but I wrote it all down. Basically the ryzen 1700 overclocked to 3.8, gets between like 20-90 fps less than a i7 with a 1080ti. Those are max fps though, the average is more like 15-20, and the min is like 10. 

    Here is is.
    http://forums.mmorpg.com/discussion/comment/7142251#Comment_7142251




    If you are doing 4k, the CPU doesn't really matter in the slightest. A 8350 will get the basically the same FPS as a 7700k with a 1080ti, because the GPU bottle necks. 

    So there is really only one reason to get a 7700k, and that is if you use a 1080 144hz monitor. If not they are a waste of money. Because a Ryzen 1700 will get over 60 fps at 1080, which you can't see any difference if you dont have a 144hz monitor. The 7700k in and of it self is really a waste of money to be honest even if your doing 144hz, because a i5 7600k will get you the same results in gaming as the i7, but cost almost half the price. It use to be that the i7 use to be better at multi thread tasks, so people would get it, cause they wanted to do more than game, but if you are doing more than gaming the ryzen 1700 would blow a i7 away at like 2x the speed. 

    I owned a i5 6600k, and I hit a bottle neck with it, at the time. I was playing 2 eve onlines and stellaris, which not to many people attempt to do. At the time there was no ryzen, so I bought a 6800k, which got rid of the bottle neck, but I blew it up. The 6600k was so close in gaming to my friend 6700k that it wasn't worth talking about. 

    If all you are doing is streaming, playing games and youtubing have a 144hz monitor than i5 is  the ticket. If you have 1080 60 hz, or 4k 60hz ryzen is the better deal. If all your doing is going to be 1080 or 4k 60hz gaming, the best deal is the ryzen 4 core because it is like 100 bucks.


    From your experience, someone using a 1440p/144hz monitor and a 1080Ti who is going to be doing gaming with streaming 40% of the time...does it make more sense performance wise to go with the 7700k or a Ryzen 1700+ ?


    I see a lot of people say "streaming then get Ryzen", but what if I am playing games on high settings off stream more than I am streaming? Is a Ryzen going to ruin that?


    Gaming on high settings is priority 1, streaming is priority 2.

  • toolowdown71toolowdown71 Member UncommonPosts: 24
    if gaming is your #1 priority get the 7700k i was in the same boat a month ago and i got the 7700k. i have not looked back since, overclocked it out of the box to 4.8 without delid and its a beast in games:)
  • RidelynnRidelynn Member EpicPosts: 7,383










    I spend alot of time deciding that question. If you have a 144 monitor I would go with the i7. If you have lots of money to waste the 7700k as stated is the fastest, but the 6700 isn't that far behind. If you over clocked a 6700 to 4.5, which wouldn't be to hard to do, my friend and I got his to 4.5 in like 10 mins. 

    I dono  if I could find my post on here, but I wrote it all down. Basically the ryzen 1700 overclocked to 3.8, gets between like 20-90 fps less than a i7 with a 1080ti. Those are max fps though, the average is more like 15-20, and the min is like 10. 

    Here is is.
    http://forums.mmorpg.com/discussion/comment/7142251#Comment_7142251




    If you are doing 4k, the CPU doesn't really matter in the slightest. A 8350 will get the basically the same FPS as a 7700k with a 1080ti, because the GPU bottle necks. 

    So there is really only one reason to get a 7700k, and that is if you use a 1080 144hz monitor. If not they are a waste of money. Because a Ryzen 1700 will get over 60 fps at 1080, which you can't see any difference if you dont have a 144hz monitor. The 7700k in and of it self is really a waste of money to be honest even if your doing 144hz, because a i5 7600k will get you the same results in gaming as the i7, but cost almost half the price. It use to be that the i7 use to be better at multi thread tasks, so people would get it, cause they wanted to do more than game, but if you are doing more than gaming the ryzen 1700 would blow a i7 away at like 2x the speed. 

    I owned a i5 6600k, and I hit a bottle neck with it, at the time. I was playing 2 eve onlines and stellaris, which not to many people attempt to do. At the time there was no ryzen, so I bought a 6800k, which got rid of the bottle neck, but I blew it up. The 6600k was so close in gaming to my friend 6700k that it wasn't worth talking about. 

    If all you are doing is streaming, playing games and youtubing have a 144hz monitor than i5 is  the ticket. If you have 1080 60 hz, or 4k 60hz ryzen is the better deal. If all your doing is going to be 1080 or 4k 60hz gaming, the best deal is the ryzen 4 core because it is like 100 bucks.




    From your experience, someone using a 1440p/144hz monitor and a 1080Ti who is going to be doing gaming with streaming 40% of the time...does it make more sense performance wise to go with the 7700k or a Ryzen 1700+ ?


    I see a lot of people say "streaming then get Ryzen", but what if I am playing games on high settings off stream more than I am streaming? Is a Ryzen going to ruin that?


    Gaming on high settings is priority 1, streaming is priority 2.



    At High settings at 1440p, Ryzen 7 and Intel 7700 are more or less matched - within 5% most of the time, and Ryzen taking the lead in some occasions, and bigger opportunity for efficiency gains moving forward.

    During streaming, I have no idea how the two match up - it blows my mind that people would even attempt stream with CPU encoding, but apparently there are people out there that do.

    So with that in mind, it really seems like this isn't such an important question. Even with future optimizations, I wouldn't expect Ryzen to all of a sudden go from "fairly evenly matched" to "blows Intel out of the water"

    You can build a very nice gaming rig out of either CPU right now. In applications that support it (which aren't exactly common, but do exist - maybe CPU streaming encoding is one of them) - 8/16 is going to demolish 4/8. If your using Quicksync for encoding, then maybe you should be looking for a Crystalwell package (Iris Pro/Plus Graphics, like i7 6970HQ), as that depends on the integrated GPU and not so much CPU cores. 
  • GrubbsGradyGrubbsGrady Member UncommonPosts: 371

    Ridelynn said:



    At High settings at 1440p, Ryzen 7 and Intel 7700 are more or less matched - within 5% most of the time, and Ryzen taking the lead in some occasions, and bigger opportunity for efficiency gains moving forward.

    During streaming, I have no idea how the two match up - it blows my mind that people would even attempt stream with CPU encoding, but apparently there are people out there that do.

    So with that in mind, it really seems like this isn't such an important question. Even with future optimizations, I wouldn't expect Ryzen to all of a sudden go from "fairly evenly matched" to "blows Intel out of the water"

    You can build a very nice gaming rig out of either CPU right now. In applications that support it (which aren't exactly common, but do exist - maybe CPU streaming encoding is one of them) - 8/16 is going to demolish 4/8. If your using Quicksync for encoding, then maybe you should be looking for a Crystalwell package (Iris Pro/Plus Graphics, like i7 6970HQ), as that depends on the integrated GPU and not so much CPU cores. 


    So do you feel that its better to hardware encode using your GPU, or are you just saying that people streaming need to have a dedicated system for it and utilize something like a capture card?
  • RidelynnRidelynn Member EpicPosts: 7,383
    It depends greatly on your rig, what game you are streaming, what software you are using to stream, and where you are streaming to.

    In general:

    GPUs can perform tasks like encoding faster than CPUs. But they can only do so with specific software, and with certain bitrate and other constraints. Your typical GPU has hundreds or thousands of processing units to deal with, the CPU only has a handful. Software like nVidia's Shadowplay, or plugins which allow NVENC do this.

    CPU will be more flexible, in that it's a general purpose processor, but will tend to do so slower. 

    Most games aren't particularly CPU greedy, especially older DX9/DX11 titles. They use a couple of cores decently heavy, and the rest just get some light processing threads, so a typical game will have a lot of cores at or near idle.

    Some games are heavily GPU bound, and trying to offload encoding to a GPU there isn't a great idea. A few games are CPU bound, and conversely, offloading encoding to the CPU will affect your game performance. There's also other considerations, such as disk access, PCI limitations, RAM speeds, network access, etc.

    A good solution is often Quicksync, which uses the otherwise dormant IGP of an Intel CPU - your GPU is used to game, your CPU cores are mostly used for gaming, and the IGP does the heavy lifting of the stream encoding. This is an older thread, but it can give you an idea.

    Intel CPUs with Crystalwell GPUs (Iris Pro) have beefier IGP than standard CPUs, which means they will encode faster/higher quality if you are using Quicksync. You own't get quite as fast of a CPU though, and because your sharing the die (and cooling) between your CPU and IGP, your CPU may throttle more or not be able to achieve full boost clocks as often.

    If you want the best of all worlds, you'd encode with a completely separate machine. That would allow you to use the full set of resources for your gaming rig for your game, and the full set of resources for your streaming encode on your encoding machine. I don't know if/how streaming software does that though or how big of a hassle it is, I'm not a streaming, just speaking from the technology perspective.

    Another option would be to not stream live - again, I'm not a streamer, so I don't know how that would affect things like viewers or sponsorship or whatever, but if your just dumping raw video to your hard drive, and then encoding and uploading it at a later time, that's another way to avoid running both processes at once on the same machine.

    If you aren't a professional streamer, where you aren't bound to stream in very high resolution with high bitrates, the easiest alternative may just be turn down the quality of your stream some. Dropping the bitrate and/or resolution will have a huge impact on the resources needed to encode, and won't sacrifice the visual fidelity of the game running on your local machine for you.

  • GrubbsGradyGrubbsGrady Member UncommonPosts: 371

    Thanks for that explanation, it really helps me.

  • QuizzicalQuizzical Member LegendaryPosts: 25,499

    Ridelynn said:

    A good solution is often Quicksync, which uses the otherwise dormant IGP of an Intel CPU - your GPU is used to game, your CPU cores are mostly used for gaming, and the IGP does the heavy lifting of the stream encoding. This is an older thread, but it can give you an idea.

    Intel CPUs with Crystalwell GPUs (Iris Pro) have beefier IGP than standard CPUs, which means they will encode faster/higher quality if you are using Quicksync. You own't get quite as fast of a CPU though, and because your sharing the die (and cooling) between your CPU and IGP, your CPU may throttle more or not be able to achieve full boost clocks as often.



    QuickSync doesn't use a GPU.  It uses a fixed-function hardware block to do video transcoding.  That's why, when it launched, it was able to be better than video transcoding on higher end discrete video cards.
  • CleffyCleffy Member RarePosts: 6,414


    I think this is the current argument for Ryzen for streaming. It produces the best quality stream without an external solution. However, Linus has been a bit of a shill for AMD lately.
  • OzmodanOzmodan Member EpicPosts: 9,726
    Well here is another big boost for Ryzen, seems the I7 7700 has some nasty issues....

    http://www.theregister.co.uk/2017/05/04/intel_i77700_heat_spike_problems/
  • OzmodanOzmodan Member EpicPosts: 9,726
    edited May 2017







    Ozmodan said:



    Well here is another big boost for Ryzen, seems the I7 7700 has some nasty issues....

    http://www.theregister.co.uk/2017/05/04/intel_i77700_heat_spike_problems/






    Mine disagrees, but then it's a "K" version...




    If you rebuild every couple years probably not a problem, but the chances of your system going past that are not that good.  Ryzen chips run cooler and my experience has always been that heat kills.  I keep my systems for at least 5 years hence won't be buying any Intel chips in the near future.  I have had to replace far too many of customers Intel cpus lately that were supposed safely overclocked.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited May 2017


    Ozmodan said:

    Ryzen chips run cooler and my experience has always been that heat kills.  I keep my systems for at least 5 years hence won't be buying any Intel chips in the near future.  I have had to replace far too many of customers Intel cpus lately that were supposed safely overclocked.




    Wait what?

    Both chips (Ryzen 7 vs 7700k) pull about the same power at stock frequencies, but:

    Ryzen 7 doesn't overclock nearly as far (for R7 4.1 is about the ceiling, whereas 7700k goes around 4.8), and when overclocked, pulls a lot more power than a 7700K

    https://www.pcper.com/reviews/Processors/Overclocking-AMD-Ryzen-7-1700-Real-Winner
    http://www.anandtech.com/show/10968/the-intel-core-i7-7700k-91w-review-the-new-stock-performance-champion/11

    Also, what exactly is a "safe" overclock?

  • OzmodanOzmodan Member EpicPosts: 9,726
    A safe overclock is no overclock!  Most overclockers burn out their CPU's within 2 years.
  • RidelynnRidelynn Member EpicPosts: 7,383

    Ozmodan said:

    A safe overclock is no overclock!  Most overclockers burn out their CPU's within 2 years.


    Is there some data to back this up? It certainly isn't my experience.

    I've overclocked a lot of different CPUs over the years. I've broken a few - most of them physically (remember trying to mount heatsinks on the raised IC surface of the TBird/Duron?).

    In the days before thermal throttling, I did burn up a couple on overvoltage while trying to see how far I could push it. I've seen PSUs go out, and take out a CPU with it (that one is rare, CPUs tend to be well-shielded by the motherboards power circuitry anymore).

    But I've never had a CPU that was otherwise working, overclocked or not, burn out on me. I've still got an Athlon 64 somewhere that runs that was OCed it's entire life. I've got a i7 920 that's been overclocked to 4.0Ghz for it's entire life and it still runs.

    I'm not saying your wrong in that OCing can contribute to early silicon death. Or that some people have had their CPU fail inside of 2 years. I've just never had a CPU have long enough useful life to hit that silicon death point, even while overclocked, nor do I know anyone else that has either.
  • VorthanionVorthanion Member RarePosts: 2,749
    edited May 2017
    One thing you might consider is that the I7 7700k might drop in price considerably with the release of the new chip toward the end of 2017.  That alone might be worth the 3 or 4 month wait.  If the new chip ends up being a huge upgrade and has been tested and reviewed as so, then go ahead and purchase it instead.  Your current setup is good enough to last you till the end of the year to find out for sure and potentially save you a lot of money.  

    Isn't Intel going to be releasing a 10 / 12 core CPU around that time to replace the current Broadwell series?

    Update:  Apparently, a roadmap was leaked and it reveals that both Skylake-X (Broadwell E successor) and Kabylake-X(Kabylake enthusiast segment) are releasing Q2, starting in June at some tech show in Taiwan.  Both will use the same Intel X299 HEDT platform.

    http://wccftech.com/intel-skylake-x-kaby-lake-x-q2-2017-roadmap-leak/

    Post edited by Vorthanion on

    image
  • someforumguysomeforumguy Member RarePosts: 4,088
    I would never post a question about hardware on these forums lol. I know there are people who know a lot about hardware here, but at the same time you have to wade through countless of almost fanbased comments (all the tripe with their anecdotal evidence why <insert brand> is utter shit or awesome).

    - Ignore all posts claiming shit because of some anecdote.
    - Ignore all posts from people who clearly didn't read about the requirements you have for your new pc.
    - Ignore all posts from people who can't compare sensibly (some intel cpu vs amd cpu with huge price difference or stuff like that).

    - Ignore this post too.

    I wonder how many posts will be left. Someone count please. And OP good luck with your purchase. May the gods protect your brain from getting hurt by these forums.
  • RidelynnRidelynn Member EpicPosts: 7,383
    Yeah I saw that roadmap - interesting to note that Kaby Lake X is limited to 4/8 (basically a 7700 w/o the IGP). Skylake will get up to a 12/24 core model (up from Broadwells' current 10/20). 

    That being said, apart from retailer inventory reduction, chipmakers very rarely discount existing hardware when their new generation comes out - particularly Intel, who tends to actually raise prices as chips move End-Of-Life.

    I would expect X299 to be similar to the current X99 -- quad channel DDR4, more PCI lanes than H/B/Z series, maybe more robust RAID/M.2 options, but all in all not a lot past that. Exciting if your working on multi-SLI, or have some particular need for a lot of RAM bandwidth, not so much if your not.
  • RobsolfRobsolf Member RarePosts: 4,607
    For me, the choice of CPU has really been more about buying time, more than performance.  Not just in the immediate, "X loads in 20 seconds, compared to 23 seconds with the other CPU", but in the time you'll have before you have to upgrade.

    I've seen a few arguments about just buying a previous generation CPU for much cheaper, and the point is probably true that you won't see much of a difference between those and a 7700 on a 60hz lcd... -now.  But you're going to be short that time in generations in terms of system viability.  If you buy a 1 year old CPU design, you're gonna be short 1 year, more or less, in system usefulness.  But there's still a perfectly good argument for it.

    Up until this time when I went with the 7700, that was pretty much my system building strategy.  I'd buy a good series $100-ish dollar CPU, buy as future proof a mobo as I could afford, and replace the cpu when the system either got too slow or after the socket was  retired.  If the latter, I'd buy the best compatible CPU, at a fraction of the cost of when it was the newest gen.  As a result, I'd end up with a better, faster system for about the same money as if I bought the latest gen in the first place.

    Personally, I found that AMD was best for this.  They maintained socket compatibility far longer than Intel did, and this may still be the case, today.

    Now, the reason I went with the 7700 this time because I was tired of some of the quirky crap that I got from the AMD from day one.  Games like LotRO and Marvel Heroes would hitch terribly, something that neither of my Intel laptops did despite having weaker video and slower HD's.  Other games would just inexplicably go from 60 FPS to a slideshow instantly in certain areas, again, an issue when it was 0 years old or 8 years old.  A few weeks now with the 7700 has yielded 0 surprises like this, and that makes me happy.
  • OzmodanOzmodan Member EpicPosts: 9,726




    Ozmodan said:


    Well here is another big boost for Ryzen, seems the I7 7700 has some nasty issues....

    http://www.theregister.co.uk/2017/05/04/intel_i77700_heat_spike_problems/




    Mine disagrees, but then it's a "K" version...


    Come by my place sometime, I have a box of burnt out processors that I have had to replace. 
  • RidelynnRidelynn Member EpicPosts: 7,383
Sign In or Register to comment.