Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Why you shouldn't buy the newest hardware.

l2avism2l2avism2 Member UncommonPosts: 38
I'm in my thirties and I've seen alot of hardware come and go.
I did my first online gaming on a 100mhz Pentium with a pci graphics card and 56kbps dialup.
I've spent alot of cash of hardware replacements since then and there is one thing that was always true.
The newest hardware isn't always the best option for your gaming PC.
I'm not saying that from a performance perspective, but rather from a cost efficiency perspective.
You can usually play modern games on 1 to 2 year old hardware at max or near max settings and have really great results.
So instead of buying the best hardware from today, or settling for mediocre hardware from today, buy the best hardware from a year ago after its been marked down by 75%.
Many types of hardware like CPUs and graphics cards will come with many tiers. The lower tiers are often significantly slower than the higher tiers from 3 years ago.
Usually only the feature sets improve but the processor clocks on the CPUs and video cards improve very little over the years.
Its been nearly 20 years since the first desktop 5ghz CPU, and we are still buying new CPUs with sub-5ghz clock rates. Really they have just added more multithreading features and AMD and later Intel moved the memory control onto the CPU. But even then we had the ability to run 8 cores was back in 1996 with the Pentium Pro 133mhz by using an 8 CPU motherboard. (not saying that an 8 core Pentium Pro would run crysis though)
For example, my little sister plays modern MMOs on my 5 year old gaming PC that was built by me using an older AMD Phenom CPU and a Radeon card and she doesn't have to turn the settings down at all.
«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    The problem with saying to wait until prices drop by 75% is that they don't drop by 75%.  For Intel CPUs, the prices commonly don't drop at all; Intel just discontinues the old parts and slots in the new one at about the same price as the old.  For AMD CPUs or GPUs from either vendor, prices may drop a little, but they prefer to discontinue the old lineup to make room for the new rather than continuing to sell the old for cheaper prices.  At most, you'll see the old lineup drop in price by about enough to have its price/performance be in line with the new, but at the expensive the old still using higher power.

    There are rare exceptions where a company produced too much of some particular GPU and then needs to get rid of it.  AMD did have clearance pricing on their Cypress GPU for roughly the month of April 2011, and their Hawaii GPU for a while some years later.  But those are unusual.

    An eight-socket server is not at all comparable to an eight-core CPU.  For starters, going multiple sockets creates a ton of NUMA problems.  It also makes a mess of your memory configuration, as every socket has to have its own memory.
    jimmywolfwingoodpsychosiz1
  • IceDarkIceDark Member UncommonPosts: 207
    l2avism2 said:
    I'm in my thirties and I've seen alot of hardware come and go.
    I did my first online gaming on a 100mhz Pentium with a pci graphics card and 56kbps dialup.
    I've spent alot of cash of hardware replacements since then and there is one thing that was always true.
    The newest hardware isn't always the best option for your gaming PC.
    I'm not saying that from a performance perspective, but rather from a cost efficiency perspective.
    You can usually play modern games on 1 to 2 year old hardware at max or near max settings and have really great results.
    So instead of buying the best hardware from today, or settling for mediocre hardware from today, buy the best hardware from a year ago after its been marked down by 75%.
    Many types of hardware like CPUs and graphics cards will come with many tiers. The lower tiers are often significantly slower than the higher tiers from 3 years ago.
    Usually only the feature sets improve but the processor clocks on the CPUs and video cards improve very little over the years.
    Its been nearly 20 years since the first desktop 5ghz CPU, and we are still buying new CPUs with sub-5ghz clock rates. Really they have just added more multithreading features and AMD and later Intel moved the memory control onto the CPU. But even then we had the ability to run 8 cores was back in 1996 with the Pentium Pro 133mhz by using an 8 CPU motherboard. (not saying that an 8 core Pentium Pro would run crysis though)
    For example, my little sister plays modern MMOs on my 5 year old gaming PC that was built by me using an older AMD Phenom CPU and a Radeon card and she doesn't have to turn the settings down at all.
    Well , I don't agree.

    Of course you are not going to upgrade on a yearly basis and most gamer will build a computer which will run for around 5 years ( with some upgrades in the mean time if necessary ) .

    From a cost efficiency perspective, for exemple, I bought a GTX 1060 right around when it was released for : EUR 347,99 and almost 2 years later , the price is EUR 375,80. 

    Also , CPU GHz has an effect on gaming performance but is not the only thing to consider, and so , your theory about this "issue" is not valid.

    "Its been nearly 20 years since the first desktop 5ghz CPU" - I .. did google this , but can't seems to find anything close to this. All I found is :

    On March 6, 2000, AMD reached the 1 GHz milestone a few months ahead of Intel

    In 2002, an Intel Pentium 4 model was introduced as the first CPU with a clock rate of 3 GHz

    As of mid-2013, the highest clock rate on a production processor is the IBM zEC12, clocked at 5.5 GHz, which was released in August 2012

    So if you take the 2002 P4 3 GHz and compare it with Ryzen 5 1600 for exemple, which has a base CPU speed of 3.2 GHz , you will see massive difference, even if the GHz are almost the same. 

    If anything , we are on a very good track in terms of computers this days. 
    LeFantomewingoodpsychosiz1
    The Ice is dark and full of terror.
  • VrikaVrika Member LegendaryPosts: 7,989
    edited May 2018
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.


    EDIT: There's a good case to be made for not buying most expensive hardware, but advising people to buy previous generation hardware is stupid. Some parts are occasionally available at cheap prices, but those are exceptions usually the latest gen hardware is best choice unless you're buying used stuff.
    Quizzical
     
  • ScorchienScorchien Member LegendaryPosts: 8,914
    edited May 2018
    I keep 3 pcs in the house , My Office , the wifes Office and the kids , i buy a new PC every 3 years and pass them down .. works out fine ...

      Pass mine down to my wife , hers down to the kid .. I have given the extra one away to friends with kids that need one 1/2 a dozen times ..
    H0urg1assjimmywolfwingood
  • H0urg1assH0urg1ass Member EpicPosts: 2,380
    Scorchien said:

      I have given the extra one away to friends with kids that need one 1/2 a dozen times ..
    That's pretty awesome of you
    jimmywolf
  • l2avism2l2avism2 Member UncommonPosts: 38
    IceDark said:
    l2avism2 said:
    I'm in my thirties and I've seen alot of hardware come and go.
    I did my first online gaming on a 100mhz Pentium with a pci graphics card and 56kbps dialup.
    I've spent alot of cash of hardware replacements since then and there is one thing that was always true.
    The newest hardware isn't always the best option for your gaming PC.
    I'm not saying that from a performance perspective, but rather from a cost efficiency perspective.
    You can usually play modern games on 1 to 2 year old hardware at max or near max settings and have really great results.
    So instead of buying the best hardware from today, or settling for mediocre hardware from today, buy the best hardware from a year ago after its been marked down by 75%.
    Many types of hardware like CPUs and graphics cards will come with many tiers. The lower tiers are often significantly slower than the higher tiers from 3 years ago.
    Usually only the feature sets improve but the processor clocks on the CPUs and video cards improve very little over the years.
    Its been nearly 20 years since the first desktop 5ghz CPU, and we are still buying new CPUs with sub-5ghz clock rates. Really they have just added more multithreading features and AMD and later Intel moved the memory control onto the CPU. But even then we had the ability to run 8 cores was back in 1996 with the Pentium Pro 133mhz by using an 8 CPU motherboard. (not saying that an 8 core Pentium Pro would run crysis though)
    For example, my little sister plays modern MMOs on my 5 year old gaming PC that was built by me using an older AMD Phenom CPU and a Radeon card and she doesn't have to turn the settings down at all.
    Well , I don't agree.

    Of course you are not going to upgrade on a yearly basis and most gamer will build a computer which will run for around 5 years ( with some upgrades in the mean time if necessary ) .

    From a cost efficiency perspective, for exemple, I bought a GTX 1060 right around when it was released for : EUR 347,99 and almost 2 years later , the price is EUR 375,80. 

    Also , CPU GHz has an effect on gaming performance but is not the only thing to consider, and so , your theory about this "issue" is not valid.

    "Its been nearly 20 years since the first desktop 5ghz CPU" - I .. did google this , but can't seems to find anything close to this. All I found is :

    On March 6, 2000, AMD reached the 1 GHz milestone a few months ahead of Intel

    In 2002, an Intel Pentium 4 model was introduced as the first CPU with a clock rate of 3 GHz

    As of mid-2013, the highest clock rate on a production processor is the IBM zEC12, clocked at 5.5 GHz, which was released in August 2012

    So if you take the 2002 P4 3 GHz and compare it with Ryzen 5 1600 for exemple, which has a base CPU speed of 3.2 GHz , you will see massive difference, even if the GHz are almost the same. 

    If anything , we are on a very good track in terms of computers this days. 
    We've been stuck at 3-4ghz for 16+ years.

    Also:

    https://www.engadget.com/2007/01/24/pentium-4-overclocked-to-8ghz-lets-see-your-fancy-core-2-try-t/

    https://www.computerworld.com/article/2490182/computer-processors/intel-s-new-core-i7-chip-hits-5ghz.html

    https://www.techradar.com/news/computing-components/processors/amd-on-its-5ghz-processor-you-don-t-buy-a-ferrari-for-the-mpg-1158675

    Also, the P4's topped out at 3.8ghz out of the box in 2005. The AMD processors at the time were 64bit and clocked at 2ghz and were alot faster because of the faster memory buss (dual channel dd2) and more cache.
    At the time Intel was working on an entirely incompatible 64bit CPU called italatium and AMD released the Athlon64 which could run all old versions of 32bit windows and then run an extended intel x86 instruction set called the AMD64 instruction set and run WindowsXP 64 bit. This caused a huge shakeup at Intel because Pentiums were too slow to compete with the Athlon chips and Intel could not run 64bit desktop Windows (Italium required a special version of Windows server or Linux).
    Anyways, after Intel adopted the AMD instruction set and produced the Pentium Ds which could run 64bit Windows XP, they ran clock speeds comparable to the AMD chips (2ghz ish).

    We have been bouncing around between 3 and 4ghz for basically 16 years. They just keep adding more cores (which most games wont use) or sometimes a little more cache. The memory buss has increased significantly.
    However, the progress after 2003 is embarrassing slow compared to that of 1990 to 2003. Its like we get excited over the most insignificantly small gains and then rush out and spend a thousand dollars and then come home disappointed.

  • l2avism2l2avism2 Member UncommonPosts: 38
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    l2avism2 said:
    IceDark said:
    l2avism2 said:
    I'm in my thirties and I've seen alot of hardware come and go.
    I did my first online gaming on a 100mhz Pentium with a pci graphics card and 56kbps dialup.
    I've spent alot of cash of hardware replacements since then and there is one thing that was always true.
    The newest hardware isn't always the best option for your gaming PC.
    I'm not saying that from a performance perspective, but rather from a cost efficiency perspective.
    You can usually play modern games on 1 to 2 year old hardware at max or near max settings and have really great results.
    So instead of buying the best hardware from today, or settling for mediocre hardware from today, buy the best hardware from a year ago after its been marked down by 75%.
    Many types of hardware like CPUs and graphics cards will come with many tiers. The lower tiers are often significantly slower than the higher tiers from 3 years ago.
    Usually only the feature sets improve but the processor clocks on the CPUs and video cards improve very little over the years.
    Its been nearly 20 years since the first desktop 5ghz CPU, and we are still buying new CPUs with sub-5ghz clock rates. Really they have just added more multithreading features and AMD and later Intel moved the memory control onto the CPU. But even then we had the ability to run 8 cores was back in 1996 with the Pentium Pro 133mhz by using an 8 CPU motherboard. (not saying that an 8 core Pentium Pro would run crysis though)
    For example, my little sister plays modern MMOs on my 5 year old gaming PC that was built by me using an older AMD Phenom CPU and a Radeon card and she doesn't have to turn the settings down at all.
    Well , I don't agree.

    Of course you are not going to upgrade on a yearly basis and most gamer will build a computer which will run for around 5 years ( with some upgrades in the mean time if necessary ) .

    From a cost efficiency perspective, for exemple, I bought a GTX 1060 right around when it was released for : EUR 347,99 and almost 2 years later , the price is EUR 375,80. 

    Also , CPU GHz has an effect on gaming performance but is not the only thing to consider, and so , your theory about this "issue" is not valid.

    "Its been nearly 20 years since the first desktop 5ghz CPU" - I .. did google this , but can't seems to find anything close to this. All I found is :

    On March 6, 2000, AMD reached the 1 GHz milestone a few months ahead of Intel

    In 2002, an Intel Pentium 4 model was introduced as the first CPU with a clock rate of 3 GHz

    As of mid-2013, the highest clock rate on a production processor is the IBM zEC12, clocked at 5.5 GHz, which was released in August 2012

    So if you take the 2002 P4 3 GHz and compare it with Ryzen 5 1600 for exemple, which has a base CPU speed of 3.2 GHz , you will see massive difference, even if the GHz are almost the same. 

    If anything , we are on a very good track in terms of computers this days. 
    We've been stuck at 3-4ghz for 16+ years.

    Also:

    https://www.engadget.com/2007/01/24/pentium-4-overclocked-to-8ghz-lets-see-your-fancy-core-2-try-t/

    https://www.computerworld.com/article/2490182/computer-processors/intel-s-new-core-i7-chip-hits-5ghz.html

    https://www.techradar.com/news/computing-components/processors/amd-on-its-5ghz-processor-you-don-t-buy-a-ferrari-for-the-mpg-1158675

    Also, the P4's topped out at 3.8ghz out of the box in 2005. The AMD processors at the time were 64bit and clocked at 2ghz and were alot faster because of the faster memory buss (dual channel dd2) and more cache.
    At the time Intel was working on an entirely incompatible 64bit CPU called italatium and AMD released the Athlon64 which could run all old versions of 32bit windows and then run an extended intel x86 instruction set called the AMD64 instruction set and run WindowsXP 64 bit. This caused a huge shakeup at Intel because Pentiums were too slow to compete with the Athlon chips and Intel could not run 64bit desktop Windows (Italium required a special version of Windows server or Linux).
    Anyways, after Intel adopted the AMD instruction set and produced the Pentium Ds which could run 64bit Windows XP, they ran clock speeds comparable to the AMD chips (2ghz ish).

    We have been bouncing around between 3 and 4ghz for basically 16 years. They just keep adding more cores (which most games wont use) or sometimes a little more cache. The memory buss has increased significantly.
    However, the progress after 2003 is embarrassing slow compared to that of 1990 to 2003. Its like we get excited over the most insignificantly small gains and then rush out and spend a thousand dollars and then come home disappointed.

    You said we hit 5 GHz nearly 20 years ago.  IceDark said, no we didn't.  You gave links from 2007, 2013, and 2014.  None of those are nearly 20 years ago.  And the link from 2007 was liquid nitrogen overclocking, anyway.

    The first CPUs that could realistically hit 5 GHz apart from exotic cooling was Sandy Bridge in 2011, and even that took an enormous overclock and some lucky silicon.  Even with liquid nitrogen, the first that could hit 5 GHz was surely some sort of Pentium 4, and likely either Northwood (2002) or Prescott (2004).  That's not 20 years ago, either.
  • GruntyGrunty Member EpicPosts: 8,657
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    And here you are complaining about hertz. 
    "I used to think the worst thing in life was to be all alone.  It's not.  The worst thing in life is to end up with people who make you feel all alone."  Robin Williams
  • QuizzicalQuizzical Member LegendaryPosts: 25,499
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
  • Octagon7711Octagon7711 Member LegendaryPosts: 9,004
    In general more expensive systems go longer without needing upgrades or replacements.  Moderately priced systems generally need to be upgraded or replaced sooner. 

    "We all do the best we can based on life experience, point of view, and our ability to believe in ourselves." - Naropa      "We don't see things as they are, we see them as we are."  SR Covey

  • DvoraDvora Member UncommonPosts: 499
    edited May 2018
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    Ya sorry dude, hertz is a pretty piss poor measurement of performance nowadays, unless comparing to a cpu of the same generation with the same instruction set and architecture etc.

    Your example of a pentium 4 at 3.8ghz... would not hardly run modern games at lowest resolution and lowest settings (if it ran at all, which is doubtful) compared to my old first gen i7 that only clocks 3.6ghz that still runs pretty good.  Then theres newer gen i7's that are about the same clock speed as my i7, but are 30% faster or or more at single core performance.

    It's true that stuff isnt increasing as fast as it used to, but eh, your examples are so far off the charts they aren't even applicable.

    I'd also challenge you to try to run black desert or bless on that 5 yr old pc without turning everything to low.  Even WOW i bet you get no more than 20fps on high settings.
  • l2avism2l2avism2 Member UncommonPosts: 38
    Quizzical said:
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
    No, what I was saying was the metrics that actually measure real performance are usually the ones ignored.
    Measuring instructions per second is far more useful. A computer program is a series of instructions.
    Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
  • VrikaVrika Member LegendaryPosts: 7,989
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    Yes, but the problem is that processor's clock speed is not the only factor that affects its real-world performance.

    Using clock speed to compare different processors would be like using top speed to compare different cars, and then concluding that a Ferrari is better for transporting stuff than a delivery truck: You've got a valid unit, but your ignoring other factors that are also important, and arriving to wrong conclusions.

    Herz was very good unit for comparing speeds of different processors back in 1990s, but it isn't any more. Nowadays other factors that affect CPU speed are too significant and using herz alone will not give you good info on CPU's real world performance.
     
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    Meh. People who have PC gaming hardware in and out of their lives are doing it for personal hobby reasons. I doubt it has to do with needing to have top performance. 
    I think we all can admit endorphins release when we smell new computer parts and see the styrofoam peanuts.

    The truth is game developers aren't even utilizing the hardware to it's full potential so unless you're in some industry that requires it there's no real REAL reason to be upgrading every 2-3 years. The cycle can be way more reasonable, but you know;

    Mmmmmmmm, new PC parts smell...


    RidelynnMikeha
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • IceDarkIceDark Member UncommonPosts: 207
    edited May 2018
    l2avism2 said:
    Quizzical said:
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
    No, what I was saying was the metrics that actually measure real performance are usually the ones ignored.
    Measuring instructions per second is far more useful. A computer program is a series of instructions.
    Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
    You are still wrong, really. Your idea that GHz is basically all it matters is wrong. Take best CPU from 2000 with the best CPU from 2018 with the same GHz and see the difference. 

    But I did test. I compared a P4 2.66GHz with an I7 920 - 2.66GHz . P4's release date is 2002 , while I7 920 , 2008. 6 years difference, yes ?

    So in 6 years of technology we have : 

    Single core difference - P4 2.66 vs i7 920 2.66GHz = around 170% better ( for i7 )
    Multi core difference - .. around 1000%

    Now, what did technology accomplished in 6 years? around 200%+ better single core speed, and A LOT more multi core speed. Plus many other good things. ( and I didn't take best of '08 CPU - while P4 2.4 to 2.8 GHz was high-end )

    I say that technology from 2000 to 2010 has evolved better then 1990 to 2000.

    About your .. benchmarks remark , you do know that there are plenty of benchmarks who stress test single/multi cpu core for other uses, and not Games only, right?
    The Ice is dark and full of terror.
  • DvoraDvora Member UncommonPosts: 499
    l2avism2 said:
    Quizzical said:
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
    No, what I was saying was the metrics that actually measure real performance are usually the ones ignored.
    Measuring instructions per second is far more useful. A computer program is a series of instructions.
    Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
    MMO's, newer ones at least are typically more CPU bound than they are GPU bound.  If you are trying to say that the 3.8ghz p4 would run a modern mmo, or even shooter as well as an I7 at the same 3.8ghz and with same video card, you are completely wrong, especially with MMO's.
  • Onji12Onji12 Member UncommonPosts: 4
    edited May 2018
    The cpu manufacturers had to move into a multi-core model. Enhancing the speed of a single core was quickly approaching what they term "the power wall" . Power consumption, not to mention heat levels on the units were quickly rising beyond feasible limits. 

    Here is a link for reference on GPUs but CPUs suffer the same issue.
    https://cs.nyu.edu/courses/spring12/CSCI-GA.3033-012/lecture12.pdf 
  • KyleranKyleran Member LegendaryPosts: 44,057
    I held off upgrading this year because so far I'm not playing any games which are over taxing my 4 year old gaming laptop.

    I don't play any of the newest titles however, most are at least 2 to 4 years post launch which is likely why.

    There isn't anything in MMORPG space I've been unable to play, even the newer ones because they code to a lower hardware standard these days.

    IceDarkGorwe

    "True friends stab you in the front." | Oscar Wilde 

    "I need to finish" - Christian Wolff: The Accountant

    Just trying to live long enough to play a new, released MMORPG, playing New Worlds atm

    Fools find no pleasure in understanding but delight in airing their own opinions. Pvbs 18:2, NIV

    Don't just play games, inhabit virtual worlds™

    "This is the most intelligent, well qualified and articulate response to a post I have ever seen on these forums. It's a shame most people here won't have the attention span to read past the second line." - Anon






  • IceDarkIceDark Member UncommonPosts: 207
    edited May 2018
    Kyleran said:
    I held off upgrading this year because so far I'm not playing any games which are over taxing my 4 year old gaming laptop.

    I don't play any of the newest titles however, most are at least 2 to 4 years post launch which is likely why.

    There isn't anything in MMORPG space I've been unable to play, even the newer ones because they code to a lower hardware standard these days.

    Exactly. 5 years before upgrading is very good.

    Heck, I had a i7 920 in the last 7/8 years. The only upgrades were a GPU and a SSD ( makes sense ) , and I could play almost any game out there on mid/high settings, which is fine.

    Now I just upgraded to a 6800k , 16 GB Ram and a m.2 . I kept the GPU since is a GTX 1060. As I know myself, it will be around 2-3 years before I upgrade my GPU and around 5/6 years when i'll upgrade CPU/MB/RAM.
    The Ice is dark and full of terror.
  • H0urg1assH0urg1ass Member EpicPosts: 2,380
     I take a multi-layered approach to this situation.

    If it's an iteration of an existing technology, and that iteration far exceeds what I'm currently working with, then I'll typically buy it.  For instance, if I'm working with a GTX 460 video card and a GTX 560 comes out, then I probably won't upgrade.  The difference between the two cards performance wise was well under 20% iirc.

    The performance difference just isn't worth the money.  But when the GTX 670 came out, I ponied up the dough because that card was a significant upgrade over the GTX 460 and the technology is iterative and not completely new, therefore the chance of it being a complete failure is low.

    Now let's take a look at VR technology.  It's brand new, it's not established as a "must have" product yet, it's clunky, very few games use it and, good fucking grief, the decent VR systems are expensive as shit.  So no, I'm not dipping into that technology pool yet.  Give it a few iterations, see if it even catches on at all, and somewhere around the Gen 3 mark, I'll buy in.

    Lastly, if one technology relies too heavily on another technology catching up to it to be useful, then I'll generally wait.  For instance, the price of GPU's right now that can push 4K monitors at high frames and with great settings, is pretty high right now.  Not only that, but I do a lot of gaming on triple monitors.  Swapping out all of my monitors to 4K and finding a GPU that will push them, is just off the table.  When they both come down significantly in price, then I'll buy in, but for now 1080p triple monitor with a single mid range card is completely satisfactory gaming.

    Essentially, if it gives a 25% or better increase in speed/performance, then I'm up for swapping.
  • OzmodanOzmodan Member EpicPosts: 9,726
    l2avism2 said:
    Quizzical said:
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
    No, what I was saying was the metrics that actually measure real performance are usually the ones ignored.
    Measuring instructions per second is far more useful. A computer program is a series of instructions.
    Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
    Yes, you can run many of today's games on PC's older than 3 years, but you end up stepping down the resolution and objects displayed.  You are also very limited on multiprocessing.  Try to run a streaming app when you are playing a game with one of these older PC's you will have issues.   I usually have 3 or 4 other applications running when playing a game, sometimes more.  Good luck with an old PC.

    So in the end I disagree with your point of view.
  • l2avism2l2avism2 Member UncommonPosts: 38
    Ozmodan said:
    l2avism2 said:
    Quizzical said:
    l2avism2 said:
    Vrika said:
    Some people get smarter as they grow older.

    Others fail to notice changes and still measure CPU speeds by a standard that was very popular in 1990s. Over time they develop strange superstitions as a result.

    You are aware that the unit of measurement for a processors clock speed is hertz right?
    Maybe you should have said something like computation rate which is expressed in ops or flops (operations per second or floating point operations per second) or maybe instructions per second. In which case that information is rarely available to you at time of purchase. Those would be actual performance benchmarks.
    I don't know about you, but when I talk about a processor being "fast", I don't mean the clock speed.  I mean "does the computations that it's asked to do in a short period of time".  Otherwise, you'd have to think of a Pentium 4 as being faster than an Athlon 64, or a Bulldozer as being faster than a Sandy Bridge.

    But if you really want to be pedantic, you are aware that a single chip commonly has different regions that run at different clock speeds, aren't you?  If all that matters is the clock speed, then which one?  Whichever number is largest?
    No, what I was saying was the metrics that actually measure real performance are usually the ones ignored.
    Measuring instructions per second is far more useful. A computer program is a series of instructions.
    Comparing pcmark benchmarks would be more useful for games because it takes into account the moving around of large memory buffers that happens in games but not in other programs (in gaming the CPU has to copy the textures and vertex buffers into the graphics card which is nearly 70% of the work performed by the CPU in a game, the graphics card is responsible for most of the work performed in any game as the CPU really just tracks player coordinates and handles game logic).
    Yes, you can run many of today's games on PC's older than 3 years, but you end up stepping down the resolution and objects displayed.  You are also very limited on multiprocessing.  Try to run a streaming app when you are playing a game with one of these older PC's you will have issues.   I usually have 3 or 4 other applications running when playing a game, sometimes more.  Good luck with an old PC.

    So in the end I disagree with your point of view.
    3 years old isn't really old enough to have to turn the resolution down. If you are multitasking really thats a ram issue.
    In fact the processors are of comparable speeds, they just have more cores now.
    Really the performance difference between hardware generations is exaggerated.
    Its really more of a luxury to have the newest than a necessity.
  • MikehaMikeha Member EpicPosts: 9,196
    Last build I did was 5 years ago.  AMD FX 8350 / Radeon 7970 GHZ


    Only thing I did since was upgrade from 8 gigs of ram to 16 gigs and my 7970 died last year so I got a GTX 1060. I only play mmos on pc and I have no problems playing those with what I have. :)
  • l2avism2l2avism2 Member UncommonPosts: 38
    H0urg1ass said:
     I take a multi-layered approach to this situation.

    If it's an iteration of an existing technology, and that iteration far exceeds what I'm currently working with, then I'll typically buy it.  For instance, if I'm working with a GTX 460 video card and a GTX 560 comes out, then I probably won't upgrade.  The difference between the two cards performance wise was well under 20% iirc.

    The performance difference just isn't worth the money.  But when the GTX 670 came out, I ponied up the dough because that card was a significant upgrade over the GTX 460 and the technology is iterative and not completely new, therefore the chance of it being a complete failure is low.

    Now let's take a look at VR technology.  It's brand new, it's not established as a "must have" product yet, it's clunky, very few games use it and, good fucking grief, the decent VR systems are expensive as shit.  So no, I'm not dipping into that technology pool yet.  Give it a few iterations, see if it even catches on at all, and somewhere around the Gen 3 mark, I'll buy in.

    Lastly, if one technology relies too heavily on another technology catching up to it to be useful, then I'll generally wait.  For instance, the price of GPU's right now that can push 4K monitors at high frames and with great settings, is pretty high right now.  Not only that, but I do a lot of gaming on triple monitors.  Swapping out all of my monitors to 4K and finding a GPU that will push them, is just off the table.  When they both come down significantly in price, then I'll buy in, but for now 1080p triple monitor with a single mid range card is completely satisfactory gaming.

    Essentially, if it gives a 25% or better increase in speed/performance, then I'm up for swapping.
    I kind of agree here.
    For example, I'm in the market for a vulkan test rig so I'll probably buy a little newer that I'm used to this year.
    I'm more of an AMD fanboi than an Nvidia fanboi. The last nvidia card I owned was that classic FX5200 (first video card to require 2 slots and separate power supply) which replaced my geforce3ti500 which I held on to because it was faster than the geforce4's.
    I switched to the AMD64 processors after holding out on the Cyrix CPUs when they stopped making them at 1.3ghz. That first gen athlon 64 + nvidiafx5200 + ddr2 windowsxp 64bit machine was like the only time that I actually just bought the best of the best (even had a WD raptor HDD with 10krpm disks).
Sign In or Register to comment.