It looks like you're new here. If you want to get involved, click one of these buttons!
I know, I know!! Upgrading is good.. right? Why do you upgrade? Want that edge in competitive gameplay?
At this state of technology I see no point in upgrading a computer. When you can buy a totally synced, graphically divine system cheaper than you can buy a PS4 for. Yes! Current Systems are like that.
The only thing to upgrade in the current technological state is perhaps your ISP connection. Other than that. This talk of video cards. which is better...ect.. Is a early 2000's story line. And the video card companies want that talk so they can still sucker people to buy from a outdated genre. Thus attempt to stay in business.
If you run a system pre-2012 then perhaps you should replace your system. As it is cheaper. Regardless of what upgrades you make. You will still be behind and waste much money on trying to stay current and synched.
Video Cards are dinosaurs now days!
Just my two cents worth!
Comments
I'll admit that we are hitting diminishing returns on computing power versus software. Also, we are hitting diminishing returns on hardware - we no longer see 50-60-70% speed increases between generations, now it's closer to 15%.
It's not like the jump from Wolf3d to Doom Glide was anymore, now we are seeing a small effect like hair animation take upwards of 20% additional compute resources.
But I won't deny it all adds up to a better experience, and there is still room to improve.
If your happy with whatever rig you have - great. There is no reason to upgrade - until that rig can't play/perform something you want it to - which could be tomorrow, or could be years from now. But it will occur someday, even with diminishing returns playing a factor. And then there are some people that are only happy if they are able to turn on every single feature and option in a game, and sometimes there is no hardware available with enough performance to do such.
i gotta agree. the diminishing returns in hardware are quite visible at this point. You could see them 3-4 Years ago already.
However, i usually upgrade my computer only if its really needed. I recently upgraded my graphics card due to having a GTX660 wasn't enough to play recent games in good quality anymore - but i'm still running an i5-2500K because iam not willing to pay 200 Bucks or more for a minor upgrade of like 10-15% performance at max. Thus the only reason i would upgrade it would be for the sake of power consumption and i cannot justify spending so much money for saving like 20-30 bucks a year.
In the past many who had gaming hobby bought cheaper gaming computers knowing there's no reason to waste money to expensive parts that would be obsolete in 2 years anyway. Now with tech development speed slowing down, it makes much more sense to upgrade your computer to a more expensive model.
I don't upgrade the whole computer, I upgrade parts. I only upgrade parts when I see a significant improvement and/or benefit for me. I'm not your average user, I'm heavily into flight simulation and home cockpit building, My setup for that requires a network of 3 computers. I also have my gaming machine, my latest upgrade was moving up to 1440p which required a new monitor and a new GPU. I can say that the difference between 1080p and 1440p is black and white and that I'm very happy with the upgrade.
It's a hobby for me and there is a hell of a lot of worse things I could spend my money on.
"Be water my friend" - Bruce Lee
I would say that upgrading CPU/Motherboards isn't really a big deal anymore. They've been pretty much fine since the Core 2 days. Video cards are definitely still in the upgrade every year or two category. I thought my GTX 770 that I bought a year ago would last a good while, GTA 5 has proven me wrong.
The current problem with video cards is texture resolution, and every couple years the memory requirements for new games get higher and higher. When that slows down, then the need for card upgrades will slow down too.
You won't have to upgrade for a long time if you play 1080P.
Because silicon tech has hit a noticeable roadblock trying to miniaturize.
And because if you don't play 4k, you don't need the upgrade anyway.
Dont listen to OP lol. You can take a PC made in the past 3 years and spend 100-300 bucks and keep it worth gaming for another 1-3 years. Over consoles that only last 3-5 years before the next new shinny console is here. 9 times of out 10 upgrading your Video card, CPU or RAM, 1 or all 3 of these and your rocking. Most times you only need to upgrade 1 of the 3. Then you add in the fact you can use gaming keyboards and a gaming mouse, MMOs are just better on a PC. If you PvP, PC makes it an even better option.
As for buying a new rig for cheaper then upgrading. If you hunt sales this is far from true. I have gotten a video card worth 400 bucks for 120. You just need to take your time and watch the sales. In the end, upgrading is almost always the cheaper option.
We are a long way off from 4k gaming being any kind of standard.
At the time I started to treat my hobby more serieus I upgraded my system almost every 8 months to a year. Yes I know very foolish back then, just wanted to have the best of the best.
A few years later once ever 2 years.
I have upgraded to a complete new gaming system at the end of the year 2014. But that has taken almost 4 years since I last upgraded. And felt very comfertable since I now actually notice how big of difference it makes. With the way I upgraded in the beginnen I hardly felt/saw any difference.
Last year on my old system I still could play most games on the highest settings but was missing out on 64bit exclusive games. But playing the highest settings on a 4 year old rig compared to a very new gaming system was to me like apples and oranges
The reason why I upgrade is simple. I am a gamer want to play most current games hopefully at the highest and most detail settings possible. Though at the same time I feel it's better atleats for me to go with the 4 year cycle to actually notice the difference.
"Possibly we humans can exist without actually having to fight. But many of us have chosen to fight. For what reason? To protect something? Protect what? Ourselves? The future? If we kill people to protect ourselves and this future, then what sort of future is it, and what will we have become? There is no future for those who have died. And what of those who did the killing? Is happiness to be found in a future that is grasped with blood stained hands? Is that the truth?"
I beg to differ, the 980 ti was a night and day different between my SLI'd 760's. Night and day.
The issue is you're not accounting for the margins. You're looking at it in a vacuum. So if you have a game you can't play very well, say it gets only 15-20fps, and a new video card with only maybe a 40% speed increase allows you to now play said game at 25-30 fps, you can now play a game you really couldn't before.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
The GeForce GTX 980 ti and the GeForce GTX Titan X share one thing in common. It is going to be years before the video game industry will release a game that will truly put these 2 video cards to the test.
"Possibly we humans can exist without actually having to fight. But many of us have chosen to fight. For what reason? To protect something? Protect what? Ourselves? The future? If we kill people to protect ourselves and this future, then what sort of future is it, and what will we have become? There is no future for those who have died. And what of those who did the killing? Is happiness to be found in a future that is grasped with blood stained hands? Is that the truth?"
The only time I upgrade is if a component dies. Ex: GPU replace it with a better one.
Old systems get used for other things, like file servers, continuous test boxes, random OS, etc. Although virtualization is/has replaced testing boxes.
Epic Music: https://www.youtube.com/watch?v=vAigCvelkhQ&list=PLo9FRw1AkDuQLEz7Gvvaz3ideB2NpFtT1
https://archive.org/details/softwarelibrary_msdos?&sort=-downloads&page=1
Kyleran: "Now there's the real trick, learning to accept and enjoy a game for what it offers rather than pass on what might be a great playing experience because it lacks a few features you prefer."
John Henry Newman: "A man would do nothing if he waited until he could do it so well that no one could find fault."
FreddyNoNose: "A good game needs no defense; a bad game has no defense." "Easily digested content is just as easily forgotten."
LacedOpium: "So the question that begs to be asked is, if you are not interested in the game mechanics that define the MMORPG genre, then why are you playing an MMORPG?"
1440p would be pushing your card pretty hard and you could forget about 4k at a playable fps. The only reason to upgrade to these top end gpu's today is to make the switch to the higher resolutions when it comes to everyday gaming. I do have a simulation that will bring your 290x or my Titan-X for that matter to their knees at 1080p.
It takes a lot of horsepower to 3d render the real world http://www.prepar3d.com/
"Be water my friend" - Bruce Lee
Somebody, somewhere has better skills as you have, more experience as you have, is smarter than you, has more friends as you do and can stay online longer. Just pray he's not out to get you.
I cant see the difference between 30fps and 60fps, but I sure can feel it. The responsiveness is black and white.
"Be water my friend" - Bruce Lee
For PC gaming it's not really about "standard". You can play pretty much any modern game in 4k.
I first built my current system back in 2009 and am still using it to this day. It is an Intel Core i7-920 processor running at 2.66GHz with 64GB of DDR3 RAM (max ram is 256GB) powered with an ATX 1.1 power supply. I originally had two 500GB Seagate hard drives and two AMD Radeon HD 4890 video cards.
My system today has the same mainboard, processor, RAM, and power supply. The only things I ever kept upgrading were my video cards and hard drives. Current I have four Samsung 850 EVO 500GB solid state drives and I only recently upgraded my old Nvidia GeForce GTX HD 660 TI to an Nvidia GeForce GTX HD 980 and the difference is like night and day.
The one thing I did when I first built my system was go with a liquid cooling system instead of a fan and heat sink and will never go back.
/facepalm
You do realize both quantum and optical computing have very specialized uses. You need to stop watching so much star trek.
Quantum computers are great at what they call "needle in the haystack" problems, but are atrociously bad at just about everything else.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
You can also pretty much walk from Boston to Miami, but you'll probably wish you took a plane about 30 minutes into the hike.
Sure, you can "play" in 4K now, in the sense that the game will run and most video cards will support the video resolution. But if you attempt to do it without some high end components, you'll probably just wish you had stuck with 1080 or whatever you had before.
We aren't quite to the point where there's enough general compute power to make 4K a foregone default resolution (like 1080 is today). But it's getting there, we are just now getting to where single GPU cards can push it that many pixels.
Yes and no. And if it's a standard or not is kind of irrelevant for the point I'm making.
The point is that if you stay with 1080P, you have about 3 times higher FPS over 4k. (we had a long thread about how 4k actually costs FPS, it was around 3x less FPS)
Also, as even gamers know by now, the speed increases are not 2x or even 3x anymore from generation to generation.
It's 10-15% increases in speed nowadays from generation to generation.
Moore's law has come to and end.
Yes, there will still be speed increases, but they will be slower, costs will go up even more, which will make it even more slower, there are few competitors left because the cost it takes.
There are alternatives like graphene and others, but they are years away.
The area where we still see large speed increases is SSD and Internet connections. But in terms of CPU and GPU, increases in speed are extremely small.
Another way they can increase speed is software. Which is why we now see things like Mantle, things like DX12, things like the new Mac OS.
Well, if the games continually gets developed for the lowest common denominator, then people will of course stick with their hardware for longer as well. This is even more true for this generation, more so than the last one. Unless you up the resolution to 1440p or even higher, there's really no need for a powerful PC before developers develop the PC version, then downscale the game to the rest of the bunch.
This in turn hurts hardware creators, which is of course bad.