It looks like you're new here. If you want to get involved, click one of these buttons!
I wonder if a MMO which would natively run on 64-Bit client, minimum 16 GIG ram and GTX 580 be successul?
Think about what would be possible with these minimum specs. Seamless worlds, super-high res textures, realistic worlds...
Ultra setting could require SLI setup.
Would you be able/think about upgrading your PC to play?
Comments
I'm not going to upgrade my hardware just for one video game. So no for me...
Neither would my friends so even less reason for me to consider the choice or care about the game.
1. For god's sake mmo gamers, enough with the analogies. They're unnecessary and your comparisons are terrible, dissimilar, and illogical.
2. To posters feeling the need to state how f2p really isn't f2p: Players understand the concept. You aren't privy to some secret the rest are missing. You're embarrassing yourself.
3. Yes, Cpt. Obvious, we're not industry experts. Now run along and let the big people use the forums for their purpose.
yeah i wish some games would take proper advantage of my kit.. and all games these days should be native x64...
Dragon prophet is designed as a native x64 game.. infact i think beta is 64bit only.. unless they have got their 32bit client running that is.
It depends on the game. While decent graphics are a plus there are more important things a game needs to have. If theese other features are extremely good and make me realy want to play the game it would be very likely for me to go as far as upgrading my pc for it(within certain limits, upgrading my pc is one thing, spending more money than for a standard upgrade is another thing).
Like usual, depends on the game. Graphics alone does not make a MMO great.
- Al
Personally the only modern MMORPG trend that annoys me is the idea that MMOs need to be designed in a way to attract people who don't actually like MMOs. Which to me makes about as much sense as someone trying to figure out a way to get vegetarians to eat at their steakhouse.- FARGIN_WAR
I think the market is predominantly set to cater to those with medium settings. Which makes sense, since that probably enables the min pc's to play as well - which means a ton of people can play total, which equals good earnings.
But I would honestly like to have an mmo that really was pushing the boundaries. Something for the high end spec's to drool over - The game would need to be play worthy ofcourse. It wouldnt do it for me, if you got some Vanguard crap that ran like a donkey for months and months.
I imagine the occulus rift will be purchaseable in its finished form within the next 2 years - and once that happens, anyone who spends the 300-500 dollars or so that it will cost isnt going to want to play in a 3d mmo that looks - well - bland and boring. - All the more reason for developers to either make or enable high end graphics in their next games.
My hope is that in the next 5 years, an AA developer takes the occulus and runs with it. Makes a great mmo, that has the option for occulus and really good graphics.
One can dream
It's about the gameplay.
If the design is good - a browser based text game can be a winner.
Nothing says irony like spelling ideot wrong.
My experience with MMOs is that the game experience is as good as the quality of the game engine.
I ran beautiful MMOs with top notch graphics only to see it freeze up or framerate plummet because the server couldn't handle it.
Buy a top of the line machine? No problem, but as long as the server can produce the quality I want with many characters in 1 place at any time without problems.
"going into arguments with idiots is a lost cause, it requires you to stoop down to their level and you can't win"
I found out Star Citizen will be such a game!
-It is running on the x64 version of Cryengine 3
-Minimum specs for graphics card will be GTX 460 (SLI recommended)
And a poll in the forums showed that more than 90% of the people already run a 64-bit OS
If the size of the intended audience was realistically assessed, and both design budget and future revenue plans were viable within that window of players, and the game was enjoyable to the target crowd, then yes.
There isn't a "right" or "wrong" way to play, if you want to use a screwdriver to put nails into wood, have at it, simply don't complain when the guy next to you with the hammer is doing it much better and easier. - Allein
"Graphics are often supplied by Engines that (some) MMORPG's are built in" - Spuffyre
In some alternate dimension there's an MMORPG company saying, "In the last decade, MMORPGs have been plagued by too many subscribers. You can't launch a beta without 10 million players showing up. What methods to reduce subscriber counts do we have left?"
"We could try rasing the min spec, sir..."
"Fantastic idea, Dodson! HR, promote that man!"
"What is truly revealing is his implication that believing something to be true is the same as it being true. [continue]" -John Oliver
The problem is, what do you get for those high minimum specs? Having high minimum hardware specs for the sake of having them just restricts your player base for no good reason.
There's no good reason for a game to require huge amounts of video memory in the minimum specifications. If a game is going to use a lot of video memory, then that's mostly eaten up by high resolution textures. But converting high resolution textures to lower resolution textures to save on video memory is a pretty trivial thing to do. In fact, video cards commonly already use mipmaps to dynamically use a lower resolution version of a texture for faraway objects. Mipmapping all textures down a level before passing them along to the video card is fairly trivial to do, and reduces the video memory requirements of textures by 75%. For technical reasons, you likely don't quite want to do that for all textures (some textures don't meaningfully correspond to pictures), but your video memory savings for only mipmapping down textures where it makes sense probably won't be far shy of 75%. And if 75% isn't enough, then you can mipmap textures down two levels and reduce video memory requirements by 94%.
Requiring huge amounts of system memory doesn't make much sense, either. The only use I can think of for ridiculous amounts of system memory other than allowing game programmers to be really inefficient is caching art assets so that that you can load things off of a hard drive from further away without having to immediately use up video memory. But that would be trivial to turn off, so there's no reason to put it in the minimum specs.
There's no good reason to require an extremely powerful GPU in the minimum specs, either. Even modern integrated graphics has plenty of shader power to process as many vertices as necessary to make everything look as smooth as you care to, at least if you use tessellation. The way to use up massive amounts of GPU power is with fancy GPU physics effects or fancy lighting effects in pixel/fragment shaders. But neither of those affect gameplay, so they should both be easy to turn off. Even if you offer such effects, you make them optional so that you can keep the minimum requirements low.
The only reasons for a game to require high single-threaded CPU performance are that the programmers are terrible, the game was started before it was realized that many CPU cores was the future rather than extremely fast single-core processors and changing the threading model later is too much of a pain, or the game makes extensive use of some algorithm that isn't practical to thread. I can't imagine any game mechanics that would force the third upon you. The nearest that I can think of would be needing to pass ridiculous amounts of data to the video card so that you turn the PCI Express bus into a bottleneck.
Now, I could imagine requiring an SSD to make it easy to give the game a seamless world that still has AAA quality graphics, or requiring several CPU cores in order to do a bunch of processing on the CPU. But "you need to add a $100 SSD to play this game" is far shy of "you need to add a $300 video card to play this game". Game designers that try to push the latter in their minimum settings would be simply incompetent, and that's likely to cause a lot of problems elsewhere in the game. (I'm allowed to say that without insulting anyone, aren't I, if it's something that no game designer has ever done for any recent release?)
^^This plus everything else he said.
You don't design a game for Oculus Rift in particular. Even if working well on Oculus Rift is a priority, you implement stereoscopic 3D in such a way that any implementation of it will work, not just one particular one. If you're using an industry standard graphics API that supports stereoscopic 3D, then the difference in work between "works well on Oculus Rift" and "works well on Oculus Rift and Nvidia 3D Vision and AMD 3DHD and iZ3D and everything else" is just a bit of testing and debugging to see if the others do what they theoretically ought to.
The server generally isn't even aware of what's going on with graphics. Higher or lower quality graphical settings are done purely client-side.
There's a big difference between "game launches today and requires a GeForce GTX 460 or better" and "game launches in four years and requires a GeForce GTX 460 or better, which means that the latest integrated graphics at launch day are also good enough".
If any game launches with expectionally high system requirements, to me it tells that either
a) The devs don't know how to optimize their code, or
b) The devs didn't have any good ideas on what fun to build in their game, so instead they just focused on graphics
Be successful? Maybe.
Become a blockbuster hit? Highly doubtful.
The main reason MMOs try to focus on aesthetics over graphics is that "ultra good and realistic graphics" tend to become obsolete very quickly. Two years down the line, when it is no longer visionally impressive, it will lose much of the playerbase that played it just because it had great graphics at the time. This is why the art style is emphasized more so than graphics (aside from trying to make it look pretty and still run on low end machines to get a wider audience). The art style of WoW is still beautiful to many -- graphically obsolete without a doubt -- but the new areas that come with the expansions are exceedingly gorgeous as they update the engine on a bi-yearly basis to go along with the theme.
The problem with making a MMO with high graphics will be exceedingly high costs along with exceptionally high risks. It will have to be a near perfect game in every way to get people to want to buy new computers, which may cost five times what it already takes to release with as much content as many will expect. If it is imperfect in any way, or even in a major way (take a look at FFXIV 1.0), then it would just be a costly flop with a very small P2P community. Also you will have to recognize that MMOs are more demanding, and the higher resolution a piece of art is, the less foilage and overall detail will be allowed on the maps.
What are you planning to do with that extra power? What difference will it make?
Maybe the brain behind my eyes is getting too old and set in its ways, but I find that these days, I associate an increase in specs with a drop in game performance more than I associate it with the increase in graphics.
OP...
You are so over-presumptuious. It comes down to standards and encompassing operating systems. There is a big reason Developer's have not realeased a 64bit client... it wouldn't of been successful at the time.
Today..? Easy..
The crux is, they... these Developer's have to up their game, hire uber-pro codrz and commit themselves to their game. As even a $400 laptop run Windows7 (or 8) and has at least 4GB, if not 8gb of memory... that current games do not even remotely touch.
Games limitations today are not graphically.. it was the lack of unified archetecture throughnout the PC eco-system. DX11 is in everyone house now and will be utilized heavily in the next 5 years. Anyone who own a PC built in the last 2 years.. essentially has a virtual Xbox via Microsoft Windows.
Playstation 4 & Xbox720 will be DX11 feature rich. The playstation comes standard with 8GB of ram... (Xbox specs have not been released, but rumored to be 8, or 16).
So, the point being are we willing to upgrade.. it is "when are these games comming".
Even a $700 computer this xmas with Haswell, or AMd's new APU will be hable to handling any large scale MMOrpg you are describing. It just takes them to make it on a 64bit client... which allows developes to address more RAM within their games. What brought to your screen is still the same... the "expanse" is sitting in your 8GB of memeory as you spin around.
DX11
64bit os (Win 7/8)
8gb of ram
What video card you have will have little effect on gameplay in the near future. It will only determine how much detail you want to dial-in. Not how vast, or open it is, etc..
"No they are not charity. That is where the whales come in. (I play for free. Whales pays.) Devs get a business. That is how it works."
-Nariusseldon
Computers are already capable of displaying millions of colors... can you name them all?
The reality is, you don't need as much detail as you think to get the picture across... a lot of what a computer is capable of is overkill because you only need 1/10th of the information to process it.
So while they can make a game that can project things at a rate that your brain will mostly ignore, there really is not point in doing so other than marketing.
They are red-green-blue values of 0-0-0, 0-0-1, 0-0-2, 0-0-3...
Back to my second sentence I pretty much play gw2, that means I payed the cost of my pc 1200 plus 40 for the game, just to play gw2. Yes I try out other games but I usually stick to gw2. So you could say I payed $1240 for gw2. Now when new games come out and if I like them the price I payed will go down but right now I bought a pc to play one game gw2
It's easy to release a 64-bit client. The question is whether there's a point to it. If a game never uses more than 1 GB of system memory because that's all that it needs in order to do what it wants, then does the 2 GB cap of a 32-bit program really matter?
The API jump that I'm looking forward to is when game designers are willing to assume that everyone has geometry shaders available (DirectX 10 or OpenGL 3.2 or later). That could be sooner than you think, as it only needs GeForce 8000 or Radeon HD 2000 series cards or later. But alas, Intel didn't launch their first graphics with OpenGL 3.2 support until less than a year ago, and that's what's holding everything back.
Tessellation is great, but whatever game features you use it for, you can probably port back to not use tessellation. Geometry shaders let you do entire new classes of work on the GPU, and if you do a ton of stuff that heavily uses geometry shaders, porting it back to a version that doesn't use geometry shaders is likely to be an enormous pain and also carry such a huge performance hit that the game isn't playable on the older hardware, anyway.