It looks like you're new here. If you want to get involved, click one of these buttons!
Often I read bout how you need this grahics card or that to play certain mmorpgs. Lately, I purchased a new computer. Its nothing major and will give you the specs. I thought because I did not have one of those expensive graphic cards I would not be able to play certain games. But, so far, every single mmorpg I play including LOTRO,SWTOR,RIFT,EQ,EQ2,Vanguard...have no issues.
Are the new computers just that good these days?
Here are the specs:
Processor: Intel(R) Pentium(R) CPU-G645T @ 2.5GHz 2.50GHz
4G Ram
System type: 64bit processor x64-based processor
Intel(R) High Definition Graphics
Windows 8 Premium
Thats it, nothing more nothing less. I run the above titles with no issues. Also run Pirate101, Wizard101, EVE, and the one I thought I would have trouble with Age of Conan.
All of them run perfect if not better than my old system with its R8250(HD) graphics setup.
Any clue as to why they run perfect? I used to dread seeing the Intel graphics on a computer. But on this machine it seems a graphics card is not even needed.
The computer itself is only around 8 weeks old.
Comments
Some people aren't happy if they don't get 60fps on max settings. I know I've heard people say before "runs perfect for me" then I ask them to check fps and they say "says 12fps" I mean 12 really? That's a true story btw it was in vanguard. 12fps is unplayable to me but he thought it was perfect.
Laptop brand and model name? No 3D chip on board, seriously?
Secrets of Dragon?s Spine Trailer.. !
http://www.youtube.com/watch?v=fwT9cFVQCMw
Best MMOs ever played: Ultima, EvE, SW Galaxies, Age of Conan, The Secret World
http://www.youtube.com/watch?v=T2X_SbZCHpc&t=21s
.
.
The Return of ELITE !
i doubt u can move Swtor or EQ2 (is hardly optimized >.>) in that specs "well"
i can move crysis 2 , or AC3,skyrim with lots of mods at max np , and have FPS problems in that games
and yeah for me now 60fps maxed is perfect , but i can play with 30fps+ , played old games with 25-30fps
Ohh, the doubters!!
First its not a lap top. I play all the games on high. Well I looked up my Intel graphics card. Its a Intel Graphics(HD) 2000.
Here is a short video of what it can do. http://www.youtube.com/watch?v=cEpfC6ycJJY and http://www.youtube.com/watch?v=wQ-VgLpFgIo for our doubters.
also a Skyrim video using Intel 2000: http://www.youtube.com/watch?v=aVLq7LvzxJQ
Which means I really do not need a expensive graphics card. Intel with its latest version of graphics took it away :-D
Moore's Law says that about every two years, you get to double the number of transistors on a chip. That's exponential growth, so chips today can have about 32 times as many transistors as they did 10 years ago, and about 1000 times as many as they did 20 years ago.
It's gotten to the point where everyone says, well, we don't really need this many transistors for the CPU, so what else can we put in the same die? So you get an onboard memory controller, PCI Express bus, power management, GPU, or whatever else they feel like tossing in. And they can dedicate quite a few transistors to the GPU, too.
Putting a GPU in the same chip as the CPU has some big advantages, and both AMD and Intel have started making integrated graphics a couple notches higher in the performance scale than they used to.
Meanwhile, the bare minimum graphical performance needed to get a game to run hasn't really increased that fast. If you could do something to make a game playable ten years ago on the video cards of that day, then you can do the same thing to make it playable on video cards no faster than what was available ten years ago.
Having a ton of extra GPU power available means you can offer more effects and make games look much better. But it sounds like you're running games at low settings, seeing that they're playable, and happy with that. Some of the games that you list aren't very demanding; in some cases, this is because they're very old. If you pick up a fairly demanding game and try to max settings, you'll see the GPU choke in a hurry.
Intel graphics are still awful compared to the competition. AMD Radeon HD 7660D integrated graphics would probably get you about quadruple the performance of what you got. Intel drivers are also rather problematic. If a nasty bug in AMD or Nvidia video drivers is found, they'll probably release a beta fix within weeks and get it into their routine driver updates within months. If a nasty bug in Intel video drivers is found, it probably won't be fixed. Ever.
There's also the issue of API compliance, and Intel lags behind very badly here. You've probably only tried running DirectX 9.0c or earlier games, and that launched in 2005. Newer APIs can do a ton of cool stuff that game designers have mostly been hesitant to use--and your graphics basically doesn't support it. The oldest Nvidia cards to support OpenGL 3.3 launched in late 2006. The oldest AMD cards to support it launched in early 2007. The oldest Intel graphics to support it launched in 2012.
Meanwhile, the graphics chip you got doesn't support it at all. If a game decides to require a recent API in order to run the game, even using an old enough API that all discrete video cards from the last six years are fine could easily leave the game unable to run at all on your graphics. When will that happen? It's not clear, but it could become common pretty soon.
Most games do use DirectX and not OpenGL. But DirectX (or rather, the Direct3D part of DirectX) and OpenGL mostly do about the same things. DirectX also has some "feature level" nonsense, where a GPU chip can claim that it supports DirectX 11 feature level 9_3, and people think it supports DirectX 11, but that really only means the functionality present in DirectX 9.0c. If a game needs anything more recent, I don't know if it would just disable some features and make the game look horribly broken (e.g., a bunch of things simply don't draw at all), or if it would crash or otherwise refuse to run.
OpenGL doesn't allow those shenanigans; if you want to claim to support OpenGL 4.2, you have to support everything in it. But since DirectX and OpenGL mostly do the same things (I'd be absolutely shocked if video cards don't commonly pair a DirectX command with its OpenGL equivalent and treat them exactly the same way), if video drivers don't support recent OpenGL versions, I'd question how complete the support for recent DirectX versions is.
Game requirements stagnated, I have had my ATI 5770 for 3 years and it still plays every game. It wasn't even a high-end card to begin with, it's a mid-low end card that is 3 years old.
I think it's because of consoles, when the new consoles come out they'll upgrade graphics again and we'll need new hardware, until then everything works.
op!the thing is often when people say this wont cut it etc most of those are elitist!
but an average person ?could buy half the computer an elitist would buy and get away very close to same quality ,how?
buy gaming at 1080i instead of 1080p,sound stupid?it isnt.the issue most of the time isnt 1080i the issue is setting your computer properlly to have 1080i this means also looking into window color system color profile etc etc!
like console:when they re at 1080i everything is automaticly set to 1080i by the machine (not like on computer where you have to manually set everything)and i bet most would be hardpressed to see the difference and most of the time the difference is caused by the srgb profile not the interlace !but then marketeer ten to do this!
oh there will be a difference!but often you will have to zoom in so much on the image (with a spacial software)to see it!that in the end there is no difference!
the main issue is if it isnt on a console it is next to impossible to set your computer in 1080i for average user.too many variable in there that need to be changed.but if your tech savy enough?lol instead of neeeding that 690 or that 7990 you could do has good with a 680 or a 7970
you have to remember that in 1080p there is around 2 million and in 1080i there is 1 million that is a huge difference!most often the issue is bad timing in the various software!
like i say ,in 1080i it can look so close to as good as 1080p that you ll wonder why people use 1080p ,because in theory 1080p look better .but if you stream then quality is lowered because too much data (bitrate)see the picture here.ya 1080p is mostly a scam.unless you dont stream or you have a uber computer.stick to 1080i learn how to set your computer.its the best compromise for average user.
its worst ,even if you were to win the lotery i still would say to you to stick to 1080i !why!becausethe web on average is limited to 3500 kb/s this means you probably wont want to go higher then 3499 kb/s!ya today your best bet is 1080i!
ya when you talk about online it is full of compromise some better then other!
In a earlier post had a poster state that I cannot run EQII on high settings. All the games above that I stated except Vanguard run on high graphics. Vanguard which runs on medium. I imagine next years Intel graphics will make most other graphic cards useless. Unless there is a significant breakthrough technology(Like VHS to DVD)and programming. I do not expect to see anything useful from the graphics tech market.
When I can go out and buy a 500 dollar computer and it equals a monster machine(that someone payed monster money for) from as little as 2 years ago. That is saying something!
I'm sorry, but I don't believe the OP in the slightest. A good friend of mine at work has a chip with Intel HD Graphics 4000 and even that chip can't touch any modern game on "high" settings. I have seen him try with my own eyes. He had to crank the settings down to the bare minimum just to get almost-playable FPS.
But then again, we don't know what monitor resolution he's using. Maybe he has a 800x600 CRT screen. Sure, your chip will play games on high at that resolution, but it won't come close to playing games at high settings at 1920x1080 resolution. Not a chance.
Uh....that Skyrim video....yeah, I think you might need a trip to your optomitrist. If you want to really see what Skyrim is capable, just check out some of the heavily modded PC versions with some of those fancy ENBs enabled. I am not saying you cannot have fun playing Skyrim the ay it looks in your video, but....ok, it's awful.
That Skyrim video looks downright awful. The close up textures have to be on the absolute minimum settings, as the pixels look like something from the late 90s. The view distance detail is really low too. Those trees in the background look like trees from one of the earliler Tiger Woods PGA Tour games for the playstation 2. Skryim can look so so SO much better than that.
As for the other games, they're shown in 480p resolution, so it's hard to tell much of anything. We also don't know what resolution monitor they were played on.
Don't get me wrong, I've never been one to have the latest, greatest, most poweful gagets. I had a Radeon 5770 for 3 years and only just upgraded. It still served me fine until the end. I now have a GeForce 560ti, which I got for just over $100. You would never see me spend more than that on any PC part. So I'm totally on board with being economical. But don't trick yourself into thinking you've got something better than you do.
Actually, scratch that. If you can deceive yourself and be blissfully happy with what you have, why not? Whatever your requirements are to enjoy the games you play.
I find it difficult to believe.
Epic Music: https://www.youtube.com/watch?v=vAigCvelkhQ&list=PLo9FRw1AkDuQLEz7Gvvaz3ideB2NpFtT1
https://archive.org/details/softwarelibrary_msdos?&sort=-downloads&page=1
Kyleran: "Now there's the real trick, learning to accept and enjoy a game for what it offers rather than pass on what might be a great playing experience because it lacks a few features you prefer."
John Henry Newman: "A man would do nothing if he waited until he could do it so well that no one could find fault."
FreddyNoNose: "A good game needs no defense; a bad game has no defense." "Easily digested content is just as easily forgotten."
LacedOpium: "So the question that begs to be asked is, if you are not interested in the game mechanics that define the MMORPG genre, then why are you playing an MMORPG?"
EverQuest 2 launched way back in 2004. If a new computer today can't keep pace with computers that people had in 2004, something is wrong.
But EverQuest 2 is a massive outlier in game performance. Two key ways to improve game performance is to:
1) offload as much work to the video card as is practical, and
2) thread the CPU code to scale well to many cores.
EverQuest 2 does neither. They did a ton of graphical work on the CPU, and single-threaded the game. The idea was that the game would run well on the future 10 GHz Pentium 4 CPUs that Intel was promising around the time the game launched. Then the laws of physics got in the way and Intel still hasn't released a consumer CPU with a stock clock speed of even 4 GHz. So while it is a badly coded game, it won't penalize you for having a weak GPU or only two CPU cores, as it isn't able to exploit stronger gaming systems.
-----
There's also a question of how high of frame rates you want. If you're perfectly happy with 20 frames per second on average and a lot of hitching, then that lets you get away with much weaker hardware than if you insist on a steady 60 frames per second. If you're perfectly happy with 60 frames per second in fairly vacant areas dropping into the teens when it gets crowded, then you can get much weaker hardware than if you insist on 60 frames per second in even the harshest, most crowded conditions.
You should also realize that game programmers can detect the GPU you're using and adjust settings for you, though what game designers do about this varies wildly. The "high" graphics preset on Intel HD Graphics could easily be lower graphical settings in an absolute sense than a "medium" graphics preset on a more capable gaming card. They could also hard-disable certain features if they decide that your GPU can't handle it--especially if it is necessary to prevent a crash. The maximum settings that they'll allow your GPU to attempt isn't necessarily the max that they'll allow someone else to attempt.
There are also options with how some features get implemented in drivers. In some cases, the DirectX and OpenGL specification will say that a GPU has to have this exact behavior. In other cases, it will only say that it needs to meet certain criteria, but let the exact details vary from one GPU to another. That means that the same game at the same (absolute) settings on different video cards can look substantially different--and any differences are probably going to be in the direction of "Intel graphics look worse".
-----
You can get a decent enough GPU on a severe budget. Desktop Llano systems are basically on clearance and have been for quite a while, so you could grab that and probably get a viable severe budget gaming desktop on a $400 budget excluding peripherals. But what you got isn't a decent enough GPU. It sounds like you just haven't picked up any games that require substantial GPU power at the settings you want.
Wow... that Skyrim video...
I had the same experience only once, back when Oblivion came out and I had to play it on my old Ti4200. Luckliy some dude made a patch to the game with dumbed down textures, called it OldBlivion... I played it for a week then bought a better card
I am not that suprised that that card can handle those old games with decent settings and even modern games with everything set to low.
Even so the fps wont be that good and it certaily wont be as enjoyable as playing games with a dedicated card even a lower spec dedicated card..
Me I like to play all my games maxed out so have a decent system. but most of the time i do small upgrades so i dont end up spending a shit load at the same time.