It looks like you're new here. If you want to get involved, click one of these buttons!
Hi,
Have some dilemma. I'm currently running an i5-2500 machine with single GTX 670 2GB card and Eizo EV2436W (1920x1200) as my main monitor, assisted by some 1680x1050 Dell on the side.
I would like to upgrade to triple 1920x1200 or single 3440x1440 monitor setup and I wonder what will be better and cheaper. I'm using the rig about 50% of time for work and watching some films on the other monitor at the same time and other 50% for gaming: MMORPGs, Elite: Dangerous , RPGs, some FPSses.
I can get another two Eizos for 850Euro/970USD but I'm afraid I'll have to upgrade the GFX card as well, at least find used and add another GTX 670 for SLI. Or I can get Dell U3415W for 950Euro/1085USD.
Any ideas what will be better for my situation?
Comments
I run 3x 1080p monitors.. So a couple of things:
1. 3 is great for productivity being able to separate tasks into their own windows and such.
2. For gaming when it works Its great and very immersive, but many many games have issues with the fov for that resolution size. The other thing to note with eyefinity / surround configurations is the far left and right sides of your screen are basically fish eyed, ie: stretched so your not really getting anymore detail in the scene as much as something for the corner of your eye to react to.
3. you can safely take any resolution you have now and cut the fps in half+ by going multimonitor.
4. Multimonitor: Every single games ui will be stretched, Its horrible.
I suspect with a 4k monitor would get a much richer picture over the entire surface of you screen and probably more screen real estate, but all those apps would be windowed.
TSW - AoC - Aion - WOW - EVE - Fallen Earth - Co - Rift - || XNA C# Java Development
I'd personally dismiss the 3440x1440 out of hand as being obviously stupid. Nearly everything I do on a computer (games, web browsing, e-mail, spreadsheets, programming) is limited mainly by vertical pixels, not horizontal. The wider resolutions are meant for watching movies, which is fine for movie theaters or television sets, but is stupid on a computer monitor.
I've long thought that 1920 pixels is a stupid width for a computer monitor. It's too wide for most things, but not wide enough to comfortably fit two things side by side. I've used two 1920x1200 monitors and I mostly end up using the right side of the left monitor and the left side of the right monitor. If they were 1600x1200 instead, I wouldn't miss the extra width. You can do that with two monitors, but it doesn't work so well with three, as which side of the center monitor do you use?
2560 pixels wide is more often enough to have two things comfortably side by side. But 3440x1440 doesn't really gain you much over that, but it sure does inflate the price tag.
What I'm likely going to do soonish for an upgrade is to get three 2560x1440 monitors in portrait mode. 2560 pixels of height is all the vertical space you want and then some, and 1440 pixels wide is appropriate to most programs. If you need more width, spread something across multiple monitors.
I'd strongly advise against buying another GTX 670 in SLI. SLI, like CrossFire, is very heavily dependent on drivers to make things work on a per-game basis. Nvidia is now pushing Maxwell cards, not Kepler, so it's not likely that they're still putting in the work to make SLI work properly with new games on three year old cards--and even if they are, they won't be for much longer. So if you get a second video card for SLI, it often won't gain you much except for heat and noise. If you need a faster video card, get a faster video card--in place of, not in addition to, your old GTX 670.
If you're going to spend a lot of money on monitors, I'd make sure to pick out something that supports adaptive sync. That allows the monitor to refresh when it makes sense rather than at fixed intervals. When playing a game, that means refreshing when it has a new frame to display. That means both smoother frame rates and reduced display latency. Your current video card doesn't support adaptive sync, but your next one likely will, and you don't want to replace the monitors you buy today the next time you get a new video card.
At the moment, AMD supports adaptive sync and Nvidia doesn't. Nvidia supports G-sync instead, which is basically the same thing, except that it's proprietary to Nvidia and requires the monitor vendor to buy some $100 hardware from Nvidia. That inflates the cost of building a monitor, and the higher prices get passed on to you. Adaptive sync doesn't add to the cost of building a monitor at all, and allows the monitor vendor to buy scalers from any of several standard industry vendors rather than specifically from Nvidia. Nvidia presumably could--and likely will--support adaptive sync at some point with a driver update, but that likely won't extend support all the way back to your current GTX 670.
Even if you don't go to three monitors, two is a lot better than one. Two monitors lets you have two completely independent things up that each have a monitor all to themselves.
The typical way that games are rendered is that things are perspective correct for all monitors in a single plane and some viewpoint unnaturally close to the monitor--e.g., 8 inches away. People with multiple monitors virtually never put them all in the same plane. We've had geometry shaders since 2007, and that let's you fix the perspective issues if a game developer wants to. That I'm not aware of any game that has even tried this is, to me, pretty compelling evidence that they're not hiring people with the proper math background to make graphics engines. So it's something that can and should be fixed in software, but you choose from among the games that do exist, not the ones that should exist.
That's one argument for portrait mode over landscape mode in a multi-monitor setup: it keeps the window closer to square, so you don't get the stupidly distorted stuff at the extreme edges.
I like the 16x10 aspect ratio for games, I do use the extra horizontal space while playing, and I miss it greatly on the rare occasion I'm in 4x3 ratio again (older games that don't scale mostly). But like Quiz says, for things like web browsers, mail, and other mostly productive tasks, I keep the windows in a relative 4x3 aspect ratio, and use the extra horizontal space to tile them and flip between them.
I use two 16x10 monitors side by side - one for my game, the other for web browser, widgets, music player, etc. The web browser monitor could be 4x3 and I wouldn't miss the extra space, but I would miss it if it were gone on my gaming screen.
Two monitors doesn't lend itself to multi-monitor gaming in my case, I've tried it, but it was lackluster - mostly for the obvious reason that a bezel sits square in the middle of the FOV. I can't imagine 3 monitors side by side in a landscape orientation, that would be a very wide FOV, and while it ~may~ increase immersion, I suspect it would be more a distraction than a help, for me anyway. But then again, I tend to think any monitor larger than 27" is too big (because I have to move my entire head to see the entire surface if I'm sitting at a desk, and I don't like that).
I would like a 4k monitor eventually - I have Retina on my laptop and it's nice (for text in particular), although that laptop can't game well, and chokes utterly if you try to game at the screen's native resolution.
I suspect you could get 3 16x9 panels for cheaper than a ultra-wide would run (I didn't actually look) - but my preference would lean strongly to a single panel of whatever, then you don't have to deal with bezels and monitor alignment and a lot of display and power cables. If you wanted to game at anything higher than your current 1080, you could do it with your current card, but yeah, you would have to significantly adjust your in-game settings, and you may run across some more recent titles where your at or near minimum if you want to run at the higher resolution.
I'm running a triple monitor setup and it's great for general work, like someone else said it's nice to three screen to spread programs around on. As for games, if the game has been optimised for three monitors it's great. GW2 did a great job as the ui is still on the center monitor so no looking off to the edges of the other two monitors. ESO looks and plays well also. Also games that let you move windows all over the place work well. I can leave inventory windows open on the side if I want or other windows and still see what's going on with no problems. I have a 55 inch flat screen TV I use for gaming from time to time but when 3 monitors work well with the game it's better IMO.
A few games look terrible on 3 monitors so I just use a single monitor for those. Watch Dogs looks bad on 3.
"We all do the best we can based on life experience, point of view, and our ability to believe in ourselves." - Naropa "We don't see things as they are, we see them as we are." SR Covey
Any menus that are fundamentally 2D also don't suffer from the distortion of 3D stuff far off to the sides.
I wouldn't recommend that 34 inch ultra-wide monitor.
Even an ultra-wide monitor that large isn't good if you're planning to have your work and a movie open at same time, you'll need to have multiple monitors for that, and if you're going to have multiple monitors side by side then you don't really want an ultra-wide monitor because it would be hard to position: For gaming and other single-monitor use the largest monitor should be exactly in the middle, but for work you can't have 34 inch UW monitor in the middle and another monitor besides it or you'll get neck pains for trying to look at the monitor that's too far to the side.
Its actually really easy for a developer to setup for 3 monitors properly. They just choose not to, it would also increase the performance overhead and the amount of people it affects is negligible. All you would do is make 3 cameras instead of 1 that are right next to each other pointing in an appropriate direction to prevent overlap. Similarly for stereoscopic you have 2 next to each other for each eye.
Like others I don't recommend ultra wides. I would go with more monitors. I find having 2 monitors to be a dream. I don't know what I would do with 3. An ultra wide typically has resolution mismatch issues. They also are not wide enough for doing 2 things at the same time. I would consider 3:1 the minimum aspect ratio for having things side by side, or 27:9 instead of 21:9.
I think the best application of an ultra-wide is in a curved display. With it you can effectively use 1 camera to have a wider field of view in a game by simply adjusting the FOV. But it would be very novelty.
On the first part, I think you underestimate how involved it would be. Sure, you could take a naive approach have three separate camera views, use stencil buffers to block most of the image at a time, and make three separate passes--at a cost of tripling the work you have to do to render an image. It's possible to do quite a bit better than that by having separate culling for each monitor and only sending the draw call for the pass to the appropriate monitor(s), but that still means you have more uniforms that you have to switch back and forth and more things that you want to sort by simultaneously. Some objects will have to be drawn multiple times for multiple monitors. So it can be done, but it's a pretty big performance hit.
A cleaner approach would be to do clipping yourself manually in geometry shaders. At that point, however, you're trying to do computations in what fundamentally is a quotient space. I don't think it would be that hard for someone with a strong math background. But if it is attempted by a typical computer science major for whom "quotient" doesn't have any mathematical meaning outside of arithmetic, it's not going to end well.
On the second part, if you want to project properly onto a curved display, then you've left linear algebra behind entirely. You'd end up with something fundamentally incompatible with the way fixed function rasterization is done, and you'd likely get all sorts of weird artifacting as a result. You could avoid this by ditching the usual graphics APIs entirely and using OpenCL instead, but then you lose access to some important fixed-function parts of the pipeline, notably tessellation, the fixed-function depth buffer, and blending. Tessellation may not get used much because the math is too involved for most professional game developers, but it's a whole lot easier than making your own modern 3D graphics engine without using any of the graphics APIs. And that's even if we ignore all the complications that projecting onto a curved monitor bring.