It looks like you're new here. If you want to get involved, click one of these buttons!
This is going to take some explanation, so bear with me. The short version is that, if it works right, g-sync could greatly increase the smoothness of movement in games.
The way that a 60 Hz monitor works right now is that each pixel refreshes every 1/60 of a second. If it has a new frame to draw, then it changes the color of the pixel from the old frame to the new one. If not, then it just redraws the old pixel. If a new frame comes in halfway between refresh cycles, it waits until the next 1/60 of a second rolls around to start changing the color to the new pixel. (For simplicity, I'm ignoring the time gap between when a video card starts sending a frame to a monitor and when it finishes sending that frame.)
If all games rendered at exactly 60 Hz, with a new frame exactly every 16 2/3 ms, this would work fine. The problem is that games don't work like that, for about as many reasons as why you don't fall asleep at exactly the same time every night.
Imagine a game that renders at a steady 50 Hz, taking exactly 20 ms for each frame. Let's say that the monitor receives one frame at exactly the moment that it needs it in order to start a refresh. It displays that frame immediately. The next frame comes 20 ms later, too late for the refresh tick that occurs only 16 2/3 ms later. So instead, it just lets the new frame sit there until the tick after it, at 33 1/3 ms later. This adds an extra 13 1/3 ms to your display latency for that frame. The next frame finishes 40 ms after we started, and gets displayed 50 ms after we started, so the second frame only appeared for 16 2/3 ms.
This continues on, with frames being displayed for 33 1/3, 16 2/3, 16 2/3, 16 2/3, and 16 2/3 ms, then going through the cycle again. Every fifth frame is displayed for twice as long as the others. However, the game engine sees it as a new frame every 20 ms, so just as much movement takes place from one frame to the next. The monitor just displays the frames unevenly, and this jutter makes the game look substantially worse.
And that's for a game that renders at a completely smooth 50 Hz. That doesn't happen; games take longer for some frames than others unless the game engine is artificially padding the time it takes to render each frame with idle time between frames.
The increased display latency can often be avoided by turning off vertical sync and displaying the latest partial frame, so as to have the monitor display part of one frame and part of another simultaneously. But this tearing also looks terrible, especially if anything is moving fast or the screen is turning.
The real solution would be for monitors to abandon their insistence on refreshing at fixed times for reasons of their own, without regard to what is going on in the game. Rather, a monitor would refresh when it makes sense. If a game is rendering at 50 frames per second, let the monitor refresh 50 times per second--whenever a new frame is ready to display. If consecutive frames take 20 ms, 24 ms, 21 ms, 25 ms, and 21 ms, then refresh the monitor at those times, whenever the new frame is ready. Update the monitor with timing dictated by the game, not ignoring the game in favor of the monitor refreshing whenever it feels like it. That would both provide much smoother motion, and also reduce the display latency.
That is precisely the promise of Nvidia's G-sync. As the changes are mostly to the monitor, not the video card, G-sync is a physical card that has to be inserted into the monitor. Nvidia says that some monitors next year will include it, and later this year, they'll sell the card for enthusiasts to install into existing monitors on their own.
The problem with proprietary features like this is usually compatibility. Doing physics computations on a GPU or having a low level API for more direct access to the hardware aren't intrinsically bad ideas. But when Nvidia's GPU PhysX or AMD's Mantle would require a lot of coding that only a handful of players will benefit from, they don't provide any benefit to the games that don't implement them--meaning, nearly all games.
But G-sync could well fare better. I don't see any intrinsic reason why Nvidia can't do the software side of this entirely in video drivers, without game engines having to change anything at all. The video card already knows when a frame is finished, and just has to pass that along to the monitor and say when it's time to refresh. That means that essentially universal game compatibility is quite possible--whether DirectX, OpenGL, or browser-based; brand new or many years old; still in development or abandoned years ago.
The hardware side of things is where the compatibility problems show up. For starters, it needs a monitor with a G-sync card in it. But there have been better and worse monitors for a long time, and this is just one more feature that monitors can add.
Worse is that Nvidia says it will require a GeForce GTX 650 Ti Boost or better. I don't see any intrinsic reason why it can't work with lower end, older, or non-Nvidia cards. But I also don't see any reason why Nvidia wouldn't use this to push Nvidia cards. If they were willing to push artificial vendor lock-ins with completely stupid things like chipset-compatibility for SLI, then I see no reason why they wouldn't push for this with a potentially great feature.
It's entirely possible that AMD or some independent vendor will come up with their own competing solution in time, and then eventually there will be industry standards that work with everything. Think of how Nvidia's 3D Vision led to AMD's 3DHD, which led to OpenGL 4.2 and then DirectX 11.1, and now someone who wants to implement stereoscopic 3D can readily implement it once and have it work on hardware from both vendors.
All this, of course assumes that G-sync works properly. And that's a big assumption. But if it does what it ought to, Nvidia's calling it "revolutionary" isn't just stupid marketing blather.
Comments
I tend to agree, it could be a big deal. Kinda like when SSDs came out - they didn't sound like much (and a lot of people still poo-poo them) but they have drastically changed PCs. A lot of people will poo-poo this because it doesn't increase frame rates, but tearing is the #1 most annoying thing I contend with on an LCD (Source-based games are horrible for this).
This would be enough to make me consider nVidia over AMD even at a slight price premium for performance, if it works as advertised.
The only problem I see with it is that it requires a supporting monitor. Would be nice if you could get a passthrough device that lets it work with existing monitors, but not knowing exactly how the tech works, that may not be possible.
The software side already supports existing late-series GPU's... it probably could support nearly every GPU (AMD included if they cared to) but they are limiting it to later generations to try to drive sales, which I can understand. And if you are willing to go out and buy a new monitor for this (one of the most durable parts of a PC build), you could probably afford to drop enough cash on at least an entry level GPU to support.
SSDs are a good comparison--but let's not forget that the early SSDs were mostly junk.
Nvidia's G-sync involves putting a particular physical card with several chips on it into the monitor. It's not just a software or firmware hack, but different hardware.
Sounds like a great idea, but I'm a bit worried about picture quality.
As far as I know, in theory a LED monitor should keep its picture stable and have no problems no matter how long the time between refreshes is. But since all modern computer monitors are designed to refresh constantly at 60 fps, I'm not ready to trust that they'd work perfectly on all frame rates slower than that until I see it myself.
G-sync is not at all similar to V-sync.
Three potential problems:
1) Screen tearing, as the monitor displays part of one frame and part of another simultaneously, with an obvious line between them
2) Increased display latency, as the video card makes the monitor wait for a while after a frame is completed before the monitor can do anything with it
3) Display stuttering, as monitor timing issues make it so that some frames are displayed much longer than others, to a much greater degree than the variation in times of rendering frames
Your options:
a) Vertical sync on gives you problems (2) and (3), but not (1).
b) Vertical sync off gives you problems (1) and (3), but not (2).
c) G-sync gives you none of the three problems.
I like option (c) best, assuming it works right.
A CRT had an analog electron gun, you could set the sweep time (in Hz). The gun started at the top (aimed via a magnetic field), scanned left to right in a single row, then moved down one row, and proceeded thusly until it got to the bottom, at which point it would start over again at the top. The picture was continuously be repainted because the electron gun never stopped moving, and the time it took to paint the entire image from top to bottom was considered the refresh rate. Crazy things would happen if the input signal and the refresh rate didn't sync up (remember old TVs and the picture would like roll up off the screen?)
In LCD's, it's just a clock where they just take whatever is on the input, and have the pixels respond to what they are supposed to be according to that input - they have a maximum refresh time (and there are cases where you could go so fast the pixels don't have time to respond completely), but they, in theory, should be able to scale that clock downwards with no ill effects at all. Some monitors may have fixed clocks, and you'd be stuck (and that may be why we don't see a dongle or pass-through device, although I'm pretty sure the clock is on the video card and is included in the signal to the monitor digitally), but since nVidia is forcing this G-Sync card to be paired with a monitor, the monitor manufacturer can ensure the LCD clock they use can scale according to whatever G-Sync needs it to be.
It's implemented in monitor hardware, not game engines. All of the software support will probably be done in video drivers, but the hard part is hardware that gets built into the monitor.
I do expect Nvidia to lock it to recent GeForce cards, and most monitors won't support it. But if you buy a monitor and video card that do support it, then it should work with every game. There's a big difference between that and GPU PhysX or 3D Vision.
And all to cost a soul and a kidney?
No, thanks. I will stick to some old technology until that one is proven and tested and more importantly, cheap.
New players can get a welcome package and old/returning players can also get a welcome back package and 7 days free subscription time! Just click here to use my referral invitation
wow. instead of beating the middle man they decided to be a much better middle man.
nice play Nvidia, i hope it works.
"There are at least two kinds of games.
One could be called finite, the other infinite.
A finite game is played for the purpose of winning,
an infinite game for the purpose of continuing play."
Finite and Infinite Games, James Carse
All I see brewing is another beta vs vhs, bluray vs hd-dvd coming down the pipeline. nVidia is positioning themselves to enter the display market in hopes to capitalize on inclusive hardware manufacturing before Radeon can get their foot in the door. Think the sum of Apple iphone parts. Soon there will be two types of monitors, those with nVidia modules installed and those with radeon modules installed. Compatibility issues will arise but the average pc user will not know the difference. Enthusiast will be the first to fight these wars. Eventually you will see exclusive partnering with monitor manufacturers and a slight rise in monitor cost. Yay! more things to go wrong when you're putting together a PC.
"Small minds talk about people, average minds talk about events, great minds talk about ideas."