It's not just the number, but also the type of anti-aliasing. For that matter, different cards offer different types of anti-aliasing. The superior geometry performance of GeForce cards means that Nvidia has offered CSAA for years, something that AMD just introduced on their high end cards yesterday. And I do mean yesterday in the literal sense. AMD calls it EQAA, but I think CSAA is a better name for it.
Meanwhile, the superior shader performance in Radeon cards means that AMD offers MLAA that Nvidia doesn't, though that isn't true anti-aliasing. AMD cards also handle SSAA better than Nvidia cards. Nvidia offers some of the benefits of SSAA at a lighter performance hit with transparency AA, which AMD doesn't offer.
How high you should turn anti-aliasing varies by game, and by what your hardware can handle. I don't see much of a point in going above 4x SSAA. Often there are tradeoffs between settings, where you can get smooth performance with either setting A or setting B on, but not both at once.
One thing to note about anti-aliasing is that it only strains the video card harder, not the processor. If you're processor bound, you can go ahead and crank up anti-aliasing without impacting your frame rate at all.
Also, there's no such thing as a 5950, and I don't see much of a point in getting a 5970 now.
AmazingAveryAge of Conan AdvocateMemberUncommonPosts: 7,188
I can tell you I see a difference from 8x to 16x AA in Age of Conan at the same time it looks the best game out there for me at 32x AA :P (forced through Nvidia control Panel)
I can tell you I see a difference from 8x to 16x AA in Age of Conan at the same time it looks the best game out there for me at 32x AA :P (forced through Nvidia control Panel)
GeForce cards don't offer a true 32x AA, or even 16x AA. What Nvidia calls 8x and 16x AA are both really 4x AA plus CSAA. What Nvidia calls 32x AA is really 8x AA plus CSAA. Nvidia uses inflated numbers to try to make it sound like it's better than it actually is.
That's not to say that CSAA is bad. But 4 real samples plus 12 coverage samples isn't at all the same thing as 16 real samples, or even 8, for that matter.
Regardless, that's all in the territory where you won't tell a practical difference in actual games with things moving around the screen rapidly. Maybe if you take screenshots and compare them side by side, you might be able to tell a difference. Maybe.
I use 4x AA because I do not stand around looking at a leaf to see if I can find jaggies. Besides, for me 4x looks great.
If 4x is too much, use 2x.
You will see a difference between 4x and 2x. However, if you are getting an HD 5870 you can run almost any game at 4x AA. The same can be said for the HD 5850. The HD 5870 has a little more power.
If you are looking at the HD 5870, take a look at the benchmarks for the 6870 and 6950, and compare prices.
Intel Core i7 7700K, MB is Gigabyte Z270X-UD5 SSD x2, 4TB WD Black HHD, 32GB RAM, MSI GTX 980 Ti Lightning LE video card
x2 AA = much crisper, but fine objects like vehicles, leaves, trees, homes, telephone cables, mountain ridges, are all still noticably jagged.
x4 AA = very clear, telephone lines, mountain edges, cars, homes, and grass all looks seemless without the noticing of jaggedness, you would need to look closely and really observe with the purpose of looking for jaggedness to find it.
x8 AA = with 5770 this starts to slow me down, and I do not notice a significant differenct from x4, I can tell that EXTREMELY fine details like leaves, phone cables, ect, all look slightly sharper at distances. Not worth the extra resources imo, because my eye never catches this difference when I am not specifically looking for it.
x16 AA = same crap as the previous, but much slower, and I don't have a microscope at hand.
IMO, for 'most' games, anything above x4 is sort of pointless. With a few games being an exception. x4 really is the sweet spot. For me, anything below that isn't crisp enough, and anything higher than that, doesn't make enough of a noticable difference to be worth the performance cost.
The higher the resolution, the less you need AA, depending on the pixel density in your field of view. But at 720p, I can only barely tell the difference between 4 and 8x on my 56 DLP from 8 feet away.
If you were using a lower res than 1080p on something 2 feet from your face I'd say def strive for 8x, but at 1080p it isn't going to matter much, and def not 16x. Waste of processing power.
If you must know MSAA is faster,but SSAA does a little more.
IMO i have never seen or at least never noticed surfaces that need AA,only the jagged edges of objects need it.
No question you stick either one up to x16 they will both be superb,but for sake of todays strenuous gaming,i would say go for the faster MSAA.I never set it more than x4,and often i never need it,i think it is more related to games of old,but for sake of MMORPG gaming,there is older games using low end graphics with 64x64 stretched textures.Wow could use AA,FFXi could use AA,heck even EQ2,most certainly EQ1.MANY F2P games could use some AA help as well.
You just don't see that pixelated look anymore in graphics,even with AA turned off.
Never forget 3 mile Island and never trust a government official or company spokesman.
The 6950 and 6970 seem to be priced more or less correctly compared to the 570 and 580 (if maybe a hair too high), and the 6950 is at least at a fair price compared to the 5870 (though really no better) but at the moment I find that the 6970 often doesn't give the boost in performance over the 6950 that would warrant it.
Another option is, of course, to do something like put two 6850s in Crossfire. That'll only set one back about the price of a single 6970 (a little less, in fact), but will deliver vastly better performance than a 6970. Taking Guru3d's 6970 review, it looks like two 6850s would give roughly 30% better performance, even just at 80% scaling.
Nobody really talks about how high to set AA. I understand that the higher resolutions on smaller screens means less AA is needed.
What do you recommend?
Entirely depends on your GPU. Start with 4 and move up till it chokes. Personally I could max it but I don't because it's not a good "bang for your buck" tweak.
Open Paint and draw a not quite horizontal black line on a white background. It will look jagged, as you can see exactly where it jumps up by one pixel. That's the sort of thing that anti-aliasing is intended to fix.
If the pixels at the boundary of the line are gray instead of white, with light gray if a little further from the center of the line and dark gray if a little nearer to the center of the line, it looks much better to the human eye. Well, at least it does when you zoom out. Zoomed in enough that each pixel is pretty big, it looks terrible to blur like this. Anti-aliasing acts by blurring such jagged edges.
The naive way to do anti-aliasing is SSAA. If you run a game on a 1920x1080 monitor with 4x SSAA on, the game internally renders at a resolution of 3840x2160. To compute each pixel that actually gets displayed on the monitor, the card averages four pixels in a 2x2 block from what the card rendered.
In the above example of a diagonal line in Paint, pixels at the edge but a boundary further away from the center of the line would be an average of 3 white pixels and 1 black pixel, and appear as light gray. Pixels a little nearer to the center of the line would appear as an average of 3 black pixels and 1 white, and appear as dark gray. Some in between would be an average of 2 white and 2 black and appear as an intermediate shade of gray. This blurs jagged edges very effectively.
There are two big advantages to SSAA. First, it's very easy to implement. Radeon HD 5000 series and later cards can do SSAA on games that don't support anti-aliasing at all. Thus, you can go back and play 10 year old games that didn't implement anti-aliasing because cards weren't powerful enough, and get anti-aliasing on those games.
Second, it's the best possible image quality. All jagged edges get blurred, no matter how they got there in the picture. Some fractal type patterns can mess it up, but that's about it.
But there's one huge drawback to SSAA: it's very computationally expensive. SSAA doesn't add to the geometry or physics processing requirements, but does put a heavy load on the shaders, texture units, probably ROPs, likely memory bandwidth, and especially memory capacity. Running a game at 3840x2160 is just a huge resolution that will put a huge strain on a lot of cards that might otherwise run the game smoothly.
This computational expense is the reason why there are other forms of anti-aliasing. The basic goal is to take more samples where there are jagged edges, but not where there aren't jagged edges and you wouldn't be able to tell the difference. If you can get 80% of the benefit of SSAA at 20% of the extra computational cost, that can be great for cards that can't handle true SSAA.
One common approach to this is MSAA. Both AMD and Nvidia advocate implementing MSAA in the same way. Basically, a 3D image is made up of lots of polygons (typically triangles), with texture maps on each. Often the jagged edges arise at borders of polygons, as happens at the boundary of an object (in the intuitive sense that a rock or a mob is an object, not the OOP sense). The approach of MSAA is to take the extra samples and average the pixels at the borders of polygons, but not elsewhere. This fixes jagged edges at borders of polygons. Most pixels are not at borders of polygons, so most pixels don't get anti-aliased, which keeps the computational expense down.
One problem with MSAA is that it requires extensive knowledge of the geometry of a scene to do anything. If game developers don't specifically code MSAA support into a game, then it can't be used with that game.
Another problem with MSAA is that it misses a lot of jagged edges. One common type of jagged edge that it misses is tranparencies. For example, if you have a fence, rather than having a bunch of polygons for each bar of the fence, it is simpler to have one large square, with a texture map that is transparent at the "holes" in the fence. The bars of the fence are not at the edges of a polygon, so MSAA doesn't know that they ought to be anti-aliased, and doesn't anti-alias them. That leaves jagged edges and looks bad.
Another type of jagged edges that MSAA will miss is when you have sharp jumps in color on a texture map. For example, go back to the not quite horizontal black stripe on a white background. If that's the texture map for a polygon, then the stripe won't be at the edge of the polygon, so MSAA won't know to anti-alias it, and will just leave it jagged.
One attempt at improving this is that recent Nvidia cards offer transparency anti-aliasing. This is basically MSAA, but when it finds transparencies in a texture map, it anti-aliases those, too. This fixes jagged edges caused by transparency, but not the black stripe on a white background issue. AMD cards do not offer this type of anti-aliasing.
Another more recent effort at fixing this issue is MLAA on recent AMD cards. What MLAA does is that, after a frame is completely done, it goes through and looks for places where there is a sharp jump in color between consecutive pixels. It then blurs them together. If it finds a place where one pixels is white and the next one is black, it will blur them together. This fixes jagged lines due both to transparency issues and the black line on a white background issue. Thus, MLAA completely fixes jagged line issues. Because it doesn't have to be applied everywhere, it is far less expensive than SSAA, though still typically more expensive than MSAA. MLAA doesn't have a 4x or 8x or whatever modifier. Either MLAA is off or on.
MLAA isn't really a type of anti-aliasing at all, though its intuitive effects may look similar to anti-aliasing. Rather MLAA is a post-processing effect. This offers the advantage that MLAA can be applied to any image, without any need for game developers to do something. You could take a screenshot of a spreadsheet and apply MLAA to it, for example. This also means that MLAA can be combined with any other type of anti-aliasing that you can use. For example, you could play a game and have MSAA on while computing the initial game image, then apply MLAA to that image before it gets displayed on the screen.
There is one huge problem with MLAA, though. Sometimes jagged edges really should be jagged. Applying anti-aliasing to text makes the text look blurry and looks really bad. When an AMD card takes the completed image and applies MLAA to it, it doesn't know what parts of the image come from 3D renders and need to be anti-aliased, and what parts are the HUD (e.g., your skillbar) and done in 2D and pasted on top of what it rendered in 3D. The latter should not have MLAA applied to it, but the video card can't tell the difference. I hope that in the future there will be a way for a game to say, this part of the picture is done in 2D, so don't apply MLAA to it. AMD might well be lobbying for the inclusion of that in DirectX 12.
MLAA is very shader-heavy, and AMD cards have far superior shader power to otherwise compable Nvidia cards. MLAA isn't offered on Nvidia cards (yet), and even if it were, the cards wouldn't be very good at it.
The final anti-aliasing method worth mentioning is CSAA. Nvidia cards have offered CSAA for years, and AMD just now started offering it (under the name EQAA) on new cards. Let's say that you have a boundary between two triangles, one of which is solid white and the other solid black. If you use MSAA to take several samples for pixel, the ones in one triangle will be white, and the ones in the other triangle will be black. The pixel will be a shade of gray determined by the ratio of pixels in each triangle.
What CSAA does is to say, let's only take a few samples that get the exact color, but take several more that ask only what triangle the point is in. The latter type of sample is much less computationally expensive than the former. What Nvidia calls 16x AA is really 4 real samples plus 12 coverage samples that only ask what triangle the point is in. It then assumes that each point that it doesn't check the color on is the same color as the other points in the same triangle for which it does check the color.
In the case of solid color triangles, this can work really well. The only time that this sort of 16x CSAA would be any different from a true 16x MSAA is when all four color samples are in the same triangle, but one or more of the coverage samples are in a different one.
If the textures are messy, however, this 16x CSAA doesn't necessarily look any better than mere 4x MSAA, but is more computationally expensive. I'm sure there are pathological cases where Nvidia's 16x CSAA will actually look worse than 4x MSAA, but those aren't representative of real world usage.
The superior geometry performance of the recent architectures for GeForce cards mean that they're better at CSAA than Radeon cards. The Radeon HD 6950 and 6970 are the only AMD cards that even offer CSAA, though I'd expect that future cards of the same and future architectures will offer it as well.
In order for Anti-Alliasing to work, blurring occurs in the image naturally. Anisotropic filtering is what makes the image quality a lot better. If you are playing competitively, forget Anti Alliasing altogether....Otherwise, on 1920x1080, going above 4xAA actually results in more performance burning and over-bluring of an image. SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing, but that is an amazing technology actually for image quality.
Super Sampling is taking your game and rendering it at double the resolution, then scaling the image down for your monitor, the curves look more correct because its smaller and finer, harder for the eyes to look for the aliasing, but if you look hard enough, its still there....but not as noticable.
Originally posted by Shinami In order for Anti-Alliasing to work, blurring occurs in the image naturally. Anisotropic filtering is what makes the image quality a lot better. If you are playing competitively, forget Anti Alliasing altogether....Otherwise, on 1920x1080, going above 4xAA actually results in more performance burning and over-bluring of an image. SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing, but that is an amazing technology actually for image quality.
You have not provided a compelling reason for AA to be a disadvantage in competitive play. The jagged edges of non-AA are actually less representative of the hit boxes than the smoothed edges of AA, jaggies do not equal accuracy to the internal geometry of the scene. The scene setup has much finer granularity than can be represented on a monitor, and a jagged line is representing a polygon outside the dimensions it actually occupies. Anti-aliasing actually removes false data from the scene by reducing its contrast to the nearby sampled colors.
As you can see in the visual aid I just made in Paint :P
The black line represents the jagged edge of non-AA occupying large pixels, the yellow line represents the actual geometry of the scene the game would be using for hit detection (if it even uses the actual geometry for it)
AA actually improves image fidelity even though it may sound detrimental when you throw around words like 'blur'. This helps when enemies are behind heavy foliage where jaggies can obscure neighboring pixels or at long distances where few pixels are available to represent objects. Though if you are playing a very graphically limited game like Quake it's not going to matter because there is nothing to obscure player's pixels.
Nice post Quizzical. Isn't the Adaptive MSAA option in Catalyst ATI's version of Transparency anti-aliasing?
I'm not sure what adaptive AA is. If it is just transparency anti-aliasing, then that's a terrible name for it.
"SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing"
SSAA takes what a game does for anti-aliasing and does it to the entire scene instead of only selected portions. That sure sounds like anti-aliasing to me.
"If you are playing competitively,"
Playing competitively? Bah. It's a game.
If the only thing that mattered was the highest possible frame rates, we'd all turn most graphical settings off because all they do is make the game look nicer. I have higher standards on frame rates than most people, but even I think that would be ridiculous.
Nice post Quizzical. Isn't the Adaptive MSAA option in Catalyst ATI's version of Transparency anti-aliasing?
I'm not sure what adaptive AA is. If it is just transparency anti-aliasing, then that's a terrible name for it.
"SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing"
SSAA takes what a game does for anti-aliasing and does it to the entire scene instead of only selected portions. That sure sounds like anti-aliasing to me.
"If you are playing competitively,"
Playing competitively? Bah. It's a game.
If the only thing that mattered was the highest possible frame rates, we'd all turn most graphical settings off because all they do is make the game look nicer. I have higher standards on frame rates than most people, but even I think that would be ridiculous.
SSAA simply takes an image at double the resolution and downsamples it to your monitor. This is not an "anti-alliasing" method because it does nothing to solve the "exhaustion" problem (formal mathematics)*, it just makes it smaller to the eye (by four times) to try to make the eye not recognize it. SSAA uses four times the space, while the processing itself is O( n )^x (Expoential growth function) while its optimization reduces the exponential growth time by a lot.
AMD made MLAA the first 100% full screen anti-alliasing through the 6XXX series, all others prior AA forms were partial. It suffered problems due to it being first generation but it had promise. I can't wait for it to be perfected.
As far as gaming goes, you either are competitive or you arent. The demands are different as well as the settings. Beauty is we can go either way without fault and show maturity. Sometimes you really want to win, other times one wants to shoot the breeze.
Personally, I hate AA. It makes things blurry and I prefer the detail. I recommend none. AF on the other hand... Crank that up all the way. No modern high end video card will have trouble with AF on maximum.
SSAA uses four times the space, while the processing itself is O( n )^x (Expoential growth function) while its optimization reduces the exponential growth time by a lot.
That is so wildly wrong as to be absurd.
When the big O notation is used, most commonly n is the variable. The variable can be something else (or there could be more than one), as you really just use whatever variable(s) you used in the algorithm for which you're trying to evaluate the computational complexity. But you definitely don't use it when you don't already have some variable that you're talking about.
O(n) is a set of functions. More precisely, a function f(n) is in O(n) if there are real numbers M and N such that f(n) < Mn for all n > N. For either of the inequalities, it doesn't matter if the inequality is strict or not. If you want to allow f(n) to be negative, then you may want an absolute value; often f(n) is counting something, and nonnegative by definition.
That seems like a goofy definition at first glance, and it is a fairly subtle concept, but it makes a lot more sense if you've got a real analysis background. What it's really saying is that the lim sup of f(n)/n exists and is finite. That is, f(n)/n may be messy, but is bounded above for sufficiently large n, rather than going off to infinity.
But back to the point. O(n) is a set. And you're trying to take a set to the x power. You can take a Cartesian product of a set with itself x times if x is an integer. If x is not an integer, I'm not sure if it's possible to make sense of it. But even if it is, taking a Cartesian product of a set of functions with itself some arbitrary number of times is just a goofy thing to do.
The notation could make sense if you make it O(n^x) instead. If you plug in a positive integer for x, this is a set of functions that doesn't grow faster than some polynomial. In particular, if f(n) is in O(n^x) for some integer x, then f(n) is not exponential in n unless the base is at most 1. And if the base is at most 1, then it's not growing at all.
If you use nx SSAA (where n is some number), then the number of samples as a function of n is in O(n). The amount of work done to compute such a scene is in O(n). And in particular, it's not exponential in n.
Actually, I think Quizzical IS a mathematician. He had noted in another thread once that that's where his background was, and I assume he meant his formal education.
Comments
4x if AA impacts your framerate, if not, crank it as high as it will go without it hurting your performance.
Someone on some other fourms says that anything above 8x AA is just a marketing gimmic, do you find any difference with 8x AA vs. 16x AA?
I'm not to worried about the preformance hit. Im getting a 6950 or a 6970, i just rather have FPS over too much AA.
It's not just the number, but also the type of anti-aliasing. For that matter, different cards offer different types of anti-aliasing. The superior geometry performance of GeForce cards means that Nvidia has offered CSAA for years, something that AMD just introduced on their high end cards yesterday. And I do mean yesterday in the literal sense. AMD calls it EQAA, but I think CSAA is a better name for it.
Meanwhile, the superior shader performance in Radeon cards means that AMD offers MLAA that Nvidia doesn't, though that isn't true anti-aliasing. AMD cards also handle SSAA better than Nvidia cards. Nvidia offers some of the benefits of SSAA at a lighter performance hit with transparency AA, which AMD doesn't offer.
How high you should turn anti-aliasing varies by game, and by what your hardware can handle. I don't see much of a point in going above 4x SSAA. Often there are tradeoffs between settings, where you can get smooth performance with either setting A or setting B on, but not both at once.
One thing to note about anti-aliasing is that it only strains the video card harder, not the processor. If you're processor bound, you can go ahead and crank up anti-aliasing without impacting your frame rate at all.
Also, there's no such thing as a 5950, and I don't see much of a point in getting a 5970 now.
I can tell you I see a difference from 8x to 16x AA in Age of Conan at the same time it looks the best game out there for me at 32x AA :P (forced through Nvidia control Panel)
GeForce cards don't offer a true 32x AA, or even 16x AA. What Nvidia calls 8x and 16x AA are both really 4x AA plus CSAA. What Nvidia calls 32x AA is really 8x AA plus CSAA. Nvidia uses inflated numbers to try to make it sound like it's better than it actually is.
That's not to say that CSAA is bad. But 4 real samples plus 12 coverage samples isn't at all the same thing as 16 real samples, or even 8, for that matter.
Regardless, that's all in the territory where you won't tell a practical difference in actual games with things moving around the screen rapidly. Maybe if you take screenshots and compare them side by side, you might be able to tell a difference. Maybe.
You have good advice for the most part here.
I use 4x AA because I do not stand around looking at a leaf to see if I can find jaggies. Besides, for me 4x looks great.
If 4x is too much, use 2x.
You will see a difference between 4x and 2x. However, if you are getting an HD 5870 you can run almost any game at 4x AA. The same can be said for the HD 5850. The HD 5870 has a little more power.
If you are looking at the HD 5870, take a look at the benchmarks for the 6870 and 6950, and compare prices.
Intel Core i7 7700K, MB is Gigabyte Z270X-UD5
SSD x2, 4TB WD Black HHD, 32GB RAM, MSI GTX 980 Ti Lightning LE video card
Yeah I play bad company 2 on my 5770 and
x1 AA = very jagged
x2 AA = much crisper, but fine objects like vehicles, leaves, trees, homes, telephone cables, mountain ridges, are all still noticably jagged.
x4 AA = very clear, telephone lines, mountain edges, cars, homes, and grass all looks seemless without the noticing of jaggedness, you would need to look closely and really observe with the purpose of looking for jaggedness to find it.
x8 AA = with 5770 this starts to slow me down, and I do not notice a significant differenct from x4, I can tell that EXTREMELY fine details like leaves, phone cables, ect, all look slightly sharper at distances. Not worth the extra resources imo, because my eye never catches this difference when I am not specifically looking for it.
x16 AA = same crap as the previous, but much slower, and I don't have a microscope at hand.
IMO, for 'most' games, anything above x4 is sort of pointless. With a few games being an exception. x4 really is the sweet spot. For me, anything below that isn't crisp enough, and anything higher than that, doesn't make enough of a noticable difference to be worth the performance cost.
mmm pizza
The higher the resolution, the less you need AA, depending on the pixel density in your field of view. But at 720p, I can only barely tell the difference between 4 and 8x on my 56 DLP from 8 feet away.
If you were using a lower res than 1080p on something 2 feet from your face I'd say def strive for 8x, but at 1080p it isn't going to matter much, and def not 16x. Waste of processing power.
Thanks everyone! And to advoid confusion i ment 6950/6970. Lol sorry.
I'll be using 4x from now on.
I cant wait for my 6950 now xD.
What is the best type of AA from a visual perspective?
SSAA or Morphological Anti-Aliasing?
Edit: Nvm SSAA is still the best
Well lol.....
If you must know MSAA is faster,but SSAA does a little more.
IMO i have never seen or at least never noticed surfaces that need AA,only the jagged edges of objects need it.
No question you stick either one up to x16 they will both be superb,but for sake of todays strenuous gaming,i would say go for the faster MSAA.I never set it more than x4,and often i never need it,i think it is more related to games of old,but for sake of MMORPG gaming,there is older games using low end graphics with 64x64 stretched textures.Wow could use AA,FFXi could use AA,heck even EQ2,most certainly EQ1.MANY F2P games could use some AA help as well.
You just don't see that pixelated look anymore in graphics,even with AA turned off.
Never forget 3 mile Island and never trust a government official or company spokesman.
The 6950 and 6970 seem to be priced more or less correctly compared to the 570 and 580 (if maybe a hair too high), and the 6950 is at least at a fair price compared to the 5870 (though really no better) but at the moment I find that the 6970 often doesn't give the boost in performance over the 6950 that would warrant it.
Another option is, of course, to do something like put two 6850s in Crossfire. That'll only set one back about the price of a single 6970 (a little less, in fact), but will deliver vastly better performance than a 6970. Taking Guru3d's 6970 review, it looks like two 6850s would give roughly 30% better performance, even just at 80% scaling.
Entirely depends on your GPU. Start with 4 and move up till it chokes. Personally I could max it but I don't because it's not a good "bang for your buck" tweak.
A quick explanation of anti-aliasing:
Open Paint and draw a not quite horizontal black line on a white background. It will look jagged, as you can see exactly where it jumps up by one pixel. That's the sort of thing that anti-aliasing is intended to fix.
If the pixels at the boundary of the line are gray instead of white, with light gray if a little further from the center of the line and dark gray if a little nearer to the center of the line, it looks much better to the human eye. Well, at least it does when you zoom out. Zoomed in enough that each pixel is pretty big, it looks terrible to blur like this. Anti-aliasing acts by blurring such jagged edges.
The naive way to do anti-aliasing is SSAA. If you run a game on a 1920x1080 monitor with 4x SSAA on, the game internally renders at a resolution of 3840x2160. To compute each pixel that actually gets displayed on the monitor, the card averages four pixels in a 2x2 block from what the card rendered.
In the above example of a diagonal line in Paint, pixels at the edge but a boundary further away from the center of the line would be an average of 3 white pixels and 1 black pixel, and appear as light gray. Pixels a little nearer to the center of the line would appear as an average of 3 black pixels and 1 white, and appear as dark gray. Some in between would be an average of 2 white and 2 black and appear as an intermediate shade of gray. This blurs jagged edges very effectively.
There are two big advantages to SSAA. First, it's very easy to implement. Radeon HD 5000 series and later cards can do SSAA on games that don't support anti-aliasing at all. Thus, you can go back and play 10 year old games that didn't implement anti-aliasing because cards weren't powerful enough, and get anti-aliasing on those games.
Second, it's the best possible image quality. All jagged edges get blurred, no matter how they got there in the picture. Some fractal type patterns can mess it up, but that's about it.
But there's one huge drawback to SSAA: it's very computationally expensive. SSAA doesn't add to the geometry or physics processing requirements, but does put a heavy load on the shaders, texture units, probably ROPs, likely memory bandwidth, and especially memory capacity. Running a game at 3840x2160 is just a huge resolution that will put a huge strain on a lot of cards that might otherwise run the game smoothly.
This computational expense is the reason why there are other forms of anti-aliasing. The basic goal is to take more samples where there are jagged edges, but not where there aren't jagged edges and you wouldn't be able to tell the difference. If you can get 80% of the benefit of SSAA at 20% of the extra computational cost, that can be great for cards that can't handle true SSAA.
One common approach to this is MSAA. Both AMD and Nvidia advocate implementing MSAA in the same way. Basically, a 3D image is made up of lots of polygons (typically triangles), with texture maps on each. Often the jagged edges arise at borders of polygons, as happens at the boundary of an object (in the intuitive sense that a rock or a mob is an object, not the OOP sense). The approach of MSAA is to take the extra samples and average the pixels at the borders of polygons, but not elsewhere. This fixes jagged edges at borders of polygons. Most pixels are not at borders of polygons, so most pixels don't get anti-aliased, which keeps the computational expense down.
One problem with MSAA is that it requires extensive knowledge of the geometry of a scene to do anything. If game developers don't specifically code MSAA support into a game, then it can't be used with that game.
Another problem with MSAA is that it misses a lot of jagged edges. One common type of jagged edge that it misses is tranparencies. For example, if you have a fence, rather than having a bunch of polygons for each bar of the fence, it is simpler to have one large square, with a texture map that is transparent at the "holes" in the fence. The bars of the fence are not at the edges of a polygon, so MSAA doesn't know that they ought to be anti-aliased, and doesn't anti-alias them. That leaves jagged edges and looks bad.
Another type of jagged edges that MSAA will miss is when you have sharp jumps in color on a texture map. For example, go back to the not quite horizontal black stripe on a white background. If that's the texture map for a polygon, then the stripe won't be at the edge of the polygon, so MSAA won't know to anti-alias it, and will just leave it jagged.
One attempt at improving this is that recent Nvidia cards offer transparency anti-aliasing. This is basically MSAA, but when it finds transparencies in a texture map, it anti-aliases those, too. This fixes jagged edges caused by transparency, but not the black stripe on a white background issue. AMD cards do not offer this type of anti-aliasing.
Another more recent effort at fixing this issue is MLAA on recent AMD cards. What MLAA does is that, after a frame is completely done, it goes through and looks for places where there is a sharp jump in color between consecutive pixels. It then blurs them together. If it finds a place where one pixels is white and the next one is black, it will blur them together. This fixes jagged lines due both to transparency issues and the black line on a white background issue. Thus, MLAA completely fixes jagged line issues. Because it doesn't have to be applied everywhere, it is far less expensive than SSAA, though still typically more expensive than MSAA. MLAA doesn't have a 4x or 8x or whatever modifier. Either MLAA is off or on.
MLAA isn't really a type of anti-aliasing at all, though its intuitive effects may look similar to anti-aliasing. Rather MLAA is a post-processing effect. This offers the advantage that MLAA can be applied to any image, without any need for game developers to do something. You could take a screenshot of a spreadsheet and apply MLAA to it, for example. This also means that MLAA can be combined with any other type of anti-aliasing that you can use. For example, you could play a game and have MSAA on while computing the initial game image, then apply MLAA to that image before it gets displayed on the screen.
There is one huge problem with MLAA, though. Sometimes jagged edges really should be jagged. Applying anti-aliasing to text makes the text look blurry and looks really bad. When an AMD card takes the completed image and applies MLAA to it, it doesn't know what parts of the image come from 3D renders and need to be anti-aliased, and what parts are the HUD (e.g., your skillbar) and done in 2D and pasted on top of what it rendered in 3D. The latter should not have MLAA applied to it, but the video card can't tell the difference. I hope that in the future there will be a way for a game to say, this part of the picture is done in 2D, so don't apply MLAA to it. AMD might well be lobbying for the inclusion of that in DirectX 12.
MLAA is very shader-heavy, and AMD cards have far superior shader power to otherwise compable Nvidia cards. MLAA isn't offered on Nvidia cards (yet), and even if it were, the cards wouldn't be very good at it.
The final anti-aliasing method worth mentioning is CSAA. Nvidia cards have offered CSAA for years, and AMD just now started offering it (under the name EQAA) on new cards. Let's say that you have a boundary between two triangles, one of which is solid white and the other solid black. If you use MSAA to take several samples for pixel, the ones in one triangle will be white, and the ones in the other triangle will be black. The pixel will be a shade of gray determined by the ratio of pixels in each triangle.
What CSAA does is to say, let's only take a few samples that get the exact color, but take several more that ask only what triangle the point is in. The latter type of sample is much less computationally expensive than the former. What Nvidia calls 16x AA is really 4 real samples plus 12 coverage samples that only ask what triangle the point is in. It then assumes that each point that it doesn't check the color on is the same color as the other points in the same triangle for which it does check the color.
In the case of solid color triangles, this can work really well. The only time that this sort of 16x CSAA would be any different from a true 16x MSAA is when all four color samples are in the same triangle, but one or more of the coverage samples are in a different one.
If the textures are messy, however, this 16x CSAA doesn't necessarily look any better than mere 4x MSAA, but is more computationally expensive. I'm sure there are pathological cases where Nvidia's 16x CSAA will actually look worse than 4x MSAA, but those aren't representative of real world usage.
The superior geometry performance of the recent architectures for GeForce cards mean that they're better at CSAA than Radeon cards. The Radeon HD 6950 and 6970 are the only AMD cards that even offer CSAA, though I'd expect that future cards of the same and future architectures will offer it as well.
holey crap O_o thanks.
Nice post Quizzical. Isn't the Adaptive MSAA option in Catalyst ATI's version of Transparency anti-aliasing?
In order for Anti-Alliasing to work, blurring occurs in the image naturally. Anisotropic filtering is what makes the image quality a lot better. If you are playing competitively, forget Anti Alliasing altogether....Otherwise, on 1920x1080, going above 4xAA actually results in more performance burning and over-bluring of an image. SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing, but that is an amazing technology actually for image quality.
Super Sampling is taking your game and rendering it at double the resolution, then scaling the image down for your monitor, the curves look more correct because its smaller and finer, harder for the eyes to look for the aliasing, but if you look hard enough, its still there....but not as noticable.
You have not provided a compelling reason for AA to be a disadvantage in competitive play. The jagged edges of non-AA are actually less representative of the hit boxes than the smoothed edges of AA, jaggies do not equal accuracy to the internal geometry of the scene. The scene setup has much finer granularity than can be represented on a monitor, and a jagged line is representing a polygon outside the dimensions it actually occupies. Anti-aliasing actually removes false data from the scene by reducing its contrast to the nearby sampled colors.
As you can see in the visual aid I just made in Paint :P
The black line represents the jagged edge of non-AA occupying large pixels, the yellow line represents the actual geometry of the scene the game would be using for hit detection (if it even uses the actual geometry for it)
AA actually improves image fidelity even though it may sound detrimental when you throw around words like 'blur'. This helps when enemies are behind heavy foliage where jaggies can obscure neighboring pixels or at long distances where few pixels are available to represent objects. Though if you are playing a very graphically limited game like Quake it's not going to matter because there is nothing to obscure player's pixels.
I'm not sure what adaptive AA is. If it is just transparency anti-aliasing, then that's a terrible name for it.
"SuperSampling Should not be called "Anti Alliasing" as it has nothing to do with eliminating aliasing"
SSAA takes what a game does for anti-aliasing and does it to the entire scene instead of only selected portions. That sure sounds like anti-aliasing to me.
"If you are playing competitively,"
Playing competitively? Bah. It's a game.
If the only thing that mattered was the highest possible frame rates, we'd all turn most graphical settings off because all they do is make the game look nicer. I have higher standards on frame rates than most people, but even I think that would be ridiculous.
SSAA simply takes an image at double the resolution and downsamples it to your monitor. This is not an "anti-alliasing" method because it does nothing to solve the "exhaustion" problem (formal mathematics)*, it just makes it smaller to the eye (by four times) to try to make the eye not recognize it. SSAA uses four times the space, while the processing itself is O( n )^x (Expoential growth function) while its optimization reduces the exponential growth time by a lot.
AMD made MLAA the first 100% full screen anti-alliasing through the 6XXX series, all others prior AA forms were partial. It suffered problems due to it being first generation but it had promise. I can't wait for it to be perfected.
As far as gaming goes, you either are competitive or you arent. The demands are different as well as the settings. Beauty is we can go either way without fault and show maturity. Sometimes you really want to win, other times one wants to shoot the breeze.
Personally, I hate AA. It makes things blurry and I prefer the detail. I recommend none. AF on the other hand... Crank that up all the way. No modern high end video card will have trouble with AF on maximum.
That is so wildly wrong as to be absurd.
When the big O notation is used, most commonly n is the variable. The variable can be something else (or there could be more than one), as you really just use whatever variable(s) you used in the algorithm for which you're trying to evaluate the computational complexity. But you definitely don't use it when you don't already have some variable that you're talking about.
O(n) is a set of functions. More precisely, a function f(n) is in O(n) if there are real numbers M and N such that f(n) < Mn for all n > N. For either of the inequalities, it doesn't matter if the inequality is strict or not. If you want to allow f(n) to be negative, then you may want an absolute value; often f(n) is counting something, and nonnegative by definition.
That seems like a goofy definition at first glance, and it is a fairly subtle concept, but it makes a lot more sense if you've got a real analysis background. What it's really saying is that the lim sup of f(n)/n exists and is finite. That is, f(n)/n may be messy, but is bounded above for sufficiently large n, rather than going off to infinity.
But back to the point. O(n) is a set. And you're trying to take a set to the x power. You can take a Cartesian product of a set with itself x times if x is an integer. If x is not an integer, I'm not sure if it's possible to make sense of it. But even if it is, taking a Cartesian product of a set of functions with itself some arbitrary number of times is just a goofy thing to do.
The notation could make sense if you make it O(n^x) instead. If you plug in a positive integer for x, this is a set of functions that doesn't grow faster than some polynomial. In particular, if f(n) is in O(n^x) for some integer x, then f(n) is not exponential in n unless the base is at most 1. And if the base is at most 1, then it's not growing at all.
If you use nx SSAA (where n is some number), then the number of samples as a function of n is in O(n). The amount of work done to compute such a scene is in O(n). And in particular, it's not exponential in n.
If you read this far, congratulations.
So basically, if you can't defend your claims on their own merits, you go for personal attacks at whoever points out that you're wrong?
But this is ridiculously off topic. So I'm going to conclude by answering the original question with: it depends on the game you're playing.
I don't think he claimed to be. But he clearly has a very strong background in it.
Actually, I think Quizzical IS a mathematician. He had noted in another thread once that that's where his background was, and I assume he meant his formal education.