Saturday, October 15, 2011

Anti Aliasing, GPU Murder, Justified, or No?

Anti-Aliasing, multi-sampling, 8x, 16x? Analytical? FSAA? What the hell is this shit, why can't I max it, and what am I looking for that makes it worth turning my GPU into an EZ-Bake oven? It can be kind of annoying when you're beating your framerate into the floor for something that's pretty hard to spot specifically unless you know what to look for.

Lucky for you, you don't need to look any further to find at least some of the answers. Simply put, AA is designed to make angled line on square pixels not look like a staircase. Since each pixel can only have one RGB value at any given time, you can't just have it be half and half with two different colors. The way Anti-aliasing fixes this, roughly, is to take raw texture data for some or all of the pixels involved, and kind of blend the colors in a way that lends itself to both sides of the line, making the transition appear smooth.

One of the reasons this is difficult to spot for a lot of people is pixel density. Most displays have fairly high density, which basically just refers to how much area the pixels are packed into. 22" 1920x1080 has the same number of pixels to play with as 30" 1920x1080. That means that the bigger your display, the larger the pixel, which can exaggerate the staircase shape on angled lines.

The fun thing here, is that the better your pixel density, the less you need AA. You might need 4x or so at 22" 1080p, but at 32", you should be trying to max it for lines that look similarly smooth. The different varieties and multiples involved, like 8xMSAA, or 4xSSAA are just different methods that can be used to determine what the resulting pixel data will be.

0xAA
4xaa
8xAA 




















These three images were originally taken in 1080p from Dragon Age 2, DX11, Highest quality everything, with the only thing changing in each shot being the level of Anti-Aliasing, as seen in the captions.

Now that I've explained anti-aliasing, spotting it should be easy, right?

 I'm guessing a lot of you out there actually have to work really hard to spot it, right? Like I said, the smaller the pixel, the smaller the square shape, the smoother the line, even without anti-aliasing.

On my 32" TV, I can barely see the difference between 4x and 8x at desk viewing distance.


 So, now that we've gone over exactly how big of a difference it makes, you can see that trying to max this may not really matter for your overall gaming experience. Given the massive GPU horsepower needed for high AA with decent quality textures, it can be an expensive prospect to truly max every game that comes out.

But, if you can live without something you can't see or can barely see, you can generally get by with a lot less power. The exact difference will vary a good bit from game to game, but frequently, if you're getting slightly jerky FPS at 8xAA, dropping to 4xAA will get you fairly smooth, in my experience. That's a big performance difference for a barely visible change.

Now obviously, this is one of those things where everyone needs to draw their own conclusions. I'll keep using insane PCs, because getting performance out of them is half the fun. But for people wanting performance on a budget, check the AA used in benchmarks, and you might just find that you'll be ok with a card that doesn't look quite as pretty if all you see is how long the bars are.

2 comments:

  1. Perhaps you could also do a post on the different types of AA?

    ReplyDelete
  2. It's in the works. I'm trying to keep subjects separate mostly. Glad you liked it though.

    ReplyDelete