Showing posts with label graphics. Show all posts
Showing posts with label graphics. Show all posts

Monday, November 14, 2011

Beta Testers, the Bane of Gamers.

So this weekend I got to spend some time on the Star Wars Old Republic beta. Obviously, it's still under NDA, so I'm going to stick with discussing things that are officially released common knowledge, plus the testers themselves.

Fact: a lot of people testing for Bioware don't have a clue how this works. The game releases in what, a month and change? So why would you suggest changes to the underlying structure of major systems, like space combat? I'm pretty sure there might have been a design meeting at some point where they discussed the merits of rail shooter vs open space combat. Do I wish it was more like Tie Fighter and X-Wing? Yes, of course, I loved those games. Do I think it's going to look like that in 40 days when it's nothing like that now? No, I don't know where to find that good of drugs. Stick to useful feedback.

Second: I don't care that you liked KOTOR. SW:TOR is not KOTOR. Doesn't matter that it's Bioware with a lot of similarities. If it was KOTOR 3, they probably would have named it KOTOR 3. It's not a single player game. It won't be a single player game. And come on people, grow a damn brain stem already. MMO does NOT mean WoW clone. Get a damn clue. It's a type of multiplayer experience. Nobody asking for them not to fuck it up by turning it into a single player game where we have to pay the upkeep of the DRM servers, which is what the KOTOR fanboys are asking for wants this game to be WoW. We just want the MMO to have the MM in it. It's easier to ignore other people in an MMO setting than it is to get an MMO experience in a single player game.

Third: Running from point A to point B won't kill you. You don't need to start off with a speeder bike. Hell, try doing some of the quests, you might even like the game if you do the content that isn't the storyline. Oh, but wait, this is a single player game with inconvenient other people in it.

And here's a big one, folks. Quit trying to give technical feedback if your PC makes the original Game Boy look hot. I literally saw one idiot talking about the "amazing" graphics. After he upgraded to a graphics card that let him play in native resolution. That's right, folks. He's used to ever game looking like Wolfenstein 3D, so he must know what good graphics are when he sees them! The graphics, as is, are objectively crap, and the performance was pretty bad on those graphics. Yes, I've seen worse, and it didn't detract too much from the gameplay, but that doesn't mean it performed well or looked all that pretty. Don't give feedback above your level of ignorance. You say opinion, I say you're too stupid to recognize an objective fact if it bit you on the ass.

If this is the sort of feedback designers get from their betas, and the sort of people they inevitably listen to, it's hard to blame the devs for the plethora of mediocre games lately. Let's blame the twelve year old kids who haven't played anything besides WoW, on their dad's old graphing calculators. They're destroying games for the rest of us. Don't stand for it. Sign up for betas, give good feedback, make your voices heard.

And devs, please, please, please start instituting some form of testing for potential beta testers. Pick a demographic, get some baseline knowledge levels on certain subjects for that demographic, and quiz people. It won't work completely, but it should weed out the worst batch of idiots, they can't use Google.

Monday, November 7, 2011

EA, or, the Charmin Jock Strap of the Tech "Support" World.

Ok, EA. I get it. Big company, just released a new game, and your online distribution platform takes more work to push than heroin in Salt Lake City. But come on. The outsourcing couldn't be more obvious if the phone got answered "Thanks for calling the International House of Curry...". The first person you talk to knows roughly enough about computers to know they require electricity. The script following is as blatant as a first grade school play.

Maybe if you had the common sense to not release Origin until it actually works as a Steam ripoff, things wouldn't be so rough right now. Yesterday, one of your "techs" was so moronic I asked if I could talk to somebody who understood third grade English and knew at least as much about computers as me. He didn't know enough English to be offended. I'd rather not be right about that, guys.

So, since I'm sure you're actively ignoring everything coming through normal channels as much as you're passively ignoring this blog, I'll tell you about how broken your stupid shit is here, where there's probably a higher chance of you hearing about it than from your bottom tier, $0.05/hr people.

For starters: There is absolutely zero excuse for requiring me to have Origin AND a browser open at the same damn time to play a game. In fact, this level of incompetence makes the shit in Dilbert sound sane. It was clearly invented by some jackass in upper management who thinks Minesweeper is a hardcore, competitive game.

Next up, we have the fact that Origin doesn't set it's default download path into the drive it's on. In this day and age, that's beyond mandatory, and into "someone should get slapped across the face with their pink slip for getting this wrong" territory. And if you're really going to screw that up, there shouldn't be a bug in the options menu that can make it repeatedly NOT change the filepath to the one the user designates, but indicate it is changed when you hit the install button.

Maybe these issues don't seem huge, but when you can't talk to somebody with a multiple digit IQ score about them, it starts to be a serious problem. In fact, you corporate morons should stop rolling in the money from releasing "Madden Clone Whatever Year" with zero technical changes, and hire a consultant who's actually played a video game in their life to tell you how you're being idiots.

So, on to today's tech support joys. I'm having a stupid software conflict where for some reason various third party voice chat apps don't want to work with Battlefield 3. This game kind of revolves around teamwork in the multiplayer modes, if you didn't know. This is an intolerable problem. Luckily, the guy I talk to comprehends the concept of "Escalate me to someone who knows more about computers than the monkey you evolved from". So he asks me for a DxDiag dump, copy/pasted into the live chat, and disappears, supposedly to escalate me I guess...

Sorry, guy, but your silly little input limit on the live chat means I'd be copy pasting and digging through for the spot I left off for about an hour. How about you take your Ctrl-c/Ctrl-v and cram it, and give me a way to upload that massive wall of text. And while you're at it, can I maybe be escalated in a rational amount of time? I've been waiting over an hour now. Last I heard was... "Amresh: Just copy and paste it ." That was an hour and a half ago at this point. Maybe at least a confirmation I'm actually waiting on a human that didn't fail the turing test? I know there's someone above you who isn't busy, because everyone else who contacted your support already mercykilled themselves after the 73rd bash of their face against the brick wall you call "Service". Cheers. Die in a fire.

Wednesday, October 26, 2011

PC World, Bad Article, Bad Benching.

Sorry, PC World, I know you're a big mainstream magazine and website, and I'm just a little independent blogger, but this article of yours blows. The opening discussion of SLI, Eyefinity, and multi-display support has more holes in the than my spaghetti strainer. Call it nit-picking if you want, I call it writing a halfway intelligent article. I'm insulted that you get paid to produce this pathetic, mindless drivel.

My two year old son does better fact checking than you. He knows damn well whether his cup has juice, water, or milk in it, and he can tell the difference between cereal, cookies, and cake.

Where to begin? How about the part where you say you need SLI to run more than two displays with Nvidia cards? How about just needing two graphics cards. SLI is running multiple identical graphics cards together for a 3D performance boost. You can run two dissimilar cards together, not linked with a bridge, and use the extra outputs.

But hey, enough about the fun details. How about this fact. I'm the next best thing to an EVGA fanboy. The only thing I'm missing is the compulsion to suggest their products when they'd be inappropriate, or otherwise not be objective about things like price/performance. That means I invariably have Nvidia cards. And I STILL say your benchmarks are over the top biased, moronic, pathetic, idiotic, and otherwise pure and utter garbage.

Crysis 2? Dirt? A grand total of two games and a couple of synthetic benches? What the hell are you smoking? I would be embarrassed to call that research. And that's completely ignoring the part where you picked games that nvidia tends to do well on. And the part where you benched the 560 Ti against a 6870, when a 6950 is much closer to the same price point. HD 6950 goes pretty much blow for blow against a 560 Ti.

Which card do I personally prefer? 560 Ti, definitely. But your methodology is trash. Of course, you turn around and show your bipartisan idiocy with declaring the HD 6970 superior to the GTX 570 based on features. If you're running multi-display gaming, whether it's 3DSurround or Eyefinity, you probably don't want 5760x1080 resolution running at low details, or at 3 frames per decade. Which is what's going to happen if you run single card Eyefinity. Maybe with iPhone's for the displays.

Last but not least... GTX 570 vs HD 6970 is somehow the "Mainstream gamer" market? That's hysterical. Last I checked, mainstream is usually... mainstream. The best gaming hardware census out there, Steam, shows those two cards combining to make up roughly 5% of DX11 capable GPUs. That, of course, is ignoring all the DX9 and DX10 cards still in use since console ports have killed the hardware industry. GTX 570 makes up .99% of all GPUs, 6970 falls under "Other", so who knows how much they make up. Last time I checked, that's not mainstream.

Article is laughable, guys. But at least it doesn't sound like advertising, since it's stupid all around. Or if it does, it's attempting to advertise direct competitors in the same article. Which would be in keeping with the quality you showed.

Wednesday, October 19, 2011

Sampling Different Types of Anti Aliasing

Anyone who pays much attention to settings that can be forced in drivers or games has probably noticed by now that there's a whole lot of different types of Anti-Aliasing. The main two things people know about it though, are that it murders GPU and VRAM, and that it makes things look smoother. The method behind the madness, however, is generally regarded with the same apprehension usually reserved for Voodoo rituals.

In a recent post, I discussed the benefits of Anti-Aliasing and how they apply to the average gamer. That doesn't change the fact that there are a whole helluva heap of acronyms, and apparently some of them are easier than others, and they theoretically all do the same thing, in different ways, using different amounts of resources, and for some reason, we always refer to it in multiples, like x4, x8, and so on.

Just as a quick recap, Aliasing is what we call it when a line that isn't directly vertical or horizontal is depicted in pixels, and gets a little staircase effect. Anti-Aliasing is just there to make your eye think that isn't happening to things look pretty.

One of the earliest forms of AA in gaming was SuperSampling AA, or SSAA. It just happened to be a bit too brutal for the graphics cards of the day, and got phased out for a while, but is making a comeback now. Supersampling basically means rendering the scene at a higher resolution so that each pixel you'll see is composed of more pixels. These pixels then get blended based on various algorithms, which were either determined by throwing darts or by someone way smarter than me. There's also adaptive supersampling, which mostly seems to involve a combination of witchcraft and tarot to determine which pixels actually need to be full supersampled, which means your GPU takes longer to explode trying to do all that work.

FSAA, or Full Scene AntiAliasing, is just another name for Supersampling, since they needed something new to call it to not scare the pants off of people who watched SSAA turn games into slideshows back in the day.

MultiSampling AntiAliasing, or MSAA, one of the versions we see more often, is essentially a refinement of SSAA that uses less GPU horsepower by only sampling certain portions of textures and polygons, based on depth and location in the scene. The best I've managed to understand the specifics imply some sort of mathematical formla involving the cosine of the square root of negative infinity minus pi. Or some such nonsense. Basically, it isn't quite as pretty, does part of the same job, and beats less of the shit out of your graphics card. Got it? Good, now help me figure it out, it gets more confusing every time I try to understand it.

Of course, there's still one thing we haven't covered. Where the hell does the x4, x8, etc. come from? Well, roughly, that tells it how many "samples" you want rendered for pixels that it decides need samples rendered for. Then it promptly goes back to the roulette wheel to decide which pixels to make prettier, and hey presto, it automagically looks better!

I hope this has been either educational or entertaining, if not go back and re-read the parts that confused you, (paragraphs 1-7?) while I go take a tylenol.

Saturday, October 15, 2011

Anti Aliasing, GPU Murder, Justified, or No?

Anti-Aliasing, multi-sampling, 8x, 16x? Analytical? FSAA? What the hell is this shit, why can't I max it, and what am I looking for that makes it worth turning my GPU into an EZ-Bake oven? It can be kind of annoying when you're beating your framerate into the floor for something that's pretty hard to spot specifically unless you know what to look for.

Lucky for you, you don't need to look any further to find at least some of the answers. Simply put, AA is designed to make angled line on square pixels not look like a staircase. Since each pixel can only have one RGB value at any given time, you can't just have it be half and half with two different colors. The way Anti-aliasing fixes this, roughly, is to take raw texture data for some or all of the pixels involved, and kind of blend the colors in a way that lends itself to both sides of the line, making the transition appear smooth.

One of the reasons this is difficult to spot for a lot of people is pixel density. Most displays have fairly high density, which basically just refers to how much area the pixels are packed into. 22" 1920x1080 has the same number of pixels to play with as 30" 1920x1080. That means that the bigger your display, the larger the pixel, which can exaggerate the staircase shape on angled lines.

The fun thing here, is that the better your pixel density, the less you need AA. You might need 4x or so at 22" 1080p, but at 32", you should be trying to max it for lines that look similarly smooth. The different varieties and multiples involved, like 8xMSAA, or 4xSSAA are just different methods that can be used to determine what the resulting pixel data will be.

0xAA
4xaa
8xAA 




















These three images were originally taken in 1080p from Dragon Age 2, DX11, Highest quality everything, with the only thing changing in each shot being the level of Anti-Aliasing, as seen in the captions.

Now that I've explained anti-aliasing, spotting it should be easy, right?

 I'm guessing a lot of you out there actually have to work really hard to spot it, right? Like I said, the smaller the pixel, the smaller the square shape, the smoother the line, even without anti-aliasing.

On my 32" TV, I can barely see the difference between 4x and 8x at desk viewing distance.


 So, now that we've gone over exactly how big of a difference it makes, you can see that trying to max this may not really matter for your overall gaming experience. Given the massive GPU horsepower needed for high AA with decent quality textures, it can be an expensive prospect to truly max every game that comes out.

But, if you can live without something you can't see or can barely see, you can generally get by with a lot less power. The exact difference will vary a good bit from game to game, but frequently, if you're getting slightly jerky FPS at 8xAA, dropping to 4xAA will get you fairly smooth, in my experience. That's a big performance difference for a barely visible change.

Now obviously, this is one of those things where everyone needs to draw their own conclusions. I'll keep using insane PCs, because getting performance out of them is half the fun. But for people wanting performance on a budget, check the AA used in benchmarks, and you might just find that you'll be ok with a card that doesn't look quite as pretty if all you see is how long the bars are.

Practical DX11, What It Does, and What It Means

As we all know, the current generation of Microsoft's ubiquitous and ambiguous DirectX runtime is DX11. Just to make that sound less like techno-babble designed to keep software engineers in a job, DirectX is the overall software package for handling graphics in Windows. (There's also some sound stuff, but that's outside the scope of this article.) Since DirectX includes features for 3D gaming, it can be kind of important to gamers and enthusiasts to have hardware supporting at least recent versions.

Most games right now, mind you, only require DX9 to play, due to the red-headed step brother, consoles. DX10, well, yeah, that got released at some point, and did some stuff, and nobody really gave a shit. DX11, on the other hand, has this wonky tessellation, which mostly sounds like it fell out of a science fiction novel, possibly as some really painful way to die.

Basically, to explain the portions of tessellation that really made sense beyond "ooh shiny" without two or three different degrees in graphics design, software, and who knows what else, it's just a different way of representing shapes. Tessellation, roughly, lets you break shapes down into smaller shapes. The primary use for it is for smoothing out certain kinds of detail.

Roughly, tessellation breaks shapes down into tiny triangles, letting you make what's called a displacement map. To make this make sense, do you remember the little plexiglass box toy with a whole lot of pins that could slide up and down, so when you set it on top of something, from the top it would be a 3-D representation of the object?

Roughly, this is what the displacement mapping does in 3-D rendering. Not specifically, but it lets you take a shape, and break it down into smaller shapes that can be worked with like this. That way, as you get closer to something, it can render it's way toward a model, instead of trying to render all the models and textures in full detail all the way out to max viewing distance.

I hope this makes some sense, this is all sounding smarter in my head than it looks once I type it, but I can't decide if lack of caffeine is affecting my reading or my writing. Maybe I'm just crazy. Or all of the above.




Don't get me wrong, this is a very basic and generalized explanation, which by nature will be somewhat wonky. If you want to read the fancy version, you can check it out here.

Ok, so, here's the question, what's this mean for me? Smoother textures and better poly mapping, making things look better all around, for starters. Pretty simple, and nice. The second major thing you'll see out of it is textures being able to look good close in without soaking up a ton of GPU resources at a long distance. No, this doesn't sound huge, but what it does is allow an increased view distance, since it can avoid rendering so much at a distance, meaning less stuff pops out of thin air.

I hope this has cleared this up reasonably well for some people, I know it's not that useful for some, but understanding graphics can help you understand what sort of system you personally need for the graphics quality you desire. I'm hoping to do some other similar ones over the next few days, so stay tuned.