Monday, November 14, 2011

Beta Testers, the Bane of Gamers.

So this weekend I got to spend some time on the Star Wars Old Republic beta. Obviously, it's still under NDA, so I'm going to stick with discussing things that are officially released common knowledge, plus the testers themselves.

Fact: a lot of people testing for Bioware don't have a clue how this works. The game releases in what, a month and change? So why would you suggest changes to the underlying structure of major systems, like space combat? I'm pretty sure there might have been a design meeting at some point where they discussed the merits of rail shooter vs open space combat. Do I wish it was more like Tie Fighter and X-Wing? Yes, of course, I loved those games. Do I think it's going to look like that in 40 days when it's nothing like that now? No, I don't know where to find that good of drugs. Stick to useful feedback.

Second: I don't care that you liked KOTOR. SW:TOR is not KOTOR. Doesn't matter that it's Bioware with a lot of similarities. If it was KOTOR 3, they probably would have named it KOTOR 3. It's not a single player game. It won't be a single player game. And come on people, grow a damn brain stem already. MMO does NOT mean WoW clone. Get a damn clue. It's a type of multiplayer experience. Nobody asking for them not to fuck it up by turning it into a single player game where we have to pay the upkeep of the DRM servers, which is what the KOTOR fanboys are asking for wants this game to be WoW. We just want the MMO to have the MM in it. It's easier to ignore other people in an MMO setting than it is to get an MMO experience in a single player game.

Third: Running from point A to point B won't kill you. You don't need to start off with a speeder bike. Hell, try doing some of the quests, you might even like the game if you do the content that isn't the storyline. Oh, but wait, this is a single player game with inconvenient other people in it.

And here's a big one, folks. Quit trying to give technical feedback if your PC makes the original Game Boy look hot. I literally saw one idiot talking about the "amazing" graphics. After he upgraded to a graphics card that let him play in native resolution. That's right, folks. He's used to ever game looking like Wolfenstein 3D, so he must know what good graphics are when he sees them! The graphics, as is, are objectively crap, and the performance was pretty bad on those graphics. Yes, I've seen worse, and it didn't detract too much from the gameplay, but that doesn't mean it performed well or looked all that pretty. Don't give feedback above your level of ignorance. You say opinion, I say you're too stupid to recognize an objective fact if it bit you on the ass.

If this is the sort of feedback designers get from their betas, and the sort of people they inevitably listen to, it's hard to blame the devs for the plethora of mediocre games lately. Let's blame the twelve year old kids who haven't played anything besides WoW, on their dad's old graphing calculators. They're destroying games for the rest of us. Don't stand for it. Sign up for betas, give good feedback, make your voices heard.

And devs, please, please, please start instituting some form of testing for potential beta testers. Pick a demographic, get some baseline knowledge levels on certain subjects for that demographic, and quiz people. It won't work completely, but it should weed out the worst batch of idiots, they can't use Google.

Monday, November 7, 2011

EA, or, the Charmin Jock Strap of the Tech "Support" World.

Ok, EA. I get it. Big company, just released a new game, and your online distribution platform takes more work to push than heroin in Salt Lake City. But come on. The outsourcing couldn't be more obvious if the phone got answered "Thanks for calling the International House of Curry...". The first person you talk to knows roughly enough about computers to know they require electricity. The script following is as blatant as a first grade school play.

Maybe if you had the common sense to not release Origin until it actually works as a Steam ripoff, things wouldn't be so rough right now. Yesterday, one of your "techs" was so moronic I asked if I could talk to somebody who understood third grade English and knew at least as much about computers as me. He didn't know enough English to be offended. I'd rather not be right about that, guys.

So, since I'm sure you're actively ignoring everything coming through normal channels as much as you're passively ignoring this blog, I'll tell you about how broken your stupid shit is here, where there's probably a higher chance of you hearing about it than from your bottom tier, $0.05/hr people.

For starters: There is absolutely zero excuse for requiring me to have Origin AND a browser open at the same damn time to play a game. In fact, this level of incompetence makes the shit in Dilbert sound sane. It was clearly invented by some jackass in upper management who thinks Minesweeper is a hardcore, competitive game.

Next up, we have the fact that Origin doesn't set it's default download path into the drive it's on. In this day and age, that's beyond mandatory, and into "someone should get slapped across the face with their pink slip for getting this wrong" territory. And if you're really going to screw that up, there shouldn't be a bug in the options menu that can make it repeatedly NOT change the filepath to the one the user designates, but indicate it is changed when you hit the install button.

Maybe these issues don't seem huge, but when you can't talk to somebody with a multiple digit IQ score about them, it starts to be a serious problem. In fact, you corporate morons should stop rolling in the money from releasing "Madden Clone Whatever Year" with zero technical changes, and hire a consultant who's actually played a video game in their life to tell you how you're being idiots.

So, on to today's tech support joys. I'm having a stupid software conflict where for some reason various third party voice chat apps don't want to work with Battlefield 3. This game kind of revolves around teamwork in the multiplayer modes, if you didn't know. This is an intolerable problem. Luckily, the guy I talk to comprehends the concept of "Escalate me to someone who knows more about computers than the monkey you evolved from". So he asks me for a DxDiag dump, copy/pasted into the live chat, and disappears, supposedly to escalate me I guess...

Sorry, guy, but your silly little input limit on the live chat means I'd be copy pasting and digging through for the spot I left off for about an hour. How about you take your Ctrl-c/Ctrl-v and cram it, and give me a way to upload that massive wall of text. And while you're at it, can I maybe be escalated in a rational amount of time? I've been waiting over an hour now. Last I heard was... "Amresh: Just copy and paste it ." That was an hour and a half ago at this point. Maybe at least a confirmation I'm actually waiting on a human that didn't fail the turing test? I know there's someone above you who isn't busy, because everyone else who contacted your support already mercykilled themselves after the 73rd bash of their face against the brick wall you call "Service". Cheers. Die in a fire.

Wednesday, November 2, 2011

Origin vs Steam, or, What?

Ok, EA, some tips. If you're going to pull all your games down from the best online content provider for gamers, you might want to have a viable replacement. For some weird reason, I don't consider it very good when I get ~70% of the DL speeds I do for Steam. Yeah, it's probably good that more people are looking into digital retail channels for games. Even though something like 0% of us believe that that convenience is actually any more than a weak excuse for pirates, it's still convenient.

But please, do it properly? Another thing. You should probably have a system that isn't clunky, awkward, and stupid to replace the handy Steam client. Hate to tell you this, but centralizing us on a damned web browser, on a site that loads slower than Steam pages, and even forcing us to LAUNCH from the web browser, is kind of annoying. Especially when we have to relaunch the game to switch servers.

I mean, is it really so hard for gaming's monopolistic, monolithic evil empire to hire maybe ONE whole person who actually plays games in some sort of decision making capacity? I mean, this is in the category of idiocy. The only reason people use origin for online purchase at all is because you force us to. Here's a news flash. We like Steam as a whole service, not just as a store. You didn't improve on it, and you suck.

By the way, if you try to take away my access to my games because you don't like this opinion, like you did to the one guy after DA2 came out, I promise to sue you so bad your stockholders mama feels it.

Friday, October 28, 2011

Keyboards, Had Enough, Popped my Cherry.

If that title doesn't sound like a delicious pun to you, and instead sounds dirty, A: get your mind out of the gutter, freak, and B: you clearly don't keep up with computer stuff in the neurotic way I do.

Here's some simple backstory. About 6-8 months ago, my old cheapo $20 Microsoft rubber dome keyboard died. I decided I wanted something all pretty and backlit, so I bought the Razer Lycosa Mirror. Now theoretically, all their issues (supposedly hardware issues from a bad production run) had been fixed and recalled, so the one I bought should have been fine.

So, within a couple of days, my fancy keyboard is spastically attacking me by loading a new instance of Windows Media Player roughly every 7 seconds, relegating whatever window I happened to be looking at to the background, and generally being a pain in the ass until I hard rebooted or unplugged the keyboard. Funny thing, my media player set up via the Razer software was iTunes.

I contacted Razer support. They told me "Oh, that's a hardware issue, initiate RMA." Well, it didn't seem much like a hardware issue to me, because it was easily fixed by forcing the drivers to reload, and would randomly occur at another time, whether 5 seconds or 2 days. So, I contacted them again, and told them it felt like software. After quite a bit of insanity, I managed to convince them their Lycosa software was broken, as evidenced by the fact that uninstalling it and running off of generic drivers fixed 100% of my hardware problems.

So, as the weeks go on, they keep contacting me, asking me for more information, since I'm clearly more competent than their QA/QC group, and eventually the software is actually reasonably safe to use. Of course, by this time, my audio and USB passthroughs in the keyboard had died, but hey, I'm a real geek, those are gimmicks I don't need or care about much, I get better signal quality when I'm plugged straight in anyway.

So, last night rolls around. My not so cheap keyboard, theoretically good for half a quadrillion keystrokes or some such, decides that it's time to go on strike. Specifically, my spacebar. Since I do occasionally type more than URLs, this seemed rather inconvenient. A look at the key showed me that the rather cheaply designed (read: piece of shit) spring wasn't going to function again. The little retaining clips that made the stupid thing stay attached were made out of the same plastic as those little green army men I used to play with, except less of it.

Well, that was the final straw. Something about owning an $80 "gaming" keyboard for less than a year and having more headaches than the morning after an Alcoholics Anonymous class reunion with an open bar just kind of pissed me off. So, when I drove down to my friendly neighborhood Fry's, I wasn't looking at anything Razer, and avoided all the rubber domes. I'm now fighting to get my hands retrained to work with my SteelSeries 6GV2, which uses nice mechanical Cherry Black switches.

I already love it, the return to an old-school feeling keyboard is like a slice of nostalgia that doesn't come with drawbacks like 8 bit sound and 640x480 max resolution. Granted, I'm having a few more typos than usual, but that will change soon enough.

Of course, you can't expect Razer to make a keyboard this solid. If they did, they'd be putting a perfect blunt object into the hands of the people with the most cause to want them dead: their customers. So, Razer, go soak your heads until you figure out that quality control means more than keeping half your stock in a warehouse to be able to handle the flood of RMA requests.

Wednesday, October 26, 2011

PC World, Bad Article, Bad Benching.

Sorry, PC World, I know you're a big mainstream magazine and website, and I'm just a little independent blogger, but this article of yours blows. The opening discussion of SLI, Eyefinity, and multi-display support has more holes in the than my spaghetti strainer. Call it nit-picking if you want, I call it writing a halfway intelligent article. I'm insulted that you get paid to produce this pathetic, mindless drivel.

My two year old son does better fact checking than you. He knows damn well whether his cup has juice, water, or milk in it, and he can tell the difference between cereal, cookies, and cake.

Where to begin? How about the part where you say you need SLI to run more than two displays with Nvidia cards? How about just needing two graphics cards. SLI is running multiple identical graphics cards together for a 3D performance boost. You can run two dissimilar cards together, not linked with a bridge, and use the extra outputs.

But hey, enough about the fun details. How about this fact. I'm the next best thing to an EVGA fanboy. The only thing I'm missing is the compulsion to suggest their products when they'd be inappropriate, or otherwise not be objective about things like price/performance. That means I invariably have Nvidia cards. And I STILL say your benchmarks are over the top biased, moronic, pathetic, idiotic, and otherwise pure and utter garbage.

Crysis 2? Dirt? A grand total of two games and a couple of synthetic benches? What the hell are you smoking? I would be embarrassed to call that research. And that's completely ignoring the part where you picked games that nvidia tends to do well on. And the part where you benched the 560 Ti against a 6870, when a 6950 is much closer to the same price point. HD 6950 goes pretty much blow for blow against a 560 Ti.

Which card do I personally prefer? 560 Ti, definitely. But your methodology is trash. Of course, you turn around and show your bipartisan idiocy with declaring the HD 6970 superior to the GTX 570 based on features. If you're running multi-display gaming, whether it's 3DSurround or Eyefinity, you probably don't want 5760x1080 resolution running at low details, or at 3 frames per decade. Which is what's going to happen if you run single card Eyefinity. Maybe with iPhone's for the displays.

Last but not least... GTX 570 vs HD 6970 is somehow the "Mainstream gamer" market? That's hysterical. Last I checked, mainstream is usually... mainstream. The best gaming hardware census out there, Steam, shows those two cards combining to make up roughly 5% of DX11 capable GPUs. That, of course, is ignoring all the DX9 and DX10 cards still in use since console ports have killed the hardware industry. GTX 570 makes up .99% of all GPUs, 6970 falls under "Other", so who knows how much they make up. Last time I checked, that's not mainstream.

Article is laughable, guys. But at least it doesn't sound like advertising, since it's stupid all around. Or if it does, it's attempting to advertise direct competitors in the same article. Which would be in keeping with the quality you showed.

Rainmeter, Customize your Desktop.

If you spend much time at your computer, you probably occasionally wish there was a way to stick certain things on your desktop so you could keep an eye on them while you do something else. If you're like me, you've noticed that Microsoft desktop gadgets are invariably ugly, and are massive RAM whores, like everything else they've ever made. Gadgets are also violently limited by the fact that they suck worse than a 10 cent hooker.

So how can you fix this? Well, the good people over at www.rainmeter.net have a solution for you. It's a free desktop scripting program. It runs fairly lightly in the background, can come with tons of default themes, and can be modified heavily. Track anything from your system resource usage, to your Google Calendar, your webmail inbox. Have a clock, a regular calendar, notes, and an RSS reader and media player controller. All on your desktop. You can set it up so you can click through to your icons, you can make it hide on mouseover, you can arrange it all to your liking, and you can make your own skins if you can't find fifteen or twenty to do what you want.

It's not really all that complicated to use, whether you just want basic download plug and play stuff, or if you want to tinker. Plug and play is done with an installer, and it even has a self extractor that can work on skins and themes that are set up properly.

If you want to tinker, you open up your skins folder, make a new text document, save it as a .ini file, and edit away. There's tons of instructions available, and with all the skins already out there, there's tons of reference material available. Yesterday I modded a skin rather heavily to fit in with my new desktop background of a sexy EVGA motherboard. Today, I'm making a better CPU monitor for it.

With all my stuff running, with all the monitors that have to update quickly, my Rainmeter is only using 18MB of RAM. In this day and age, that's nothing. Literally. So get it, try it, and play with it. If you don't like it, you don't have to use it, but there's no excuse for not seeing what it can do for you.

Here  is a link to the skin I customized yesterday if you want to look. Try it, have fun with it. (Note, requires rainmeter to work. Otherwise it's just a bunch of plaintext files.)

Here  is a link to the self extracting Rainstaller file for the skin pack.

 This pretty picture displays it without needing rainmeter.

Wednesday, October 19, 2011

Sampling Different Types of Anti Aliasing

Anyone who pays much attention to settings that can be forced in drivers or games has probably noticed by now that there's a whole lot of different types of Anti-Aliasing. The main two things people know about it though, are that it murders GPU and VRAM, and that it makes things look smoother. The method behind the madness, however, is generally regarded with the same apprehension usually reserved for Voodoo rituals.

In a recent post, I discussed the benefits of Anti-Aliasing and how they apply to the average gamer. That doesn't change the fact that there are a whole helluva heap of acronyms, and apparently some of them are easier than others, and they theoretically all do the same thing, in different ways, using different amounts of resources, and for some reason, we always refer to it in multiples, like x4, x8, and so on.

Just as a quick recap, Aliasing is what we call it when a line that isn't directly vertical or horizontal is depicted in pixels, and gets a little staircase effect. Anti-Aliasing is just there to make your eye think that isn't happening to things look pretty.

One of the earliest forms of AA in gaming was SuperSampling AA, or SSAA. It just happened to be a bit too brutal for the graphics cards of the day, and got phased out for a while, but is making a comeback now. Supersampling basically means rendering the scene at a higher resolution so that each pixel you'll see is composed of more pixels. These pixels then get blended based on various algorithms, which were either determined by throwing darts or by someone way smarter than me. There's also adaptive supersampling, which mostly seems to involve a combination of witchcraft and tarot to determine which pixels actually need to be full supersampled, which means your GPU takes longer to explode trying to do all that work.

FSAA, or Full Scene AntiAliasing, is just another name for Supersampling, since they needed something new to call it to not scare the pants off of people who watched SSAA turn games into slideshows back in the day.

MultiSampling AntiAliasing, or MSAA, one of the versions we see more often, is essentially a refinement of SSAA that uses less GPU horsepower by only sampling certain portions of textures and polygons, based on depth and location in the scene. The best I've managed to understand the specifics imply some sort of mathematical formla involving the cosine of the square root of negative infinity minus pi. Or some such nonsense. Basically, it isn't quite as pretty, does part of the same job, and beats less of the shit out of your graphics card. Got it? Good, now help me figure it out, it gets more confusing every time I try to understand it.

Of course, there's still one thing we haven't covered. Where the hell does the x4, x8, etc. come from? Well, roughly, that tells it how many "samples" you want rendered for pixels that it decides need samples rendered for. Then it promptly goes back to the roulette wheel to decide which pixels to make prettier, and hey presto, it automagically looks better!

I hope this has been either educational or entertaining, if not go back and re-read the parts that confused you, (paragraphs 1-7?) while I go take a tylenol.

Tuesday, October 18, 2011

Anatomy of an E-Sport

E-Sports are starting to be something that people occasionally hear about. Big international gaming tournaments, with prizes of thousands, or tens of thousands of dollars. That's a lot of damn money for a video game. Especially if you're a top player, salaried, on a team, we ain't in momma's basement no more.

Just a couple of days ago, I was sitting in a sports bar, watching the finals of Major League Gaming's Orlando event. Dozens of other people there, cheering for some awesome Starcraft 2 players, having a couple of beers, yelling, big screen TV's, the works. This is getting big, and with the increasing emphasis of technology in the modern world, it's not going anywhere.

So what makes for a game that can be an E-Sport? We have various games to look at, Halo, Call of Duty, League of Legends, DOTA, HoN, Starcraft 2. Not all of these games seem to have much in common. From the shooters, to the real time strategy, to the Multiplayer Online Battle Arena.

The first factor they need, obviously, is a means of direct competition. A means of pitting people against each other, rather than just the game. This lets you see whose tactics, strategy, mechanics, and game knowledge is actually superior in a tangible sort of way. Team play isn't required, but it does add some depth to some games.

Next, they need little to no random factors. People don't watch professional Yahtzee, for some weird reason. Might have something to do with the fact that random chance can have too much of an adverse effect on skill. I still don't have a clue why people enjoy watching poker tournaments. Sure, there's some cool dynamics, but when the best player can be crushed by the vagaries of fate, it's kind of detrimental to enjoyment, at least for me.

Another important notion is a high skill cap. If it's easy to keep pace and do everything, it takes challenge out. This is why certain games, particularly fighters and shooters, will occasionally put limits on what is allowed in competitive play, whether it's disallowing specific weapons, characters, or anything else. This is there so that something that requires demonstrably less skill to be effective doesn't skew the competition by forcing everyone to use to stupid overpowered stuff to be competitive at all.

Also important is a certain amount of tension. There needs to be some sort of edge, a palpable means of pressure building on the players. American football has third and long, or fourth and inches, field goals, and other moments where one exceptional play can make or break a team. If the game is capable of hanging by a thread, or balancing on a narrow ledge, where it can go either way at any second, it draws in attention.

Finally, for an E-Sport to be successful, there needs to be some sort of exterior community. People gathering, whether it's online or in sports bars, at tournaments or in houses, there needs to be sufficient gatherings of people to draw the sponsors. Without sponsors, you don't have player salaries, you don't have big tournament payoffs.

There may be other factors you can think of, but this is, I think, the true essence of an E-Sport. Just about anything that combines these factors is likely to succeed.

Saturday, October 15, 2011

Anti Aliasing, GPU Murder, Justified, or No?

Anti-Aliasing, multi-sampling, 8x, 16x? Analytical? FSAA? What the hell is this shit, why can't I max it, and what am I looking for that makes it worth turning my GPU into an EZ-Bake oven? It can be kind of annoying when you're beating your framerate into the floor for something that's pretty hard to spot specifically unless you know what to look for.

Lucky for you, you don't need to look any further to find at least some of the answers. Simply put, AA is designed to make angled line on square pixels not look like a staircase. Since each pixel can only have one RGB value at any given time, you can't just have it be half and half with two different colors. The way Anti-aliasing fixes this, roughly, is to take raw texture data for some or all of the pixels involved, and kind of blend the colors in a way that lends itself to both sides of the line, making the transition appear smooth.

One of the reasons this is difficult to spot for a lot of people is pixel density. Most displays have fairly high density, which basically just refers to how much area the pixels are packed into. 22" 1920x1080 has the same number of pixels to play with as 30" 1920x1080. That means that the bigger your display, the larger the pixel, which can exaggerate the staircase shape on angled lines.

The fun thing here, is that the better your pixel density, the less you need AA. You might need 4x or so at 22" 1080p, but at 32", you should be trying to max it for lines that look similarly smooth. The different varieties and multiples involved, like 8xMSAA, or 4xSSAA are just different methods that can be used to determine what the resulting pixel data will be.

0xAA
4xaa
8xAA 




















These three images were originally taken in 1080p from Dragon Age 2, DX11, Highest quality everything, with the only thing changing in each shot being the level of Anti-Aliasing, as seen in the captions.

Now that I've explained anti-aliasing, spotting it should be easy, right?

 I'm guessing a lot of you out there actually have to work really hard to spot it, right? Like I said, the smaller the pixel, the smaller the square shape, the smoother the line, even without anti-aliasing.

On my 32" TV, I can barely see the difference between 4x and 8x at desk viewing distance.


 So, now that we've gone over exactly how big of a difference it makes, you can see that trying to max this may not really matter for your overall gaming experience. Given the massive GPU horsepower needed for high AA with decent quality textures, it can be an expensive prospect to truly max every game that comes out.

But, if you can live without something you can't see or can barely see, you can generally get by with a lot less power. The exact difference will vary a good bit from game to game, but frequently, if you're getting slightly jerky FPS at 8xAA, dropping to 4xAA will get you fairly smooth, in my experience. That's a big performance difference for a barely visible change.

Now obviously, this is one of those things where everyone needs to draw their own conclusions. I'll keep using insane PCs, because getting performance out of them is half the fun. But for people wanting performance on a budget, check the AA used in benchmarks, and you might just find that you'll be ok with a card that doesn't look quite as pretty if all you see is how long the bars are.

Practical DX11, What It Does, and What It Means

As we all know, the current generation of Microsoft's ubiquitous and ambiguous DirectX runtime is DX11. Just to make that sound less like techno-babble designed to keep software engineers in a job, DirectX is the overall software package for handling graphics in Windows. (There's also some sound stuff, but that's outside the scope of this article.) Since DirectX includes features for 3D gaming, it can be kind of important to gamers and enthusiasts to have hardware supporting at least recent versions.

Most games right now, mind you, only require DX9 to play, due to the red-headed step brother, consoles. DX10, well, yeah, that got released at some point, and did some stuff, and nobody really gave a shit. DX11, on the other hand, has this wonky tessellation, which mostly sounds like it fell out of a science fiction novel, possibly as some really painful way to die.

Basically, to explain the portions of tessellation that really made sense beyond "ooh shiny" without two or three different degrees in graphics design, software, and who knows what else, it's just a different way of representing shapes. Tessellation, roughly, lets you break shapes down into smaller shapes. The primary use for it is for smoothing out certain kinds of detail.

Roughly, tessellation breaks shapes down into tiny triangles, letting you make what's called a displacement map. To make this make sense, do you remember the little plexiglass box toy with a whole lot of pins that could slide up and down, so when you set it on top of something, from the top it would be a 3-D representation of the object?

Roughly, this is what the displacement mapping does in 3-D rendering. Not specifically, but it lets you take a shape, and break it down into smaller shapes that can be worked with like this. That way, as you get closer to something, it can render it's way toward a model, instead of trying to render all the models and textures in full detail all the way out to max viewing distance.

I hope this makes some sense, this is all sounding smarter in my head than it looks once I type it, but I can't decide if lack of caffeine is affecting my reading or my writing. Maybe I'm just crazy. Or all of the above.




Don't get me wrong, this is a very basic and generalized explanation, which by nature will be somewhat wonky. If you want to read the fancy version, you can check it out here.

Ok, so, here's the question, what's this mean for me? Smoother textures and better poly mapping, making things look better all around, for starters. Pretty simple, and nice. The second major thing you'll see out of it is textures being able to look good close in without soaking up a ton of GPU resources at a long distance. No, this doesn't sound huge, but what it does is allow an increased view distance, since it can avoid rendering so much at a distance, meaning less stuff pops out of thin air.

I hope this has cleared this up reasonably well for some people, I know it's not that useful for some, but understanding graphics can help you understand what sort of system you personally need for the graphics quality you desire. I'm hoping to do some other similar ones over the next few days, so stay tuned.

Friday, October 14, 2011

Sunbeam Rheobus Extreme Fan Controller, Half a Review.

So, the Sunbeam Rheobus. Great little toy. I've been using it on my rig for a while. Each channel can handle up to 30w, with 6 channels, letting it handle pretty much any fans you want to throw at it. It retails below $30 usually, making it extremely affordable.

Now on the looks, there's a couple of things that are a bit hit or miss. The LED's in the knobs run off the same variable resistor as the fans... now this is amazing for telling where your fans are at a glance, but they're so damn bright. It's only a minor issue unless you sleep in the same room as your PC and don't shut down at night, though.

Also, the front is glossy, instead of some fancy little LCD touch panel that was built for about three cents, so while the front panel is less of an issue if it breaks, it also looks really inconsistent in most black cases, since these tend to be matte. Except for maybe a few really shitty things made out of plastic, that would be vastly improved if they included duct tape and baling wire in the design.

That being said, it's nice. It's a little tight when you're fitting it into the 5.25" bay, but it fits without a hammer, and the screw holes line up, so really, it's all good for installation. That being said, that's physical installation of the panel itself. I haven't said anything about fan control wires. The connectors are placed kind of awkward if you've got, say, an optical drive or the top of your case above it. But this should only matter once.

Speaking of wires, it comes with a lot more than most controllers, but still not enough. It comes with two each 3 pin extensions, 3 pin to molex, and 3 pin to 3 pin with mobo signal. It also comes with it's own power cable, but that's neither here nor there. It pulls off the 12v rail via molex, so it's got what you need to tickle bigger fans, although you do have to kind of turn them up higher than you want them to get them going from a stop.

Overall, I really like this thing, although I haven't used a whole lot of  controller panels so you might take that with a grain of salt. Photos shamelessly stolen from the manufacturer's website. But judging by the manual, they probably can't read this blog to complain.


Overall, this product rates an official JingleHellTech 
Non-Turd Award!

Thursday, October 13, 2011

AMD Responds to Bad Reviews.

You know, since the AMD FX CPU's haven't been for sale for 48 hours at the time I'm writing this, I have to wonder if they just went ahead and prepared this particular response in advance, since they knew they were releasing overhyped, overpriced paperweights.

But, in the interest of fairness, even though I'm sure I'm the least of their worries, I'm going to show their rebuttal and respond to it. After all, I hate when people can't be objective.


Our Take on AMD FX

by akozak

This week we launched the highly anticipated AMD FX series of desktop processors. Based on initial technical reviews, there are some in our community who feel the product performance did not meet their expectations of the AMD FX and the “Bulldozer” architecture. Over the past two days we’ve been listening to you and wanted to help you make sense of the new processors. As you begin to play with the AMD FX CPU processor, I foresee a few things will register:
In our design considerations, AMD focused on applications and environments that we believe our customers use – and which we expect them to use in the future. The architecture focuses on high-frequency and resource sharing to achieve optimal throughput and speed in next generation applications and high-resolution gaming.
Here’s some example scenarios where the AMD FX processor shines:
Playing the Latest Games
A perfect example is Battlefield 3. Take a look at how our test of AMD FX CPU compared to the Core i7 2600K and AMD Phenom™ II X6 1100T processors at full settings:
Map Resolution AMD FX-8150 Sandy Bridge i7 2600k AMD Phenom™ II X6 1100T
MP_011 1650x1080x32 max settings 39.3 37.5 36.3
MP_011 1920x1200x32 max settings 33.2 31.8 30.6
MP_011 2560x1600x32 max settings 21.4 20.4 19.9
Benchmarking done with a  single AMD Radeon™ HD 6970 graphics card
Creating in HD
Those users running time intensive tasks are going to want an AMD FX processor for applications like x264, HandBrake, Cinema4D where an eight-core processor will rip right along.
Building for the Future
This is a new architecture. Compilers have recently been updated, and programs have just started exploring the new instructions like XOP and FMA4 (two new instructions first supported by the AMD FX CPU) to speed up many applications, especially when compared to our older generation.
If you are running lightly threaded apps most of the time, then there are plenty of other solutions out there. But if you’re like me and use your desktop for high resolution gaming and want to tackle time intensive tasks with newer multi-threaded applications, the AMD FX processor won’t let you down.
We are a company committed to our customers and we’re constantly listening and working to improve our products. Please let us know what questions you have and we’ll do our best to respond.
Adam Kozak is a product marketing manager at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites, and references to third party trademarks, are provided for convenience and illustrative purposes only. Unless explicitly stated, AMD is not responsible for the contents of such links, and no third party endorsement of AMD or any of its products is implied.

For starters, I'd buy the bit about design considerations based on what users need or will need in the future a bit more if FX was being marketed as a professional CPU, or a processor for college kids, and either way, priced a good bit lower. When you start trying to sell it as a consumer CPU, it needs to not be a step backwards in tasks that they're doing right now. Like gaming, perhaps. Yes, it did well in a few, generally the ones that are designed like software should be these days as far as threading. I don't think any enthusiast will argue against better threading in games. But it doesn't exist NOW, which is when FX went on the market. But frankly, since the majority of games are GPU bound anyway, better threading would only help in a portion of the market.

Generally, I look for inconsistencies when I read stuff like this. Notice, they show exact bench FPS from their tests in BF3. Talking about encoding, they just comment on how well threaded encoding tends to be. Why, exactly? Well, because the FX8150 kind of chills with higher end Intel. Which is soon to be surpassed by better Intel. That kind of sucks for AMD.

Now don't get me wrong, FX isn't technically bad. It's just horrendously overpriced. If the 8150 was at a price point with multi-locked i5's like 2300 or 2400, and the motherboards a bit cheaper, it would be an outstanding buy for several types of users. I could see college students doing video related or software related tasks, and being able to toss those to 6 cores, and do light gaming on another 2. Cheap streaming for E-Sports types would also be plausible.

So, sorry, AMD, but I'm not buying the statement. No, they aren't technically as terrible as they're getting made out to be, but between the hype and the pricing at the retail channels, this is a joke.

Defending AMD FX: A Losing Battle.

So, since yesterday's release of AMD FX CPUs, including the FX-8150 Octo-core, there's been many discussions online, referencing benchmarks, value, quality, and performance. As can be expected, there are actually still some fanboys dumb enough to think they have a chance of defending these piles of shit. So, in interest of... fairness, I've decided to discuss some of the... reasoning.

Excuse 1: It's not technically an 8 core CPU blah de blah.
As much as I normally love being technically accurate, there is such a thing as having shit for brains, and this excuse shows us the people who do. AMD advertised FX-8150 as an 8 core CPU, why shouldn't I judge threaded performance by the standard of 8 cores? Are you accusing the company you love of fraudulent advertising?

Excuse 2: It's really only intended to be a server architecture though!
Sorry, comes down to the same damn thing. AMD spent a fortune sponsoring IGN Proleague and shoving their AMD FX ads down our throats, trying to convince people that an 8 core CPU was good for gaming. You market it for gaming, you get benched in gaming performance. Not our fault it fell completely flat on it's face. Most people using Intel would prefer if AMD could compete, it would be awesome for CPU prices.

Excuse 3: Well, it's worse clock for clock, but it overclocks way higher!
Sadly, this means nothing to the vast majority of users. Why? Because, during testing, Anandtech found that the FX-8150 can only hit the same clocks on air as a 2500k or 2600k. It goes a bit higher on water, but quickly ends up needing extreme cooling. And sorry, the clock you could hit if you kept liquid nitrogen hanging around doesn't mean a whole lot if you don't.

Excuse 4: It isn't working well with Windows yet!
Yes, we know, you've burned that one into the ground. Unfortunately, the tasks it does the worst on are less threaded tasks, like gaming, where it has to try and compete with Phenom 2, the utterly obsolete predecessor. 

Excuse 5: But it does compete with Sandy Bridge on multi-threaded tasks!
And it wants a cookie, I assume? For the vast majority of consumers this means... oh yeah, nothing. AMD FX is replacing Phenom 2 in bad product placement, as the poorly priced, late to the game CPU that you might get if you really need physical cores and don't want to pay for better hardware that costs from slightly less to slightly more.

Excuse 6: All those benchmarks are biased, and not representative of anything.
On the second one, welcome to benchmarks, jackass. Find a good analog for performance that gets benched frequently, or bench things yourself. As for bias, I really doubt all the reviewers were so biased that they had a damn conference at some executive resort in the Swiss Alps just to plan out how BD would do on each test.

So, AMD Fanboys, for your tenacity in the face of logic, for your stubborn pride and arrogance in the face of benchmarks, and for your inability to listen to reason, I salute you. Someone has to keep AMD in business so Intel can't really monopolize the market.

Wednesday, October 12, 2011

What the Hell, AMD, or: Fanboy Tears, Tonic for the Soul.

Bulldozer got release. Many people who were willing to think critically and be objective have been just a bit skeptical, due to delays, leaked benchmarks, and the fact that AMD basically hasn't turned out a CPU that was actually worth a damn in years. All the speculation peaked in recent weeks, when they released a Youtube Video bragging about an overclock record (That they didn't bench) talking like clock speed was the be-all end-all of performance. Shortly after this, they released an ad that played way too many times during the IGN Pro League finals.

Skepticism and mockery of the way they've been dancing verbally around performance, going along with all the other factors, had people eager for vindication after debates and arguments.

Well, folks, with the actual release of AMD FX CPU's, they've finally lifted the NDA. Kind of late, yeah? Well, you can hardly blame them. Even the tests they do best in, they're coming out slightly ahead of the i7 2600k, with the FX 8150. Twice the physical cores, higher stock clock, and the best results are slightly ahead of i7. The worse results, in applications that aren't able to benefit from extra physical cores? Well, in those, they're mostly struggling to compete with the more recent Phenom 2 offerings. And those were obsolete the day they got released. Owch.

Here's the intended lineup:

  • FX-8150: Eight cores, 3.6 GHz CPU base (3.9 GHz Turbo Core, 4.2 GHz Max Turbo), $245 suggested retail price (U.S.)
  • FX-8120: Eight cores, 3.1 GHz CPU base (3.4 GHz Turbo Core, 4.0 GHz Max Turbo), $205 suggested retail price (U.S.)
  • FX-6100: Six cores, 3.3 GHz CPU base (3.6 GHz Turbo Core, 3.9 GHz Max Turbo), $165 suggested retail price (U.S.)
  • FX-4100: Four cores, 3.6 GHz CPU base (3.7 GHz Turbo Core, 3.8 GHz Max Turbo), $115 suggested retail price (U.S.) 
Now I don't know about you, but I don't see these prices as being all that interesting for the majority of consumers. Or even close. The FX-8150, in gaming performance, falls on it's face, competing with Phenom 2. This puts it well behind a cheaper i5 2500k, which OC's to about the same clock on air as an 8150... while being faster clock for clock.

I'm sure some enthusiasts will go nuts trying to OC  these things, but in my opinion, just wait for Intel's new enthusiast socket. It won't suck.

Hopefully in the next couple of days I'll compile enough things fanboys have to say in defense of FX to have a reasonable post ripping into that.

Monday, October 10, 2011

How Not to Suggest PC Components.

So it strikes me, there are a lot of people who manage to do bad jobs of suggesting components, from "reviewers" to forumgoers, and everywhere in between. The reasons are usually obvious if you already know the product is bad or inappropriate for you, or if you can read between the lines well, but what if you can't? You might end up wasting money, without even knowing it.

So what are the reasons for bad suggestions? I generally list ignorance, fanboyism, fantasy building, malice, and feeling honor bound to suggest certain components. Ignorance is fairly obvious. Sometimes people just don't know what they're talking about, whether it's outdated information, or something anecdotal they took on faith. Not much needs to be said about this, beyond two major factors: be willing to admit you were wrong, and try not to be ignorant.

Fanboyism, however, due to its close similarity to brand loyalty, deserves some discussion and definition. What is fanboyism? Fanboyism is when brand loyalty gets taken to an extreme, beyond reason, causing prejudice and inaccurate suggestions and belief in the face of evidence. Brand loyalty, however, is preferential treatment for a brand based on personal reasons, and usually doesn't come out in suggestions as a strong biased arguing point. Example being: I've got brand loyalty to EVGA. Awesome customer support, excellent warranty, and most of their products are well thought out, and designed in a way that's beneficial to my use. I'm not going to suggest them to someone that doesn't need those things, though, and, since their new motherboard team isn't quite up to my standard for EVGA, I wouldn't suggest their motherboards right now.

Fantasy building is what I call it when people try to get someone else to build the PC they wish they could build. This will come out in suggestions of multi-GPU, flagship motherboards, and top-end CPUs for people who would be perfectly happy with less, and not even use it in a way to see a difference. This is bad, because you waste people's money doing it, and that's kind of a dick move. This almost pisses me off more than ignorance, just because ignorance will generally get worse performance for a price point, but fantasy builds will convince someone to drastically overspend.

Malice. How can you give someone malicious advice on building a PC? Basically, by doing the opposite of fantasy building. You don't want that guy to have better than you, you don't want to suggest the stuff you wish you had, so you suggest worse. Congratulations, if you do this, you're in the running for asshat of the year. Die in a fire. Don't screw someone else over out of jealousy.

Finally, there's the honor bound people. This most frequently will apply to reviewers, bloggers, and fans of certain things being supported by a brand. A good example. Some of the bigger sponsors for competitive SC2 are AMD, Kingston HyperX, and ibuypower. But over at teamliquid.net, the best community site for competitive Starcraft, on the Tech Boards, we refuse to suggest stuff made by these and other big sponsors if they're bad, not cost efficient, or any other reason. Reviewers and bloggers will frequently rave about the quality of a product, and look for ways to make it look good, in hopes of getting more stuff to review.

Guess what. PC components are almost always objective. Preference on certain factors will be subjective, but performance, quality, warranty, and support are just plain facts. Misrepresenting them for any reason, particularly to kiss ass, is just stupid. If you misrepresent them too badly, you'll leave people with a sour taste. If you show the actual quality, and then leave the value of the support of x organization up to the prospective buyer, everybody ends up happy.

AMD's Bulldozer, Bad Signs for CPU Prices.

So the ads during IGN Pro League SC2 finals yesterday got me thinking, along with various discussions I've been a part of, and different promotional stuff. AMD is, possibly, about to finally release the Bulldozer FX CPUs, starting with their high-end offerings, including some Octo-Core stuff. Now there's some stuff that sounds promising for overclockers and benchmark fans alike, but there's a lot more that scares the piss out of me.

For starters, we're almost six months late getting these CPUs now. Now obviously, in the chip world, this sort of thing isn't a shock on its own, but combined with some other factors, it makes me think the delays are due to being completely incapable of competing with current (or even recent) Intel offerings, sticking with the trend of the last few years.

Then, of course, there's the video AMD released, talking about setting a world record for overclocking. Now this should be awesome for enthusiasts, right? Well, except for a quote around the 45 second mark. "We're not running any benchmarks, we're just shooting for the highest CPU-Z." Sorry, but in my humble opinion, a clock you don't run shit at is kind of pointless. I can get all kinds of random ass numbers in CPU-Z and have my PC bluescreen 2 seconds into any stress test or benchmark under the sun.

That brings me to another point. Who in hell brags about clock speed if their CPU is actually better than the competitions? No, they'd run benchmarks, even if they were benchmarks specifically favoring their hardware due to number of physical cores or whatever else, and announce the "Fastest Desktop CPU Ever*" "*large quantity of small print defining fastest to the point that it's useless." But then, looking at stuff people

Using Google Translate (sorry) on (purported) leaked benchmarks, ". In this case the Intel Core i7 2600K to run at 3.4GHz base and could climb up to 3.8GHz was used when one thread, while the AMD FX-8150 worked at 3.6GHz can climb same conditions up to 4.2GHz." In other words, running stock v stock, even in tasks like Handbrake encodes, the 8150 and it's 8 physical cores were barely keeping up with a 2600k and HT technology. Owch. The rest of them aren't any better, with 3dMark physics score being < 80% of the 2600k.


All this is only confirming what I've been saying for a while. When you've been getting smashed in almost every performance category for as long as AMD has (not price/performance, which I'll admit they took at certain points), you don't withhold a product for 6 months if it's looking like it might compete. You certainly don't withhold a product that can set overclocking speed records, unless it can't compete. 


Sorry, AMD, looks like Intel has nothing to worry about any time soon, and you've sealed Intel's ability to price wherever the hell they want for the foreseeable future, again. And I'm sure some people think I'm an Intel fanboy at this point. Sorry, I'm not. I'm a performance fanboy, and I'd LOVE for Intel to have some competition and have to consider price points for once. So please, feel free to prove me wrong.

Sunday, October 9, 2011

First-time PC Builder FAQ

This FAQ is only going to apply after you've picked parts. I'm assuming you're reading this after either asking on your forum of choice, or getting a list from the family member who works in IT and thinks workstation components are good for gaming rigs, and you wouldn't listen to me on what to use anyways. Besides which, picking components is all about deciding what you want out of your PC and buying the best parts for that, which is mostly about prices and benchmarks.

Source X (includes GPU manufacturer) says I need a higher wattage PSU than this, will it work?

Well, I can't answer for any configuration without making one up, defeating the purpose, but keep in mind, GPU manufacturers have to account for all kinds of PC configurations, and low quality PSUs. A high quality PSU in a typical configuration will generally work fine well below the "minimum wattage". Bear in mind, some PSU calculators exist to sell products, and others have to be used properly or they aren't helping at all. Example, the calculator at extreme.outervision.com, used correctly, gives the wattage needed for your PC at full load to hit the load percentage you designate. It doesn't list the wattage your PSU will use under load.

How hard is it really?

Assuming you can read, and have the ability to match rather specific shapes, it's pretty simple. Yes, you need to be moderately careful, but as long as you take your time, read instructions, and pay attention, it's pretty hard to screw up, and moreso in a way that causes permanent damage.

I was installing my Intel CPU, got resistance, and heard a crunch. Did I break it?

As long as you lined up the notches, yes, the weird grindy pressure noise thingy is normal. It's the tactile response version of hearing fingernails on a blackboard, but it's ok.

My PC won't boot, and I'm hearing a bunch of beeps, what's wrong?

Well, the specifics depend on your motherboard, but that's called beep code. If you either look in your motherboard manual, or google your motherboard's model number and the words "beep code", you can find a translation, and it will help you troubleshoot. It's usually a sign you installed something slightly wrong, or have a DOA (Dead on Arrival) CPU or RAM stick, but it can mean other things.

My PC seems to boot, but the screen stays black, what's wrong?

There's several common causes for this issue. One of the more frequent problems is plugging the video cable into the motherboard's video out with a discrete GPU installed. Another cause can be not having PCIE power cables plugged in, or incorrectly plugged in. Poorly installed video cards can also be an issue.

I got my computer to boot, and installed my OS, but now I can't connect to the internet and nothing is working, what did I do wrong?

Look in the box your motherboard came in, and find the CD. You just need to install chipset and ethernet drivers. No big deal at all.

I got an SSD, what should I put on it?

Your OS and anything you want to load at startup. PDF readers and office software are also good, as they tend to load slowly. Games with lots of single player loading time are ok, but multi-player games, there's no real point, as your loading will be restricted by the slowest loader anyways, and he invariably has a massively fragmented USB 2.0 5400 RPM external HDD.

Do you suggest a particular guide?

I personally direct people to the Hardware Canucks video guide. Other people like other ones, but a lot of the other ones I see suggested come from etailers, and I refuse to send people to a resource provided by someone with a conflict of interest.

Saturday, October 8, 2011

Multi-GPU and Multi-Monitor Un-Confused

Eyefinity, SLI, 3DSurround, 3DVision, and CrossfireX. Multiple monitors. What does it all mean? What's the difference, in this epic pile of confusing terminology, and what do I need to make this work?

SLI and CrossfireX, for starters. These are multiple-GPU configurations designed for extra graphics power on a single display. These are the ones most people think of first in multi-GPU configurations, and the terms most likely to be generically thrown in as a placeholder value. SLI is the nVidia variant, CrossfireX is AMD. CrossfireX only requires multiple PCIE slots, SLI requires an SLI ready motherboard, which, as of recently, can include AMD motherboards.

3DVision: This is using funky technology and woogy glasses to make things pop out of the screen, like a bad '80s sci-fi flick. This hasn't been implemented all that well in a lot of games. It does, however, occasionally get mixed up with 3DSurround, due to the stupid similarity in naming.

3DSurround and Eyefinity are the methods of playing games across multiple displays. Not gaming on one and having a browser or other stuff going in a spare display, that's just a normal multiple display configuration. Some AMD cards will support Eyefinity off a single card, but it's pretty stupid. No nVidia solutions do that, which is really perfectly reasonable, since playing across multiple displays doesn't make a whole lot of sense if you're having to drop the settings to "fuzzy blob" to get the extra screen real estate. By the way, use an odd number of displays per row for this, you don't want lines in the middle of the game area.

Multiple monitor setups just include using two monitors, with at most one for gaming. This is pretty much supported by almost everything these days, since it's not all that harsh for non-3D use. It can be handy if you actually have a use for the screen real estate, but for a lot of people it's basically just a fancy thing.

Remember, for SLI, Eyefinity, 3DSurround, and CrossfireX, you need identical cards. You can't SLI a GTX 560Ti and a GT 9800. You could use the 9800 for dedicated PhysX, or for extra monitor outs, but you can't get the two different cards running SLI for alternate frame rendering.

No, this isn't long, really, but it's something that drives me nuts occasionally.

Bad Ripoff PC Seller.

Some guy decided to try and advertise on Team Liquid today, talking about his "friend's" hot gaming rigs.
http://jestercreativesolutions.com/gameprotech

I'd tell you to buy Cyberpower before this shit. AMD in this day and age? A 980 BE is in the i5 2400 price point, for 65% of the performance. He sells that with a 6850 for $1200. You can outperform that for $800...

Don't spam a site I hang out on the tech board of spamming ads for SHIT builds at STUPID prices. Dig?

Saturday, October 1, 2011

Airflow, Cases, Coolers, or DIY Wind Tunnel Kits.

Ok, so I've been thinking about a rant on this topic for a while anyway, but a good friend's predicament made me decide to go ahead and do it. For the sake of this post, we'll call him "Thomas", to avoid any potential embarrassment.

If you've ever seen a high end Enterprise workstation, or an older PC, you might remember something being vaguely different from a lot of PC's these days. If you can guess what I'm talking about, raise your hand. Then feel like a moron, since nobody can see your hand. The thing I'm talking about, of course, is the fact that those computers don't move enough air to achieve escape velocity. Holy shit, crazy talk, I know. No turbofan sitting next to your desk? How dare I blaspheme the pantheon of PC case manufacturers that way?

Don't get me wrong, there's absolutely nothing wrong with a case with a decent amount of airflow. But a decent amount of airflow and a decent amount of moving air that doesn't do a damn thing are entirely different animals. My friend "Thomas" is a great example of this. He has some cheap-ass gamer case that Ali-Baba and the CyberPower Thieves stuck him with. It has two front 120mm intakes, top-mounted isolated PSU intake, optional (in this case installed) 140mm top exhaust, the radiator for his (ugh) Corsair H50 mounted as an exhaust in the back, and one or two side 120mm fans. That's a metric asston of air moving around, in case you hadn't noticed. Somehow, though, he can't keep SLI GTX460's very cool. Granted, that's not always easy, but it's not insanely difficult, either, and he's hitting dangerous temps under load.

So. why am I bringing this up? Well, airflow is complicated, and I'm not going to claim to know everything (or even close) about aerodynamics. What I do know, though, is that a lot of things people think they know about PC cooling are very very wrong. A popular myth: "Get a bigger case to keep things cooler." Horribly wrong. Get a case that's got room for your components. Large amounts of dead airspace with no components create a path of least resistance for airflow that doesn't serve a purpose.

There's a couple of (general) rules to follow when trying to figure out your airflow.
1: Know your components. Nothing more important than knowing your configuration's needs. Granted, these can change later on with upgrades and whatnot, but it's still good to know.
2: Keep dead air space to a minimum within reason. Don't cramp your components, trapping heat is bad. But once your components fit reasonably, too much more space makes for bad airflow to where you need it.
3: No competing airflow. Avoid cases where your PSU is going to be fighting your CPU cooler or GPU airflow.

Here's the thing. Air follows the path of least resistance. Hot air rises, and heat is bad stuff. Because of this, if you can stand the slightly higher noise, External Exhaust graphics cards are good stuff, since they blow the hot air out of the case, unlike radial coolers that just get the heat away from the GPU and let it rise up through the CPU cooler. Negative pressure cases are ok for systems with limited restriction on airflow, but if there's a lot of stuff in there, actively directing the airflow with a positive pressure system is frequently better.

Don't forget to prioritize cooling. RAM doesn't need cooling as much as your CPU, so don't stick some retarded fan over your memory that thrashes airflow for your CPU. If you have a side panel, push air into your GPU if it's external exhaust, or pull air away before it can rise if it's radial cooling. If you're using a closed loop liquid cooler, I don't care how much it helps your CPU temps to use the radiator as an intake, just say no to pulling heat into your case.

Above all, apply common sense. The more directly cool air gets to your components, and hot air gets away from them, the better. Always. Avoid dead air, and avoid competing airflow. Avoid pointless restriction.

Remember, a lot of moving air doesn't mean good cooling. A space shuttle launch moves a shitload of air, and that's not cold at all.

Monday, September 26, 2011

Benchmarking for Fun (Or for Info.)

So if you frequent tech forums, you might hear something along the lines of "...only matters in synthetic benchmarks." or "You can't tell the difference without a benchmark." And, if you like the idea of having a slightly overpowered system, you may just want to know how to go about those benchmarks. Even if that's not the case, you may be wondering what will help your performance in a particular game the most, or wanting to find out just how effective your CPU or GPU is for a certain game.

If you read a lot of reviews, you've probably seen all the methodology, all the different names of software, and seen tons of numbers with pretty bars and graphs. Well, I hate to tell you this, but generally, the pretty bars and graphs don't just come with the software. Luckily, it's the numbers that matter anyway.

 So, for starters, lets discuss benching for fun. The most commonly used benches for gaming performance are Futuremark's 3DMark series. The "standard" comparison points are the default settings, which you can get in the free trial version. Lets you get a better database of scores going.

If you're going to bench for fun, the most important thing to remember is that you need a consistent set of benching processes. The best way to do this is to have a secondary account on your PC with the absolute minimum of automatic services, to make sure memory use is consistent. Keep drivers updated, although you usually won't want to use Beta drivers unless you need a specific feature. (Always test beta drivers prior to sustained use, some graphics drivers have been known to cause thermal issues.)

So, now that you have your consistent stuff, you get a baseline. That's just at your normal settings, how does it work. After that, you can try tweaking things to see if they gain you performance. Whatever you decide to tweak, make sure you test with the same processes as before. It really can make a huge difference in scores, or enough to skew data to the point of unreliability.

Generally, the GPU is the easiest thing to tweak, since there's simple GUI based OC utilities that work reasonably well, like MSI Afterburner, or (to a point) EVGA Precision. Remember to push it up a little at a time, and test for stability before starting the benchmark. If you raise your GPU OC and suddenly lose performance, you either need to raise your GPU voltage (at your own risk), or lower the clock back down a bit. Generally, in graphics cards, RAM clock will be less performance for the voltage than core clock and shaders, so you generally won't want to bother with it.

The CPU, is, of course, one of those things people really think about overclocking, and a lot of people are scared of it. If you do your homework, have the right components, and do it carefully, it can be a perfectly safe way to gain performance. You only void your warranty if you do damage that can definitely be attributed to overclocking, so usually you're safe if you keep the OC within the CPU manufacturers specified voltage range.

This isn't an overclocking guide, but I will say that in recent i7's, it's usually best to raise clock to the desired level first, then see if you can squeeze hyperthreading back on safely. Hyperthreading will raise your score in 3DMark, but less than most clock speed gains. RAM speed and timings can also affect it, as can IMC clock, and basically all the usual culprits.

But now, I'm sure, you're wondering about the benching for information thing. After all, E-peenery is fun, but there's only so much you can say about it. If you want to learn something through benchmarking, you're going to need a few things.

1: Consistent Monitoring. Be it FRAPS for FPS/Frametimes, HWMonitor for temps, or something else, you must use software. Eyeballing it is worse than useless.

2: Consistent Playthrough. You need a replay, or a specific save file, or something that you always use for benchmarking. If you're getting an FPS recording of different things, it doesn't tell you much.

3: Objectivity. This is the key. If you aren't objective, you're useless. You need to be willing to get results you didn't want or expect.

Well, as for methodology, it depends a bit, based on what exactly you're wanting to test, but essentially you need to isolate a component. For CPU/RAM testing, graphics settings not requiring the CPU should be turned down, to include resolution. This ensures that the limiting factor will be your CPU. For RAM, you just get a baseline at a "Typical" setting, like CL9 1333 for DDR3, and work from there, using the same CPU clock.

If you want to isolate GPU, you crank the desired graphics settings, turn down anything you can involving the CPU, and hope you aren't trying to isolate the graphics card on an RTS. Won't happen unless you have a really screwy rig.

If you want to monitor temps, you need a consistent ambient temperature, and consistency in everything except what you're testing. Including what you use to heat up the PC, and how long you run it prior to measuring.

I know this hasn't really been long on details, but it's really more to get you thinking the right way.

Saturday, September 24, 2011

Piracy: Or, Biting the Hand That Feeds You.

Ok, I've heard all the completely retarded excuses for stealing software, movies, music, and whatever else happens to tickle your fancy. They're all bullshit. You aren't some nifty political movement, sticking it to the evil corporate empire. You aren't protesting intrusive DRM, and if they released a demo, you'd still steal the full game.

Notice I'm saying steal. Not pirate, not copy, not download, not torrent. I've heard that load of crap too, and you make Downs seem smart if you think people are buying the arguments about it not being stealing. Shall we check the definition of steal, and see?

1.
to take (the property of another or others) without permission or right, especially secretly or by force: A pickpocket stole his watch.
2.
to appropriate (ideas, credit, words, etc.) without right or acknowledgment.

First two definitions for steal. Neither of those mentions physical possession. Intellectual PROPERTY. Rocket science, much? Hell, the second one even covers non-physical stuff, like ideas, or credit... So, apparently, you can steal without denying the rightful owner a physical object. All you have to do is to take something without permission or right. That sounds like what you call Piracy, to me.

So, my next question. If it's a protest of whatever, for whatever (made up) reason, you pretty much acknowledge you're getting it the wrong way, or else the act of getting it wouldn't be a protest, right? The only other way to protest would be to not get it at all. So, if you succeeded in justifying it as just being a copy, and not stealing, it would actually remove any value from the notion of protesting, because your action wasn't wrong at all, and you got the product.

Let's face it. The majority of people aren't stealing a game to play for free to see if they want to buy it. How do I know? Because that's moronic. There's other ways to find out if you want to play a game, especially in this day and age. You aren't stealing it because of intrusive DRM, either. I'd guess something like 0.001% of the people who steal a product to get around intrusive DRM have encountered an issue and known it at the time. Even counting me and all my friends, the worst issues I've seen with DRM have been occasional new CDs not playing in old players. I've never had China take over my computer to sell to the Daleks or whatever, as a result of installing a game.

The only thing you're really protesting is the right of people to earn money by selling games. Yes, you deny money to greedy stockholders. Those game companies wouldn't exist without greedy stockholders, you idiot. The greedy stockholders are all rich and can move their money anyways, so the people you're hurting are the people who lose jobs. The people who make it happen. The common people. So, you're just a prick, you thief.

Long story short, you're making the games industry less profitable than it should be, and now we get nothing but shitty console ports for PC. Thanks a lot, assholes. Rot in whatever hell is most applicable to your religious preference. Yes, they still make money off the buyers, the honest people. But we pay MORE for games, they put worse and worse DRM in place, and everything you claim to be fighting comes as a direct result of your actions. Or at least is justified by them. So go fuck yourselves.

Monday, September 19, 2011

Mac, Why I Hate It, and Why That's OK.

So, there's this funny misconception amongst Mac users. Apparently, there's something wrong with people hating Mac. Even for those of us with a legitimate reason. After all, disliking Mac is clearly a personal attack against all Mac users, or that's how they respond. Now, I'll admit, there's PC users who exacerbate the problem, and I've been one of them. But these days, I'd say that my reasons to hate Mac are legit, and people who disagree should be willing to accept them.

For starters, iMac. Take it out of the box, plug it in, and turn it on. Great. But it's a laptop GPU pushing more display than it should be. You can't open the case, you can't stick in a new GPU, you can't do anything with it. Aside from turn it on and maybe set it on your coffee table as some sort of centerpiece. But it is decorative. Can't change anything, can't tinker... yuck.

Second, OpenGL and game support. You just can't game on a Mac. Now I know not everybody plays a lot of games, and since most Valve and Blizzard games are available for Mac, some people don't need more. I do, and I need it to look good. DX11 with a high framerate doesn't happen on a Mac.

Pricing. The most frequently cited reason, and the least understood. Mac comes with good support, good build quality, good cooling, and good displays. But me, I'm fine with manufacturer warranty for components, and I can arrange good cooling. As far as displays? I don't need perfect color, just decent color and low response times. That's it. I can do that without a Mac, and get the stuff I do want with it.

Hardware... Yeah, you can't upgrade it, you can't customize it much, and it costs a fortune for what you get, from a pure performance standpoint.

Now, to revisit that feature thing. Can't tinker, but it just works. That's good for some people. Expensive, but you get awesome customer service. Can't upgrade, but you get amazing build quality. OpenGL and game support? Not everybody needs that, and for what Mac does, it's awesome. It just doesn't game.

Saturday, September 17, 2011

Cable Management: The Good, The Bad, and the OCD.

Cable management and routing in a PC can be a weird topic, because there's such a huge difference between pure substance, and smooth, sexy style. Now obviously, cable management in your computer can be very important for  airflow and cooling, but it can have another function as well. If your cables are out of the way but accessible, you can have a much easier time moving things around, adding things, or troubleshooting.

Now I'm sure a lot of enthusiasts are freaking out right now, because I'm suggesting that there might be a drawback to having all but the last 3cm of cable invisible. That's ok, it still looks sexy, and unless you're actually adding a new component, it's not really that much more difficult.

For the lazy, understand: I'm not endorsing some tangled rats nest that looks like it was wired by a spaghetti chef that turns into a semi-solid wall between your fans and your components. If you do this, trained killer robot mongooses will track you down and stab you in the eyeballs with a rusty wooden spork. And you'll deserve it.

So we're clear, I'm all for functional cable management. Get those puppies out of the way of airflow. Of course, that means you have to understand the direction and purpose of airflow in your case, but that's a whole different topic. Use zip ties, stuff things behind the motherboard tray, hide unused cables, use sleeved cables. I don't care how you do it, but let the air move in the directions it needs to in your PC.

On the other hand, I refuse to do the OCD cable management. Gasp and shudder in horror if you so desire, but I just can't see the point in spending so much time hiding cables that my PC is obsolete by the time I install the OS. (Granted, it probably will be anyway, but that's a different rant.) Now I'll be the first to admit, it looks damn sexy, but it's just ridiculous, particularly considering the fact that it's really not helping that much, unless you need the room for plumbing.

Back to the rats nest people... did you know your PC will stay cleaner, operate cooler, make less noise, and possibly even perform better if you fix your wires? (Individual results may vary, rant shown with optional equipment. Professional airflow in a closed case, do not try this at home.)

Friday, September 16, 2011

There Are No Stupid Questions... really?

Ever heard the old adage, there are no stupid questions? I'm here to tell you, that's as big a load of crap as a C-130 carrying fertilizer. I've heard quite a few impressive ones, between various forums, and people I've met. Sadly, I see about twenty times more stupid answers than questions. I'm far from perfect, and I know it, but I usually try to say when I'm not sure about something, and I can admit it when I'm proven wrong.

For starters, a couple of the dumbest questions. Just today, I saw a good one. "I want to put the liquid cooler back into the computer. I bought an extra syringe of thermal paste, and the instructions tell me to remove old thermal compound reside with isopropyl. Is this really necessary? I was planning on just heating up the old paste with a hairdryer, adding a bit of new paste, and "gluing" my computer back together. Is it bad to mix compounds? Even if I do remove my old paste, I'm not going to use this isopropyl stuff."

Now for starters, how do you get old enough to be trusted with tools and electronics and not know how to google on the very slim off chance you've never seen a bottle of rubbing alcohol? More importantly, a hair dryer? Last time I checked, moving around hot air, if it's dry enough, can cause some pretty hefty static. Static, of course, being one of those things we try to avoid having near our CPU for some batshit crazy reason. I think we're trying to keep balloons from sticking to it or something.

Another time, back in the Army, I got asked this gem: "I have my PC hooked up to my bigscreen TV with HDMI, but the sound won't come out the TV speakers." So, of course, my first question: "Is it connected to an HDMI port on your graphics card, or are you using an adapter?" I'm sure you can figure the rest of this one out. I guess since HDMI carries audio, it should be able to extrapolate appropriate audio from a video signal, and play it, right? Makes sense to me.

And now, for a collection of a tiny fraction of the dumbest advice I've ever seen given on various tech boards.

Here's one talking about paging issues and HDD speed. One of my all-time favorites.

"They USED to be a lot slower than ram. But the fundamentals are no longer like this. You see, RAM bus is outside the chip, hence subject to abysmally slow speeds compared to intra-chip solutions. It also means that the speed growth of the connections is limited. And while harddrive read/write speeds are increasing EXPONENTIALLY, similarly how hard drive sizes are, the linear growth of ram speed cant keep up. So hard drives ultimately have come close to the speed of RAM read and write, only being limited by the same outside-chip fundamental problems."

Advice on cleaning old TIM:

"tbh you are ok with a slightly damp cloth as long as you are gentle+careful and dont run any power through the cpu until its totally dry"

And, the best solution for scareware I've ever seen!

"try to update drivers! if not, just clean all unnecesary files from youre hdd, or reinstall (reinstall is the last option) ofcours you may need to buy a new hdd, cause these kinfs of slowdowns are happening cause of HDD or drivers, sometimes it maybe something like a soundcard, or overheating!
change the thermo paste on youre CPU, cause it might be overheating! its called autothrottling, you can disable that, just google it, but it maybe risky!"

So, let's get into explaining this really complicated concept, for anybody considering giving out tech advice. If you aren't sure, GOOGLE IT.  Please. You'll save time explaining why you're wrong, and lower the potential of turning somebodies PC into a collection of overpriced paperweights. It isn't rocket science, hell it isn't even computer science. It's a combination of common sense and common courtesy. It isn't your PC, you don't have the right to ruin it by being a moron.

If you aren't willing to do a photo guide of whatever advice you're giving with YOUR PC, don't tell someone else to do it. I mean seriously, Information Technology isn't an event in the Special Olympics, so why try out?

Thursday, September 15, 2011

A Fool and His Money: Old CPUs.

One of the hot topics always popping up in discussion, is, of course, price/performance, value, and pure performance of CPUs. People try to generalize and say AMD is always better price/performance, even if Intel is slightly better. Or people point to clock speed and number of cores, like those are the only two relevant statistics in a CPU.

Since nobody with three functional brain cells is going to say AMD can outperform Intel these days, we can skip who has pure performance. That just leaves value and price/performance comparisons. I'm going to start those two off with value.

What is value, in terms of CPU, or computer parts in general? Well, there's several categories that have to be weighed here. Performance, price, needs, additional associated costs (motherboard, PSU, etc.), and upgrade path. The additional associated costs come into this, because lets face it, it doesn't do you any good to save $20 on CPU, and pay an additional $40 on motherboard. It does make you look kind of dumb, but that's about it.

Now, only you can determine the weight of the factors in value, but it's good to think about them anyways, when considering new components. Is it worth an additional $20 to have a socket with more CPUs coming for it? Are you willing to pay 15% more for a 30% improvement? Do you actually need the 6 physical cores of that PhenomII when you're only planning to play games, most of which are optimized for 2?

While you're deciding on the subjective value of components, price/performance is pretty much guaranteed to come up. There's only one real way to handle that. You need direct comparison benchmarks for the task you want to do, assign the lower value 100%, and determine the percentage of the higher performer. Then, do the same with the prices of CPU and motherboard combined, and see whether the additional performance is a higher percentage than additional price. For example, if the Intel configuration is 25% better for your primary task, and costs 15% more than the AMD configuration, the Intel build is better price/performance.

Why do I say you have to combine motherboard price with the CPU price? Because, we're determining price/performance, and since they can't go in the same motherboard, the only logical way to compare is to combine all the factors that will be different, and compare price/performance that way.

Granted, motherboard can affect performance, depending on features, but most of the features that can really affect performance, like SATA 6Gb/s, or extra power phases to OC cost more. Once you're looking at adding to the price for raw performance, you should stop comparing price/performance, and just shoot for performance again.

My Gaming History

Now I know this may seem a little bit niche, but I think the development of gaming habits in a person can be fairly telling. Now to start off, I have to qualify: I gave my parents their first TV after I moved out. Any console games prior to PS2 and Xbox, I played at a friends house, and may not have finished, even if I enjoyed it.

So, lets start with the earliest video games. At my friends house when I was growing up, Rampage, Mario, Duck Hunt, and Battletoads on the NES. We played those quite a bit. I also played Sonic on my friends Game Gear, but I never really got into it. I didn't get to play Nintendo much back then, and we're talking '90-'91ish. I was damn young.

I got a little bit older, and got a game boy, along with Ninja Turtles, Fall of the Foot Clan. I played that one quite a lot, and liked it. I played Tetris, F1, Donkey Kong Land, and Super Mario Land. Super Nintendo was reasonably common at this point, and for that, I played a lot of Super Mario World, mostly.

Well, around this time, my dad had a computer, but it was a Linux box (and this was still early '90s), so the only things I could play were various old arcade games on MAME, Nethack, and Zork. Until I discovered MUD's. I was looking for games for Linux based off of Tolkien's work, and I found a game called The Burning Eye. Instantly hooked. I couldn't possibly remember all my characters, although I do remember sucking at the game at that point.

Somewhere along in there, Pokemon Red and Blue came out in the US. I got Red for my birthday, and played it a TON. Me and all my friends had it, and we did the typical young gamer nerd thing. This part of my life, I played occasional Starfox64 with a friend, Ecco the Dolphin on a borrowed Game Gear, Rampart, Zaxxon, and Rygar (I think I'm getting these names right...) for an old Atari Lynx my dad got me at a church auction. Other than that, still MUDding, and playing Nethack.

Well, as I got a bit older, I stopped sucking so bad at the MUD's, (Burning Eye had renamed to Rebirth of Arda at this point.) At this point, my game boy games of choice were Pokemon Gold, Golden Sun, and Tactics Ogre, the Knight of Lodis. I also finally had a Windows box that I bought with summer job money. I played Alien vs Predator 1 and 2, Age of Empires 2, Quake 3 Arena, Mechwarrior 2 and 3, and Tie Fighter.

Well, I slowly but surely reached 2004, still playing pretty much the same games, although less often, and went to Basic Training. Skip to 2005-2006, while I was in Iraq, I had a laptop, and went back to some old favorites, including Nethack, and some new stuff, like Galactic Civilization 2. I was on one of the better camps, with real electricity, so my Xbox and PS2 were getting love, Kingdom Hearts 2, Magna Carta Tears of Blood, Halo 1 and 2, Forza. Anything.

End of 2006-Early 2007, back Stateside, God of War 2, Tekken 5, Soul Caliber 3, but I wasn't really playing all that often, I was busy with... other pursuits. I didn't really play many games between 2007 and 2009, but in '09, I finally built myself a new PC, and was playing Neverwinter Nights 2 and WoW. WoW did it's thing for a while, then I got Crysis, Killing Floor, AvP 2010, and a few other shooters to break up the WoW monotony.

Once Starcraft 2 came out, I got it, and played a bit, got sidetracked by League of Legends, smashed DA2, shelved Crysis2, and now I mostly just watch competitive SC2 when I'm not casually playing whatever.

Out of all of those, I'd say the most memorable, and my most loved, would be: The Burning Eye/Rebirth of Arda, followed by Nethack, Tie Fighter, Golden Sun, and Starcraft 2. Even though I barely play SC2, and I suck at it, I still love it. The games are fast paced, the casters are entertaining, and the community is excellent. I spend a lot of time over at Teamliquid.net, particularly on the Tech Support board there.

I hope this reminds some of you about all those oldies but goodies, the games that shaped you and your gaming tastes. It's fun to look back on it now, and try to figure out which games developed my gaming interests. I hope this wasn't too excessively long or boring, but for me, it's been a fun trip down memory lane.