Saturday, October 15, 2011

Anti Aliasing, GPU Murder, Justified, or No?

Anti-Aliasing, multi-sampling, 8x, 16x? Analytical? FSAA? What the hell is this shit, why can't I max it, and what am I looking for that makes it worth turning my GPU into an EZ-Bake oven? It can be kind of annoying when you're beating your framerate into the floor for something that's pretty hard to spot specifically unless you know what to look for.

Lucky for you, you don't need to look any further to find at least some of the answers. Simply put, AA is designed to make angled line on square pixels not look like a staircase. Since each pixel can only have one RGB value at any given time, you can't just have it be half and half with two different colors. The way Anti-aliasing fixes this, roughly, is to take raw texture data for some or all of the pixels involved, and kind of blend the colors in a way that lends itself to both sides of the line, making the transition appear smooth.

One of the reasons this is difficult to spot for a lot of people is pixel density. Most displays have fairly high density, which basically just refers to how much area the pixels are packed into. 22" 1920x1080 has the same number of pixels to play with as 30" 1920x1080. That means that the bigger your display, the larger the pixel, which can exaggerate the staircase shape on angled lines.

The fun thing here, is that the better your pixel density, the less you need AA. You might need 4x or so at 22" 1080p, but at 32", you should be trying to max it for lines that look similarly smooth. The different varieties and multiples involved, like 8xMSAA, or 4xSSAA are just different methods that can be used to determine what the resulting pixel data will be.

0xAA
4xaa
8xAA 




















These three images were originally taken in 1080p from Dragon Age 2, DX11, Highest quality everything, with the only thing changing in each shot being the level of Anti-Aliasing, as seen in the captions.

Now that I've explained anti-aliasing, spotting it should be easy, right?

 I'm guessing a lot of you out there actually have to work really hard to spot it, right? Like I said, the smaller the pixel, the smaller the square shape, the smoother the line, even without anti-aliasing.

On my 32" TV, I can barely see the difference between 4x and 8x at desk viewing distance.


 So, now that we've gone over exactly how big of a difference it makes, you can see that trying to max this may not really matter for your overall gaming experience. Given the massive GPU horsepower needed for high AA with decent quality textures, it can be an expensive prospect to truly max every game that comes out.

But, if you can live without something you can't see or can barely see, you can generally get by with a lot less power. The exact difference will vary a good bit from game to game, but frequently, if you're getting slightly jerky FPS at 8xAA, dropping to 4xAA will get you fairly smooth, in my experience. That's a big performance difference for a barely visible change.

Now obviously, this is one of those things where everyone needs to draw their own conclusions. I'll keep using insane PCs, because getting performance out of them is half the fun. But for people wanting performance on a budget, check the AA used in benchmarks, and you might just find that you'll be ok with a card that doesn't look quite as pretty if all you see is how long the bars are.

Practical DX11, What It Does, and What It Means

As we all know, the current generation of Microsoft's ubiquitous and ambiguous DirectX runtime is DX11. Just to make that sound less like techno-babble designed to keep software engineers in a job, DirectX is the overall software package for handling graphics in Windows. (There's also some sound stuff, but that's outside the scope of this article.) Since DirectX includes features for 3D gaming, it can be kind of important to gamers and enthusiasts to have hardware supporting at least recent versions.

Most games right now, mind you, only require DX9 to play, due to the red-headed step brother, consoles. DX10, well, yeah, that got released at some point, and did some stuff, and nobody really gave a shit. DX11, on the other hand, has this wonky tessellation, which mostly sounds like it fell out of a science fiction novel, possibly as some really painful way to die.

Basically, to explain the portions of tessellation that really made sense beyond "ooh shiny" without two or three different degrees in graphics design, software, and who knows what else, it's just a different way of representing shapes. Tessellation, roughly, lets you break shapes down into smaller shapes. The primary use for it is for smoothing out certain kinds of detail.

Roughly, tessellation breaks shapes down into tiny triangles, letting you make what's called a displacement map. To make this make sense, do you remember the little plexiglass box toy with a whole lot of pins that could slide up and down, so when you set it on top of something, from the top it would be a 3-D representation of the object?

Roughly, this is what the displacement mapping does in 3-D rendering. Not specifically, but it lets you take a shape, and break it down into smaller shapes that can be worked with like this. That way, as you get closer to something, it can render it's way toward a model, instead of trying to render all the models and textures in full detail all the way out to max viewing distance.

I hope this makes some sense, this is all sounding smarter in my head than it looks once I type it, but I can't decide if lack of caffeine is affecting my reading or my writing. Maybe I'm just crazy. Or all of the above.




Don't get me wrong, this is a very basic and generalized explanation, which by nature will be somewhat wonky. If you want to read the fancy version, you can check it out here.

Ok, so, here's the question, what's this mean for me? Smoother textures and better poly mapping, making things look better all around, for starters. Pretty simple, and nice. The second major thing you'll see out of it is textures being able to look good close in without soaking up a ton of GPU resources at a long distance. No, this doesn't sound huge, but what it does is allow an increased view distance, since it can avoid rendering so much at a distance, meaning less stuff pops out of thin air.

I hope this has cleared this up reasonably well for some people, I know it's not that useful for some, but understanding graphics can help you understand what sort of system you personally need for the graphics quality you desire. I'm hoping to do some other similar ones over the next few days, so stay tuned.

Friday, October 14, 2011

Sunbeam Rheobus Extreme Fan Controller, Half a Review.

So, the Sunbeam Rheobus. Great little toy. I've been using it on my rig for a while. Each channel can handle up to 30w, with 6 channels, letting it handle pretty much any fans you want to throw at it. It retails below $30 usually, making it extremely affordable.

Now on the looks, there's a couple of things that are a bit hit or miss. The LED's in the knobs run off the same variable resistor as the fans... now this is amazing for telling where your fans are at a glance, but they're so damn bright. It's only a minor issue unless you sleep in the same room as your PC and don't shut down at night, though.

Also, the front is glossy, instead of some fancy little LCD touch panel that was built for about three cents, so while the front panel is less of an issue if it breaks, it also looks really inconsistent in most black cases, since these tend to be matte. Except for maybe a few really shitty things made out of plastic, that would be vastly improved if they included duct tape and baling wire in the design.

That being said, it's nice. It's a little tight when you're fitting it into the 5.25" bay, but it fits without a hammer, and the screw holes line up, so really, it's all good for installation. That being said, that's physical installation of the panel itself. I haven't said anything about fan control wires. The connectors are placed kind of awkward if you've got, say, an optical drive or the top of your case above it. But this should only matter once.

Speaking of wires, it comes with a lot more than most controllers, but still not enough. It comes with two each 3 pin extensions, 3 pin to molex, and 3 pin to 3 pin with mobo signal. It also comes with it's own power cable, but that's neither here nor there. It pulls off the 12v rail via molex, so it's got what you need to tickle bigger fans, although you do have to kind of turn them up higher than you want them to get them going from a stop.

Overall, I really like this thing, although I haven't used a whole lot of  controller panels so you might take that with a grain of salt. Photos shamelessly stolen from the manufacturer's website. But judging by the manual, they probably can't read this blog to complain.


Overall, this product rates an official JingleHellTech 
Non-Turd Award!

Thursday, October 13, 2011

AMD Responds to Bad Reviews.

You know, since the AMD FX CPU's haven't been for sale for 48 hours at the time I'm writing this, I have to wonder if they just went ahead and prepared this particular response in advance, since they knew they were releasing overhyped, overpriced paperweights.

But, in the interest of fairness, even though I'm sure I'm the least of their worries, I'm going to show their rebuttal and respond to it. After all, I hate when people can't be objective.


Our Take on AMD FX

by akozak

This week we launched the highly anticipated AMD FX series of desktop processors. Based on initial technical reviews, there are some in our community who feel the product performance did not meet their expectations of the AMD FX and the “Bulldozer” architecture. Over the past two days we’ve been listening to you and wanted to help you make sense of the new processors. As you begin to play with the AMD FX CPU processor, I foresee a few things will register:
In our design considerations, AMD focused on applications and environments that we believe our customers use – and which we expect them to use in the future. The architecture focuses on high-frequency and resource sharing to achieve optimal throughput and speed in next generation applications and high-resolution gaming.
Here’s some example scenarios where the AMD FX processor shines:
Playing the Latest Games
A perfect example is Battlefield 3. Take a look at how our test of AMD FX CPU compared to the Core i7 2600K and AMD Phenom™ II X6 1100T processors at full settings:
Map Resolution AMD FX-8150 Sandy Bridge i7 2600k AMD Phenom™ II X6 1100T
MP_011 1650x1080x32 max settings 39.3 37.5 36.3
MP_011 1920x1200x32 max settings 33.2 31.8 30.6
MP_011 2560x1600x32 max settings 21.4 20.4 19.9
Benchmarking done with a  single AMD Radeon™ HD 6970 graphics card
Creating in HD
Those users running time intensive tasks are going to want an AMD FX processor for applications like x264, HandBrake, Cinema4D where an eight-core processor will rip right along.
Building for the Future
This is a new architecture. Compilers have recently been updated, and programs have just started exploring the new instructions like XOP and FMA4 (two new instructions first supported by the AMD FX CPU) to speed up many applications, especially when compared to our older generation.
If you are running lightly threaded apps most of the time, then there are plenty of other solutions out there. But if you’re like me and use your desktop for high resolution gaming and want to tackle time intensive tasks with newer multi-threaded applications, the AMD FX processor won’t let you down.
We are a company committed to our customers and we’re constantly listening and working to improve our products. Please let us know what questions you have and we’ll do our best to respond.
Adam Kozak is a product marketing manager at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites, and references to third party trademarks, are provided for convenience and illustrative purposes only. Unless explicitly stated, AMD is not responsible for the contents of such links, and no third party endorsement of AMD or any of its products is implied.

For starters, I'd buy the bit about design considerations based on what users need or will need in the future a bit more if FX was being marketed as a professional CPU, or a processor for college kids, and either way, priced a good bit lower. When you start trying to sell it as a consumer CPU, it needs to not be a step backwards in tasks that they're doing right now. Like gaming, perhaps. Yes, it did well in a few, generally the ones that are designed like software should be these days as far as threading. I don't think any enthusiast will argue against better threading in games. But it doesn't exist NOW, which is when FX went on the market. But frankly, since the majority of games are GPU bound anyway, better threading would only help in a portion of the market.

Generally, I look for inconsistencies when I read stuff like this. Notice, they show exact bench FPS from their tests in BF3. Talking about encoding, they just comment on how well threaded encoding tends to be. Why, exactly? Well, because the FX8150 kind of chills with higher end Intel. Which is soon to be surpassed by better Intel. That kind of sucks for AMD.

Now don't get me wrong, FX isn't technically bad. It's just horrendously overpriced. If the 8150 was at a price point with multi-locked i5's like 2300 or 2400, and the motherboards a bit cheaper, it would be an outstanding buy for several types of users. I could see college students doing video related or software related tasks, and being able to toss those to 6 cores, and do light gaming on another 2. Cheap streaming for E-Sports types would also be plausible.

So, sorry, AMD, but I'm not buying the statement. No, they aren't technically as terrible as they're getting made out to be, but between the hype and the pricing at the retail channels, this is a joke.

Defending AMD FX: A Losing Battle.

So, since yesterday's release of AMD FX CPUs, including the FX-8150 Octo-core, there's been many discussions online, referencing benchmarks, value, quality, and performance. As can be expected, there are actually still some fanboys dumb enough to think they have a chance of defending these piles of shit. So, in interest of... fairness, I've decided to discuss some of the... reasoning.

Excuse 1: It's not technically an 8 core CPU blah de blah.
As much as I normally love being technically accurate, there is such a thing as having shit for brains, and this excuse shows us the people who do. AMD advertised FX-8150 as an 8 core CPU, why shouldn't I judge threaded performance by the standard of 8 cores? Are you accusing the company you love of fraudulent advertising?

Excuse 2: It's really only intended to be a server architecture though!
Sorry, comes down to the same damn thing. AMD spent a fortune sponsoring IGN Proleague and shoving their AMD FX ads down our throats, trying to convince people that an 8 core CPU was good for gaming. You market it for gaming, you get benched in gaming performance. Not our fault it fell completely flat on it's face. Most people using Intel would prefer if AMD could compete, it would be awesome for CPU prices.

Excuse 3: Well, it's worse clock for clock, but it overclocks way higher!
Sadly, this means nothing to the vast majority of users. Why? Because, during testing, Anandtech found that the FX-8150 can only hit the same clocks on air as a 2500k or 2600k. It goes a bit higher on water, but quickly ends up needing extreme cooling. And sorry, the clock you could hit if you kept liquid nitrogen hanging around doesn't mean a whole lot if you don't.

Excuse 4: It isn't working well with Windows yet!
Yes, we know, you've burned that one into the ground. Unfortunately, the tasks it does the worst on are less threaded tasks, like gaming, where it has to try and compete with Phenom 2, the utterly obsolete predecessor. 

Excuse 5: But it does compete with Sandy Bridge on multi-threaded tasks!
And it wants a cookie, I assume? For the vast majority of consumers this means... oh yeah, nothing. AMD FX is replacing Phenom 2 in bad product placement, as the poorly priced, late to the game CPU that you might get if you really need physical cores and don't want to pay for better hardware that costs from slightly less to slightly more.

Excuse 6: All those benchmarks are biased, and not representative of anything.
On the second one, welcome to benchmarks, jackass. Find a good analog for performance that gets benched frequently, or bench things yourself. As for bias, I really doubt all the reviewers were so biased that they had a damn conference at some executive resort in the Swiss Alps just to plan out how BD would do on each test.

So, AMD Fanboys, for your tenacity in the face of logic, for your stubborn pride and arrogance in the face of benchmarks, and for your inability to listen to reason, I salute you. Someone has to keep AMD in business so Intel can't really monopolize the market.

Wednesday, October 12, 2011

What the Hell, AMD, or: Fanboy Tears, Tonic for the Soul.

Bulldozer got release. Many people who were willing to think critically and be objective have been just a bit skeptical, due to delays, leaked benchmarks, and the fact that AMD basically hasn't turned out a CPU that was actually worth a damn in years. All the speculation peaked in recent weeks, when they released a Youtube Video bragging about an overclock record (That they didn't bench) talking like clock speed was the be-all end-all of performance. Shortly after this, they released an ad that played way too many times during the IGN Pro League finals.

Skepticism and mockery of the way they've been dancing verbally around performance, going along with all the other factors, had people eager for vindication after debates and arguments.

Well, folks, with the actual release of AMD FX CPU's, they've finally lifted the NDA. Kind of late, yeah? Well, you can hardly blame them. Even the tests they do best in, they're coming out slightly ahead of the i7 2600k, with the FX 8150. Twice the physical cores, higher stock clock, and the best results are slightly ahead of i7. The worse results, in applications that aren't able to benefit from extra physical cores? Well, in those, they're mostly struggling to compete with the more recent Phenom 2 offerings. And those were obsolete the day they got released. Owch.

Here's the intended lineup:

  • FX-8150: Eight cores, 3.6 GHz CPU base (3.9 GHz Turbo Core, 4.2 GHz Max Turbo), $245 suggested retail price (U.S.)
  • FX-8120: Eight cores, 3.1 GHz CPU base (3.4 GHz Turbo Core, 4.0 GHz Max Turbo), $205 suggested retail price (U.S.)
  • FX-6100: Six cores, 3.3 GHz CPU base (3.6 GHz Turbo Core, 3.9 GHz Max Turbo), $165 suggested retail price (U.S.)
  • FX-4100: Four cores, 3.6 GHz CPU base (3.7 GHz Turbo Core, 3.8 GHz Max Turbo), $115 suggested retail price (U.S.) 
Now I don't know about you, but I don't see these prices as being all that interesting for the majority of consumers. Or even close. The FX-8150, in gaming performance, falls on it's face, competing with Phenom 2. This puts it well behind a cheaper i5 2500k, which OC's to about the same clock on air as an 8150... while being faster clock for clock.

I'm sure some enthusiasts will go nuts trying to OC  these things, but in my opinion, just wait for Intel's new enthusiast socket. It won't suck.

Hopefully in the next couple of days I'll compile enough things fanboys have to say in defense of FX to have a reasonable post ripping into that.

Monday, October 10, 2011

How Not to Suggest PC Components.

So it strikes me, there are a lot of people who manage to do bad jobs of suggesting components, from "reviewers" to forumgoers, and everywhere in between. The reasons are usually obvious if you already know the product is bad or inappropriate for you, or if you can read between the lines well, but what if you can't? You might end up wasting money, without even knowing it.

So what are the reasons for bad suggestions? I generally list ignorance, fanboyism, fantasy building, malice, and feeling honor bound to suggest certain components. Ignorance is fairly obvious. Sometimes people just don't know what they're talking about, whether it's outdated information, or something anecdotal they took on faith. Not much needs to be said about this, beyond two major factors: be willing to admit you were wrong, and try not to be ignorant.

Fanboyism, however, due to its close similarity to brand loyalty, deserves some discussion and definition. What is fanboyism? Fanboyism is when brand loyalty gets taken to an extreme, beyond reason, causing prejudice and inaccurate suggestions and belief in the face of evidence. Brand loyalty, however, is preferential treatment for a brand based on personal reasons, and usually doesn't come out in suggestions as a strong biased arguing point. Example being: I've got brand loyalty to EVGA. Awesome customer support, excellent warranty, and most of their products are well thought out, and designed in a way that's beneficial to my use. I'm not going to suggest them to someone that doesn't need those things, though, and, since their new motherboard team isn't quite up to my standard for EVGA, I wouldn't suggest their motherboards right now.

Fantasy building is what I call it when people try to get someone else to build the PC they wish they could build. This will come out in suggestions of multi-GPU, flagship motherboards, and top-end CPUs for people who would be perfectly happy with less, and not even use it in a way to see a difference. This is bad, because you waste people's money doing it, and that's kind of a dick move. This almost pisses me off more than ignorance, just because ignorance will generally get worse performance for a price point, but fantasy builds will convince someone to drastically overspend.

Malice. How can you give someone malicious advice on building a PC? Basically, by doing the opposite of fantasy building. You don't want that guy to have better than you, you don't want to suggest the stuff you wish you had, so you suggest worse. Congratulations, if you do this, you're in the running for asshat of the year. Die in a fire. Don't screw someone else over out of jealousy.

Finally, there's the honor bound people. This most frequently will apply to reviewers, bloggers, and fans of certain things being supported by a brand. A good example. Some of the bigger sponsors for competitive SC2 are AMD, Kingston HyperX, and ibuypower. But over at teamliquid.net, the best community site for competitive Starcraft, on the Tech Boards, we refuse to suggest stuff made by these and other big sponsors if they're bad, not cost efficient, or any other reason. Reviewers and bloggers will frequently rave about the quality of a product, and look for ways to make it look good, in hopes of getting more stuff to review.

Guess what. PC components are almost always objective. Preference on certain factors will be subjective, but performance, quality, warranty, and support are just plain facts. Misrepresenting them for any reason, particularly to kiss ass, is just stupid. If you misrepresent them too badly, you'll leave people with a sour taste. If you show the actual quality, and then leave the value of the support of x organization up to the prospective buyer, everybody ends up happy.

AMD's Bulldozer, Bad Signs for CPU Prices.

So the ads during IGN Pro League SC2 finals yesterday got me thinking, along with various discussions I've been a part of, and different promotional stuff. AMD is, possibly, about to finally release the Bulldozer FX CPUs, starting with their high-end offerings, including some Octo-Core stuff. Now there's some stuff that sounds promising for overclockers and benchmark fans alike, but there's a lot more that scares the piss out of me.

For starters, we're almost six months late getting these CPUs now. Now obviously, in the chip world, this sort of thing isn't a shock on its own, but combined with some other factors, it makes me think the delays are due to being completely incapable of competing with current (or even recent) Intel offerings, sticking with the trend of the last few years.

Then, of course, there's the video AMD released, talking about setting a world record for overclocking. Now this should be awesome for enthusiasts, right? Well, except for a quote around the 45 second mark. "We're not running any benchmarks, we're just shooting for the highest CPU-Z." Sorry, but in my humble opinion, a clock you don't run shit at is kind of pointless. I can get all kinds of random ass numbers in CPU-Z and have my PC bluescreen 2 seconds into any stress test or benchmark under the sun.

That brings me to another point. Who in hell brags about clock speed if their CPU is actually better than the competitions? No, they'd run benchmarks, even if they were benchmarks specifically favoring their hardware due to number of physical cores or whatever else, and announce the "Fastest Desktop CPU Ever*" "*large quantity of small print defining fastest to the point that it's useless." But then, looking at stuff people

Using Google Translate (sorry) on (purported) leaked benchmarks, ". In this case the Intel Core i7 2600K to run at 3.4GHz base and could climb up to 3.8GHz was used when one thread, while the AMD FX-8150 worked at 3.6GHz can climb same conditions up to 4.2GHz." In other words, running stock v stock, even in tasks like Handbrake encodes, the 8150 and it's 8 physical cores were barely keeping up with a 2600k and HT technology. Owch. The rest of them aren't any better, with 3dMark physics score being < 80% of the 2600k.


All this is only confirming what I've been saying for a while. When you've been getting smashed in almost every performance category for as long as AMD has (not price/performance, which I'll admit they took at certain points), you don't withhold a product for 6 months if it's looking like it might compete. You certainly don't withhold a product that can set overclocking speed records, unless it can't compete. 


Sorry, AMD, looks like Intel has nothing to worry about any time soon, and you've sealed Intel's ability to price wherever the hell they want for the foreseeable future, again. And I'm sure some people think I'm an Intel fanboy at this point. Sorry, I'm not. I'm a performance fanboy, and I'd LOVE for Intel to have some competition and have to consider price points for once. So please, feel free to prove me wrong.

Sunday, October 9, 2011

First-time PC Builder FAQ

This FAQ is only going to apply after you've picked parts. I'm assuming you're reading this after either asking on your forum of choice, or getting a list from the family member who works in IT and thinks workstation components are good for gaming rigs, and you wouldn't listen to me on what to use anyways. Besides which, picking components is all about deciding what you want out of your PC and buying the best parts for that, which is mostly about prices and benchmarks.

Source X (includes GPU manufacturer) says I need a higher wattage PSU than this, will it work?

Well, I can't answer for any configuration without making one up, defeating the purpose, but keep in mind, GPU manufacturers have to account for all kinds of PC configurations, and low quality PSUs. A high quality PSU in a typical configuration will generally work fine well below the "minimum wattage". Bear in mind, some PSU calculators exist to sell products, and others have to be used properly or they aren't helping at all. Example, the calculator at extreme.outervision.com, used correctly, gives the wattage needed for your PC at full load to hit the load percentage you designate. It doesn't list the wattage your PSU will use under load.

How hard is it really?

Assuming you can read, and have the ability to match rather specific shapes, it's pretty simple. Yes, you need to be moderately careful, but as long as you take your time, read instructions, and pay attention, it's pretty hard to screw up, and moreso in a way that causes permanent damage.

I was installing my Intel CPU, got resistance, and heard a crunch. Did I break it?

As long as you lined up the notches, yes, the weird grindy pressure noise thingy is normal. It's the tactile response version of hearing fingernails on a blackboard, but it's ok.

My PC won't boot, and I'm hearing a bunch of beeps, what's wrong?

Well, the specifics depend on your motherboard, but that's called beep code. If you either look in your motherboard manual, or google your motherboard's model number and the words "beep code", you can find a translation, and it will help you troubleshoot. It's usually a sign you installed something slightly wrong, or have a DOA (Dead on Arrival) CPU or RAM stick, but it can mean other things.

My PC seems to boot, but the screen stays black, what's wrong?

There's several common causes for this issue. One of the more frequent problems is plugging the video cable into the motherboard's video out with a discrete GPU installed. Another cause can be not having PCIE power cables plugged in, or incorrectly plugged in. Poorly installed video cards can also be an issue.

I got my computer to boot, and installed my OS, but now I can't connect to the internet and nothing is working, what did I do wrong?

Look in the box your motherboard came in, and find the CD. You just need to install chipset and ethernet drivers. No big deal at all.

I got an SSD, what should I put on it?

Your OS and anything you want to load at startup. PDF readers and office software are also good, as they tend to load slowly. Games with lots of single player loading time are ok, but multi-player games, there's no real point, as your loading will be restricted by the slowest loader anyways, and he invariably has a massively fragmented USB 2.0 5400 RPM external HDD.

Do you suggest a particular guide?

I personally direct people to the Hardware Canucks video guide. Other people like other ones, but a lot of the other ones I see suggested come from etailers, and I refuse to send people to a resource provided by someone with a conflict of interest.