Insomnia | Commentary

The Nuts and Bolts are as Important as the Ones and Zeros

By Alex Kierkegaard / March 19, 2008


In a previous commentary, I wrote: "The hardware -- the nuts and bolts, so to speak -- is not as important as game journalists and people who post on online message boards would have you believe. [...] eventually the console will become like a DVD player, and we'll all be able to concentrate on the experience of playing games, instead of obsessing about the systems they run on".

Today I am going to ostensibly change my tune, and talk about why hardware should be very important, at least to the discerning gamer, and why we should all pay much more attention to it than we currently do (and -- why not? -- even obsess about it a little).

I say "ostensibly" because there's no contradiction between these two positions. The reason for this is down to semantics. In the first instance I was using the term "hardware" to refer specifically to consoles, whereas I am now using it in a much broader sense, to encompass things like TVs and monitors, controllers and sound equipment, arcade cabinets, converters, modchips and all sorts of cables, and generally all that expensive junk that sits on our desks and "entertainment centers", that covers our floors or fills up our closets and storage areas.

My position remains that, in order to to get the most out of gaming, comparisons between consoles as to which ones are "better" are not really useful (since you'll need to buy or emulate all of them anyway). However, as I will now endeavor to explain, general hardware knowledge is indispensable.

A medium ever-reliant on technology
It is important to realize that electronic games, in contrast to other kinds of games, have, by definition, always been reliant on technology, and always will be. To put it simply, without hardware there can be no games.

In fact, since the early '90s, innovations in the development of computer systems (a term which also includes consoles and arcade systems, which are effectively computers lacking keyboards and user-friendly operating systems) have oftentimes been driven by the insatiable hunger of game developers for more and more processing power. PC gamers, who are forced to more or less continuously upgrade their machines, know this very well, as do microprocessor manufacturers such as Intel and AMD, which for well over a decade have been expending considerable resources in an effort to woo them over to their respective product lines.

The quest for power has always been partly about jazzing up the graphics of course, but to view it simply as such would be missing the point. The real benefit of more advanced technology is that it enables developers to create new kinds of games. Metal Gear Solid and Katamari Damashii, for example, would not have been possible on 16-bit consoles -- not only because those consoles could not have coped with the necessary calculations, but also because they lacked the high-volume storage of optical drives and the versatility of modern controllers.

And when the extra power or new features of the latest technologies are not used to create totally new experiences, they are used to evolve existing genres and take them in new directions. Herzog Zwei, the first real-time tactics game, was developed in the late '80s to run on the 16-bit Mega Drive, but massive-scale RTT and RTS games such as Creative Assembly's Total War series or Chris Taylor's Supreme Commander would not have been possible without the power of today's computer systems.

So this is what additional processing power allows us to accomplish. But what do ever-larger displays and higher-fidelity, multiple-speaker sound setups give us?

This is a funny question, if indeed anyone is wondering about it while reading these words. And I say it's funny because anyone who has ever chosen to see a movie in the cinema instead of at home (and I presume this includes every single one of the readers of this website) knows the answer very well, though I suspect some of you do so only subconsciously. To clear this up once and for all, I'll let Pulitzer-winning film critic Roger Ebert do the talking:


From time to time I'll meet someone who was underwhelmed by "2001: A Space Odyssey". Because I consider it one of the great moviegoing experiences of my life, I ask them how they saw it. They invariably saw it on home video. Just as there are movies -- "Moulin Rouge" seems to be one -- that benefit from return visits via DVD, so there are a few movies that should not be seen that way -- not the first time, anyway.

Stanley Kubrick's masterpiece is above all a big-screen experience. To work, it must dominate and overwhelm the viewer. Its power resides in the immensity and emptiness of its images of outer space, and in the frailty of man, dwarfed even by his tools. The first awesome shot of the shuttle approaching the orbiting space station is humbling and exhilarating, one of those rare moments when you really do believe the movies can lift you into another dimension of experience.

...

Seeing "2001" on a big screen in 70mm is one of a handful of obligatory experiences during a film lover's lifetime.

...

I said I've seen the movie many times on 70mm, and so I have [...]. During all of those experiences, the film has never grown old or lost its power -- perhaps because it is not a narrative but an experience. Just as it doesn't matter how many times you have approached Venice by sea at dawn, or crept to the edge of the Grand Canyon, it doesn't matter how often you've seen "2001" on the big screen. It is one of the noblest and most awesome works of film.

Of film, but not of video. "2001" on a TV set is like the Grand Canyon on a postcard.


So there you have it: the world's foremost film critic is in effect telling his readers that to really get 2001: A Space Odyssey they have to watch it on a freaking MASSIVE screen, preferably in tandem with a very expensive sound system (he didn't say this last part, but he might as well have). So if Ebert can get away with this you'll hopefully allow me to say that what Halo did more than anything else (and it did many other things), was to take the FPS experience away from desks and tiny monitors and put it on large TVs and decent sound systems. Moreover, given the fact that the power of many electronic games (and especially modern ones) resides in our fascination with the images they create (in sharp contrast to most films, whose power resides in our fascination with the ideas they contain), one can clearly see that most games stand to gain a lot more from the use of expensive audiovisual hardware than most movies.

Of course with the theoretical issues out of the way we come face to face with the practical ones. The problem here is that, at least for the time being, videogame theatres do not exist. So while practically everyone can afford the eight or ten bucks for a movie ticket, a thousand dollars plus for a large screen or a projector remains out of the question for most people. This is something that can't be helped. It is a fact that games like Halo gain much from the use of such equipment, and it is also a fact that very few people will experience them that way. But this situation is only temporary. This is consumer electronics hardware we are talking about, after all. Prices are always falling, and doubtless the day will come when gigantic screens will be as ubiquitous as cellphones (i.e. everyone will have three). Until then, depending on your temperament, you can always get a job or rob a bank.

But there is a lot more to it than that
Naturally, as those of us who enjoy the odd handheld game while lying in bed know very well, not all games stand to benefit from extra screen real estate or surround sound setups, and besides screen size and sound quality are not the only hardware issues that should concern us. What are some of the other ones?

A very important issue is that of controllers. It is an unfortunate fact that most people who play videogames do not realize that certain kinds of games work better with certain kinds of controllers -- or, in extreme cases, work only with certain kinds of controllers. In fact many kinds of games were invented only after the appropriate kinds of controllers became available (or, in other cases, they were the driving force that led to their invention), and the subsequent development of these games became thereafter linked with that of their respective controllers, in a process mirroring that of the development of many real-life sports. The power and speed of modern professional tennis would not have been possible with the flimsy wooden racquets of the '60s, for example, nor could anyone hope to get towed into 35-foot waves on an oldschool longboard and come out alive. Of course videogames rarely ever get as technical, and therefore as demanding, as real-life sports (though it's early days still -- give them a couple more decades and you'll see), so naturally their hardware demands are not nearly as stringent. Nevertheless, the fact remains that you can have a masterfully conceived, designed and executed game, and a willing and enthusiastic player, but if the wrong hardware is used the experience will be much degraded, or in some cases even outright worthless. This is one of the reasons why most long-time fans of a specific genre have so much trouble enjoying games that belong to a completely different one. Time and experience have taught them to only accept the best and most appropriate control methods for their preferred genre, and yet, as soon as they cross over into something new, they assume that just about anything at hand will do.

Of course, once you set down the road that leads to audiovisual and control perfection, you begin to develop an interest for an ever-increasing number of related topics, from the laws of physics that underpin everything, to the endless and highly technical comparisons between different kinds of cables, switchboxes, adapters and adaptors, et cetera, which to the uninitiated and the naive always seem somewhat pointless and pedantic. Taken as far as it will go, this road eventually leads to elaborate gaming rigs, home-made, custom solutions and improvisation, which can sometimes surprise us with wonderful, unexpected results.

And of course simply owning the right hardware for each genre of games -- or for each game, even -- is not by any means the end of the story. Knowing how to set everything up correctly, and how to adjust the settings to achieve the best results according to the demands of each scenario, is equally important. These then are some of the issues worth exploring in some detail, and there are many others, including tricks for bypassing region-locking techniques (which can open up whole new libraries of previously inaccessible games), video capturing methods (for recording and sharing your virtual adventures), and of course the inexhaustible and ever-growing, both in range and importance, field of emulation.

Does all this sound like too much of a headache to you, dear reader? Well, it's not my fault you picked such a demanding hobby to mess around with. I guess it's never too late to go back to backgammon or gardening, or whatever the hell pathetic hobby you filled your spare time with before you began playing videogames. Alternatively, just keep playing whatever mainstream junk corporate journalists tell you to play, on whatever console or computer your mom and dad happened to pick up from the sales counter of the local supermarket, using whatever shitty cables and controllers you happened to find in the box. Then go on the first crummy online message board you come across, and start some spastic rant about why the only thing that matters in videogames is the gameplay. At the very least, you'll make a lot of new friends.

Newsflash: Ignorance is not bliss, dumbass -- it's stupid
In yet another previous commentary, I bitched and moaned about the complete and total lack of decent gaming-specific hardware coverage nowadays, not only in the specialist gaming press, but also on publications such as CNET and PC Magazine, which really should know better. But to truly demonstrate the depths of ignorance the average gamer dwells in these days, yet another example is in order.

Tim Rogers, my chosen example, is certainly not your average gamer. The man has been into games for at least two decades, if not longer, learned Japanese because of his love for Japanese video games, and eventually moved to Japan to pursue a career in game journalism, among other things. Since then he has helped build Insert Credit, one of the handful of gaming websites I consider worth reading, has become the most well-known advocate of New Games Journalism (a way of writing about games which, thought pointless and infantile, at least demonstrates that its true adherents are at least fairly interested in games), and has written about games for a number of respected publications including Wired, Edge and GamesTM (respected by others that is, not by me), as well as become a long-time member of a number of videogame forums frequented by hundreds of gamers who consider themselves highly knowledgeable. If you are still not impressed with this man's qualifications, I'll personally vouch that, on the whole, he knows far more about games than most any other corporate game journalist, not to mention he is far more well-educated and intelligent.

Did you get all that? Good.

And yet he doesn't know why it should be considered a crime to play a game such as Gunstar Heroes on his 40-inch widescreen 1080p-native display. If you are thinking that perhaps he does know but simply chooses to use that display because of reasons of convenience (perhaps he has no space in his apartment for a decent CRT, for example), think again. I remember back when the PS2 Gunstar Heroes: Treasure Box came out, he was going around telling everyone that the image quality on his Sony Bravia was out of this world, even going as far as to post pictures of the game in action to prove his point. And as far as those hundreds of videogame experts friends of his were concerned, he certainly did, since not a single of them came forward to object to the whole travesty.

Dear reader, if I need to explain to you why a low-res 2D game will look like utter shit when stretched to a widescreen aspect ratio and then blown up to a resolution of 1920x1080 (not to mention all the while displayed through composite video!), then you clearly need to follow this website's new hardware section religiously. Trust me on this: Whatever you do, do not miss a single update.

The point is that, if this guy, who if there could ever be such a thing as the "gaming lifestyle" (lol, etc.) certainly would be among its most iconic stars, is still not aware of the basic principles of operation of digital displays, then the so-called "beginners" or "average gamers" might have trouble installing a game on their PCs or turning on their freaking consoles without calling a helpline for all I know. You might think I am exaggerating or being too hysterical about all this, but then I am sure that the average moviegoer would have said the same thing about Ebert when he called them "dummies" for watching 4:3 movies squashed on their widescreen TV sets. But then, hey, what do we know, eh? We are only the experts in our respective fields. I am sure you and your friends know much better.

So anyway, enough with the ridiculousness. There are a few people out there who know their shit when it comes to hardware, and it is my goal to try and get them to come on here and explain to the rest of us what we are doing wrong, and what we should be doing. I myself have amassed a fair amount of knowledge on the subject over the years, and certainly plan to put down in writing as much of it as possible. I know I've been promising to do this for years, but now, at last, it's for real. Until the first hardware update then, let me recommend my era-defining 2006 article on the HDTV fiasco which is required reading. If you haven't read it yet, this would be a good time to do so. There will be a quiz afterwards.