Commentary


PS3 and 360: not nearly as powerful as they should be

By Alex Kierkegaard / September 6, 2006


Consider this: when you are watching a movie on your TV you are seeing a standard resolution image; assuming you are in an NTSC country that would be 640x480i. When you are playing an Xbox/PS2/GameCube game on the same TV you are seeing the exact same resolution.


Which looks better,[1] the movie or the game?


The movie of course.


Now go out and buy an Xbox 360, an HDTV and Tecmo's Dead Or Alive 4. On your way back stop over at a Blockbuster and pick up a DVD for comparison -- say Zhang Yimou's 2004 movie Hero (a martial arts flick to go with the martial arts game). Go home, unpack and set everything up, and get ready for quite a shock.


Though DOA 4 runs at the insanely high resolution of 1920x1080i on your brand-new, top-of-the-line, exhorbitantly-expensive HDTV (you did remember to buy an expensive one, didn't you?), what you end up seeing is a far cry from the lush 640x480i glory of Hero.


hero_a.jpg doa4_a.jpg hero_b.jpg doa4_b.jpg

Compare the hair on the two ladies or the details in the clothes of the fighters. There is a huge gap in image quality, and though DOA 4 offers much higher definition, what it helps define is an obvious lack of detail. But how can this be? How can it be that even the tiniest, decade-old TV and any cheap-ass DVD player can put your 360/huge-HDTV combo to shame?


The answer to this question is that increasing the resolution doesn't help much, if the system pumping out the visuals isn't correspondingly powerful. It seems obvious once you sit down and think about it: you can blow up a Nintendo 64 game to whatever resolution you want and it will still look as primitive as it normally does -- you might get rid of some jaggies in this way but the game will not look fundamentally better. But there's much more to it than that.


A resolution of 1080i corresponds to 1,036,800 pixels, while 480i is only 153,600 pixels[2] -- DOA 4 is, in fact, crazy as it may sound, showing almost seven times as much detail as Hero at any given moment. So why does it look so much worse in comparison?


It's the effects, stupid

The truth is that, contrary to what Sony and Microsoft would have you believe, resolution is not the most important factor in graphics quality. As anyone who is into first person shooters on the PC will tell you, the effects are much more important.


PC gamers have been able to switch resolutions with a few clicks for well over a decade; at the same time they've also been able to experiment with various detail settings: things like advanced shader models, anti-aliasing and bump mapping; and more recently with transparency supersampling, HDR lighting effects and subsurface scattering (I could be making these up and you probably wouldn't know the difference, but that's exactly the point). Because of this, they've long since realized a few things that escape many of the rest of us.


Whenever a new blockbuster arrives, be it Doom 3 or Far Cry or whatever, they have to make a choice between going for higher detail settings or higher resolutions. What they've come to understand is that, if you don't have enough horsepower to go for both, it's always preferable to jack up all detail levels to the max, rather than to go for the highest resolution possible.[3]


To convince yourself of this beyond any doubt, try the following. Get hold of Valve's Half-Life and Half-Life 2 games, install them on your PC, and start messing around with the video settings. You will quickly realize that no matter how high you jack up Half-Life's resolution -- try 1600x1200, for example -- it will never look anywhere near as good as Half-Life 2 running at even the lowest possible setting. The reason for this is that the Source engine of HL2 supports many more, and much more advanced, effects than that of the original game.


So we've thus far established that obscenely high resolutions are relatively unimportant. I will now explain why the consequences of pursuing them above all else can deal a big setback to the effort for the overall improvement of graphics quality in next-generation consoles.


We've already seen that a 480i image is composed of around 150,000 pixels, whereas 1080i contains more than one million, and the PS3's much-touted 1080p resolution is in fact made up of just over two million pixels. Now the way graphics cards work is that they have to perform all necessary calculations for every single pixel shown on screen. This means that a 1080i image requires, in the best case scenario,[4] approximately seven times more calculations -- i.e. processing power -- than a 480i one (because it has approximately seven times more pixels). Accordingly, a 1080p image requires almost fourteen times more processing power.


In other words, if you take any currently available Xbox game, which has been designed to run at 480i, and you try to display it at 1080i, you will need a system approximately seven times as powerful as the original Xbox. Similarly, to run a regular PlayStation 2 game at 1080p you will need a system fourteen times as powerful -- simply to run the same old game, with the same old number of polygons and the same old effects, at the higher resolution. If you want to double the amount of polygons and upgrade the graphics with newer, fancier effects, that will cost you extra. Say twice as much for a double improvement in overall visual quality. And now I have to ask:


Is the Xbox 360 fourteen times more powerful than the Xbox?


Is the PS3 twenty-eight times more powerful than the PS2?


Maybe they are -- I don't know, and neither does anyone else at this point. Multi-core architectures are not very well understood at the moment -- not even at an academic level -- so any comparisons right now will necessarily be vague and misleading (witness Microsoft's and Sony's empty war of words about which system is more powerful).


But let's assume for a moment that the PS3 is, in fact, twenty-eight times more powerful than the PS2. For games running at 1080p half of that power will always be spent in order to show fine, miniscule details that simply aren't there, because the remaining power isn't enough to produce them. Can you imagine the kind of effects, or the number of polygons, developers could utilize if they were allowed to design a PS3 game running at the normal 480i resolution? I am certain they could come very close to the level of detail seen in a DVD movie, like the aforementioned Hero. And I don't know about you, but I'd rather be playing something that looks like Hero on my old TV, than Dead Or Alive 4 on one of these newfangled LCD or Plasma monitors, most of which can't even display all the pixels the console is pumping out to them! (Don't even get me started on that subject.)


Not exactly a conspiracy, but close

If you are still wondering where I am going with all this let me make my position clear: I believe that the PS3 and the Xbox 360 are not powerful enough to handle true HDTV resolutions, and at the same time deliver the large variety of new effects necessary to approach photorealism. The quest for photorealistic graphics is, after all, the main reason for designing new consoles every four or five years -- once it is accomplished the console cycle will grow longer and everyone will start concentrating more on peripherals and human/machine interfaces (Nintendo simply jump-started this process with the touch screen of the DS and the remote of the Wii, because they didn't have the R&D budget to stay in the graphics game).


In view of the above, I believe that the extra processing power of the new consoles should not be wasted trying to push such a huge number of pixels around as is required for HDTV resolutions. The extra power should be used for higher polygon counts and more complex effects, so that standard resolution games can become indiscernible from standard resolution movies. Only when we have achieved this should we move to vastly more powerful hardware and higher resolutions.


Unfortunately, Microsoft is not allowing the developers to try this, and it seems that Sony has decided to follow its lead. So what we will end up getting instead is a flood of games like Ninety Nine Nights, with fairly detailed characters perhaps, but barren, featureless backgrounds.


Of course, there will be better-looking games than Ninety Nine Nights and even Dead Or Alive 4 by the end of this generation, but they won't be such a huge leap forward. Consider that Dead Or Alive 3 was a launch title for the original Xbox and it's still one of its best-looking games. In fact, Ninja Gaiden is often cited as the best-looking Xbox game, and it only looks marginally better.


There is a fair chance I am wrong in the previous statement, and third- or fourth-generation PS3 and Xbox 360 games end up being a huge leap forward compared to what we are seeing now. This will only come about if academics and game developers come up with algorithms that make efficient use of the multi-core architecure of the new systems. If this happens, no one can predict with any amount of certainty what the net processing power gain will be. It could simply be significant, or it could be huge. I would love to see this, but I am not optimistic about it.


I want to point out that I am not against higher resolutions altogether at this point. There are valid reasons, mostly of a marketing nature, why Sony and Microsoft decided to push HDTV with such wild abandon. The latest console cycle happened to coincide with the beggining of a wider adoption of the HDTV standards, and it only made sense that the new consoles should support them. Digital applications and web-browsing are awful at standard NTSC resolutions, and there are kinds of games -- strategy or text-heavy RPG titles, for example -- that can really benefit from the increased resolutions, effects be damned.


The mistake, however, was to force all developers to design for 720p minimum, regardless of the kind of game they are making, and to launch sophisticated marketing campaigns to convince consumers that the jump in resolution would amount to a corresponding jump in image quality. As I've explained, this last assertion couldn't be further from the truth.


You won't hear anything about all this from developers -- at least not on the record -- because they have to work with Sony and Microsoft on a daily basis, and they don't want to risk that relationship. Likewise, game journos are not being much help, either. The days when they cared about hardware to the point where they'd count elephants in the SNES and Genesis versions of Street Fighter II are long past -- they are now expending their gray matter writing eloquent reviews instead.


But talk to anyone who knows their tech and they'll confirm what I am saying here. Progress in visual quality is being held back because of a policy cooked up by marketing people to suit their, and not the developers', needs. On the one hand this is not such a big deal; graphics aren't everything, and we'll eventually get there one way or another. But what upsets me more is all the gibberish "ZOMG HDTV!" nonsense that is being poured down people's ears, while no one goes on the record to refute it.


Oh well. At least now you know.




[1] What looks better is subjective, of course. There are those who find that they prefer the look of Ninja Gaiden on the NES to that of the Xbox game of the same name. However, in this article I use the term 'better' to mean "closer to photorealistic" (that is to say, indistinguishable from a photograph). This is, indeed, "better," because if you have a system capable of rendering photorealistic graphics it can also render anything else you want, no matter how simple.


[2] For the calculations, the number of columns (the second number) must be halved, because these are interlaced resolutions. So for 480i it is 640 x 240 and for 1080i 1920 x 540. For 1080p, however, which I will mention later on, the second number remains unchanged because it is a progressive resolution. So in that case the multiplication will be 1920 x 1080.


[3] This is always true if your goal is to achieve the best image quality you possibly can. However, for gameplay reasons, especially in multiplayer situations, you are sometimes forced to trade off image quality for higher resolutions, in order to increase the draw distance (in first person shooters), or the viewable area (in real-time strategy games).


[4] I say best case scenario because the processing power required for certain effects, such as some full-screen anti-aliasing algorithms, for example, increases exponentially and not linearly with an increase in resolution.