Cost's Two Cents: The Graphics Asymptote

Cost's Two Cents: The Graphics Asymptote

How much better can we make our games look?

NOTE: Articles labeled "Cost's Two Cents" are NOT intended to be cited or interpreted as factual pieces like the other News on this site. These are my opinions, observations, and the like, often sprinkled with some facts mixed in. Enjoy.

The video game industry has been around long enough to where bystanders with sufficient interest and ability to make connections can begin to draw patterns and predict where the medium is going. Because I'm not qualified to give an in-depth analysis of every trend video games have undergone over the course of one ~1000 word article, I'm going to hone in on one enduring and prevalent phenomenon, that of console developers consistently trying to outdo themselves and others in terms of a computer's graphics more than just about any other facet of the product.

Back in the 1980s, when the NES and Super Mario Bros. took the world by storm and secured a niche for the industry to move out of arcades and into the home of the average Joe, the visual competence of video games was extremely limited compared to every other medium at the time, save for old-fashioned novels and books. Comics could print detailed illustration, Movies could wow audiences with (at the time) astounding special effects, but games could only have about ten or less moving sprites on the screen without lagging or even crashing intermittently. With a background like this, it's no wonder console developers focused on fixing the most obvious flaw in their budding medium, and consequently competing to see who could fix it better.

This led to the waging of the "bit-wars", referring to the graphical capabilities of the competing home consoles at the time. Some of these computers or their attachments were even named based on the bit-value of their processors, the most notable examples being Sega's 32x add-on and the Nintendo 64. Needless to say, the fight was fierce, and over-the-top commercials made to show their visual superiority to the competition illustrate just how important looking good was at the time.

While the bit wars may have "ended" around the time of the Xbox and GameCube, anyone who's taken a cursory glance at E3 press conferences or an IGN forum will know that having stellar graphics is still sought after and boasted about ad nauseum. And yet most people can't seem to decide who's telling the truth.

The difference in video quality between consoles in the 1990s was, for the most part, well defined; most people could see that the N64 (for the most part) looked better than the PlayStation, the SNES looked better than the NES, and watching a man give birth looked better than the CD-I [hey I owned one of those -ed]. However, around the previous generation - that of the PS3/360/Wii - these lines started to blur, so much that debates were waged over which console actually had better graphics, and the introduction of the PS4/XBONE/Wii U generation has expanded the debate to whether graphics have improved much at all from their predecessors. People simply aren't being wowed by visual innovation like they used to, there are no clear boundaries; the truth is obscured and abstract. Some think it's because we've become spoiled. Others think most of the confusion comes from angry, nostalgia-blinded gamers from the bit-wars era. I, however, have a different theory.

Within the realm of psychology there is an amalgamation of theories on sensory adaptation known as the Weber-Fechner Law. Without going into anything too convoluted, the law can be summed up like this; for people to continue to notice a difference between two stimuli, say, 100g and 105g, the threshold being 5g of difference, where the intensity is doubled, the threshold must also double as well. Basically, most people can just barely tell the difference between 100g and 105g, cannot tell the difference between 200 and 205g, yet are able to once again between 200g and 210g. Now, you may wonder what this mumbo-jumbo has to do with something as simple as how nice a console's visuals look, and understandably so. Let me elaborate.

People were astonished by graphical innovation in gaming's early years because the jumps in the # of bits on the label, like those between 8-bit and 16-bit (threshold of 8 bits here), or 16-bit to 32-bit (now a threshold of 16 bits), and so on, were greater than the established threshold; I'll bet that a theoretical 48 bit console instead of a 64 bit one would have far less effect on the public than the 8-to-16 jump. These jumps all ended with the introduction of 64 bit consoles, seeing as every console since has just advanced the 64-bit formula. In essence, the threshold is progressively getting higher, like a power function, while our attempts to break it are leveling out at a theoretical asymptote, the point where advancements are so minute they will be invisible to the human eye, or at least unremarkable enough to ignore entirely.

If you want to see this effect for yourself, find a HD video on YouTube. Set the quality to 240p (I know, it hurts), and notice the difference when you set it to 360p and then 480p. The 480p obviously looks better than 240p, but notice that the perceptual change to 360p isn't nearly as dramatic. Now look at the difference between, say, 720p and 1080p. There's a tiny shift in quality, but nothing compared to the jump between 240p and 480p. The same exact thing is happening to games; adding more pores onto a character's face is nothing compared to seeing Super Mario Bros. 3 next to Super Mario World for the first time, even with nostalgia removed from the equation entirely. We are leveling out and our jumps are becoming hops at best, hops to skips, and so on.

The simple truth is that we cannot be impressed by graphical advancements like we used to, yet companies are still basing much of their marketing strategy and betting resources on the notion that we can.

I therefore ask the question, where do we go from here? Have we reached the pinnacle of graphical capability in video games? I'd love to see some discussion in the comments below, but as for me, I say that depending on graphics to sell your console is becoming less and less of an ideal market strategy, and that we may within our lifetimes see a convergence of visual quality. Perhaps then, the focus will fall entirely on the games themselves.

Here's to wishful thinking.


Follow @WiisWorld on Twitter for updates! Go on, most of us don't bite!

"It ain't bragging if you've done it. There's nothing wrong with being proud of doing something well. In fact, if you intend to do something creative for a living, it's absolutely essential." - James A. Owen

Follow me on twitter for updates on articles, artwork, and podcasts, all in glorious 64-bit! @NoBeanChiliCost

's avatar

Griffin Cost

8 news items

Share this story

User comments

No posts yet for this game. How about being the first?

Write a comment

Instant join

Wii's World is not officially affiliated with Nintendo! (but they wish we were).