Editorials

720p? 1080p? ESRAM? Why it matters and why it doesn’t

The press has been foaming at the mouth like a dog with a bone for the last few weeks, spewing venom over resolutions, rumors about memory, and miscellaneous falsehoods and conjecture. I’m not about to sit here and pretend that none of these things matter, they do. To explain why it matters, you need to understand the technical pieces that make up the differences.

Below is a picture. Without right clicking on it and looking at the resolution that it happens to be, can you readily tell? Now how about the picture below it?

BF4-720p_ultra

BF4-1080p_ultra

Without putting them side by side, can you tell me which looks better? It’s not hard to tell on a still picture which is the cleaner looking of the two, but in motion the differences become less clear. The fact of the matter is simple – unless you have two copies of a game and you are running them side by side, the difference between the two resolutions is all but imperceptible to many people, and the comparisons rely on a great many things beyond the console on which that game runs, including upscaling or the resolution of the HDTV where it appears. Let’s dig into the details that make some of these comparisons even more muddy territory.

Size matters. Speed matters. i matters.
Making the situation more complicated, your shiny new console’s power may be somewhat wasted on anything less than a 46” TV. Suggested viewing distances are also an issue that many people overlook when selecting their televisions. Let’s stir the mud a little further – do you happen to know what the refresh rate is on your TV? If it’s 120HZ or less, you might be shooting yourself in the foot as a gamer before you even get to the conversation of resolutions.

Refresh rate is the number of times your TV refreshes the image that was sent to it. That is to say that it is the number of times it draws the individual lines that make up your image. At 720p (the p stands for progressive), the image is drawn one line at a time in sequence. Line one, then two, then three, and so forth. At 1080i (the i stands for interlace), the same image is rendered at a higher resolution, but only every other line is rendered in each pass. That is to say that in the first refresh of the screen line one, three, five, and so on is rendered, and then on the second refresh line two, four, six, etc. is rendered. The more often an image is refreshed, the cleaner the visual representation on your screen, so a faster refresh rate is always going to be better for gaming.

Frame rate on the other hand is the speed at which a display can show a unique consecutive frame. Most modern analog cinema is displayed at a crisp 24 frames per second, meaning that there are 24 unique images for every second of time. That said, their refresh rate is actually three times this amount, meaning that the unique frame is illuminated by the projector three times per frame. If you again follow the math that is 24 frames multiplied by 3 for a refresh rate of 72HZ.

At 720p (there is no such thing as 720i), a screen would display 1280 pixels horizontally, 720 pixels vertically, and render them at roughly 1/50th of a second progressively, as described above, resulting in roughly 46 million pixels on the screen at any one time. 1080i would push 1920 pixels horizontally, and 1080 pixels vertically, but rendering them at half the speed (due to the alternating interlacing) for a result of just 52 million pixels. You could easily conclude that there is very little difference between 720p and 1080i. Moving to 1080p is an incredible jump though, as the math would suggest. 1920 pixels horizontally multiplied by 1080 pixels vertically multiplied by 50 (the 1/50th of a second needed for progressive scan at 120HZ) comes out to roughly 104 million pixels on the screen at a 1080p resolution. You can argue all day, but the math is sound – 104 million is and will always be better than 46 million.

So why is upscaling bad?
Upscaling, sometimes referred to as ‘upsampling’, takes an image that is rendered at a lower resolution and stretches the resultant image to a higher resolution. For instance, Call of Duty: Ghosts runs at 720p (that’s 1280×720) and is all but confirmed to be upscaled to 1080p on the Xbox One. The result is slightly blurrier textures than you’d find on the PlayStation 4 which renders at 1080p (1920×1080 resolution). This is because you are trying to take something that is, as mathematically demonstrated above, 46 million pixels and attempting to stretch it to 104 million pixels in size. It’s easier to show than to explain, so take a look at the images below:

720-1080i-1080p

Used with permission from user mongycore on Reddit

Thinking of it a different way, if you took a picture at 1 megapixel on an old phone, the result in an image that is 1000×1000 pixels. If you were to stretch that same image to 3 megapixels the result is grainy and unnatural at 1736×1736 pixels. Doing this with video just makes it worse. This means that the show CSI has been lying to you for quite a while – zoom and enhance is bullshit.

Putting lipstick on the pig
Building textures is thirsty work, and their drink of choice is RAM. Your selected texture size depends greatly on the resolution you are targeting. On the Unreal Engine 3 powered games of this current console generation, the team at Epic suggested that most characters and world normal and texture maps should be rendered at 2048×2048. Textures are made up of an array of texels (like a picture would be made up of a group of pixels), and the rule of thumb is to pick the closest power of two that would give you a ratio of 1 texel per pixel. The space surrounding a single texel on all corners creates what we could call a polygon – the basis of all 3D building blocks. Without getting into some complex math, suffice it to say that building a character like Marcus Fenix in Gears of War 2 at 20,195 polygons, is complex and RAM-intensive work. Stuffing all of that graphical density into the 512mb of shared memory on the Xbox 360 meant doing it, most commonly, at 30 frames per second at 720p. Some more detailed titles like Halo 3 were forced to render at 620p (1138×640 resolution) and upsample to 720p to fit high dynamic range lighting into that same memory space. You can start to see three things emerge from all this; resolution and texture size can be tricky business, memory bottlenecks are painful to the development process, and games seem to be far more fun to play than to make.

Welcome to the next generation
Developers rejoiced at having the handcuffs taken off with the Xbox One and PlayStation 4. Sony put 8GB of GDDR5 memory running at 5500MHz in their upcoming console, delivering a staggering 176GB/s of memory bandwidth to crunch through the high-polygon count titles coming shortly. Microsoft went a different direction, putting 8GB of DDR3 memory that hits a bandwidth of 68.3 GB/s, but also features a memory subsystem of 32 MB of ESRAM. This “embedded static” RAM is located directly on the graphics die. This ESRAM provides a high speed pathway between the multiple components in the device at a theoretical speed of 192 GB/s. Recent rumors from developers is that this also carries a bit of a learning curve, making for more complex programming than the unified memory systems of the PlayStation 4. As was the case with the cell processor in the early days of the PlayStation 3, this is likely a short-term issue that’ll be mitigated with a bit of practice with the hardware. In the mean time, however, it’s going to make higher resolutions and cleaner textures more difficult on Microsoft’s new platform.

So what does it all mean?
I’ve filled your head with terminology, postulations, theoretical computational limits, and even made you do some math. What does all of this mean? In the end….nothing. Sure, some multiplatform games might look marginally better or worse on one platform or another, but unless you have them running side by side you’ll likely never even notice. Your conclusion should be this – buy the games you want to play and stop worrying so much about the platform. The debate will rage on, but in the end it is and will always be about gameplay. Both systems will have their ups and downs, Microsoft and Sony fans will lash out at one another, Nintendo will do their own thing, and the PC crowd will lord their 4K resolution over the top of all of them.

Executive Director and Editor-in-Chief | [email protected]

Ron Burke is the Editor in Chief for Gaming Trend. Currently living in Fort Worth, Texas, Ron is an old-school gamer who enjoys CRPGs, action/adventure, platformers, music games, and has recently gotten into tabletop gaming.

Ron is also a fourth degree black belt, with a Master's rank in Matsumura Seito Shōrin-ryū, Moo Duk Kwan Tang Soo Do, Universal Tang Soo Do Alliance, and International Tang Soo Do Federation. He also holds ranks in several other styles in his search to be a well-rounded fighter.

Ron has been married to Gaming Trend Editor, Laura Burke, for 27 years. They have three dogs - Pazuzu (Irish Terrier), Atë, and Calliope (both Australian Kelpie/Pit Bull mixes).

See below for our list of partners and affiliates:

Trending

To Top
GAMINGTREND