Video Modes, Quality, and Frames-Per-Second
Posted: November 30th, 2013, 4:02 am
Video Modes, Quality, and Frames-Per-Second
When you want to compare a game's quality, often the first thing that comes to mind is its resolution, and how fast the game goes.
Comparing games often boils down to boasting about how much FPS a game can get, or about "it does 1080p!!" or other such stuff. But in reality, this is never the whole story. One must take into account a lot of monitor capability, and the graphics card, before you can make any kind of judgement about a game's capabilities.
There are some main factors that usually affect game performance and quality from the "computer/tv screen" side of things. Here's what each factor is, and how it affects performance and quality:
Factor 1: Screen Refresh Rate!
Any game that is worth playing has to have some fixed rate at which to display its graphics, otherwise the game will look choppy. A higher refresh rate means that the game will look smoother, while a lower refresh rate will make it look "slow" or "unresponsive".
Typically, a 2D game will choose to run at some common frame rate like 30 or 60, to make the game math easier - but the actual frame rate of the game may not actually be the same as the screen's physical refresh rate.
To combat this, games use something called "delta timing" to make the game run at the same speed no matter how fast the monitor refreshes. But doing this only solves half of the problem - delta timing will not guarantee a smooth game if the game cannot keep up with the screen refresh.
To keep a smooth display, the game is forced to put video frames onscreen at some even multiple of the screen's refresh rate. On a 60Hz screen, for example, as the game gets slower and slower, the steps to keep the game smooth while it is slowing down would be like this in Frames Per Second: [60, 30, 20, 15, 12, 10, etc...] so that there is always an even amount of time in between each displayed frame. If you don't keep an even amount of time between the frames, the game would feel like it is "jerking" or "jittering" and reacting to the game would become more difficult.
That's not the end of the refresh rate story, though. A lot of gamers think Frames Per Second is the ultimate measuring stick with which to compare games, but it's not. Frames Per Second has less to do with the game you are playing, and more with the hardware the game is actually being played on.
Frames Per Second (FPS) cannot physically go any faster than the screen's refresh rate. If the game did try to serve up more graphics frames per second anyway, it wouldn't matter because the screen can only display images at its refresh rate.
So, having a game going faster than the screen refresh is pointless. Also, rendering graphics at rates other than the screen refresh can cause "tearing", which means you get glimpses of two halves of the current and next video frame onscreen at the same time. Some programmers might say this is "not noticeable" and shrug it off, but no. It's definitely noticeable, and it is not normal for a game to have tearing. It makes any game look badly done and unprofessional.
Tearing can also happen if the game does not do its video display (that is, actually showing the image) in sync with the refresh periods of the screen. This is where the term vsync comes from. A lot of gamers bash using vsync because "it makes the game go slower"... but this notion of vsync being bad is actually a bunch of nonsense.
A professional game ABSOLUTELY MUST vsync so that frames get copied to the screen during the time that the screen frame is not changing from one to the next. Otherwise, the game will end up doing this copy in the middle of a frame display, and it will cause tearing in exactly the same way rendering at the wrong rate does.
This can be "fixed" (not really) by Double Buffering*: making a copy of the rendered graphics first onto some off-screen "buffer" and then after that copy to the actual screen, which reduces the chances of tearing happening, but it doesn't guarantee that the game will actually do the copy at just the right time for each frame. So, this Double Buffering method gets us a little bit closer, but doesn't really fix anything.
One of the working methods to fix the tearing problem (when copying by itself is too slow), is to make the game copy the frame to an off-screen buffer before the vsync happens, and then tell the graphics card to show that off-screen buffer instead, and then it uses the old screen to draw on the next time around. This is called Page Flipping*. There's also another method called Triple Page Flipping* which, if supported by the graphics card, enables the game to continue doing work immediately instead of waiting for the current vsync to finish.
*Watch Out: NVidia and OpenGL like to call the Page Flipping mode "Double Buffering". Be careful not to get the 3 modes confused!
Factor 2: Color Depth!
This is simply the number of colors that each pixel could have.
Usually, the number of colors will be a power of two, which is because the color is represented with binary ones and zeroes.
That's why we call the color depths like 16bit, 32bit, etc - the number of bits usually determines how fine, in detail, the graphics the game has can be displayed. A game may actually have a higher number of colors for its graphics, than the screen you are playing it on, so don't be too quick to judge a game based on the colors you see. The game might actually be limited by the old screen you have!
On today's PCs, you will most likely have 24bits of color, and 8 of transparency (we call it alpha) information, which we call 32bit color (it's still actually 24bit, though). There are also 15bit and 16bit modes, and if you are very extremely lucky, you might find 24bit hardware (the kind without those 8 bits of alpha) out there in the wild.
A game may have a higher bit color while actually showing less overall colors. When the number of showable colors is 256 or less, the hardware will usually let the game programmer set up a "palette" which uses higher-bit colors than the actual number of colors that can be shown onscreen. This can result in higher quality graphics even on a very limited system, and it is very common for old console games. For instance, SNES has 15bit color, but can only show 256 of them at a time.
(DELTA doesn't support hardware palette modes, by the way.)
Factor 3: Screen Resolution!
This is the number of pixels (dots) on your screen, in width x height. This could be anything, but TVs are typically one of a few standard sizes. CRT computer monitors might support any reasonable size, and LED/LCD screens are usually limited to standard sizes (just like with TVs).
A screen might be able to show more than one size, but the resolution that it was designed for is considered the screen's Native Resolution.
Games will look the best when played at your screen's Native Resolution.
Factor 4: Interlaced vs Progressive Scan!
This is what TV advertizements are talking about when they say 1080p compared with 1080i. The "i" or "p" indicates whether the screen shows images interlaced, or all at once.
Interlacing means that the screen only shows odd lines on one frame then even lines on the other. Progressive Scan will show all the lines of a frame on a single refresh. A screen that's going at 60Hz, but is interlaced, will only really be showing 30 frames per second, not 60.
Simply put, interlacing will make for a very fuzzy picture, and usually isn't suited to games that are made for PC. Some old console games were designed around this however, and they look great.
LCD and LED screens are all Progressive Scan by definition, while TVs may do only interlaced or may be capable of both modes. When given the choice, choose Progressive Scan. It will almost always look better.
Factor 5: Screen Pixel Type/Size!
TVs have fuzzier pixels (dots) than their computer screen relatives. Also, computer screens have much smaller pixels than digital TVs, so they will look a whole lot "sharper" or "clearer" than a TV of the same size. Games usually look better on a computer screen for these reasons.
When you want to compare a game's quality, often the first thing that comes to mind is its resolution, and how fast the game goes.
Comparing games often boils down to boasting about how much FPS a game can get, or about "it does 1080p!!" or other such stuff. But in reality, this is never the whole story. One must take into account a lot of monitor capability, and the graphics card, before you can make any kind of judgement about a game's capabilities.
There are some main factors that usually affect game performance and quality from the "computer/tv screen" side of things. Here's what each factor is, and how it affects performance and quality:
Factor 1: Screen Refresh Rate!
Any game that is worth playing has to have some fixed rate at which to display its graphics, otherwise the game will look choppy. A higher refresh rate means that the game will look smoother, while a lower refresh rate will make it look "slow" or "unresponsive".
Typically, a 2D game will choose to run at some common frame rate like 30 or 60, to make the game math easier - but the actual frame rate of the game may not actually be the same as the screen's physical refresh rate.
To combat this, games use something called "delta timing" to make the game run at the same speed no matter how fast the monitor refreshes. But doing this only solves half of the problem - delta timing will not guarantee a smooth game if the game cannot keep up with the screen refresh.
To keep a smooth display, the game is forced to put video frames onscreen at some even multiple of the screen's refresh rate. On a 60Hz screen, for example, as the game gets slower and slower, the steps to keep the game smooth while it is slowing down would be like this in Frames Per Second: [60, 30, 20, 15, 12, 10, etc...] so that there is always an even amount of time in between each displayed frame. If you don't keep an even amount of time between the frames, the game would feel like it is "jerking" or "jittering" and reacting to the game would become more difficult.
That's not the end of the refresh rate story, though. A lot of gamers think Frames Per Second is the ultimate measuring stick with which to compare games, but it's not. Frames Per Second has less to do with the game you are playing, and more with the hardware the game is actually being played on.
Frames Per Second (FPS) cannot physically go any faster than the screen's refresh rate. If the game did try to serve up more graphics frames per second anyway, it wouldn't matter because the screen can only display images at its refresh rate.
So, having a game going faster than the screen refresh is pointless. Also, rendering graphics at rates other than the screen refresh can cause "tearing", which means you get glimpses of two halves of the current and next video frame onscreen at the same time. Some programmers might say this is "not noticeable" and shrug it off, but no. It's definitely noticeable, and it is not normal for a game to have tearing. It makes any game look badly done and unprofessional.
Tearing can also happen if the game does not do its video display (that is, actually showing the image) in sync with the refresh periods of the screen. This is where the term vsync comes from. A lot of gamers bash using vsync because "it makes the game go slower"... but this notion of vsync being bad is actually a bunch of nonsense.
A professional game ABSOLUTELY MUST vsync so that frames get copied to the screen during the time that the screen frame is not changing from one to the next. Otherwise, the game will end up doing this copy in the middle of a frame display, and it will cause tearing in exactly the same way rendering at the wrong rate does.
This can be "fixed" (not really) by Double Buffering*: making a copy of the rendered graphics first onto some off-screen "buffer" and then after that copy to the actual screen, which reduces the chances of tearing happening, but it doesn't guarantee that the game will actually do the copy at just the right time for each frame. So, this Double Buffering method gets us a little bit closer, but doesn't really fix anything.
One of the working methods to fix the tearing problem (when copying by itself is too slow), is to make the game copy the frame to an off-screen buffer before the vsync happens, and then tell the graphics card to show that off-screen buffer instead, and then it uses the old screen to draw on the next time around. This is called Page Flipping*. There's also another method called Triple Page Flipping* which, if supported by the graphics card, enables the game to continue doing work immediately instead of waiting for the current vsync to finish.
*Watch Out: NVidia and OpenGL like to call the Page Flipping mode "Double Buffering". Be careful not to get the 3 modes confused!
Factor 2: Color Depth!
This is simply the number of colors that each pixel could have.
Usually, the number of colors will be a power of two, which is because the color is represented with binary ones and zeroes.
That's why we call the color depths like 16bit, 32bit, etc - the number of bits usually determines how fine, in detail, the graphics the game has can be displayed. A game may actually have a higher number of colors for its graphics, than the screen you are playing it on, so don't be too quick to judge a game based on the colors you see. The game might actually be limited by the old screen you have!
On today's PCs, you will most likely have 24bits of color, and 8 of transparency (we call it alpha) information, which we call 32bit color (it's still actually 24bit, though). There are also 15bit and 16bit modes, and if you are very extremely lucky, you might find 24bit hardware (the kind without those 8 bits of alpha) out there in the wild.
A game may have a higher bit color while actually showing less overall colors. When the number of showable colors is 256 or less, the hardware will usually let the game programmer set up a "palette" which uses higher-bit colors than the actual number of colors that can be shown onscreen. This can result in higher quality graphics even on a very limited system, and it is very common for old console games. For instance, SNES has 15bit color, but can only show 256 of them at a time.
(DELTA doesn't support hardware palette modes, by the way.)
Factor 3: Screen Resolution!
This is the number of pixels (dots) on your screen, in width x height. This could be anything, but TVs are typically one of a few standard sizes. CRT computer monitors might support any reasonable size, and LED/LCD screens are usually limited to standard sizes (just like with TVs).
A screen might be able to show more than one size, but the resolution that it was designed for is considered the screen's Native Resolution.
Games will look the best when played at your screen's Native Resolution.
Factor 4: Interlaced vs Progressive Scan!
This is what TV advertizements are talking about when they say 1080p compared with 1080i. The "i" or "p" indicates whether the screen shows images interlaced, or all at once.
Interlacing means that the screen only shows odd lines on one frame then even lines on the other. Progressive Scan will show all the lines of a frame on a single refresh. A screen that's going at 60Hz, but is interlaced, will only really be showing 30 frames per second, not 60.
Simply put, interlacing will make for a very fuzzy picture, and usually isn't suited to games that are made for PC. Some old console games were designed around this however, and they look great.
LCD and LED screens are all Progressive Scan by definition, while TVs may do only interlaced or may be capable of both modes. When given the choice, choose Progressive Scan. It will almost always look better.
Factor 5: Screen Pixel Type/Size!
TVs have fuzzier pixels (dots) than their computer screen relatives. Also, computer screens have much smaller pixels than digital TVs, so they will look a whole lot "sharper" or "clearer" than a TV of the same size. Games usually look better on a computer screen for these reasons.