Display resolution versus Image Size

From Nexus Mods Wiki
Revision as of 05:46, 20 July 2018 by Dubiousintent (talk | contribs) (Added '== Frame Rate (FPS) versus Monitor Refresh Rate (Hz) ==' section)
Jump to: navigation, search

Display resolution versus Image Size (especially in older games)

There is a tendancy to think the largest available texture image size is always preferable. This is not always true. It does produce more detailed images, but can those details make a material difference on your display screen or are you wasting video processing resources and possible creating "stutter" for yourself? Much depends upon the game in question and the display you are using as well as texture image resolution (often called "image size").

Image size

The "vanilla game" default texture image resolution size for Bethesda is 512x512 pixels (an array of individually addressable dots on the display screen). The largest "high resolution" textures used at the time of older games such as "Fallout New Vegas" (released in 2010) were 2048x2048 pixels. Each pixel ("picture element") includes a "color" component. The intensity of each pixel is variable. In color imaging systems, a color is typically represented by three or four component intensities such as red, green, and blue (RGB), or cyan, magenta, yellow, and black (CMYB). The number of distinct colors that can be represented by a pixel depends on the number of bits per pixel (bpp). A 1 bpp image uses 1-bit for each pixel, so each pixel can be either on or off. Each additional bit doubles the number of colors available, so a 2 bpp (2^2 bits) image can have 4 colors, and a 3 bpp (2^3) image can have 8 colors:

  • 8 bpp, 2^8 = 256 colors
  • 16 bpp, 2^16 = 65,536 colors ("Highcolor" )
  • 24 bpp, 2^24 = 16,777,216 colors ("Truecolor")

- (Source: Wikipedia: Pixel).

This becomes important when considering the impact of your choice of image resolution: the resulting image size is not only width x height but also x bpp.

Display resolution

In 2010 70% of LCD Monitors were 19 inches or less. (An additional 6% more were 20-21 inch, and only 1% were 25-27 inches; the largest size released that year.) - Statista.

The highest resolution a 19 inch monitor is capable of is 1680x1050 pixels in dimension (width x height) at 104.3ppi ("pixels per inch") in a 16:10 aspect ratio, requiring 1.68MP ("mega pixels") in total for the display. Much more common was 1280x1024 at 86.3ppi in a 5:4 ratio, requiring 1.25MP. At the upper end were 21-22 inch units in 1920x1080 at 100.1-102.5ppi in 16:9 ratio, requiring 1.98MP. - Desktop LCD Display Comparison.

The larger "Cinamatic HD" TV screens (30+ inches) available today were not even considered.

Impact of choice of Image size on Display

Consequently, a texture image of a given size gets rendered on screen in the screen resolution and aspect ratio demanded by the monitor. (Multiple individual images make up the total rendered screen image.) The video card takes care of this conversion and passes along the final rendered display of various images in that resolution as that many total "mega-pixels". When the image size of a model is smaller than the number of pixels needed to fill it's respective area of the screen, the graphics card fills in the gaps by "interpolation" of the surrounding pixels. The result can be a "grainy" appearance to that model image. Higher resolution images require less interpolation, and less grainy images but more video processing memory per image. (The screen display requirements remain the same.)

Large displays can make the "grainy" image more apparent than on a smaller display. Smaller screens in effect "waste" processing power and time on higher resolution images they can't display anyway. (This latter is the basic problem with VWD//LOD images in general that are not optimized to reduce the polygon count and texture details.)

There are basically two types of graphic improvements: foreground and background. Everything you see up close (such as weapons, armor, and NPC bodies) are "foreground". Their image scope (the relative size of the object in question; not the texture size of the image itself) tends to be small as they are individual models. Everything in the middle to far distance is "background" (technically View While Distant/Level of Detail (VWD/LOD)). They encompass the entire horizon and objects seen at a great distance, so their image scope tends to be larger. They both count against the video memory; choose between them carefully.

Image size matters. A 4096 x 4096 image with 16 bit color resolution is 32 MB of data. If your screen is displaying 100 different models with textures that size, that's 3.2 GB of data just for processing the textures. Since a 32 bit program can only address up to 4 GB of data in total, you can see how trying to display a lot of high resolution models is going to run the game out of memory very quickly. (Thank you for the perspective madmongo.) Now, VRAM is not directly addressed by the program; it's controlled by the video card and can hide the actual addressing of more than 4GB from the program, but you get the idea. If you are running on a laptop (which usually doesn't have dedicated VRAM but uses system RAM for video instead) that is coming out of the memory available to Windows (the OS) and the game, and if you are running a 32-bit version of Windows, the 4GB overall limitation still applies.

Another thing to consider is that larger images take longer to pass through the video pipeline. When such larger than 2048x2048 image sizes were not considered in the game design (and they probably weren't at the time of older games), bottlenecks are to be expected. The use of "ENBoost" can help if you have a 64-bit version of Windows:

The idea is to reduce [32-bit] RAM and as result - fewer CTDs. Memory transfered to enbhost.exe processes and you can use up to 128 GB of [video] memory for x64 OS instead of a bit less than 4 with default game. What this means? You can install more mods, especially if you have cool videocard with 3+ GB of VRAM and want 8K textures everywhere. - (Source: ENBoost mod description)

ENBoost is independent of, and installed separately, from the "ENB Series" graphical post-processor package. It is bundled with the various game specific "patch" files of the "ENB Series" as well, available on the "ENB Downloads" page, but does not require them.

Frame Rate (FPS) versus Monitor Refresh Rate (Hz)

This issue is similarly related to the "display vs image" issue in that it confuses many people. The article Frame Rate (FPS) vs Refresh Rate (Hz) gives an excellent explanation.

Conclusion

The size of the monitor you are displaying the game on should dictate the upper limit of your choice of image size. 4K resolution images are wasted on a 15 inch display. You can't really see the difference. They (4K images) are intended for large screen TVs which otherwise make the "grainy" aspects of lower resolution images more apparent.

The trade-off is that older games are not designed to deal with images of that size that have to be rendered on those larger "mega-pixel" displays. The video pipeline will choke and stutter. In addition, the more pixels the video card needs to render, the more "art assets" it needs to pull into it's buffer to process efficiently. When it has to retrieve those assets from the hard disk, this is orders of magnitude slower and introduces stutter. This is where a faster video card with more dedicated VRAM becomes essential. If the video card can't keep up with the display demands, you get a stutter you can't overcome except by using "ENBoost" to increase the buffer szie (which will still be slower than VRAM) and reducing the size of the images, and therefore the memory demands.

It's all inter-related.

References