We can generally think of this as a marketing strategy, and that is precisely it: they have created better monitors and want to sell them, so they tell us that what we have now is not worth it and that we have to spend more. the money. their new product. But, in reality, we are also improving the image quality … or not?
Does higher resolution mean better picture quality?
If we stick to the empirical data, a Full HD display has 1920 x 1080 pixels, or whatever, something over 2 million pixels. An Ultra HD display has a resolution of 3840 x 2160 pixels, or about 8.3 million pixels, or four times that. This means that in the same space we integrate a lot more pixels, so we will have a higher definition, as long as we are talking about the same screen size.
And this is important information, the size of the screen, because what gives us the definition is not the resolution or the number of pixels, but the pixel density.
Pixel density is the key factor in definition
Pixel density in a monitor is measured in dots per inch (DPI for “dots per inch”) but since it refers to the number of dots that are in a one-inch scan line, they have been dropped. . In favor of PPI (pixels per inch). Although PPI is the correct term for monitors, the two are often used interchangeably.
Pixel density is important because it is what determines image quality in the sense that, in general, higher density will create sharper images. Let’s give a few examples using a 27-inch monitor as a reference, which is quite common and normal today:
- A 27-inch 720p monitor would have around 54 PPI.
- A 27-inch 1080p monitor has a density of around 81 PPI.
- If the monitor has a resolution of 1440p, its density would be around 108 PPI.
- If we switch to 4K resolution, the density rises to 163 PPI.
- A 27-inch 8K monitor would have a density of 326 PPI.
To put this data in perspective, imagine you have two monitors, next to each other, and both with Full HD resolution. If one of them is 32 inches in size while the other is 27 inches, if you zoom in you will see an obvious difference in pixel size, because although both monitors have the same amount, the density of the 27 inches will be much larger, its pixels will be smaller and therefore offer better definition.
Another example in terms of density: imagine a 4K monitor of 1000 inches (exaggerating). However, we will see that the pixels have a considerable size because the density would be very low, and therefore the image quality will be poor regardless of 4K.
So, the higher the pixel density, the better?
The answer is yes and no. In general, higher pixel density is better because it provides better picture definition, but there is a certain point where performance decreases. As the density increases more and more, the observable benefits of this higher density are less and less obvious to the point that they are imperceptible to the human eye.
In the example above, the 27-inch Full HD monitor will have a density of around 81 PPI, while the 32-inch monitor will have 69 PPI. In this situation, it is safe to say that there will be some observable differences between the two monitors, but if we were talking about two 24-inch monitors, both, one with 4K resolution and the other 8K, the difference would be imperceptible, and yet obviously processing images at 8K resolution has a much higher performance cost than processing at 4K.
At this point, the exact point in density at which the human eye stops perceiving changes is subject to debate. Some experts say this figure is around 400 PPI, others say 1000, and most users are even content with less than 200. Either way, what is obvious is that it comes a certain time. That a higher pixel density is no longer appreciated.
Image quality, related but not related to resolution
Coming back to the main topic, according to Dolby, the image quality that people perceive (because after all it’s a matter of subjective perception) depends mainly on three factors:
- The number of pixels (and their density, as we explained).
- The frame rate per second.
- Pixel performance.