HDR vs. Ultra HD vs. 4K: these are the differences
Related Videos: 4K UHD TV vs. 1080p HDTV - Side by Side Comparison (May 2024).
HDR, Ultra HD, 4K & Co. You have probably stumbled across these names before. Find out what the differences are in this practical tip.
Ultra HD, 4K and 8K: what's the difference?
The term "4K resolution" actually comes from the cinema area and describes a resolution of 4096 × 2160 pixels. However, since an aspect ratio of 16: 9 is generally used for TVs, manufacturers also use the designation "4K" for televisions for the resolution of 3840 × 2160 pixels.
- "UHD" stands for "Ultra High Definition". A distinction is made between UHD-1 and UHD-2. The UHD-1 also uses a resolution of 3840 × 2160 pixels. The classic cinema 4K (4096 × 2160 pixels) is also called "DCI 4K" or just "4K". "DCI" stands for "Digital Cinema Initiatives". 3840 × 2160 pixels are called "UHD 4K" or "UHDTV1". Another name would be "2160p / i" or "QFHD (Quad Full High Definition)" because half the width and length of the resolution in turn gives the classic Full HD resolution. As a result, Full HD is only a quarter as big.
- In addition to UHD-1, there is also UHD-2, which describes a resolution of 7680 × 4320 pixels and is usually referred to as "8K UHD" or "UHDTV2".
What is HDR?
"HDR" stands for "High Dynamic Range" and describes a technology that can reproduce contrasts and differences in brightness in great detail.
- Here, several pictures with different exposure are taken at once and put together to form an entire picture. There are therefore no overexposed or underexposed areas in the image.
- Conventional SDR uses a color depth of 8 bits, whereas HDR uses a color depth of 10 bits. This means that HDR images not only have a significantly better contrast, but usually appear much more real than a classic photo.
In the next practical tip, you will learn how you can easily create 3D models yourself.