What is HDR and what are the criteria for HDR division?

HDR, or High Dynamic Range, was originally a term from the television industry. In recent years, it has become more prevalent in mobile devices. This includes smartphones with HDR camera modes and some manufacturers claiming their phones have HDR displays. Additionally, platforms have announced support for HDR playback. HDR is indeed a technology that impacts the entire film and television industry chain. So what exactly is HDR? Is all the marketing hype just watered down information? To truly enjoy HDR content, certain conditions must be met. In this article, we will explore what you need to know about HDR. 1. What is HDR and what are the criteria for HDR allocation? 2. The conditions needed to watch authentic HDR content This is the first part of our HDR series: "What is HDR and what are the criteria for HDR allocation?" HDR stands for High Dynamic Range, which literally means "high dynamic range." The term refers mainly to image brightness and contrast—the ratio between the minimum and maximum brightness values. When an object is displayed on a screen, it goes through several processes such as shooting, compression, encoding, transmission, decoding, and display. If the final image can completely restore the scene observed by the human eye, then we wouldn't need the term "high dynamic range." You can think of the dynamic range as 100%. However, the human eye can perceive a brightness range of about 10-3 nits to 106 nits, with an instantaneous contrast ratio of up to 10,000:1. The average screen in front of your eyes may only reach around 300–400 nits, with a contrast ratio of 2000:1. It's clear that current screens still have a long way to go in replicating what the human eye sees. HDR Color Depth Before discussing the quality improvements brought by HDR, it’s important to understand Standard Dynamic Range (SDR), the current mainstream standard for video production. Whether it's a film studio creating content or a screen manufacturer setting product parameters, they follow this standard. Each color is made by mixing red, green, and blue. Under the SDR standard, each primary color is divided into 256 levels (0–255), resulting in 16.7 million colors (256×256×256). This is referred to as 8-bit color depth. However, in reality, one color has far more than 256 shades from the brightest to the darkest. It can even be divided infinitely. HDR, on the other hand, uses a 10-bit color depth, meaning each primary color is divided into 1024 levels. This results in about 1.07 billion colors, offering an exponential improvement over SDR. The following image shows the color contrast differences between 8-bit and 10-bit standards. It clearly demonstrates that under 10-bit, the color transitions are more layered. The HDR camera mode in cameras works by taking three photos—overexposed, normally exposed, and underexposed—and combining them using software algorithms to capture more detail in highlights and shadows. This is why early HDR camera modes required processing time due to limited processor power. The latest iPhone models, like the iPhone 8, now have HDR enabled by default thanks to powerful processors like the A11 Bionic, making the delay almost imperceptible. HDR Color Gamut Another key concept related to color depth is color gamut, which refers to the range of colors that can be displayed. Current commercial LCD/OLED screens typically cover the BT.709 color gamut, which is approximately 95% of the NTSC color gamut. Recently, many mobile phones and laptops have started supporting DCI-P3, while the HDR standard uses BT.2020, which covers a much wider range. The difference in color gamut between BT.709 and BT.2020 is shown below. It's clear that the BT.2020 color gamut used in HDR significantly exceeds existing standards, allowing for richer and more accurate color representation. HDR represents a leap in both the number of colors and the range of colors that can be displayed. With HDR, images have better details in both light and dark areas, full color, and strong texture, making them more vivid and closer to what the human eye sees. Different HDR Standards As with any new technology, the proliferation of HDR has led to a battle over standards. Similar to how the mobile phone industry lacks a unified definition for "full screen," different organizations have developed various HDR standards to define whether an image meets the HDR requirements. For example, some TVs or monitors claim to support HDR10. HDR10 is one of the most common HDR standards, defined by the Consumer Technology Association. It requires a 10-bit color depth and the BT.2020 color space. It's open-source and widely supported by major manufacturers like Dell, LG, Samsung, Sony, and others. In 2017, Samsung and Amazon introduced HDR10+, which adds frame-by-frame brightness adjustments. Another important standard is Dolby Vision, developed by Dolby Laboratories. Unlike HDR10, it requires certification from Dolby, making it more expensive and less common. There's also the Mobile HDR Premium standard for mobile devices, which is similar to HDR10 but slightly lower in performance, covering about 90% of the P3 color gamut and 500 nits of brightness. In summary, HDR is a display technology that dramatically increases color reproduction across brightness ranges. It's an effort to approximate the true visual experience of the human eye, representing progress in both pixel count and quality. As HDR moves from TVs to mobile devices, it's expected to become a significant evolution in display technology. Some manufacturers may use HDR for marketing purposes, so understanding its principles is essential. In the next article, we'll explore authentic versus fake HDR.

For Vivo Glass

For Vivo Glass,Vivo V25 Pro Glass,Vivo V23 Pro Front Glass,Vivo V23 Pro Black Lcd Glass

Dongguan Jili Electronic Technology Co., Ltd. , https://www.jlglassoca.com

Posted on