What is HDR TV (and why should you care)?

Digital Trends has a great deep dive on all things HDR and why you want it.

HDR: The basics

Contrast is measured by the difference between the brightest whites and darkest blacks a TV can display, as measured in candelas per square meter (cd/m2), known as nits. The ideal low end is completely black, or zero nits — currently only possible on OLED displays, which can turn pixels completely off. On the high end, it’s a different story. Standard dynamic range TVs generally produce 300 to 500 nits, but some HDR TVs aim much, much higher — thousands of nits even.

Multiple formats for displaying HDR are possible, but currently there are two major players: the proprietary Dolby Vision format and the open standard HDR10. Dolby was first to the party, displaying a prototype TV capable of displaying up to 4,000 nits of brightness. For a short time, Dolby Vision was essentially synonymous with HDR, but not every manufacturer wanted to play by Dolby’s rules (or pay its fees), and many started working on their own alternatives. Companies quickly realized that this could lead to madness, and many popular manufacturers, including LG, Samsung, Sharp, Sony, and Vizio, eventually agreed on an open standard called HDR10.

In April 2016, the UHD Alliance — an industry group made up of companies like Samsung, LG, Sony, Panasonic, Dolby, and many others — announced the Ultra HD Premium certification for UHD Blu-ray players. This benchmark sets some baseline goals for HDR, such as the ability to display up to 1,000 nits of brightness and feature a minimum of 10-bit color depth. Both HDR10 and Dolby Vision meet the standards set by the certification, but how the two go about it varies greatly.

Testimonials

What Our Clients Say