The HDMI 2.0 series of updates marked a milestone when electronics manufacturers, including TV manufacturers, began releasing 4K TVs. Video shot in this format became available and new protocols were needed to transmit it to the TV from an external device. Each of the updates – HDMI 2.0, 2.0a, and 2.0b – consistently expanded the capabilities of the interface without changing the physical characteristics of the ports and cables.

HDMI 2.0 (2013)

HDMI 2.0 provides full support for 4K resolution at 60 frames per second for the first time. The bandwidth has increased to 18 Gbps, enabling higher quality video transmission, including 4:4:4:4 color subsampling at 8-bit color depth. It was also possible to transmit up to 32 audio channels and support the 21:9 aspect ratio of a movie theater. However, this version of the protocol has not yet implemented official HDR support, which limits the use of the interface in new video formats with extended dynamic range.

HDMI 2.0a (2015)

The HDMI 2.0a specification introduced official support for HDR. This allowed the transmission of additional meta information (static metadata) required to correctly display an extended dynamic range image. The HDR10 format was the first supported standard. At the same time physically cables and ports remained the same – to switch to 2.0a it was enough to update the firmware of devices. HDMI 2.0a was the first mass-market solution to deliver HDR content over standard HDMI connections.

HDMI 2.0b (2016)

HDMI 2.0b extended HDR support by adding the HLG (Hybrid Log-Gamma) format. This standard was designed for broadcast television and does not require metadata transfer, making it convenient for streaming services and satellite TV. Otherwise, in terms of features and capabilities, 2.0b remains identical to HDMI 2.0a and also uses the same cables and ports.

HDR support and hardware requirements

With the introduction of HDMI 2.0a and 2.0b, full HDR video playback is only possible under a number of conditions. Devices at both ends of the connection – both the signal source and the display – must support the appropriate version of HDMI and be able to process HDR metadata. In addition, the display itself must have a panel with a color depth of at least 10 bits and have sufficient brightness and contrast. TVs with OLED screens and quantum dot (QLED) models usually meet these requirements. Budget IPS and VA matrixes usually do not provide adequate HDR display quality.

If the TV does not support HDR, the metadata will be ignored when playing the corresponding content and the image will be displayed as standard UHD video.

Previous articleHow to blur background in Zoom
Next articleHow to check display quality of laptop, Monitor, TV

LEAVE A REPLY

Please enter your comment!
Please enter your name here