Ken Eagle, director of field training/technical sales at Atlona provides his take on where we are with HDR and what installers need to know.
Seeing is believing. So while we can talk to clients about HDR until we’re red-green-blue in the face, all it’s going to take to convince most people that HDR is worth all the fuss, is one good side-by-side demo. Yes, seeing is believing. After all, one way to think about HDR is that it is the key remaining qualitative difference between the rich, vivid images you see in a theatre release versus what you see in today’s high-end home theatre.
But we’re getting ahead of ourselves. It’s true that there’s already a lot of information out there about HDR, HDR displays and HDR content, but I’ll say it’s also true that there’s not enough out there (yet) that puts this information into a concise context for installers.
Before delving deeper, there is one question you’ll want to be prepared to answer: is HDR a fad? After all, look at 3D (ironically, many new TVs have eliminated 3D support in favour of HDR).
While no one can say with certainty what’s going to succeed, it’s a pretty safe bet that HDR is going to be with us for a while. HDR represents a major advancement in displaying more accurate, more realistic and more vivid colour with differences that are clearly visible. So much so that viewers consistently chose HDR over non-HDR as the better picture, even when HDR content is shown on lower resolution displays.
Also, in terms of the range of colours it can render, HDR is already a standard feature of cinema and TV video cameras; so while HDR content has so far been slow in coming to consumers, we can expect a gradual if not rapid increase.
So what exactly is HDR?
To start, we’ll review some of the key specifications we need for background in understanding the improvements HDR brings, starting with 4K and Ultra HD.
Unfortunately, the terms 4K and UHD get used interchangeably. 4K is shorthand and technically refers to the digital cinema specification for displays measuring 4096 pixels wide by, typically (but not part of the spec), 2160 pixels tall, creating an image area with an aspect ratio of 17:9. This is a wider-format image than seen in the 16:9 ratio used as the basis for the 1920 x 1080 pixel area associated with HDTV.
While a wide, 4K 17:9 image ratio might be desirable for cinematic productions, displays designed for home entertainment, if designed to a 17:9 specification, would waste seven percent of their pixel area when showing content shot for TV and other 16:9 formats. Scalers could resize the image to use the entire screen area, but would introduce undesirable if modest disproportion to the picture.
Ultra HD, or UHD, stems from a set of broadcast, telecomm and consumer electronics industry specifications for 4K and 8K digital TV, including a CTA (Consumer Technology Association) specification that Ultra HD displays provide an input for a minimum native signal resolution of 3840 x 2160. This is twice the vertical and horizontal resolution as full HDTV, preserving the 16:9 aspect ratio while quadrupling the number of pixels used to render the picture compared to full HD (roughly 8.3 million vs. 2.075 million pixels).
The UHD specifications also call for other improvements in bit-depth, colour gamut (both key parameters for rendering HDR video) and refresh rates.
In recent years, technologies were developed for increasing the dynamic range of pixels used in LCD video displays (as well as the dynamic range of sensors in video capture and post-production equipment) and last year the CTA released HDR10, an open standard for high dynamic range UHD video supported by major manufacturers.
Having all these pixels, whether 4K or UHD, is great for rendering more detailed images, but moving the digital data comprising them from source to display requires more bandwidth than HD. Just how much more is determined not only by the number of pixels, but also by the image refresh rate and colour depth.
Refresh rates for cinema or consumer video typically don’t require more than 60 frames per second and 24, 30 and 60fps is the common refresh rate for cinema or home video. This parameter won’t change as video moves from HD to 4K/UHD screens.
The HDTV specification for colour depth is eight bits. Interestingly, a video signal of 4K or UHD resolution and 8-bits colour depth at 60 fps, fits within the 10.2 Gb/s bandwidth of HDMI 1.4 (as well as HDBaseT) technology. This allowed manufacturers to start selling 4K sets for that generation of source and switching devices, but their only advantage was higher resolution, not the enhancements to dynamic range, which produce the most visible improvements in quality. Even if the sets were capable of HDR performance, they could not accept the data needed. These sets also came to market prior to the availability of high-speed, 18Gb/s HDMI 2.0, resulting in a generation of 4K TVs that won’t work with new generations of Ultra HD Blu-ray players, AVRs, switchers and distribution amps.
While HDTV calls for 8-bit colour, the UHD specifications call for 10-bit. This difference is more significant than the small delta in digits suggests, because each pixel is comprised of three ‘sub-pixels’ of red, green and blue, with each element defined by its colour code. For HDTV, this means each sub-pixel can render 256 colours (28), allowing each whole pixel to display up to 16.7 million colours (2563). However, with a display based on 10-bit colour, each whole pixel becomes capable of rendering over 1 billion, or (210)3, colours.
One might think this just allows 4K and UHD screens to display more shades within the exiting colour range of HDTV, but here is where Ultra HD with HDR makes the difference. Continue Reading….
For more information about 4K workflows and HDR, please contact a Piston Media Group representative (888) 829-7320 or email us at email@example.com