Lack of HDR Standards Threatens 4K Market
That’s what Lightreading.com‘s Brian Santo believes:
Ultra HD TV has a problem. Even as UHD/4K TVs are being sold and companies like Netflix, YouTube and DirecTV are pushing 4K content, the standards required for getting the best 4K viewing experience still need work. A lot of work, according to a recent report from SMPTE.
The ugly truth about Ultra HD is that the increase in resolution isn’t worth it. Under most viewing circumstances, the human eye simply cannot perceive the difference. There is a step improvement in Ultra HD video, however, but it derives from other properties: high dynamic range (HDR) and wide color gamut (WCG).
The standards for HDR and WCG are still lacking. It’s an issue that most TV technologists have known for months, if not years, but there’s no telling how big a problem it might turn out to be. After all, when consumers found out their first “HD” TVs were only 720-lines instead of 1040, most of them just shrugged. Furthermore, millions of people bought HD TVs, watched only SD video on them, and didn’t notice the lack of improvement.
But that’s no guarantee that plunging into the 4K era despite a lack of HDR and WCG standards won’t come back to bite the TV industry in the ass.
The 4K market, already in progress, cannot be put on hiatus while HDR/WCG standards are developed. LG Electronics Inc. (London: LGLD; Korea: 6657.KS) , Samsung Corp. and Sony Corp. (NYSE: SNE) already have TVs on the market that are HDR. Comcast Corp. (Nasdaq: CMCSA, CMCSK) is planning to introduce in 2016 a set-top, the Xi6, that supports Samsung HDR sets, according to Fierce Cable. Chip companies are beginning to jump into the act as well. (See Sigma’s UHD Chip With HDR Ships in Volume .)
Yet forging ahead with 4K — and 4K with non-standard HDR — while lacking HDR/WCG standards increases the likelihood that more early adopters are going to be left with products that could become obsolete long before they should be.