default header

Hardware

Dolby Vision vs. HDR10

Moderator: JC Denton

Dolby Vision vs. HDR10

Unread postby infernovia » 12 Dec 2016 23:55

This is definitely something you want to look for in your next big TV purchase.

First off, here is a background on why Dolby Vision was created:

http://www.theverge.com/2014/1/6/527693 ... lly-bright
The unit of measurement is colloquially‎ known as a "nit", and CRT displays back in the day had an average peak brightness of 100 nits; that’s still the same reference level used today. Modern day televisions take that signal and stretch it to match their own peak brightness (usually between 400–500 nits), but that has its limits. If you take the image much brighter, it starts to fall apart.

The problem is that the human eye is used to seeing a much wider range in real life. The sun at noon is about 1.6 billion nits, for example, while starlight comes in at a mere .0001 nits; the highlights of sun reflecting off a car can be hundreds of times brighter than the vehicle’s hood. The human eye can see it all, but when using contemporary technology that same range of brightness can’t be accurately reproduced. You can have rich details in the blacks or the highlights, but not both.

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better.

DOLBY BUILT A LIQUID-COOLED EXPERIMENTAL DISPLAY

With that knowledge in hand, Dolby then built a 1080p, liquid-cooled experimental display with a backlight made up of 18,000 RGB LEDs (in comparison, its standard reference monitor uses a mere 4,500). With a peak brightness of 4,000 nits, it allowed the company to color grade footage with vastly improved contrast and dynamic range — and it was the same kind of monitor Dolby brought out for journalists in a side-by-side shootout with its current professional display.

While both used the same LCD panel, the difference was staggering. It wasn’t just a higher-quality version of the same image; it was a new kind of imagery. With the ability to reproduce a wider range of the color gamut, images glowed luxuriously. A worker welding looked like a clipped, diffuse blur of white on the standard display; on the Dolby Vision monitor it was a sharp punch of luminescent detail. A person stood in silhouette, and when the sun peaked out from around their head, I actually squinted. Granted, a demo is always a best-case scenario when evaluating new technology, but the combination of increased detail, color reproduction, and ultrabright highlights recreated reality in a way I’d simply never seen on a television before.


Onto the format war:

http://www.digitaltrends.com/home-theat ... explained/
While Dolby Vision may have been first, it isn’t currently the most popular format. Instead, the current most popular format is HDR10, an open standard backed by companies including Samsung, LG, Panasonic, and Hisense, who also happens to own the rights to the Sharp brand name for TVs in the U.S. It isn’t as technologically advanced as Dolby Vision’s theoretical specs, but then again, neither are the Dolby Vision-enabled TVs you can go out and buy right now.

The HDR10 standard currently uses 10-bit color, while Dolby Vision uses 12-bit color depth. Both of these offer millions of colors per pixel, and the difference will difficult to spot depending on how a given movie or TV show is mastered. Since one of the goals of HDR is to offer greater color volume, a higher color depth is desirable, at least in theory, but even 10-bit color depth is a major step up from the 8-bit color depth used in standard dynamic range TVs.

[...]

While HDR10 plays it more safe than Dolby Vision when it comes to technology, it is also more feasible for TV manufacturers to implement right now, rather than in the future, so it has become the more popular of the two formats. This is also helped by the fact that HDR10 is an open standard — TV manufacturers can implement it freely. It is also recommended by the UHD Alliance, which generally prefers open standards to proprietary formats like Dolby Vision.

[...]

HDR10 might currently be in more TVs, but that might not necessarily always be the case down the road. In terms of sheer technological might, Dolby Vision has a clear advantage, even with TVs available today. Looking toward the future, the gap between Dolby Vision and HDR10 becomes even more apparent.

As mentioned earlier, Dolby Vision offers 12-bit color depth, as opposed to the 10-bit color depth offered by HDR10. It also offers higher theoretical brightness. HDR10 currently maxes out at 1,000 nits, while Dolby Vision can currently handle 4,000 nits, with Dolby saying that future upgrades to its tech could allow it to handle up to 10,000 nits of maximum brightness. Current displays can’t display anywhere near that, but that won’t always be the case.


Knowing this, I would recommend picking a display that supports both. However, there simply isn't anything that supports Dolby Vision in the future, so it might be easy to ignore for now.
Last edited by infernovia on 13 Dec 2016 05:52, edited 2 times in total.
User avatar
infernovia
 
Joined: 21 Apr 2009 19:37
Location: Wisconsin, US

Unread postby infernovia » 13 Dec 2016 00:13

The updated "4k ready" consoles support HDR10:

http://www.techradar.com/news/hdr10-vs- ... format-war
Both 4K game consoles, Xbox One S and PS4 Pro, support HDR10, as do many set top boxes including the Roku Premiere+ and Roku Ultra, as well as Ultra-HD Blu-ray players from Samsung and Panasonic.


However, note that a lot of the TVs that are HDR/Dolby Vision capable can't actually output the needed brightness:

http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm
Contrary to popular belief, the purpose of HDR (high dynamic range) mastering is to expand the available luminance range rather than elevate the overall brightness of HDR videos. High-end display calibration software maker Light Illusion has published this exact quote from SMPTE’s ST.2084:2014 standard “High Dynamic Range Electro-Optical Transfer Function of Mastering Reference Displays” on its website:

This EOTF (ST2084) is intended to enable the creation of video images with an increased luminance range; not for creation of video images with overall higher luminance levels. For consistency of presentation across devices with different output brightness, average picture levels in content would likely remain similar to current luminance levels; i.e. mid-range scene exposures would produce currently expected luminance levels appropriate to video or cinema.

What this means is that for most scenes, 4K Blu-ray’s Average Picture Level (APL) in HDR should not deviate drastically from that of a regular 1080p Bluray in SDR (standard dynamic range). Indeed, that’s what we found in our own 4K Ultra HD Blu-ray vs Blu-ray comparisons.


So not the "true" implementation of the tech like the Dolby experiment mentioned before, but can still make a difference. But the proof is in the actual games, and unfortunately, I don't have a HDR capable display to test things with.
User avatar
infernovia
 
Joined: 21 Apr 2009 19:37
Location: Wisconsin, US


Return to Hardware

cron