So it doesn't really matter on higher end tv's but budget tv's get a bigger boost.
You guys should really start testing in all formats to see if the TVs are consistent across HDR10/+/DV. I remember the U8G had a terrible EOTF for HDR, but upon detecting a DV signal it would start tracking the EOTF a lot more accurately. I always wanted to know how did that affect the Rec2020 color gamut and volume, contrast, and peak brightness.
I have a midrange OLED (LG C1) and the biggest difference for me with Dolby Vision (streamed content) is more detail in low light scenes and less black crush.
This so far is the clearest explanation and comparison between HDR10, HDR10+, and Dolby Vision yet! Being that Dolby Vision is the preferred format among 4K Blu-ray enthusiasts such as Youtube's best physical media experts such as Jeff Rauseo, MovieGuy365, and Midlevel Media, I watch the Dolby Vision encode whenever a disc includes one. One thing which is hard to get used to is my Sony X90CL holding back and rarely reaching its full peak brightness when set to Dolby Vision Dark, but time seems to be helping me adapt and adjust. While I have my favorite physical media experts, my favorite home theater experts include Youtubers such as all the folks from Rtings Home Theater, Brian from Brian's Tech Therapy, Caleb Denison from Digital Trends, Quantum TV, Stop the FOMO, and Vincent from HDTVTest.
It's just great that you don't just test or compare the high-end devices but simply use a low budget device and show the differences. Really great 👍🏼
On my projector, XGIMI Horizon Ultra, the results match nearly exactly what you see on the cheap TV. Dolby Vision looks notably better than the generic 'HDR' setting. It's pretty dramatic.
Its also worth noting on Kaleidescape it shows the file sizes for Dolby Vision and HDR10. Just picking Mad Max Furiosa as an example, Dolby Vision is 5.3GB larger than the HDR10 version - thats a lot of data! I can't say what it would be for HDR10+ as there aren't many movies finished in this format.
The problem with Dolby Vision vs HDR10+ is that most colorists actually putting out HDR prefer to use Dolby. That’s it. It’s just utilized more by more experts that know what they are doing. HDR10+ gets largely neglected in that world. At least in the past. That may be changing now. Creating HDR10+ currently isn’t as nice as Dolby Vision. There are not a lot of professional tools to make the creation process as easy or productive. Dolby has more money to push it and make better tools to create it. So pros use it more. The rest of the world are really not even touching HDR yet. Even in the wedding or corporate video worlds most completely ignore its existence and really don’t understand it very well. Dolby Vision is incredibly expensive to create. In Davinci Resolve it used to cost $2500 a year to export content with proper Dolby Vision dynamic metadata. It has since dropped to a one time fee of $1000 likely due to HDR10+ finally being utilized a little more. That’s really expensive and another reason why it was exclusive to only really Hollywood and other professionals. Fun fact but the dynamic metadata is literally a text file that has a timestamp and what the gamma should be. It’s not any more complex than closed captioning text data. Just a large set of timestamps and a brightness value for each. The text file literally tells the video to adjust its gamma to that value once it hits that timestamp. Much like closed captioning will display certain words when it hits a certain timestamp. That’s what all this extra cost is for. 0f course the TVs cost more because they have to know how to use that text data and make adjustments to the picture accordingly based on Dolby standards. Most of it however is just licensing the Dolby name. Just like that ridiculous $2500/year fee to create the content. It literally just lets us set the values per scene or even per frame if we want to. We don’t even have to use it that much. A movie that rarely changes in tone values might set it once and never again. Sometimes a movie may only change a few times. It depends on the content and if a normal static gamma would not look as good. Changing it per frame is possible but very extreme and nobody does. Even per shot isn’t usually a thing because most shots have the same environment and same brightness for a scene.
How about a similar video comparing HDR10, HDR10+, and DV on disc? You mention it is different and send us to the article, which is cool, but would also be nice to have a similar comparison video for on disc content for those of us that want to spend money on a "premium player" and experience. Please.
Honestly, if I'm paying multiple thousands of dollars for a TV, it better support all of the formats. Yes, they have to pay a couple dollars for a DV chip, but on a TV that expensive, they can afford the "hit".
TCL 85C 805K user here - my set can reach up to 1500nits in HDT highlights - as the set supports all the formats I found DV gives the highest peak highlights and overall 100% brightness with highly saturated colors, hdr10+ gives the most natural looking IQ with duller highlights and more reserved coloirs, hdr10 is serviceable but noticeably less contrasty as the other two leaving an overall less impressive image quality. As Abby says in the video here overall bit rate will make the most noticeable difference to the quality. A 100gb hdr10 blu ray will look crisper than a 20gbDV amazon stream could.
People in this community can get weird about their allengiance to HDR and DV. In reality, something having that logo on it is no guarantee that it was applied well. The people mastering and authoring a disc still has to take the time to configure for the title to reach its full potential...yet sometimes shortcuts are taken
I have an old hisense U6g and just by changing some color settings , picture quality is greatly improved in all HDR versions.
It wasn't mentioned but the Apple TV+ streaming service has supported HDR10+ for a few months. I've also noticed that many titles I purchased digitally from Apple's movie store that once were listed as "HDR10" are now "HDR10+." It's possible those titles had HDR10+ on other platforms, e.g., Amazon but obviously Apple needed to update the firmware on their Apple TV devices to support HDR10+ before they could show such a designation in the user interface. But who knows, it's possible the movie studios only recently added HDR10+ metadata due to the fact that Samsung commands ~60% of the TV market and has steadfastly refused to license Dolby Vision. Incidentally, the licensing cost to have Dolby Vision on a TV is marginal -- it's estimated if Samsung added Dolby Vision to their TVs, it would probably only be $3-$4 per TV. In other words, it's totally a pride thing. On a tangent, Dolby Vision licensing for discs is higher than that of the royalties paid to Dolby Labs for streaming so you run into the situation where the 4K BluRay disc for a movie does NOT have Dolby Vision but the same movie via streaming is available w/Dolby Vision - this applies to many of Disney's titles. Cheap TVs that were talked about in this video aside where they saw differences not present on more premium TVs, I'm of the strong opinion that unless content is deliberately mastered with Dolby Vision in mind, it's not likely to make any difference assuming you're comparing apples to apples. I don't think it makes sense to compare a Dolby Vision stream of a movie with the same movie coming from an HDR10 4K BluRay. The bitstream coming off a disc is always noticeably higher. Depending on the streaming platform, it could be 3x as much data coming off a 4K BluRay disc. The best video I've seen that showcases that Dolby Vision DOES make a difference, assuming you're talking about 4K BluRay playback (HDR10 vs. Dolby Vision) is this video from Linus Tech Tips: https://youtu.be/na1hqx4Yi68?si=BqWLMQSHk9BkplNA When you get to the Dolby Vision part of the video at around 7 minutes Linus did a simple test with his staff. They played content from a 4K BluRay disc in HDR10 on one TV and on another TV the same content off a second copy (4K BluRay disc) but in Dolby Vision. They juxtaposed the TVs and every single person picked the TV that had Dolby Vision as looking better. So unless you plan on buying a dedicated 4K BluRay player, which HDR format is used is probably NOT going to make much difference for you. When it comes to streaming, let's take this blog post from Netflix: "All of Netflix’s HDR video streaming is now dynamically optimized" https://netflixtechblog.com/all-of-netflixs-hdr-video-streaming-is-now-dynamically-optimized-e9e0cb15f2ba My inference from that article is Netflix uses HDR10 & Dolby Vision as general containers and they do not specifically target the strengths of Dolby Vision. That being said, it might be possible to notice a difference for a same show in one format versus the other, in particular, when things are done wrong. I recall reading a review of Netflix's "Marco Polo" series where the reviewer said that he surprisingly found the Dolby Vision stream inferior to the HDR10 stream. I myself recently noticed a difference in "The Umbrella Academy" where an indoor scene (banquet hall) was brighter in HDR10 than in Dolby Vision. The HDR10 stream/scene looked too bright. And this was on a very expensive TV I might add.
This is such a great and informative comparison but you guys missed one: RTX HDR. Let's just say this feature has saved me a lot of money. If you guys decide to try it make sure to use HGIG in order to let the graphics card do all the tone mapping. Also Color Control app to force it on a preset other than game optimizer is a must to avoid the more aggressive ABL of that preset .
I think HDR haven't industry standard yet. Every manufacturer interpretate HDR differently and that's why we have so many HDR types. Modern TV's are great for calibrated SDR and you still getting a great looking picture.
Finally a channel made a video on this! While I wish you compared HDR+ and Dolby Vision on 4K Blu-ray too, this gave good insight on the formats. Hope to see a comparison with 4K Blu-rays in the future.
Theres other things to consider too, like on some panels, HDR10+ can look better than Dolby Vision on another panel. For example, I was cross shopping for a TV, and I was deciding between the LG C3 and Samsung S90C. The C3 has Dolby Vision support, and the S90C "only" has HDR10+ support. However, the panel on the S90C is brighter, more colourful, has a better response (as its QD OLED), so HDR10+ on the S90C actually looks better than Dolby Vision on the C3. So although Dolby Vision is "technically" a superior technology to HDR10+, the panel itself makes a big difference too.
One thing that I don't see mentioned is how subtitles are handled, especially in variable formats like DV. It can actually be quite annoying as they change from white to various shades of grey in an attempt to remain the same brightness.
@DarrenKrusi