Sen
Utente Èlite
- Messaggi
- 3,895
- Reazioni
- 1,652
- Punteggio
- 128
Ho letto per caso una spiegazione breve e veloce sul HDR.
Ve la copio e incollo, magari a qualcuno interessa:
I'm going to provide a response, but also wanted to CC /u/evilmnky204
-----------------------------
> How does this monitor fail HDR?
There are two currently competing HDR standards: Dolby Vision and HDR10. I won't go over the specs and pros/cons of each, just understand that they are different standards that require certification.
In order for HDR to work, you need a certified chain. So, for HDR10, the Console/PC, the game, and the display must all be certified for HDR10. Same applies for Dolby Vision.
The PS4 uses HDR10 as its standard. That means all HDR-compatible games for the PS4 will use HDR10. Therefore, your display MUST be HDR10 certified. If it's not HDR10 certified, your PS4 won't detect the presence of an HDR display, period. For the sake of the PS4, this monitor does not support HDR.
That said, I did say that it was fake HDR, and while evilmnky204 felt this was a bit harsh, I'm going to elaborate and tell you why I'm correct.
The whole point of HDR is show show a high range of color and luminance that is dynamic in nature (hence high dynamic range). Wide gamut has been possible for a long time, but the luminance is the big issue due to the limited contrast of non-OLED displays. Due to this, all content is mastered using a technique called tone-mapping that maps the luminance within this limited range. HDR allows for the expansion of this range.
Let's say that we have a display with a max brightness of 500 nits. Now, let's say that this is a standard VA panel (common in HD/UHD TVs), so not OLED. And let's compare two, one using standard edge-lit LED backlighting, and one using FALD (Full Array Local Dimming). For the record, the Samsung display in this thread is edge-lit.
Now, let's take an all-black image, and a glowing white orb floats across from one side to the other. To properly display HDR, the display with FALD will show the orb as being at or near 500 nits, the all black zones will be off or low light, and any zones that are a mix of the orb and black will have a mid-level of brightness. The edge-lit display? It's running a uniform level of brightness because it cannot properly display HDR content.
So, "HDR" displays that aren't certified for HDR10 and/or Dolby Vision, and that use Edge Lighting, [like the Dell S2718D that I reviewed here] (https://www.reddit.com/r/Monitors/comments/676xts/dell_s2718d_impressions_ultrathin_fake_hdr_fake/), are just adjusting the entire screen's brightness on the fly. This doesn't look good and I prefer it off.
You might know it by it's old name, "Dynamic Contrast," which is a feature that most monitors have. But marketing has seized the opportunity to rename it to "HDR."
This is fake HDR. And no, it's not harsh to call it out as such. It's harsh that marketing thought we'd fall for it. It's harsher that many of us have.
Ve la copio e incollo, magari a qualcuno interessa:
I'm going to provide a response, but also wanted to CC /u/evilmnky204
-----------------------------
> How does this monitor fail HDR?
There are two currently competing HDR standards: Dolby Vision and HDR10. I won't go over the specs and pros/cons of each, just understand that they are different standards that require certification.
In order for HDR to work, you need a certified chain. So, for HDR10, the Console/PC, the game, and the display must all be certified for HDR10. Same applies for Dolby Vision.
The PS4 uses HDR10 as its standard. That means all HDR-compatible games for the PS4 will use HDR10. Therefore, your display MUST be HDR10 certified. If it's not HDR10 certified, your PS4 won't detect the presence of an HDR display, period. For the sake of the PS4, this monitor does not support HDR.
That said, I did say that it was fake HDR, and while evilmnky204 felt this was a bit harsh, I'm going to elaborate and tell you why I'm correct.
The whole point of HDR is show show a high range of color and luminance that is dynamic in nature (hence high dynamic range). Wide gamut has been possible for a long time, but the luminance is the big issue due to the limited contrast of non-OLED displays. Due to this, all content is mastered using a technique called tone-mapping that maps the luminance within this limited range. HDR allows for the expansion of this range.
Let's say that we have a display with a max brightness of 500 nits. Now, let's say that this is a standard VA panel (common in HD/UHD TVs), so not OLED. And let's compare two, one using standard edge-lit LED backlighting, and one using FALD (Full Array Local Dimming). For the record, the Samsung display in this thread is edge-lit.
Now, let's take an all-black image, and a glowing white orb floats across from one side to the other. To properly display HDR, the display with FALD will show the orb as being at or near 500 nits, the all black zones will be off or low light, and any zones that are a mix of the orb and black will have a mid-level of brightness. The edge-lit display? It's running a uniform level of brightness because it cannot properly display HDR content.
So, "HDR" displays that aren't certified for HDR10 and/or Dolby Vision, and that use Edge Lighting, [like the Dell S2718D that I reviewed here] (https://www.reddit.com/r/Monitors/comments/676xts/dell_s2718d_impressions_ultrathin_fake_hdr_fake/), are just adjusting the entire screen's brightness on the fly. This doesn't look good and I prefer it off.
You might know it by it's old name, "Dynamic Contrast," which is a feature that most monitors have. But marketing has seized the opportunity to rename it to "HDR."
This is fake HDR. And no, it's not harsh to call it out as such. It's harsh that marketing thought we'd fall for it. It's harsher that many of us have.