4K Displays for HTPCs: A Consumer Checklist

by Ganesh T S on 6/4/2015 11:00 PM EST
Comments Locked

68 Comments

Back to Article

  • mike55 - Friday, June 5, 2015 - link

    I don't understand what HDR means for the new Blu-ray standard. Current Blu-rays can already display different intensities of colors from pure white to pitch black and everything in between. What exactly does HDR do for the new standard?
  • Taneli - Friday, June 5, 2015 - link

    Current standards don't allow pitch blacks or pure whites. Dynamic range is the distance between black and white, practically the same as contrast ratio. Strictly speaking, the current HDTV standard REC.709 is only about 5 stops (1:32 contrast ratio) from black to white, which is inherited from the analog era and is a really low value. Film and newer video cameras have significantly higher DR and also the dynamic range of HDTVs fall between 7 and 9 stops (1:128 and 1:512). The content is usually adjusted to fit this range and as it's currently digitally coded at 8 bits per pixel which looks kinda ok, but with the lack of a proper common standard this is still kind of a workaround.

    The new HDR standard will raise the dynamic range to somewhere between 12 (1:4096) and 14 (1:16384) stops. Then we'll have a standard that better allows the use of current digital technology (or more accurate presentation of film). Compared to the current TVs, HDR allows the blacks to be blacker and the whites to be whiter.
  • hawtdawg - Friday, June 5, 2015 - link

    "the current HDTV standard REC.709 is only about 5 stops (1:32 contrast ratio) from black to white,"

    Then how am I able to do a 20 pt greyscale reading on my HDTV?
  • theSuede - Saturday, June 6, 2015 - link

    Because you're confusing DR with tone resolution. A range with five stop DR (end-to-end brightness difference) may have several hundred discernible steps (minimum step-to-step differentiation is good)
  • blue_urban_sky - Friday, June 5, 2015 - link

    I'm guessing if you have 8 bits per channel for color information you get 256 contrast points currently used (00,00,00-->ff,00,00) if you have an extra 2 bits of data for HDR that will give you 1024 levels. A TV could have a higher max brightness and have it not interfere with content we have at the moment.
  • Taneli - Friday, June 5, 2015 - link

    Dynamic range is independent from bit depth. With high dynamic range, higher bit depth is preferable to avoid banding, but it's possible to do so with lower bit depth.
  • asanagi - Saturday, June 6, 2015 - link

    Dynamic range is literally the exact same thing as bit depth. Bit depth is a measure of dynamic range. Why do you write nonsense things?
  • Timbrelaine - Sunday, June 7, 2015 - link

    Nope. Bit depth is the number of bits used to store color information, with more bits giving you the ability to distinguish between more shades of a color. But it doesn't tell you anything about the dynamic range, which is the difference between the lowest and highest possible luminance value that the display can actually reproduce. A display with a huge bit depth and tiny dynamic range might be able to distinguish between billions of different colors, but they'd all be shades of gray.
  • iAPX - Tuesday, June 9, 2015 - link

    +1 @Timbrelaine

    Not even taking account of the gamma that is used, and even with 8bit depth of information, you might need a 10bit or 12bit display, if you apply gamma or calibration.

    Rec.2020 on UltraHD "4K" and it's larger gamut will offer the ability to make better panels, and to dissociate crap from high-end products.
  • Kjella - Friday, June 5, 2015 - link

    It's a pure white, but not a bright white. Imagine brightness as a floating point between 0 and 1, where 1 is currently set rather low. If you redefine 1 to be as bright as staring into the sun, all old content would be way too bright and all new content look pitch dark on old TVs. So HDR can maybe be explained like "extended brightness" that would be above 1 in the old encoding.

    It also brings more levels, because if you get too few bits to describe the brightness you get posterization - distinct jumps in brightness the same way you get color banding if you have jumps in colors. But that's a sideeffect of the "distance" between 0 and 1 increasing, you also want more precision.
  • hawtdawg - Friday, June 5, 2015 - link

    It's basically an industry standard that is finally forcing television manufacturers to stop making TV's with shit contrast ratio's.
  • Mark_gb - Saturday, June 6, 2015 - link

    Very disappointed that there are no DisplayPort ports. I have not seen any video cards that have more than 1 HDMI 2.0 port, but some do now have 3 DP 2.0 ports.

    As much as I would love to grab one of these, I think I will wait for a 10b panel and DP before I make the jump. I am expecting that a reasonably priced monitor with those is probably a year or so away, and by then, we should have a lot more video cards out there capable of doing 4K 10 bit color over DP 2.0 easily.
  • 457R4LDR34DKN07 - Friday, June 5, 2015 - link

    Anyway to determine what the panel for a Sony XBR-65X850C? Not to be found in any brochure or specification sheet.
  • sheh - Friday, June 5, 2015 - link

    Spec sheet?! Manufacturers never publish what panels they use, even in cases where it's a specific panel per model number.
  • 457R4LDR34DKN07 - Sunday, June 7, 2015 - link

    Apparently AVclub forums suggest it has a 10 bit panel.
  • nevcairiel - Friday, June 5, 2015 - link

    From a HTPC perspective, unfortunately I'm not convinced we will be able to make proper use of HDR. UHD Blu-rays may contain HDR metadata, our software video players may even be able to read the HDR metadata from the UHD Blu-rays .. but then we're at an impasse.

    The HDMI output is controlled by the GPU, so how do we give this HDR information to the GPU to send it to the display? Unfortunately, I do not think that the GPU vendors really care to invent a standard interface for this.
  • ganeshts - Friday, June 5, 2015 - link

    Hendrik,

    Have you tried to talk to Intel / NVIDIA / AMD about this? I would first approach NVIDIA, since they already have HDMI 2.0 cards, and are also promising firmware update to HDMI 2.0a ; I think we still have time - let the first UHD Blu-ray come out ... Maybe Cyberlink (the only licensed software Blu-ray player at the moment, I think) will solve the problem for us :)
  • nevcairiel - Friday, June 5, 2015 - link

    If Cyberlink solves it, then it will only be usable with Cyberlink software, so good luck with that. :)
  • ganeshts - Friday, June 5, 2015 - link

    I meant Cyberlink would have worked with GPU vendors to transfer the metadata info - so the GPU vendors would have to come up with some APIs in the drivers, right? It shouldn't be difficult to convince the GPU vendors to open up those details, hopefully.. Maybe I should start asking around in Intel / NVIDIA about what they plan to do about this.
  • cjb110 - Friday, June 5, 2015 - link

    If its CyberLink it'll only be in the pro/premium/"super duper" versions, and they wont mention their "blu-ray player" software can't do hd audio, or more the 5.1 channels or hdr or > 1080p until after you buy it.
  • geekfool - Friday, June 5, 2015 - link

    if you want look into these options you are advised to look at the Doom9 forums where the real devs and testers hand out see
    http://forum.doom9.org/search.php?searchid=7276627

    http://forum.doom9.org/showthread.php?t=171219&...

    for a full review anand might be wise to ask NikosD etc to clarify testing etc for the 10bit/12bit current and coming wider adoption options
  • geekfool - Friday, June 5, 2015 - link

    opps first url didn't add the search parameters use the second url and

    http://forum.doom9.org/showthread.php?p=1706784#po...

    .....NikosD "Right GTPVHD.

    I took a quick look on over 10 reviews, nobody has the knowledge or will to test its HTPC/Media capabilities...." :)
  • douglord - Friday, June 5, 2015 - link

    The Shield console seems to have outs that are made for HDR and 10bit color. So Nvidia Is Paying attention. When that translate to a GPU????
  • CapablancaFan - Monday, June 8, 2015 - link

    Whoever solves it may monopolize it cranking up costs rather than it becoming a common standard where everyone with appropriate hardware benefits.
  • ajp_anton - Friday, June 5, 2015 - link

    The TV specs you posted only mention 4:4:4, but what about RGB?
    You also say "RGB 4:4:4" frequenltly in the article - what does this mean? 4:4:4 is meant for YUV, and those numbers don't really make any sense for RGB.

    Also, while 60Hz is nice for viewing the desktop/UI/anything computer related, for movie watching I'm more interesting if there's support for (a multiple of) 24Hz and 25Hz.
  • sheh - Friday, June 5, 2015 - link

    It means that the display doesn't blur the color information.
  • ajp_anton - Friday, June 5, 2015 - link

    Ah, I seem to remember some TVs convert RGB to subsampled YUV for processing. So "RGB 4:4:4" means it doesn't do this (at least the subsampled part)?
  • sheh - Friday, June 5, 2015 - link

    Yeah. Regardless of the exact terminology, it's supposed to mean full pixel information retained all the way to the panel.

    Why do TVs degrade the input signal? No idea. I'm sure the hardware is capable. Laziness in programming, I assume. Sadly, it's the norm for firmware to be stupid, and things don't necessarily improve over time. Sometimes it's getting worse.
  • knutinh - Friday, June 5, 2015 - link

    Q:"Why would tvs process their input in 4:2:2/4:2:0 formats?"

    A: Because it is a significant reduction in the size of temporary buffers needed and the number of bytes that has to be processed, and because the signal they are mainly being used for is restricted to 4:2:0 anyways.
  • nevcairiel - Friday, June 5, 2015 - link

    TVs degrade the input because the processing chips they source are usually only designed for 4:2:2 YUV processing. Its a long and sad story...
  • sheh - Friday, June 5, 2015 - link

    Any references to that? I find it hard to believe. Computer monitors, cellphones, other TVs, are all capable, for some years now, of 1080p or higher without color detail loss. We're not talking about video decoding, but about getting HDMI input through some/no processing, and from there to the panel.
  • nevcairiel - Saturday, June 6, 2015 - link

    Most/Many TVs have a 4:4:4 PC mode which can pass-through the content untouched. Its just when they start to do image processing that its going to be degraded. Monitors and cell-phones don't have movie-like processing chips in them that would do this.

    Unfortunately, this image processing is quite nice to have in many circumstances, so its always a trade-off :(
  • sheh - Saturday, June 6, 2015 - link

    I thought we were talking about PC/game modes. The thing is, many TVs don't have 4:4:4 even if you turn off all processing or even in PC/game mode. That's what I meant by lazy firmware/software.
  • Shadowmaster625 - Friday, June 5, 2015 - link

    What about 1440p @ 120?
  • Gunbuster - Friday, June 5, 2015 - link

    Come on Hisense, make a 39" or 40" model and you'll be the new Seiki!!
  • icrf - Friday, June 5, 2015 - link

    I have the 39" Seiki, and I have to say I wouldn't do it over again. Sure, there are lots of fiddly user experience things that make life difficult, but the end result is 39" is just too big for the distance I sit at my desk.

    I used to have a 30" 2560x1600 Dell display, and really enjoyed that. I think I'd be happier with a 32" 4K display than the 39" TV I have now. Maybe even a pair of 32" displays over the single 39". Just too much.

    Do 4K desktop displays still require MST?
  • Gunbuster - Friday, June 5, 2015 - link

    I guess you need a bigger desk ;) I love my Seiki as a monitor.
  • Coldsnap - Friday, June 5, 2015 - link

    That Hisense is interesting. Mostly in the 1080p @ 120fps. For my HTPC use, I'm still a bit far away from 4k gaming at 60fps out of a single video card. I wish it wasn't a 4k panel though. Does anyone make a straight panel thats 1080 120fps?
  • mars2k - Friday, June 5, 2015 - link

    Thanks for the deepish dive. One question, why arc enabled on only HDMI 1.4 port? I could see this as a concession to entry level adopters with non HDMI 2.0a enabled receivers or HTPCs but what would engineering a 2 arc enabled port entail?
  • romrunning - Friday, June 5, 2015 - link

    Probably just cheaper. After all, audio data doesn't need nearly as much bandwidth as video data. So there isn't a need for ARC to travel over a higher bandwidth port, like a HDMI 2.0 or 2.0a port.
  • Guspaz - Friday, June 5, 2015 - link

    My ideal computer monitor would support both 4K60 and 1440p120 with G-Sync. It's nice to see a display combining both 4K at 60Hz with lower resolutions at higher refresh rates, but it'd also be nice to see that in computer monitors. With g-sync.
  • pvgg - Friday, June 5, 2015 - link

    In my humble opinion, any 1080p OLED set is more future proof than the most advanced, feature rich, standard supporting 4k lcd.
    I'm perfectly aware that it's the big numbers that sell, but I do hope that LG can drive OLED prices down enough to force all other manufacturers to bring forth their own OLED products and bring back and widespread an era of true image quality.
  • douglord - Friday, June 5, 2015 - link

    Disagree. Widecolor gamut and HDR are the future and OLED is gimped for both. Not trying to bash you - but you haven't seen it. Only a small group of people have. I'm not talking about what Samsung and Sony showed at CES. I'm talking 3,000 nit, rec.2020, 4k with native content. Oled cannot do this. OLED Can't Get Super Bright, and won't come down in price for 4k at size. For 4k we are mainly talking about 70", 80" or in my case 110" screens. Trust me - you want 4k at 8 feet from a 110" screen.
  • douglord - Friday, June 5, 2015 - link

    FYI I have a 110" 1080p In The Living Room With A Couch 8 feet away. I have a 65" high end Penny Plasma in the bedroom about 15 feet away. So I understand the screen size to distance ratio better then almost anyone. I want the quality of my Plasma At 15 feet with the experience of sitting 8 feet from a 110" screen. Oh and I want it in glass less 3D. That's where OLED falls down cause you really need 8k for that.
  • nevcairiel - Saturday, June 6, 2015 - link

    OLED certainly is the future, but in the here and now its just not there yet.
    But OLED can produce much better contrast, which is what HDR really is: high contrast. And it can probably also reach the color ranges required for Rec.2020.

    I wouldn't discount OLED on the state of current consumer tech.

    Luckily we're also still a year away from proper native 4K consumer content with wide availability, and if you're being totally honest, most current 4K TVs are still lacking in several key areas...
  • bryanlarsen - Friday, June 5, 2015 - link

    Does it use headache-inducing low-frequency PWM for backlight brightness control, like most TV's do?
  • Rocket321 - Friday, June 5, 2015 - link

    I hope Hisense has improved their panels significantly in the past couple of years. When I was shopping for a HDTV recently, I went to a few stores to see panels in person. At Wal-Mart I looked at two Hisense models side by side with several other brands and they were *significantly* worse than every other TV on display. Poor color, contrast, high motion blur, etc. It was obvious within about 2 seconds of entering the electronics department. I'm not trying to bash them too much, but I would advise against ordering until you have seen the display in person.
  • knutinh - Friday, June 5, 2015 - link

    (about 4:2:0) "the effects of the sub-sampling are not very evident in rapidly changing images."

    I disagree in this claim. The effects of sub-sampling tends to not be very evident in "natural" images (such as photography, video etc). Images that are generated by a computer (such as user interfaces) often contain razor-sharp colored detail that is visibly degraded by color resampling.
  • douglord - Friday, June 5, 2015 - link

    1 - you need to start including all of this stuff in your reviews.

    2 - 10bit panel is nice but what you need to review is % of Adobe/DCI Space covered And % of Rec.2020.

    3 - HDR Is also not a checkmark feature. The newer 4k LEDs can hit 1,000 nit. LG has a PROTOTYPE OLED that hits 800. Dolby wants 3,000. Current TV is 200

    4 - you need to review video card capabilities for outputting 10 bit color Adobe RGB and HDR.
  • Senti - Friday, June 5, 2015 - link

    4: 10-bit support on GPU is pathetic. Not because they can't do it, but because it's considered "pro" feature and only enabled on Quadro/Fire cards in drivers. On AMD side you can convince Radeon to output 10-bit by some registry hacks but it's totally unsupported and often produce black screen (for example in Crossfire).

    Software support is even worse: almost nothing can draw 10-bit. The only common software I can think of is Photoshop and even that doesn't always work.
  • nevcairiel - Saturday, June 6, 2015 - link

    This is not true. You can output 10-bit through DirectX Fullscreen modes on most consumer GPUs.
    Whats considered a "pro" feature is 10-bit OpenGL.
  • Senti - Saturday, June 6, 2015 - link

    Are you sure it actually outputs 10-bit in that mode? Yes, it can create the surface but that doesn't mean that output won't be clamped to 8-bit.
  • nevcairiel - Saturday, June 6, 2015 - link

    Yes, I am sure. Its easy enough to test this with the proper test patterns.
  • CapablancaFan - Monday, June 8, 2015 - link

    From what I've read the difference between 8-bit and 10-bit color is only a subtle difference, subtle enough where 10-bit monitors aren't worth the extra cost over 8-bit.
  • Nintendo Maniac 64 - Friday, June 5, 2015 - link

    It is important to note that, just because a TV accepts 4:4:4 chroma input does not mean it can actually display it. Many TVs can not do 4:4:4 if HDCP 2.2 is used but are perfectly capable of 4:4:4 when it is not used.

    The easiest way to test this is to simply use the test pattern posted in the MadVR thread on Doom9 and open it in good old MSpaint (you will notice that said test pattern has been reposted on the above-linked AVSforum thread and is also used by hdtvtest.co.uk):
    http://forum.doom9.org/showthread.php?p=1640299#po...
  • willis936 - Friday, June 5, 2015 - link

    I'd be very interested to see a deep dive into color spaces like the one that was done with memory. What are the actual numbers for 4:4:4 8b vs 4:2:2 10b and 4:2:0 12b in terms of image quality and bitrate?
  • Senti - Friday, June 5, 2015 - link

    I so don't believe in the true 120Hz refresh. Even if they get 120Hz in, it absolutely doesn't mean that they can draw all the 120 frames: my monitor does exactly this – it will show the image in large range up to 75Hz+ on input, but actual panes refreshes are restricted to around 60Hz. And the standard marketing bullshit of "480 Hz with High Ultra Smooth Motion" only confirms the suspicion.

    Would be nice though if I'm proven wrong and that thing can output real 120 frames on that panel.
  • geok1ng - Friday, June 5, 2015 - link

    hisense makes the panel? highly unlikely. about true 1080p@120hz without dropping frames: do not believe until it is tested. finally the 4kp60 4:4:4 claim: no 4k TV so far has offered both game mode and 4kp60 4:4:4 at the same time. 4:4:4 has meant 80+ms input lag. of all the claims, the most relevant is 1080p120hz: not Netflix 4k approved TV is capable of true 1080p120hz.
  • hMunster - Saturday, June 6, 2015 - link

    The Checklist for this Hisense doesn't list the panel technology, which is it?
  • Gothmoth - Saturday, June 6, 2015 - link

    no consumer graphic card has 10 bit output in the drivers as far as i know.. so why does the panel neds to be 10 bit?

    to have 10 bit output for my eizo i needed to buy a nvidia quatro (or fire GL but im a nvidia guy).
  • Nintendo Maniac 64 - Saturday, June 6, 2015 - link

    AMD's consumer Radeon drivers allow for 10bpc output:
    https://www.youtube.com/watch?v=X-MXyvD3iRA
  • Gothmoth - Saturday, June 6, 2015 - link

    lol...... just having it enabled doesn't mean its working....
  • Senti - Saturday, June 6, 2015 - link

    Radeons definitely can output 10-bit with OpenGL over DP. I can say because I use just that right now.
  • Gothmoth - Saturday, June 6, 2015 - link

    there is a 10 bit path in direct x to be honest.. but that doens´t help except for games.
    you won´t be able to use 10 bit on a radeon and photoshop for example.

    there are a lot of registry hacks and stuff. they are all not working.. trust me.
    i searched the internet 3 weeks for infos and i tried everything before deciding to bite and buy a quatro 4000.

    and if you don´t believe me... believe AMD:

    https://www.amd.com/Documents/10-Bit.pdf
  • Senti - Saturday, June 6, 2015 - link

    That setting in AMD drivers is "link mode" not "output mode". If you set it higher than your output nothing noticeable happens, if you set it lower – your card will use dithering. So it's NOT what people mean by 10-bit output.
  • knightspawn1138 - Tuesday, June 9, 2015 - link

    I'd like to see how this stands up as a gaming monitor. 50" 4K display for just under $600? Not bad considering most 4K monitors seem to top out at 32", and usually run between $500 and $900. I'd really like to know if it's compatible with nVidia 3D Vision. I just purchased a pair of 24" 144hZ monitors for use with 3D Vision. The monitors normally sell for upwards of $260. I was lucky to get them for $70 off their normal price.
  • wiyosaya - Tuesday, June 9, 2015 - link

    I am not sure anyone else has mentioned this, however, if you are using a pass-through device like a HTR, ensure that it supports HDMI 2.0 or above and also HDCP 2.2. There are at least some pricy home theater receivers on the market that support HDMI 2.0, but not HDCP 2.2 at the moment, though this will change relatively soon. So, anyone considering a new HTR might want to hold off until HDMI 2.0 and HDCP 2.2 support is present in the model you are considering.
  • Gunbuster - Friday, June 19, 2015 - link

    Real life testers are indicating this skips every other frame at 4k 60Hz and 1080p 120Hz. Seems like the Chinese flat out faked the specs on this TV. http://slickdeals.net/f/7903719-hisense-50h7gb-50-...

Log in

Don't have an account? Sign up now