Dynamic refresh rate technologies like AMD’s FreeSync and NVIDIA’s G-Sync have become de-facto standards for gaming PCs and displays. Last year the HDMI Forum introduced a more industry-standard approach to variable refresh rate as a part of the HDMI 2.1 package, and recently makers of consumer electronics started to add VRR support to their products. At Computex the consortium demonstrated VRR operation using a Samsung QLED TV and a Microsoft Xbox One X, but the demonstration was somewhat inconclusive.

Select Samsung QLED TVs to be launched this year are set to support a 120 Hz maximum refresh rate, HDMI 2.1’s VRR, as well as AMD’s FreeSync technologies, the company announced earlier this year. The technologies do essentially the same thing, but they are not the same method – AMD's Freesync-over-HDMI being a proprietary method – and as such are branded differently. From technological point of view, both methods require hardware and firmware support both on the source (i.e., appropriate display controller) as well as the sink (i.e., display scaler). As it appears, Samsung decided to add support for both methods.

As an added wrinkle, AMD sees VRR and FreeSync as two equal technologies, which is why it intends to keep relying on its own brand, even when over time it adds support for both technologies to its products. An example of such universal support for VRR and FreeSync is Microsoft’s Xbox One X console, which according to a Microsoft rep at the HDMI Consortium booth at Computex, supports both technologies. Meanwhile during its own press event at Computex, AMD demonstrated a Radeon RX Vega 56-based system with FreeSync working on a 1080p QLED TV from Samsung, so unless said GPU already supports HDMI 2.1’s VRR (which is something that would be logical for AMD to announce), it more likely proves that Samsung supports both VRR and FreeSync on select TVs. Meanwhile, it does not seem like Samsung’s TVs support LFC (low framerate compensation), at least not right now.

The somewhat convoluted demonstration of HDMI 2.1’s VRR capabilities reveals complexities of the HDMI 2.1 technology package in general, and difficulties with the HDMI 2.1 branding in particular.

As reported a year ago, the key feature that the HDMI 2.1 specification brings is 48 Gbps bandwidth that is set to enable longer-term evolution of displays and TVs. To support the bandwidth, new 48G cables will be required. The increased bandwidth of HDMI 2.1’s 48G cables will enable support of new UHD resolutions (some will require compression), including 4Kp120, 8Kp100/120, 10Kp100/120, and increased refresh rates. In addition, increased bandwidth will enable support of the latest and upcoming color spaces, such as BT.2020 (Rec. 2020) with 10, 12, or even more advanced with 16 bits per color component.

Finally, the HDMI 2.1 supports a number of capabilities not available previously, including QMS (quick media switching), eARC (enhanced audio return channel), QFT (quick frame transport), and ALLM (auto low latency mode). The list of improvements the HDMI 2.1 spec brings is significant, furthermore, some of the new features require the new cable, others do not. Therefore, the HDMI Forum made no secret from the start that some of the new features might be supported on some devices, whereas others might be not. Meanwhile, the HDMI 2.1 branding will be used for all of them, but with an appropriate disclosure of which capabilities are supported.

There is a reason why HDMI Forum wants to use the HDMI 2.1 brand for hardware that will support only one or two new features from the package, even if it comes with certain confusion. While the key features of HDMI 2.1 are its higher cable bandwidth and the resulting support for 8K resolutions, the Forum realizes that only a couple of countries in the world are currently experimenting with 8K UHD TV broadcasting, so there's currently not much need for high bandwidth/8K support in TVs sold in Europe or the U.S. Meanwhile, things like VRR and ALLM make sense for gamers today, but since they have to be supported by both sinks and sources, proper marking is required so that people who want to have them know to get the right hardware.

Microsoft says that it has plans to expand feature set of its Xbox One X consoles going forward, so it is possible that it will gain HDMI 2.1 capabilities eventually. Obviously, innovations are good for hardware owners, but while the HDMI 2.1 remains in its infancy, such approach cases confusion for people on the market for new hardware.

Want to keep up to date with all of our Computex 2018 Coverage?
 
Laptops
 
Hardware
 
Chips
 
Follow AnandTech's breaking news here!
Comments Locked

46 Comments

View All Comments

  • Lindegren - Wednesday, June 20, 2018 - link

    what i would find useful was a usb over hdmi part. it would be nice to have keyboard, soundcard and mouse connected to tv, and pc in another room, one cable only.

    what i find irelevant is 16bit colourspace. Who needs 65536 colour gradients per channel? homestly, 1024 is more than a normal person would see anyway
  • Kevin G - Wednesday, June 20, 2018 - link

    It isn't necessarily about the raw number but how those gradients per channel are divided. An 8 bit smooth gradient per RGB channel is simple to handle in terms of processing and handing it off to a display but doesn't necessarily visually look that good compared to what can be done with 8 bits in other color spaces.

    The other nice thing about higher bit depths is that conversion between color spaces can be more precise. Speaking of, video content is often mastered in YUC color space vs. RGB display.
  • mode_13h - Thursday, June 21, 2018 - link

    8 bits per channel is always going to come up short, no matter the color space.
  • SydneyBlue120d - Wednesday, June 20, 2018 - link

    I wonder if we'll ever see a TV with DisplayPort 1.4 input...
  • wr3zzz - Wednesday, June 20, 2018 - link

    What is the point of setting a standard and then say manufacturers don't need to adhere to it 100%? If certain requirements are not ready in all markets but in demand for some then call one HDMI 2.1a and the other HDMI2.1b.
  • nevcairiel - Wednesday, June 20, 2018 - link

    If you try to distinguish by version number, you get into a real mess really fast. Sure, with HDMI 2.1 it may only be "a" and "b", but with HDMI 2.2 you add another one, now you're at abcd, and it just gets worse from there.

    Optional features are fine, as long as manufacturers properly document which features are available.
  • wr3zzz - Thursday, June 21, 2018 - link

    If a function can be optional then it is technically not "standard". Anything that is optional in 2.1 should not be part of 2.1 but left for consideration for the 2.2 "standard". Manufacturers can and do brag about features all the time but should not be allowed to use the 2.1 moniker if they are not 100% certified.
  • mode_13h - Friday, June 22, 2018 - link

    That's missing the point. The standard is there to ensure interoperability between all devices which *do* implement it. If you don't have the standard dictating how to implement a given feature, there's pretty much 0% chance of two different manufacturers' devices being compatible.

    As an example, I point to lip-synch auto-calibration, before HDMI finally tackled this. Early HDMI support for lip-sync was apparently useless without auto-calibration, which the standard didn't address. Manufacturers arrived at proprietary solutions for this, which helped virtually no one. It wasn't until HDMI took another swing at lip-sync that the problem was finally solved for people.
  • mode_13h - Friday, June 22, 2018 - link

    You're also missing the fact that some of these new features are quite high-end. So, if you force any 2.1 implementations to tackle *all* of it, then what will happen is the industry will virtually ignore 2.1 and remain stuck at 2.0.

    You'd think the high-end products could move forward, but even the high-end counts on the economies of scale driven by lower-end products, since some of the silicon is often shared between them. Having a regime where products can gradually adopt the new features creates a smooth transition path and affords greater economies to high-end products that share some silicon with the lower-end. This paves the way for more features to trickle-down to low-end.
  • vanilla_gorilla - Wednesday, June 20, 2018 - link

    WTF is VRR? I don't see it defined anywhere.

Log in

Don't have an account? Sign up now