Display processors usually aren’t really much a common topic in the press and only few companies actually do advertise the capabilities beyond a simple mention of the maximum resolution. As the ecosystem evolves, there’s however an increasing amount of new features added into the mix that adds more nuances to the discussion that go beyond resolution, colour depth or gamut.

Two years ago, we saw the release of Arm’s new Mali-D71 display processor which represented a branch new architecture and foundation for the company upcoming DP IP blocks. The D71 brought to market the brunt feature requirements to drive most of today’s higher resolution or higher framerate displays, along with providing robust and smart composition capabilities.

Today’s announcement covers the new D77 which is an evolutionary upgrade to the D71. The new IP generation brings new features that go beyond one would normally expect of a display processor, expanding its capabilities, and in particular enables the new block to open up a slew of new possibilities for AR and VR use-cases.

Currently display processors mostly act as the compositing engines inside of SoCs, meaning they take in the pixel data generated by GPU or other SoC blocks and composite them into a single surface, and handle all the required processing that is required to achieve this.

Typically today’s display controllers lie towards the end of the display pipelines in an SoC, just before the actual physical inferface blocks which transform the data into signals for say HDMI or MIPI DSI, at which point we find ourselves outside of the SoC and connect to a display panel’s DDIC SoC. Here Arm promises to provide straightforward solutions and work closely with third-party vendors which provide IPs further down the chain.

The new Mali-D77 being based on the D71 comes with all of its predecessors capabilities, with a large emphasis on AR and VR features that promise to vastly improve the experience in product employing the IP.

Among the main features are “Asynchronous Timewarp”, “Lens Distortion Correction” and “Chromatic Aberration Correction”, which provide some new unique use-cases for display processors, along with continuing to provide further improvements in the baseline capabilities of the IP such as more layers as well as higher resolutions and framerates.

Asynchronous timewarp is an interesting technique for AR and VR whose main goal is to reduce the motion to photon latency. In a normal GPU to display operation, the display always simply display the last GPU render frame. The problem with this approach is that the update interval of this render is limited by the actual rendering framerate which is a characteristic of the GPU’s performance capabilities. This causes a hard limitation for AR and VR workloads which require very high visual frame-rates in order to provide a better experience and most importantly avoid side-effects such as motion sickness caused by delayed images.

Timewarp is able to disconnect the GPU render from what is actually scanned out to the display. Here, the D77 is able to integrate position data updates such as from motion sensors in HMDs into the most recent rendered GPU frame, post-process it with the motion data, and deliver to the display an updated image. In this new process, the user effectively will see two different frames displayed even though the GPU will have only rendered one.

Effectively this massively reduces the motion to photon latency in AR and VR use-cases, even though the actual rendering framerate doesn’t actually change. Avoiding doing the work on the GPU also reduces the processing workload, which in turn opens up more performance to be dedicated to the actual rendering of content.

In addition to ATW, the D77 is also able to correct for several optical characteristics of VR lenses such as pincushion effect. The IP is able to be programmed with the characteristics of a used HMD system and will be able to correct for distortions by applying an inverse effect (in this case a barrel distortion) to compensate for the distortion on the lenses.

This optical compensation also applies to chromatic aberration correction. Similarly, the D77 needs to be aware of the optical characteristics of the lens in use, and will be able to post-process the outputting image in an inverse effect – eliminating the resulting experienced image artefacts when viewed through the lens. It’s to be noted that the spatial resolution of the correction achievable here is limited by the actual resolution of the display, as it can’t correct something that is smaller than a pixel in dimensions.

The benefits of these new techniques on the DP is that it enables a significant amount of processing savings on the part of the GPU, which is much higher power.

What this also opens up is a possible new generation of “dumber” HMDs in the future without a GPU, powered by some other external system, yet providing the same latency and optics advantages as described above in a cheaper integrated HMD SoC.

Performance characteristics of the D77 are as follows: 4K60 with 8 layers or 4K120 with 4 layers. In smartphones with Android the higher layer number is a requirement so I don’t envision 4K120 to be a thing beyond special use-cases.

For VR use-cases, the D77 is able to handle up to 4 VR layers in which (ATW, correction, etc) the maximum resolution if up to 1440p120 or 4K90 with respectively 4+4 or 2+2 layers.

Overall, the new Mali-D77 is exciting news for AR and VR. While we’re expecting the IP to be used in smartphones in the next few years, the most exciting news for today in my opinion is that it enables higher quality standalone HMDs, expanding Arm’s market beyond the typical smartphone SoC. Arm unofficially described the VR ecosystem as currently being in the “trough of disillusionment” after the last few years of peaked expectations.

The next few years however will see significant progress made in terms of improving the VR experience and bringing more consistent experiences to more people. Certainly the D77 is a first step towards such a future, and we’re excited to see where things will evolve.

Related Reading:

Comments Locked

18 Comments

View All Comments

  • jgraham11 - Wednesday, May 15, 2019 - link

    Who gives a crap about this, where is the article on the new Intel bugs: Fallout, RIDL and ZombieLoad. Apparently a new class of speculative execution bugs that only affect Intel CPUs. Intel is even saying, if you care about security, disable Hyper Threading! Holy crap that is huge!
  • JoeyJoJo123 - Wednesday, May 15, 2019 - link

    Intel on suicide watch? It just keeps getting worse. Just learned about it after you posted that and google'd a bit. And to think we might have actually had hardware-baked fixes for meltdown/spectre soon, only to see that yeah, we'd be waiting even longer for Fallout/RIDL/ZombieLoad fixes. Kind of seals the deal that my next CPU will be a Zen2/Ryzen3000 chip.

    Also, back to the article topic, the Mali-D71 chip is nice. I do think an all-in-one chip that can act as a way to both correct visual aberrations and possibly power VR headsets without a GPU could be cool. Still think that the 3 biggest problems for VR aren't solved by this.
    1) Video/audio/power cables feeding content/power to the HMD.
    2) Lack of compelling content created for VR-type HMDs.
    3) Relatively poor price/performance offered by current gen HMDs. They're either too expensive for an adequately convincing experience, or they're affordable (ex: google cardboard) but provide a terrible experience overall.
  • Kamus - Wednesday, May 15, 2019 - link

    Thanks for your useless input.
  • sa666666 - Wednesday, May 15, 2019 - link

    Ooooh, you've just triggered HStewart. He'll be showing up to defend Intel soon.
  • mode_13h - Wednesday, May 22, 2019 - link

    Lol.
  • Ian Cutress - Wednesday, May 15, 2019 - link

    As stated in previous comments, we're waiting for answers to our questions before we publish.
  • ballsystemlord - Thursday, May 16, 2019 - link

    Answer from Intel: "Please go easy on us we're losing to AMD and TSMC." :)
  • Kamus - Wednesday, May 15, 2019 - link

    This is the future of VR.... we're heading to an ASIC world. But with that said, I think this SoC is missing a key feature:

    ASW/PTW (Async Space Warp / Positional Time Warp). ATW isn't even all that taxing on current mobile processors. But we need hardware ASW if we're ever going to overcome display interface bandwidth limitations.

    Right now, the biggest limitation to pushing higher and higher refresh rates is the display interface if we want to extrapolate fake frames. But if we had hardware ASW, we could theoretically get a KHz display, and beyond.

    This is going to be an important point moving forward, because as we move to AR, motion to photon latency will become even more important than it is on VR, because the whole world actually moves with the objects you are seeing in real time.

    So to combat this latency, we're going to need super-high refresh rates that we won't have enough power to render, or enough bandwidth. This is where an ASIC that can do PTW/ASW would come in and save the day, and give us extrapolated frames beyond KHz.

    Extrapolated frames would also give us the advantage of not having to strobe the backlight, or do BFI (Black frame insertion) on an OLED.

    This technology already exists on the PC, but it's bandwidth constrained by DisplayPort, So the only way to do this right, is to do it on directly in the SoC, to bypass such bandwidth limitations.

    Either way, It's good to see that there are people working on this. And I wouldn't be surprised at all, if Oculus designed their own SoC for their next VR-AR headsets in a few years. They kind of have to go the Apple route and design their own SoC, because there's a lot of features they need that no one is working on.
  • webdoctors - Wednesday, May 15, 2019 - link

    The problem is all these VR chips ruins ppl's opinion of VR.

    You need a GTX1070 for a reasonable VR experience and these vendors are claiming to be able to do it with meager cellphone SoCs? OBviously that's not true.

    Unless its pre-rendered video footage or just vector graphics its setup for failure.

    Putting some of the brains in the display to reduce GPU power/compute overhead is great though
  • jordanclock - Thursday, May 16, 2019 - link

    The D77 isn't an SoC.

    I'm not sure what you're asking for out of something like the D77 that it isn't already addressing. It is clearly allowing for headset movement that will update the display without a new frame from the GPU.

Log in

Don't have an account? Sign up now