Here is how motion engines bolster video and gaming content on small LCDs and OLEDs by optimizing color accuracy and tone mapping.
Content formats lag mobile device capabilities, and on top of that, the visual quality is degraded by different lighting conditions in mobile viewing environments. Moreover, despite the availability of rich connect, it’s throttled down by resolution and frame rate trying to fit into the limited throughput of wireless pipes.
Here is a sneak peek into how motion processing works and why it’s so hard to do.
The motion engine in visual processors transforms the video originally not intended for the small screen and ensures that mobile users can watch images and videos without judder or blur. It’s worth noting that most of the video content is still standard dynamic range (SDR), so even if resolution is there, color and depth are not, and that’s where motion processing adds the most value.
Figure 1 Anatomy of a display processor placed between the application processor and smartphone display. Source: Pixelworks
On small mobile screens, colors are washed out, especially lighter colors in the foreground. The hardware IP in visual processors upscales the quality of SDR to high dynamic range (HDR). In other words, visual processors perform real-time SDR-to-HDR conversion to expose more detail and shades of color in non-HDR video and gaming content.
AI-enabled visual processor
Pixelworks, which has over 20-year heritage of display and video processing, started in projectors and TVs and then figured out how to take this technology to mobile form factors. According to Peter Carson, VP of marketing at Pixelworks, five out of six gaming phones launched recently use the company’s display processing solutions. That includes Black Shark’s gaming smartphone, the Black Shark 2.
Figure 2 Visual processors automatically adapt the display to the ambient light and color content. Source: Pixelworks
For mobile devices, the visual processor employs the motion engine for local contrast enhancement and sharpness enhancement, the main high-level features that require the hardware IP for power efficiency and performance. The rest—tone mapping, color calibration, and adapted display feature set for different lighting conditions—can all run on software.
For instance, software provides color management and tone mapping for standard and customized color spaces. It also facilitates ambient light, color temperature adaptation, and custom tuning across different mobile use cases.
Pixelworks’ i6 Pro visual processor provides system-level software optimization that ensures superior visual quality and power efficiency at the highest possible refresh rates and resolutions by intelligently mapping feature sets and modes.
Likewise, the company’s i6 visual processor features lightweight AI display inferencing. It augments the company’s fuzzy logic IP to adaptively and intelligently optimize overall picture quality for video, games, and photos at low power. The AI capability in i6 enables mobile displays to analyze ambient light conditions, settings, and content to intelligently adjust daylight view controls under various lighting conditions.
Both visual processors—i6 and i6 Pro—feature flicker-free DC dimming, brightness control, and skin tone accuracy across all color gamuts and viewing modes. Figure 3 offers more details about the i6 and i6 Pro visual processors’ display features.
ASUS has incorporated Pixelworks’ display technology in its recently launched ZenFone 7 phone. Other phone makers to have employed the company’s visual processors include TCL Communication and HMD Global, the Finish company now developing and producing Nokia phones.
The i6 processor is expected to be in commercial production in Q4 2020. The company’s existing visual processor, Iris 5, has become part of the X series and is renamed the X5 processor.
Majeed Ahmad, Editor-in-Chief of EDN, has covered the electronics design industry for more than two decades.