RAVEN: Reducing Power Consumption of Mobile Games without Compromising User Experience

Publié

Par , Outreach Director of Microsoft Research , Principal Research Manager

In the last decade, mobile gaming has grown into a huge industry. According to Newzoo, the global mobile games market will reach $46.1 billion in 2017, a 19.4% increase from the year before.

Players can enjoy amazing gaming experience on mobile devices due to the increasingly powerful processing capability of modern mobile GPUs. However, such gaming experience comes at a big cost: power consumption. The power consumption of mobile GPUs linearly increases with the amount of graphics computation. As a result, high-end mobile games with rich graphics content are extremely power hungry and drain batteries very quickly.

Spotlight: Blog post

Research Focus: Week of September 9, 2024

Investigating vulnerabilities in LLMs; A novel total-duration-aware (TDA) duration model for text-to-speech (TTS); Generative expert metric system through iterative prompt priming; Integrity protection in 5G fronthaul networks.

To solve the above problem, researchers from Microsoft Research Asia (MSRA) and Korea Advanced Institute of Science & Technology (KAIST) have developed a new system, called RAVEN, to reduce the power consumption of mobile games without compromising user experience.

RAVEN is based on a key observation in mobile games: many frames continuously rendered in a game are either perceptually the same or very similar. The differences of those frames are too small to be perceptible to game players. However, mobile games always render frames at a high frame rate of 60 frames per second (FPS), no matter how similar the frames are. Based on the measurement study done by the researchers, those perceptually redundant frames may make up more than 50% of the total frames in many games. Clearly, eliminating the rendering of those perceptually redundant frames could significantly reduce power consumption.

RAVEN is a novel system which leverages human visual perception for scaling the rate of rendering frames. To accomplish this, RAVEN is introducing the use of perception-aware scaling (PAS) of frame-rendering rates. This energy-saving methodology reduces a game’s rate of rendering frames whenever succeeding frames are predicted to be perceptually similar enough.

RAVEN works by setting up a side channel to track the rendered frame sequences to tailor a user’s perception of graphics changes during game-play. In this way, RAVEN opportunistically reduces GPU power consumption.

The RAVEN system consists of three major components which collectively scale the rate of game-frame rendering: Frame Difference Tracker (F-Tracker), Rate Regulator (R-Regulator), and Rate Injector (R-Injector). The system works in a pipelined fashion. First, F-Tracker measures perceptual similarity between two recent frames. Then, R-Regulator predicts the level of similarity between the current and next frame(s). The prediction is done based on how similar the current frame and the previous frame(s) are. If the next frames are similar enough (determined by a threshold) to the current one, R-Injector limits frame-rendering rates by injecting certain delay in a rendering loop and skip graphics processing for unnecessary frame(s). Presently, RAVEN can skip up to a maximum of three frames, and thus, inflict a frame-rate drop down to 15 FPS.

The key challenge RAVEN addresses is how to determine frame similarity at a low computational cost. The direct method to compare similarity is computing frames’ structural similarity (SSIM) score. Determining SSIM score is computation intensive and therefore uses a lot of power, particularly for large frames. Today’s mobile devices, including smartphones, usually have a high display resolution of 1920×1080 pixels or greater, making computing each SSIM score an unfeasible method for RAVEN to employ.

To address the challenge, the researchers employed two novel techniques. First, they developed an energy-efficient method to measure perceptual similarity based on the susceptibility of human eyes to color difference. This method leverages the difference in the luminance component (i.e., the Y component in the YUV color space) between frames. They extensively evaluated the method by comparing it with SSIM under various settings. The results showed that the luminance-based method efficiently measured perceptual similarity at a low computational cost.

Demonstration at MobiCom 2017. Left: Chanyou Hwang; Middle: Yunxin Liu; Right: Saumay Pushp.

Second the researchers built a virtual display, cloned from the mobile device main display but with a much lower resolution (e.g., 80 x 45 pixels). The system reads the graphical contents of the virtual display for the similarity measurement. Because the resolution of the virtual display is significantly smaller, the computational and energy overheads are also much smaller. The above two techniques effectively reduce the energy overhead of RAVEN.

As a next step, the researchers implemented the RAVEN system on a Nexus 5x smartphone. In an 11-person user study, they conducted comprehensive experiments using various gaming applications to evaluate RAVEN’s performance. Results showed an average 21.8% up to a high of 34.7% reduction in energy-per-game session while maintaining quality, user experiences.

RAVEN is the first system designed to achieve frame rate scaling and energy savings based on perceptual similarity for mobile games. The paper describing the RAVEN system “RAVEN: Perception-aware Optimization of Power Consumption for Mobile Games” has been published and demonstrated at MobiCom 2017. The authors include Chanyou Hwang, Saumay Pushp, Changyoung Koh, Jungpil Yoon, Seungpyo Choi and Junehwa Song from KAIST and Yunxin Liu from MSRA.

Publications connexes

Lire la suite

Voir tous les articles de blog