A NOVEL CLOUD GAMING FRAMEWORK USING JOINT VIDEO AND GRAPHICS STREAMING
As the popularity of smart phones and tablets, users have an
increasing desire to enjoy ubiquitous game playing. The emerging
cloud gaming turns this desire into reality, enabling
users to play games at anywhere on any devices. However,
due to the huge amount of data transmission, it is challenging
to provide a high quality game experience under the limited
bandwidth capacity. In this paper, we propose a novel cloud
gaming framework, in which we introduce two synchronized
graphics buffers at both the server and the client sides. The
server not only streams the compressed frames captured from
game scenes, but also progressively transmits graphics data.
The received graphics data is used to generate reference
frames. When compressing the next frame, the cloud server
will choose the reference frame with a lower residual error,
from the previous frame and the current frame rendered
from the graphics buffer. With the accumulation of graphics
data, the frame rendered from the graphics buffer is close
to the captured frame, which greatly reduces the transmission
bit rates. Based on the proposed framework, we study the
rate allocation problem, in which we optimize the allocated
bit rates between the compressed frame and the graphics data
to minimize the total distortion under the bandwidth constraint.
Experimental results demonstrate that the proposed
framework can optimally allocate bit rates to achieve a minimal
distortion for cloud gaming compared to the traditional
video streaming and graphics streaming approaches.