From NYU Video Lab

HomePage: 360-degree Video View Prediction

360-degree video has become popular in recent years with the advances in virtual reality (VR) and augmented reality (AR) technologies and has been rapidly commercialized in many applications, such as immersive cinema, gaming, education and training, healthcare, social media and 360 video streaming, etc.

Unlike traditional 2D video, in 360-degree video (also referred as “omni-directional video”), only a portion of the entire scene is watched at a time and users keep exploring and navigating the new view direction constantly according to the video contents and viewers’ interests. However, the viewing behaviors of 360-degree videos have not been investigated thoroughly.

In this project, we are exploring the user viewing behaviors when watching different 360-degree videos or VR contents over different devices, such as head mounted display (HMD) or over computer, using mouse to navigate. Leveraging on the viewer behavioral studies, advanced view prediction algorithms can be designed to facilitate 360-degree video applications, such as 360-degree video coding and streaming, VR media designs (including gaming, advertisement, film-making, etc.). Our current research topics include but not limit to the followings:

1. Desktop360: A 360-degree video view trajectory dataset of viewers watching the 360-degree videos over desktop/laptop environment.Platform Dataset

2. View Prediction Solutions:

Related Publications:

[C1] F. Duanmu, Y. Mao, S. Liu, S. Srinivasan and Y. Wang, “A Subjective Study of Viewer Navigation Behaviors When Watching 360-degree Videos on Computers,” IEEE International Conference on Multimedia Expo (ICME), San Diego, California, USA, 2018 (Accepted).

Last Update by Fanyi Duanmu, 02/14/2018

Retrieved from
Page last modified on March 19, 2018, at 09:11 AM EST