4 February 2019 Equatorial part segmentation model for 360-deg video projection
Anas Jallouli, Fahmi Kammoun, Nouri Masmoudi
Author Affiliations +
Abstract
The 360-deg or virtual reality video can capture all sides around the observer at the same time and provides the freedom to select any part of the surroundings for display using a head mounted display in the case of a single user or smart TV, curved screen, and even with video projectors placed in a 360-deg projection room in the case of multiusers. In order to encode these spherical videos with standard codecs, a projection step is necessary to transform the original 3-D space scene into a regular 2-D video sequence. Thus different geometric shapes can be used as equirectangular projection (ERP), cubemap projection (CMP), segmented sphere projection (SSP), etc. The proposed model presents the 360-deg video on eight square faces, six faces for the equatorial part, and two faces for the top and bottom views. This shape can ensure a better projection of the video thanks to the size ratio of the lateral faces, which is very close to the size ratio of televisions and video projectors used in 360-deg rooms. Quality metrics comparison shows average gains when compared with ERP and SSP models and the encoding time is hugely reduced when compared with CMP model.
© 2019 SPIE and IS&T 1017-9909/2019/$25.00 © 2019 SPIE and IS&T
Anas Jallouli, Fahmi Kammoun, and Nouri Masmoudi "Equatorial part segmentation model for 360-deg video projection," Journal of Electronic Imaging 28(1), 013019 (4 February 2019). https://doi.org/10.1117/1.JEI.28.1.013019
Received: 30 September 2018; Accepted: 3 January 2019; Published: 4 February 2019
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Video

Chemical mechanical planarization

Computer programming

Distortion

Image segmentation

Video coding

Projection systems

RELATED CONTENT

Multidimensional bit-rate control for video communication
Proceedings of SPIE (December 28 2000)
Real-time motion-based H.263+ frame rate control
Proceedings of SPIE (December 28 1998)
Hyperlinked video
Proceedings of SPIE (January 22 1999)
Texture refinement framework for improved video coding
Proceedings of SPIE (January 18 2010)

Back to Top