Skip navigation
User training | Reference and search service

Library catalog

Content aggregators
Please use this identifier to cite or link to this item:

Title: Scalable light field coding with support for region of interest enhancement
Authors: Conti, C.
Soares, L. D.
Nunes, P.
Keywords: Light field
Field of view scalability
Region of interest
Image compression
Issue Date: 2018
Publisher: IEEE
Abstract: Light field imaging based on microlens arrays - a.k.a. holoscopic, plenoptic, and integral imaging - has currently risen up as a feasible and prospective technology for future image and video applications. However, deploying actual light field applications will require identifying more powerful representation and coding solutions that support emerging manipulation and interaction functionalities. In this context, this paper proposes a novel scalable coding approach that supports a new type of scalability, referred to as Field of View (FOV) scalability, in which enhancement layers can correspond to regions of interest (ROI). The proposed scalable coding approach comprises a base layer compliant with the High Efficiency Video Coding (HEVC) standard, complemented by one or more enhancement layers that progressively allow richer versions of the same light field content in terms of content manipulation and interaction possibilities, for the whole scene or just for a given ROI. Experimental results show the advantages of the proposed scalable coding approach with ROI support to cater for users with different preferences/requirements in terms of interaction functionalities.
Peer reviewed: yes
DOI: 10.23919/EUSIPCO.2018.8553608
ISBN: 978-9-0827-9701-5
ISSN: 2076-1465
Accession number: WOS:000455614900373
Appears in Collections:IT-CRI - Comunicações a conferências internacionais

Files in This Item:
File Description SizeFormat 
PID5428053.pdfPós-print1.74 MBAdobe PDFView/Open

FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpace
Formato BibTex MendeleyEndnote Currículo DeGóis 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.