Please use this identifier to cite or link to this item: http://hdl.handle.net/10071/16344
Author(s): Verlekar, T. T.
Correia, P. L.
Soares, L. D.
Date: 2017
Title: Gait recognition using normalized shadows
Pages: 936-940
ISSN: 2076-1465
ISBN: 978-0-9928626-7-1
DOI (Digital Object Identifier): 10.23919/EUSIPCO.2017.8081345
Keywords: Shadow biometrics
Gait recognition
Abstract: Surveillance of public spaces is often conducted with the help of cameras placed at elevated positions. Recently, drones with high resolution cameras have made it possible to perform overhead surveillance of critical spaces. However, images obtained in these conditions may not contain enough body features to allow conventional biometric recognition. This paper introduces a novel gait recognition system which uses the shadows cast by users, when available. It includes two main contributions: (i) a method for shadow segmentation, which analyzes the orientation of the silhouette contour to identify the feet position along time, in order to separate the body and shadow silhouettes connected at such positions; (ii) a method that normalizes the segmented shadow silhouettes, by applying a transformation derived from optimizing the low rank textures of a gait texture image, to compensate for changes in view and shadow orientation. The normalized shadow silhouettes can then undergo a gait recognition algorithm, which in this paper relies on the computation of a gait energy image, combined with linear discriminant analysis for user recognition. The proposed system outperforms the available state-of-the-art, being robust to changes in acquisition viewpoints.
Peerreviewed: yes
Access type: Open Access
Appears in Collections:IT-CRI - Comunicações a conferências internacionais

Files in This Item:
File Description SizeFormat 
Gait Recognition Using Shadow Biometrics.pdfPós-print767,9 kBAdobe PDFView/Open


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.