Utilize este identificador para referenciar este registo: http://hdl.handle.net/10071/22877
Autoria: Jardim, D.
Nunes, L.
Dias, M.
Editor: Verikas, A., Radeva, P., Nikolaev, D. P., Zhang, W. and Zhou, J.
Data: 1-Jan-2017
Título próprio: Predicting human activities in sequences of actions in RGB-D videos
Volume: 10341
Título do evento: 9th International Conference on Machine Vision, ICMV 2016
ISSN: 0277-786X
ISBN: 978-1-5106-1132-0
DOI (Digital Object Identifier): 10.1117/12.2268524
Palavras-chave: Human motion analysis
Recognition
Segmentation
Clustering
Labeling
Kinect
Prediction
Anticipation
Resumo: In our daily activities we perform prediction or anticipation when interacting with other humans or with objects. Prediction of human activity made by computers has several potential applications: surveillance systems, human computer interfaces, sports video analysis, human-robot-collaboration, games and health-care. We propose a system capable of recognizing and predicting human actions using supervised classifiers trained with automatically labeled data evaluated in our human activity RGB-D dataset (recorded with a Kinect sensor) and using only the position of the main skeleton joints to extract features. Using conditional random fields (CRFs) to model the sequential nature of actions in a sequence has been used before, but where other approaches try to predict an outcome or anticipate ahead in time (seconds), we try to predict what will be the next action of a subject. Our results show an activity prediction accuracy of 89.9% using an automatically labeled dataset.
Arbitragem científica: yes
Acesso: Acesso Aberto
Aparece nas coleções:ISTAR-CRI - Comunicações a conferências internacionais

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
conferenceobject_42663.pdfVersão Aceite418,62 kBAdobe PDFVer/Abrir


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Todos os registos no repositório estão protegidos por leis de copyright, com todos os direitos reservados.