Please use this identifier to cite or link to this item: http://hdl.handle.net/10071/28434
Author(s): Tavares, A. J.
Silva, J. L.
Ventura, R.
Editor: Chen, F., and Billinghurst, M.
Date: 2023
Title: Physiologically attentive user interface for improved robot teleoperation
Book title/volume: IUI '23: Proceedings of the 28th International Conference on Intelligent User Interfaces
Pages: 776 - 789
Event title: IUI '23: 28th International Conference on Intelligent User Interfaces
Reference: Tavares, A. J., Silva, J. L., & Ventura, R. (2023). Physiologically attentive user interface for improved robot teleoperation. In F. Chen, & M. Billinghurst (Eds.), IUI '23: Proceedings of the 28th International Conference on Intelligent User Interfaces (pp. 776-789). Association for Computing Machinery. https://doi.org/10.1145/3581641.3584084
ISBN: 979-8-4007-0106-1
DOI (Digital Object Identifier): 10.1145/3581641.3584084
Keywords: Attentive user interface
Recycling user interfaces
Human-robot interaction
Mental state classification
Neural networks
Robot teleoperation
Abstract: User interfaces (UI) are shifting from being attention-hungry to being attentive to users’ needs upon interaction. Interfaces developed for robot teleoperation can be particularly complex, often displaying large amounts of information, which can increase the cognitive overload that prejudices the performance of the operator. This paper presents the development of a Physiologically Attentive User Interface (PAUI) prototype preliminary evaluated with six participants. A case study on Urban Search and Rescue (USAR) operations that teleoperate a robot was used although the proposed approach aims to be generic. The robot considered provides an overly complex Graphical User Interface (GUI) which does not allow access to its source code. This represents a recurring and challenging scenario when robots are still in use, but technical updates are no longer offered that usually mean their abandon. A major contribution of the approach is the possibility of recycling old systems while improving the UI made available to end users and considering as input their physiological data. The proposed PAUI analyses physiological data, facial expressions, and eye movements to classify three mental states (rest, workload, and stress). An Attentive User Interface (AUI) is then assembled by recycling a pre-existing GUI, which is dynamically modified according to the predicted mental state to improve the user's focus during mentally demanding situations. In addition to the novelty of the proposed PAUIs that take advantage of pre-existing GUIs, this work also contributes with the design of a user experiment comprising mental state induction tasks that successfully trigger high and low cognitive overload states. Results from the preliminary user evaluation revealed a tendency for improvement in the usefulness and ease of usage of the PAUI, although without statistical significance, due to the reduced number of subjects.
Peerreviewed: yes
Access type: Open Access
Appears in Collections:ISTAR-CRI - Comunicações a conferências internacionais

Files in This Item:
File SizeFormat 
conferenceobject_95534.pdf967,75 kBAdobe PDFView/Open


FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.