4.7 Article

Single robot - Multiple human interaction via intelligent user interfaces

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 21, Issue 6, Pages 458-465

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2008.03.008

Keywords

human-robot interaction; mobile robots; navigation; intelligent user interfaces

Ask authors/readers for more resources

This project addresses some research issues concerning design of intelligent user interfaces for improving human-robot interaction. In some critical applications, users interact with robots via Graphical User Interfaces (GUIs), which usually contain standard components considering a large number of users. Some of these user interface components may be redundant and sometimes confusing for some users depending on their preferences, capabilities, and the context robots are used in. This paper describes an adaptive system that enables a mobile robot to learn its users' preferences and capabilities so that it can offer a dynamic and efficient GUI for each user rather than a standard GUI for all users. The system predicts future actions of the users by generating models based on the users' previous interactions with the robot. The system was implemented and evaluated on a Pioneer 3-AT mobile robot. About 20 participants who were assessed on spatial ability directed the robot in simple spatial navigation tasks to evaluate effectiveness of the adaptive interface. Time to complete the task, the number of steps, and the number of errors were collected. The results showed that although spatial reasoning ability plays an important role in mobile robot navigation, it is less important in the robot control with adaptive interfaces compared to that of the non-adaptive. (c) 2008 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available