4.7 Article

Labeling Out-of-View Objects in Immersive Analytics to Support Situated Visual Searching

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TVCG.2021.3133511

Keywords

Object labeling; mixed / augmented reality; immersive analytics; situated analytics; data visualization

Ask authors/readers for more resources

Augmented Reality (AR) embeds digital information into physical objects, enabling real-time comparisons and search. This study categorizes different design aspects in AR label design and evaluates five different label conditions for visual search tasks. The study finds that angle-encoded labels with directional cues have the best performance and highest user satisfaction, especially for searching objects outside the field of view and comparing sparse objects.
Augmented Reality (AR) embeds digital information into objects of the physical world. Data can be shown in-situ, thereby enabling real-time visual comparisons and object search in real-life user tasks, such as comparing products and looking up scores in a sports game. While there have been studies on designing AR interfaces for situated information retrieval, there has only been limited research on AR object labeling for visual search tasks in the spatial environment. In this article, we identify and categorize different design aspects in AR label design and report on a formal user study on labels for out-of-view objects to support visual search tasks in AR. We design three visualization techniques for out-of-view object labeling in AR, which respectively encode the relative physical position (height-encoded), the rotational direction (angle-encoded), and the label values (value-encoded) of the objects. We further implement two traditional in-view object labeling techniques, where labels are placed either next to the respective objects (situated) or at the edge of the AR FoV (boundary). We evaluate these five different label conditions in three visual search tasks for static objects. Our study shows that out-of-view object labels are beneficial when searching for objects outside the FoV, spatial orientation, and when comparing multiple spatially sparse objects. Angle-encoded labels with directional cues of the surrounding objects have the overall best performance with the highest user satisfaction. We discuss the implications of our findings for future immersive AR interface design.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available