4.6 Article

Mobile Eye-Tracking Data Analysis Using Object Detection via YOLO v4

期刊

SENSORS
卷 21, 期 22, 页码 -

出版社

MDPI
DOI: 10.3390/s21227668

关键词

eye movements; eye tracking; object detection; YOLO; Faster R-CNN; physics experiments

向作者/读者索取更多资源

Remote eye tracking is a crucial tool for online analysis of learning processes. Mobile eye trackers, especially with the help of object recognition models, can simplify the analysis process and provide real-time system responses to user's gaze, although challenges in using object detection for mobile eye-tracking data need to be addressed.
Remote eye tracking has become an important tool for the online analysis of learning processes. Mobile eye trackers can even extend the range of opportunities (in comparison to stationary eye trackers) to real settings, such as classrooms or experimental lab courses. However, the complex and sometimes manual analysis of mobile eye-tracking data often hinders the realization of extensive studies, as this is a very time-consuming process and usually not feasible for real-world situations in which participants move or manipulate objects. In this work, we explore the opportunities to use object recognition models to assign mobile eye-tracking data for real objects during an authentic students' lab course. In a comparison of three different Convolutional Neural Networks (CNN), a Faster Region-Based-CNN, you only look once (YOLO) v3, and YOLO v4, we found that YOLO v4, together with an optical flow estimation, provides the fastest results with the highest accuracy for object detection in this setting. The automatic assignment of the gaze data to real objects simplifies the time-consuming analysis of mobile eye-tracking data and offers an opportunity for real-time system responses to the user's gaze. Additionally, we identify and discuss several problems in using object detection for mobile eye-tracking data that need to be considered.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据