Treffer: Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking

Title:
Consider the Head Movements! Saccade Computation in Mobile Eye-Tracking
Publisher Information:
Zenodo
Publication Year:
2022
Collection:
Zenodo
Document Type:
Konferenz conference object
Language:
unknown
DOI:
10.1145/3517031.3529624
Rights:
Creative Commons Attribution 4.0 International ; cc-by-4.0 ; https://creativecommons.org/licenses/by/4.0/legalcode
Accession Number:
edsbas.98D230FE
Database:
BASE

Weitere Informationen

Saccadic eye movements are known to serve as a suitable proxy for tasks prediction. In mobile eye-tracking, saccadic events are strongly influenced by head movements. Common attempts to compensate for head-movement effects either neglect saccadic events altogether or fuse gaze and head-movement signals measured by IMUs in order to simulate the gaze signal at head-level. Using image processing techniques, we propose a solution for computing saccades based on frames of the scene-camera video. In this method, fixations are first detected based on gaze positions specified in the coordinate system of each frame, and then respective frames are merged. Lastly, pairs of consecutive fixations –forming a saccade- are projected into the coordinate system of the stitched image using the homography matrices computed by the stitching algorithm. The results show a significant difference in length between projected and original saccades, and approximately 37% of error introduced by employing saccades without head-movement consideration. ; This is the implementation of the proposed algorithm in Python. A sample data is also provided to run the code.