Alinaghi, Negar; Kattenbeck, Markus; Giannopoulos, Ioannis I Can Tell by Your Eyes! Continuous Gaze-Based Turn-Activity Prediction Reveals Spatial Familiarity (Inproceedings) In: pp. 2:1–2:13, Schloss Dagstuhl -- Leibniz-Zentrum für Informatik, 2022, ISBN: 978-3-95977-257-0. @inproceedings{alinaghi2022can,
title = {I Can Tell by Your Eyes! Continuous Gaze-Based Turn-Activity Prediction Reveals Spatial Familiarity},
author = {Negar Alinaghi and Markus Kattenbeck and Ioannis Giannopoulos},
url = {https://drops.dagstuhl.de/opus/volltexte/2022/16887/},
doi = {10.4230/LIPIcs.COSIT.2022.2},
isbn = {978-3-95977-257-0},
year = {2022},
date = {2022-08-22},
urldate = {2022-08-22},
pages = {2:1--2:13},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum für Informatik},
abstract = {Spatial familiarity plays an essential role in the wayfinding decision-making process. Recent findings in wayfinding activity recognition domain suggest that wayfinders' turning behavior at junctions is strongly influenced by their spatial familiarity. By continuously monitoring wayfinders' turning behavior as reflected in their eye movements during the decision-making period (i.e., immediately after an instruction is received until reaching the corresponding junction for which the instruction was given), we provide evidence that familiar and unfamiliar wayfinders can be distinguished. By applying a pre-trained XGBoost turning activity classifier on gaze data collected in a real-world wayfinding task with 33 participants, our results suggest that familiar and unfamiliar wayfinders show different onset and intensity of turning behavior. These variations are not only present between the two classes -familiar vs. unfamiliar- but also within each class. The differences in turning-behavior within each class may stem from multiple sources, including different levels of familiarity with the environment.},
keywords = {eye tracking, human activity recognition, Machine Learning, Spatial Familiarity},
pubstate = {published},
tppubtype = {inproceedings}
}
Spatial familiarity plays an essential role in the wayfinding decision-making process. Recent findings in wayfinding activity recognition domain suggest that wayfinders' turning behavior at junctions is strongly influenced by their spatial familiarity. By continuously monitoring wayfinders' turning behavior as reflected in their eye movements during the decision-making period (i.e., immediately after an instruction is received until reaching the corresponding junction for which the instruction was given), we provide evidence that familiar and unfamiliar wayfinders can be distinguished. By applying a pre-trained XGBoost turning activity classifier on gaze data collected in a real-world wayfinding task with 33 participants, our results suggest that familiar and unfamiliar wayfinders show different onset and intensity of turning behavior. These variations are not only present between the two classes -familiar vs. unfamiliar- but also within each class. The differences in turning-behavior within each class may stem from multiple sources, including different levels of familiarity with the environment. |
Alinaghi, Negar; Kattenbeck, Markus; Golab, Antonia; Giannopoulos, Ioannis Will You Take This Turn? Gaze-Based Turning Activity Recognition During Navigation (Inproceedings) In: Janowicz, Krzysztof; Verstegen, Judith A. (Ed.): 11th International Conference on Geographic Information Science (GIScience 2021) - Part II, pp. 5:1–5:16, Schloss Dagstuhl -- Leibniz-Zentrum für Informatik, Dagstuhl, Germany, 2021, ISSN: 1868-8969. @inproceedings{alinaghi_et_al:LIPIcs.GIScience.2021.II.5,
title = {Will You Take This Turn? Gaze-Based Turning Activity Recognition During Navigation},
author = {Negar Alinaghi and Markus Kattenbeck and Antonia Golab and Ioannis Giannopoulos},
editor = {Krzysztof Janowicz and Judith A. Verstegen},
url = {https://drops.dagstuhl.de/opus/volltexte/2021/14764},
doi = {10.4230/LIPIcs.GIScience.2021.II.5},
issn = {1868-8969},
year = {2021},
date = {2021-01-01},
urldate = {2021-01-01},
booktitle = {11th International Conference on Geographic Information Science (GIScience 2021) - Part II},
volume = {208},
pages = {5:1--5:16},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum für Informatik},
address = {Dagstuhl, Germany},
series = {Leibniz International Proceedings in Informatics (LIPIcs)},
abstract = {Decision making is an integral part of wayfinding and people progressively use navigation systems to facilitate this task. The primary decision, which is also the main source of navigation error, is about the turning activity, i.e., to decide either to turn left or right or continue straight forward. The fundamental step to deal with this error, before applying any preventive approaches, e.g., providing more information, or any compensatory solutions, e.g., pre-calculating alternative routes, could be to predict and recognize the potential turning activity. This paper aims to address this step by predicting the turning decision of pedestrian wayfinders, before the actual action takes place, using primarily gaze-based features. Applying Machine Learning methods, the results of the presented experiment demonstrate an overall accuracy of 91% within three seconds before arriving at a decision point. Beyond the application perspective, our findings also shed light on the cognitive processes of decision making as reflected by the wayfinder’s gaze behaviour: incorporating environmental and user-related factors to the model, results in a noticeable change with respect to the importance of visual search features in turn activity recognition.},
keywords = {eye tracking, human activity recognition, Machine Learning, wayfinding},
pubstate = {published},
tppubtype = {inproceedings}
}
Decision making is an integral part of wayfinding and people progressively use navigation systems to facilitate this task. The primary decision, which is also the main source of navigation error, is about the turning activity, i.e., to decide either to turn left or right or continue straight forward. The fundamental step to deal with this error, before applying any preventive approaches, e.g., providing more information, or any compensatory solutions, e.g., pre-calculating alternative routes, could be to predict and recognize the potential turning activity. This paper aims to address this step by predicting the turning decision of pedestrian wayfinders, before the actual action takes place, using primarily gaze-based features. Applying Machine Learning methods, the results of the presented experiment demonstrate an overall accuracy of 91% within three seconds before arriving at a decision point. Beyond the application perspective, our findings also shed light on the cognitive processes of decision making as reflected by the wayfinder’s gaze behaviour: incorporating environmental and user-related factors to the model, results in a noticeable change with respect to the importance of visual search features in turn activity recognition. |