Two hours in Hollywood: A manually annotated ground truth data set of eye movements during movie clip watching
Journal of Eye Movement Research
2020
13
4
Oct.
Supersaliency: A Novel Pipeline for Predicting Smooth Pursuit-Based Attention Improves Generalisability of Video Saliency
IEEE Xplore
2020
Vol. 8
pp. 1276-1289
360-degree Video Gaze Behaviour: A Ground-Truth Data Set and a Classification Algorithm for Eye Movements
pp. 1007-1015
Proc. 27th ACM International Conference on Multimedia, MM '19
2019
Characterising and automatically detecting smooth pursuit in a large-scale ground-truth data set of dynamic natural scenes
Journal of Vision
2020
Vol. 19, Issue 14
Dec
pp. 1-25
A novel gaze event detection metric that is not fooled by gaze-independent baselines
Proc. 11th ACM Symposium on Eye Tracking Research & Applications, ETRA 2019
2019
Classifying Autism Spectrum Disorder Based on Scanpaths and Saliency
pp. 633-636
Proc. International Conference on Multimedia & Expo Workshops (ICMEW)
2019
1D CNN with BLSTM for automated classification of fixations, saccades, and smooth pursuits
Behavior Research Methods
2018
Vol. 51, Issue 2
Nov
pp. 556–572
Deep learning vs. manual annotation of eye movements
Article No. 101
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA 18)
2018
Supersaliency: Predicting Smooth Pursuit-Based Attention with Slicing CNNs Improves Fixation Prediction for Naturalistic Videos
2018
360-aware saliency estimation with conventional image saliency predictors
Signal Processing: Image Communication
2018
Vol. 69
Nov
pp. 43-52