Eye movement analysis for real-world settings using segmented linear regression

Page view(s)
9
Checked on Nov 09, 2024
Eye movement analysis for real-world settings using segmented linear regression
Title:
Eye movement analysis for real-world settings using segmented linear regression
Journal Title:
Computers in Biology and Medicine
Keywords:
Publication Date:
02 April 2024
Citation:
Johari, K., Bhardwaj, R., Kim, J.-J., Yow, W. Q., & Tan, U.-X. (2024). Eye movement analysis for real-world settings using segmented linear regression. Computers in Biology and Medicine, 174, 108364. https://doi.org/10.1016/j.compbiomed.2024.108364
Abstract:
Eye movement analysis is critical to studying human brain phenomena such as perception, cognition, and behavior. However, under uncontrolled real-world settings, the recorded gaze coordinates (commonly used to track eye movements) are typically noisy and make it difficult to track change in the state of each phenomenon precisely, primarily because the expected change is usually a slower transient process. This paper proposes an approach, Improved Naive Segmented linear regression (INSLR), which approximates the gaze coordinates with a piecewise linear function (PLF) referred to as a hypothesis. INSLR improves the existing NSLR approach by employing a hypotheses clustering algorithm, which redefines the final hypothesis estimation in two steps: (1) At each time-stamp, measure the likelihood of each hypothesis in the candidate list of hypotheses by using the least square fit score and its distance from the means of the hypotheses in the list. (2) Filter hypothesis based on a pre-defined threshold. We demonstrate the significance of the INSLR method in addressing the challenges of uncontrolled real-world settings such as gaze denoising and minimizing gaze prediction errors from cost-effective devices like webcams. Experiment results show INSLR consistently outperforms the baseline NSLR in denoising noisy signals from three eye movement datasets and minimizes the error in gaze prediction from a low precision device for 71.1% samples. Furthermore, this improvement in denoising quality is further validated by the improved accuracy of the oculomotor event classifier called NSLR-HMM and enhanced sensitivity in detecting variations in attention induced by distractor during online lecture.
License type:
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Funding Info:
This research / project is supported by the A*STAR - AME Programmatic Funding Scheme
Grant Reference no. : A18A2b0046
Description:
ISSN:
0010-4825
Files uploaded:

File Size Format Action
eye-mov-analysis-in-nat-scen-slr-cbm-r1-2812.pdf 3.16 MB PDF Request a copy