Akademska digitalna zbirka SLovenije - logo
E-viri
Celotno besedilo
Recenzirano
  • Validation of SOBI‐DANS met...
    Sun, Rui; Chan, Cynthia; Hsiao, Janet; Tang, Akaysha C.

    Psychophysiology, February 2021, 2021-02-00, 20210201, Letnik: 58, Številka: 2
    Journal Article

    Neurophysiological investigation of neural processes are hindered by the presence of large artifacts associated with eye movement. Although blind source separation (BSS)‐based hybrid algorithms are useful for separating, identifying, and removing these artifacts from EEG, it remains unexamined to what extent neural signals can remain mixed with these artifact components, potentially resulting in unintended removal of critical neural signals. Here, we present a novel validation approach to quantitatively evaluate to what extent horizontal and vertical saccadic eye movement‐related artifact components (H and V Comps) are indeed ocular in origin. To automate the identification of the H and V Comps recovered by the second‐order blind identification (SOBI), we introduced a novel Discriminant ANd Similarity (DANS)‐based method. Through source localization, we showed that over 95% of variance in the SOBI‐DANS identified H and V Comps’ scalp projections were ocular in origin. Through the analysis of saccade‐related potentials (SRPs), we found that the H and V Comps’ SRP amplitudes were finely modulated by eye movement direction and distance jointly. SOBI‐DANS’ component selection was in 100% agreement with human experts’ selection and was 100% successful in component identification across all participants indicating a high cross‐individual consistency or robustness. These results set the stage for future work to transform the to‐be‐thrown‐away artifacts into signals indicative of gaze position, thereby providing readily co‐registered eye movement and neural signal without using a separate eye tracker. We validated the SOBI‐DANS method for extracting artifact components associated with horizontal and vertical eye movements from EEG. We (1) set an upper bound on the maximum leakage of neural signals into these components; (2) quantified modulation of these components by saccade direction and distance; (3) demonstrated cross‐individual consistency; (4) raised the possibility of transforming EEG artifacts into signals of gaze position. The present study offers a starting pointing for future development of EEG‐based virtual eye tracking applications.