EEG Artifact Removal: Does It Help Or Hinder?

Published in Nature Communications Biology (2025) [1], R. Kessler, A. Enge & M.A. Skeide explored 'How EEG Preprocessing Shapes Decoding Performance'. The authors systematically varied each preprocessing step in a standardised EEG data processing pipeline, resulting in 2593 replications across unique forking paths. This was executed as a SLURM job submission to a HPC cluster.

Of note, "most artifact corrections led to a decrease in decoding performance. However, removing artifacts may also remove the neural signal of interest, especially when using the threshold-based [independent component analysis] employed in the present study. Future studies should further disentangle the influence of artifact correction steps into the unique contribution of the removed artifacts versus the unintentionally removed neural signal. In some BCI use cases in which the source of the signal may be less relevant, an analyst may even refrain from removing predictive artifacts" ([1], p8).

In another paper published in Nature Scientific Reports (2023) [2], A. Delorme "compared optimised pipelines for preprocessing EEG data maximising ERP significance using the leading open-source EEG software: EEGLAB, FieldTrip, MNE, and Brainstorm. Only one pipeline performed significantly better than high-pass filtering the data" ([2], p1). The author concludes that "for relatively clean EEG data acquired in laboratory conditions, preprocessing techniques had little effect on data quality ... [which] might not be the case for more noisy data acquired in other conditions" ([2], p8).

Others have been applying the rapid developments in deep learning to the problem of multi-channel EEG data artifact removal. R. Jiang et. al. published in Nature Scientific Reports (2025) [3] 'A Novel EEG Artifact Removal Algorithm Based on an Advanced Attention Mechanism.' Their paper introduced CLEnet as a dual-scale CNN and LSTM, with an improved EMA-1D attention mechanism. "Innovatively, CLEnet decouples morphological features from temporal features, thereby maximizing the preservation of neurophysiological information while eliminating noise... laying the foundation for EEG decoding in dynamic noise environments." ([3], p17).

There is information loss when performing classical EEG artifact removal which can be minimised through either a) simplifying the preprocessing stage for applications where signal source is not critical or b) implementing more sophisticated methods of artifact removal better able to handle dynamic noise environments.

However, in situations where model features are to be interpreted spatially or temporally, it is of interest to remove artifacts.

References
[1] https://doi.org/10.1038/s42003-025-08464-3
[2] https://doi.org/10.1038/s41598-023-27528-0
[3] https://doi.org/10.1038/s41598-025-98653-1

Previous
Previous

Beyond The Transformer