Daniel Scott Daniel Scott

Beyond The Transformer

Published in 2017 by Google Brain, 'Attention Is All You Need' [1] established the Transformer neural network architecture, which parallelised previously serial computation through their multi-head, self-attention mechanism - described by the language of linear algebra.

Read More
Daniel Scott Daniel Scott

EEG Artifact Removal: Does It Help Or Hinder?

Published in Nature Communications Biology (2025) [1], R. Kessler, A. Enge & M.A. Skeide explored 'How EEG Preprocessing Shapes Decoding Performance'. The authors systematically varied each preprocessing step in a standardised EEG data processing pipeline, resulting in 2593 replications across unique forking paths. This was executed as a SLURM job submission to a HPC cluster.

Read More