Structure preserving transformers for sequences of SPD matrices

Mathieu Seraphim &
Alexis Lechervy &
Florian Yger &
Luc Brun &
Olivier Etard.

In recent years, Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types, from texts to images and beyond, including data from non-Euclidean geometries. In this paper, we present such a mechanism, designed to classify se- quences of Symmetric Positive Definite matrices while preserving their Riemannian geometry throughout the analysis. We apply our method to automatic sleep staging on timeseries of EEG- derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.