La Trobe

Dual-branch cross-dimensional self-attention-based imputation model for multivariate time series

Download (1.28 MB)
journal contribution
posted on 2024-02-08, 00:52 authored by Le Fang, Wei XiangWei Xiang, Yuan Zhou, Juan Fang, Lianhua ChiLianhua Chi, Zongyuan Ge
In real-world scenarios, partial information losses of multivariate time series degrade the time series analysis. Hence, the time series imputation technique has been adopted to compensate for the missing values. Existing methods focus on investigating temporal correlations, cross-variable correlations, and bidirectional dynamics of time series, and most of these methods rely on recurrent neural networks (RNNs) to capture temporal dependency. However, the RNN-based models suffer from the common problems of slow speed and high complexity when dealing with long-term dependency. While some self-attention-based models without any recurrent structures can tackle long-term dependency with parallel computing, they do not fully learn and utilize correlations across the temporal and cross-variable dimensions. To address the limitations of existing methods, we propose a novel so-called dual-branch cross-dimensional self-attention-based imputation (DCSAI) model for multivariate time series, which is capable of performing global and auxiliary cross-dimensional analyses when imputing the missing values. In particular, this model contains masked multi-head self-attention-based encoders aligned with auxiliary generators to obtain global and auxiliary correlations in two dimensions, and these correlations are then combined into one final representation through three weighted combinations. Extensive experiments are presented to show that our model performs better than other state-of-the-art benchmarkers on three real-world public datasets under various missing rates. Furthermore, ablation study results demonstrate the efficacy of each component of the model.

History

Publication Date

2023-11-04

Journal

Knowledge-Based Systems

Volume

279

Article Number

110896

Pagination

10p.

Publisher

Elsevier

ISSN

0950-7051

Rights Statement

© 2023 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Usage metrics

    Journal Articles

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC