posted on 2023-07-05, 05:23authored byT Nguyen, Hien NguyenHien Nguyen, F Chamroukhi, F Forbes
Mixture of experts (MoE) are a popular class of statistical and machine learning models that have gained attention over the years due to their flexibility and efficiency. In this work, we consider Gaussiangated localized MoE (GLoME) and block-diagonal covariance localized MoE (BLoME) regression models to present nonlinear relationships in heterogeneous data with potential hidden graph-structured interactions between high-dimensional predictors. These models pose difficult statistical estimation and model selection questions, both from a computational and theoretical perspective. This paper is devoted to the study of the problem of model selection among a collection of GLoME or BLoME models characterized by the number of mixture components, the complexity of Gaussian mean experts, and the hidden block-diagonal structures of the covariance matrices, in a penalized maximum likelihood estimation framework. In particular, we establish non-asymptotic risk bounds that take the form of weak oracle inequalities, provided that lower bounds for the penalties hold. The good empirical behavior of our models is then demonstrated on synthetic and real datasets.
Funding
TrungTin Nguyen is supported by a "Contrat doctoral" from the French Ministry of Higher Education and Research. Faicel Chamroukhi is granted by the Nguyen et al. French National Research Agency (ANR) grant SMILES ANR-18-CE40-0014. Hien Duy Nguyen is funded by Australian Research Council grant number DP180101192. This research is funded directly by the Inria LANDER project.
History
Publication Date
2022-06-01
Journal
Electronic Journal of Statistics
Volume
16
Issue
2
Pagination
(p. 4742-4822)
Publisher
Institute of Mathematical Statistics and Bernoulli Society