On the asymptotic behavior of the leading eigenvector of Tyler’s shape estimator under weak identifiability

Abstract

We consider point estimation in an elliptical Principal Component Analysis framework. More precisely, we focus on the problem of estimating the leading eigenvector β1 of the corresponding shape matrix. We consider this problem under asymptotic scenarios that allow the difference rnλn1λn2 between both largest eigenvalues of the underlying shape matrix to converge to zero as the sample size n diverges to infinity. Such scenarios make the problem of estimating β1 challenging since this leading eigenvector is then not identifiable in the limit. In this framework, we study the asymptotic behavior of β^1, the leading eigenvector of Tyler’s M-estimator of shape. We show that consistency and asymptotic normality survive scenarios where nrn diverges to infinity as n does, although the faster the sequence (rn) converges to zero, the poorer the corresponding consistency rate is. We also prove that consistency is lost if rn=O(1/n), but that β^1 still bears some information on β1 when nrn converges to a positive constant. When nrn diverges to infinity, we provide asymptotic confidence zones for β1 based on β^1. Our non-standard asymptotic results are supported by Monte Carlo exercises.

Publication
Robust and Multivariate Statistical Methods: Festschrift in Honor of David E. Tyler