On the asymptotic behavior of the leading eigenvector of Tyler’s shape estimator under weak identifiability

Abstract

We consider point estimation in an elliptical Principal Component Analysis framework. More precisely, we focus on the problem of estimating the leading eigenvector $\boldsymbol{\beta}_1$ of the corresponding shape matrix. We consider this problem under asymptotic scenarios that allow the difference $r_n≔\lambda_{n1}-\lambda_{n2}$ between both largest eigenvalues of the underlying shape matrix to converge to zero as the sample size $n$ diverges to infinity. Such scenarios make the problem of estimating $\boldsymbol{\beta}_1$ challenging since this leading eigenvector is then not identifiable in the limit. In this framework, we study the asymptotic behavior of $\hat{\boldsymbol{\beta}}_1$, the leading eigenvector of Tyler’s M-estimator of shape. We show that consistency and asymptotic normality survive scenarios where $\sqrt{n}r_n$ diverges to infinity as $n$ does, although the faster the sequence $(r_n)$ converges to zero, the poorer the corresponding consistency rate is. We also prove that consistency is lost if $r_n=O(1/\sqrt{n})$, but that $\hat{\boldsymbol{\beta}}_1$ still bears some information on $\boldsymbol{\beta}_1$ when $\sqrt{n}r_n$ converges to a positive constant. When $\sqrt{n}r_n$ diverges to infinity, we provide asymptotic confidence zones for $\boldsymbol{\beta}_1$ based on $\hat{\boldsymbol{\beta}}_1$. Our non-standard asymptotic results are supported by Monte Carlo exercises.

Publication
Robust and Multivariate Statistical Methods: Festschrift in Honor of David E. Tyler