We consider point estimation in an elliptical Principal Component Analysis framework. More precisely, we focus on the problem of estimating the leading eigenvector $\boldsymbol{\theta}1$ of the corresponding shape matrix. We consider this problem under asymptotic scenarios that allow the difference $r_n:=\lambda{n1}-\lambda_{n2}$ between both largest eigenvalues of the underlying shape matrix to converge to zero as the sample size $n$ diverges to infinity. Such scenarios make the problem of estimating $\boldsymbol{\theta}_{1}$ challenging since this leading eigenvector is then not identifiable in the limit. In this framework, we study the asymptotic behavior of $\hat{\boldsymbol{\theta}_1}$, the leading eigenvector of Tyler’s M-estimator of shape. We show that consistency and asymptotic normality survive scenarios where $\sqrt{n}r_n$ diverges to infinity as $n$ does, although the faster the sequence $(r_n)$ converges to zero, the poorer the corresponding consistency rate is. We also prove that consistency is lost if $r_n=O(1/\sqrt{n})$, but that $\hat{\boldsymbol{\theta}}_1$ still bears some information on $\boldsymbol{\theta}_1$ when $\sqrt{n}r_n$ converges to a positive constant. When $\sqrt{n}r_n$ diverges to infinity, we provide asymptotic confidence zones for $\boldsymbol{\theta}_1$ based on $\hat{\boldsymbol{\theta}}_1$. Our non-standard asymptotic results are supported by Monte Carlo exercises.