Sankhya: The Indian Journal of Statistics

1995, Volume 57, Series A, Pt. 3, pp. 384--392

A NECESSARY CONDITIONS FOR THE CONSISTENCY OF L1-ESTIMATES

By

X. R. CHEN, University of Science and Technology of China

Y. WU, York University

and

L. C. ZHAO, University of Science and Technology of China

SUMMARY.   Let $y_i= \mathbf{x_{i}\prime}\beta_0+e_i, i=1, ..., n$ be a linear model, where x1, x2, ... are known p- vectors and e1, e2, ... are independen t random errors with median zero. Write $T_{n}^{2} = \|\mathbf{x_{1}} \| ^2 + ... +\|\mathbf{x_{n}} \| ^2$ and denote by $\hat \beta_{n}$ and L1 estimate of $\beta_{0}$, that is, $\hat \beta_{n}$ minimizes $\sum_{i-1}^{\infty} |y_i - \mathbf{x}_{i} \prime \beta|. It is shown that if there exists constants l10 and l20 such that$P(-h \leq e_{i} \leq 0) \leq l_{1}h \geq P(0 \leq e_i \leq h) for $i \geq 1$ and $0 \leq h \leq l_2$, then $T_n(\hat{\beta}_{n}-\beta_{0})$ cannot be $o_{p}(1)$. In particular, if $\sum_{i=0}^ \infty \|\mathbf{x}_{i}\| ^2 < \infty$ then $\hat{\beta}_{n}$ is not weakly consistent. It is also shown that the condition imposed on {ei} concerning the existence of l1 and l2 is necessary as well for the truth of this result.

AMS (1991) subject classification.   62J05, 62F35, 62G05.

Key words and phrases. $L_1$ estimate, linear regression model, consistency.

Full paper (PDF)