Sankhya: The Indian Journal of Statistics

2000, Volume 62, Series A, Pt. 2, 193--202

IDENTIFIABILITY OF DISTRIBUTIONS OF INDEPENDENT RANDOM VARIABLES BY LINEAR COMBINATIONS AND MOMENTS

By

G. J. SZÉKELY, Bowling Green State University, Ohio
Alfréd Rényi Institute, Hungarian Academy of Sciences

and

C.R. RAO, Pennsylvania State University

SUMMARY. Let $X_1, X_2, \ldots,X_n$ be independent random variables. Given the moments $EX_j^s$ $(s=1,2, \ldots , m)$, $(j=1,2, \ldots , n)$, the joint distribution function of the linear forms $$Y_i=\sum_{j=1}^n a_{ij}X_j,\ \ \ \ \ i=1,2, \ldots , k$$ \parindent =0pt with an arbitrary nonvanishing joint characteristic function uniquely determines the distributions of $X_1, X_2, \ldots , X_n$ (with trivial exceptions) iff $n \le \left( \begin{array}{c} k+m \\ m+1\\ \end{array} \right )$. For example four moments and four linear combinations under general conditions (specified later) determine the distribution of $n =56$ independent random variables, but not of 57.

AMS (1991) subject classification. Primary 62E10; secondary 60E10.

Key words and phrases. Characterization of distributions; functional equations; linear structural relations; moments; nonvanishing characteristic functions.

Full paper (PDF)

This article in Mathematical Reviews