Title: Asymptotics for Regression Models Under Loss of Identifiability
Author(s): Joseph Rynkiewicz
Pages: 155 -- 179
This paper discusses the asymptotic behavior of regression models under general conditions, especially if the dimensionality of the set of true parameters is larger than zero and the true model is not identifiable. Firstly, we give a general inequality for the difference of the sum of square errors (SSE) of the estimated regression model and the SSE of the theoretical true regression function in our model. A set of generalized derivative functions is a key tool in deriving such inequality. Under suitable Donsker condition for this set, we provide the asymptotic distribution for the difference of SSE. We show how to get this Donsker property for parametric models even though the parameters characterizing the best regression function are not unique. This result is applied to neural networks regression models with redundant hidden units when loss of identifiability occurs and gives some hints on how penalizing such models to avoid over-fitting.