Abstract
Byeong Park, Seoul National University
"Smooth backfitting in errors-in-variables additive models"
We study nonparametric additive regression models when covariates are contaminated by measurement errors. We find a new deconvolution normalized kernel that is suitable for for smooth backfitting additive component functions in the presence of errors-in-variables. We prove that the smooth backfitting iterative algorithm converges and that the smooth backfitting estimators of the component functions accomplish the univariate deconvolution accuracy. We find that the effect of contamination on smooth backfitting is confined into a term whose magnitude is negligible and the rate is accelerated in a certain range of the smoothness of measurement error distributions. In such a class of measurement error distributions, the component function estimators have asymptotic normality with the oracle variance that can be obtained under the knowledge of the other components. We present finite sample properties of the deconvolution smooth backfitting estimators in comparison with a naive application of the standard smooth backfitting technique that ignores measurement errors. The Monte Carlo simulation demonstrates that the proposed method gives smaller mean integrated squared errors than the naive one.