Font Size: a A A

Robust Statistical Modeling through Nonparametric Bayesian Methods

Posted on:2011-06-13Degree:Ph.DType:Thesis
University:The Ohio State UniversityCandidate:Lee, Ju HeeFull Text:PDF
GTID:2440390002456789Subject:Statistics
Abstract/Summary:
Nonparametric Bayesian models are commonly used to obtain robust statistical inference, and the most popular nonparametric Bayesian model is, arguably, the mixture of Dirichlet processes (MDP) model. In this study, we examine the question of how to obtain more robustness than under a conventional MDP model. In answer to this question, we develop two models from a nonparametric Bayesian viewpoint, and we investigate their properties: (i) the limiting Dirichlet process (limdir) model, and (ii) the local-mass preserving mixture of Dirichlet process (LMDP) model. The limdir model addresses the question of how to perform a "noninformative" nonparametric Bayesian analysis. Rather than being noninformative, the model requires a slight amount of input, and so provides us with a minimally informative prior distribution with which to conduct a nonparametric Bayesian analysis. The limdir prior distribution can be viewed as the limit of a sequence of mixture of Dirichlet process models. This model requires only modest input, and yet provides posterior behavior which has a number of important qualitative features, including robustness. Second, the LMDP prior distribution focuses on local mass (defined in the paper). To specify such a prior distribution, we carefully consider the behavior of parameters of interest in some small region, and we then select a prior distribution which preserves mass in the region. Local mass preservation ties the mass of the base measure to its dispersion, resulting in robust inference. These two strategies for constructing a prior distribution can be applied to any model based on the Dirichlet process. Calibration of the prior distribution is considered. We use the limdir for the compound decision problem and the one-way analysis of variance problem, and compare its performance to that of mixture of Dirichlet processes models and to parametric Bayesian models on actual data sets. We apply the LMDP model for the one-way analysis of variance problem, and compare its performance to that of a mixture of Dirichlet processes model with a conventional prior structure. In addition to developing the robust nonparametric Bayesian models, the latter part of the study describes a general form of consistency which does not necessarily rely on correct specification of the likelihood. We carefully investigate issues of consistency and inconsistency for a variety of functions of interest, such as equality of subsets of treatment means, without the assumption that the model is correct. We prove that Bayes estimators achieve (asymptotic) consistency under some suitable regularity conditions on the assumed likelihood. More importantly, we find a need to distinguish between the notions of two parameters being "equal to one another" and "close to one another", and we illustrate differences in asymptotic inference for these two statements. This distinction carries with it implications for Bayesian tests of a point null hypothesis.
Keywords/Search Tags:Bayesian, Model, Robust, Prior distribution, Inference, Dirichlet process
Related items