Penalized likelihood regression consists of a category of widely used regularization methods, including regression splines and the LASSO. This dissertation has two major parts, both of which are within the framework of penalized likelihood regression.;The first part presents a direct extension of penalized likelihood regression with RKHS penalty to the situation when the observed covariates are probability spaces. We prove that the penalized likelihood estimate exists under a mild condition. In the computation, we propose a dimension reduction technique to minimize the penalized likelihood and derive a GACV to choose the smoothing parameter. A direct implementation of our methods is to handle incomplete data problems such as covariate measurement error and partially missing covariates.;The second part concerns estimating the degrees of freedom for penalized likelihood regression. We show that the degrees of freedom for penalized likelihood regression can be estimated by: (1) the trace of the influence matrix of the mean; and (2) the GACV. With these results on hand, various model selection criteria---AIC, BIC, GACV and BGACV--- are available to select the regularization parameter. Our methods can be extended to treat the variable selection problem with multivariate Bernoulli observations. |