Font Size: a A A

Reliability inference based on degradation and time to failure data: Some models, methods and efficiency comparisons

Posted on:2005-09-09Degree:Ph.DType:Dissertation
University:University of MichiganCandidate:Gopikrishnan, AjitaFull Text:PDF
GTID:1452390008495773Subject:Statistics
Abstract/Summary:
There has been considerable interest in quality and reliability improvement methods among researchers and practitioners over the last twenty years. The traditional approach to reliability estimation is based on the analysis of time-to-failure data. However, in very high-reliability applications, there are few failures, so reliability assessment and improvement during the product development is a challenging task. Fortunately, recent advances in sensing and measurement technologies as well as computing power have made it easier to collect and analyze degradation and other performance-related data.; This dissertation considers several models for degradation data that lead to tractable failure time distributions. Failure time is defined as the first passage time to a specified degradation threshold. The models are: (a) random/mixed effects models with random slope and/or intercept and (b) a Wiener process with linear drift. The data sources include one or more of the following: (i) failure time data with right or interval censoring; (ii) regular degradation data where measurements are observable even after the degradation process exceeds the threshold; and (iii) degradation data with failure censoring where measurements are available only up to the time of failure and information about the interval where failure occurred. The last case is quite interesting and leads to new inference problems dealing with censored degradation data.; The dissertation studies methods of inference based on maximum likelihood estimation for the underlying parameters of the model. These maximum likelihood estimates (MLEs) are then used to estimate quantities of interest associated with the failure time distribution such as quantiles or design life. For the random effects models, both direct maximization of the likelihood as well as EM algorithm for computing the MLEs are studied. Although the EM algorithm is not the most efficient method from a computational viewpoint, the structure of the conditional distributions for the E-step provides useful insights into the information and trade-offs in the three data sources. The information matrices for the various situations are also obtained and used to compare the efficiencies of the MLEs of the quantiles based on different data sources. The problem of predicting the failure at the device level is also considered. Degradation data have a clear advantage over failure time data in this problem. The estimation and prediction problems for the Wiener process model are quite different from the finite-dimensional random effects models. Various extensions, including nonlinear degradation shapes and the presence of multiple failure modes, are also briefly discussed.
Keywords/Search Tags:Failure, Degradation, Data, Models, Reliability, Time, Methods, Inference
Related items