Font Size: a A A

Gross Error Detection Method Based On The Measurement Data Redundancy

Posted on:2014-01-14Degree:MasterType:Thesis
Country:ChinaCandidate:W W SongFull Text:PDF
GTID:2248330395477454Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Reliable process data are the key to the efficient operation of chemical plants.With the increasing use of on-line digital computers, numerous data are acquired and used for on-line optimization and control. Frequently we use these data to improve process performance. Because there are many factors in the actual chemical process, sometimes the measurement data that we get contain measurement errors. Due to measurement data are not accurate and reliable, so it can’t correct reaction the actual chemical process. This is the so-called measurement data imbalance. To solve this problem we need to use the method that we call data reconciliation. General speaking, data reconciliation always apply in the process of steady state in the chemical industry. However because of the effect of the instrument, the equipment and other factors in the chemical industry, to get the steady data is our important factor in data reconciliation. And this is also one of the important factors that data reconciliation can operate smoothly.This paper mainly study on the time and spatial redundancy to detect gross errors. In the time redundancy, we are considering the effect of data accumulation. Respectively by means of improved wavelet packet and improved Bayesian method; In Space redundancy, we mainly consider the gross error of measurement network problems, and improve the traditional detection method. And consider the influence of measurement variance. By the improvement of NT-MT methods complete data correction process. On the base of Bayesian and wavelet packet methods and combination with improved MT-NT method modify data correction technology. We use a variety of methods to correct measurement data on the time and space, and try our best to reduce influence form the measurement data. Through the implementation of these methods, we can get a group of reliable accurate steady data.The main content of this article as follows:1) The principles and contents of data reconciliation are discussed. Especially the basic concepts and main method of data coordination and gross error detection are study.2) Improved wavelet packet method of removing random error is proposed in time redundancy. Combination with the Shannon criterion and heursure threshold methods decompose wavelet packet and denoise signal, and making the simulation experiment of gross error detection.3) Bayesian classification principle in the time redundancy is proposed, combining with the database of history data, and regarding them as priori information. By use of Bayesian criteria calculate the posterior probability, and remove some data of containing the errors. Through adding this method can reduce the data noise quickly, and through the simulation analyses its advantages.4) Analyzing the data in space redundancy. The traditional gross error detection methods are discussed; on the base of this propose the improved NT-MT method. And improve the calculation of the measurement covariance. Finally, we accomplish the simulation combined with the control process and prove the feasibility and validity of the method.Lastly, summary the full text and analysis the data reconciliation technology further. Also own some views are proposed.
Keywords/Search Tags:Discrete-time Data reconciliation, Wavelet packet transform, thresholdprocessing, Bayesian classification, NT-MT method
PDF Full Text Request
Related items