Font Size: a A A

Study On Dynamic Data Reconciliation And Gross Error Dection Methods For Chemical Process

Posted on:2015-11-22Degree:MasterType:Thesis
Country:ChinaCandidate:G X LiFull Text:PDF
GTID:2181330467471111Subject:Chemical engineering
Abstract/Summary:PDF Full Text Request
Reliable and accurate measurements are a basic condition for equipment control,process simulation, optimization and management. However, errors are oftencontained in the measurements, which can be classified as random error and grosserror. A random error is independent to each other and conforms to normaldistribution, while the gross error rarely occurs and doesn’t conform to normaldistribution. The fundamental principle of data reconciliation is using the processredundancy to modify the data, making the measurements satisfy mass balance,energy balance and other physical constraints of the process. Depending on the actualoperating conditions of industrial process, techniques for data reconciliation can bedivided into steady state data reconciliation and dynamic data reconciliation. In thispaper, the usually used dynamic data reconciliation methods, including waveletfiltering, robust data reconciliation and kalman filtering, are studied and modified.The wavelet filtering can effectively reduce the random error of measured data inchemical process, but it cannot identify the gross error in measured data. For thisreason, this paper through summarizing the rules between correction value,decomposition level and gross error, gives a formula for this relationship. And basedon this formula,this paper gives a method for wavelet filtering to detect gross error.Then the method combined with the soft threshold wavelet filtering. According to thereconciliation for measurement data which is simulated by a reaction kettle system,the data reconciliation method proposed in this paper can effectively detect the grosserror in the measurement data.In the robust data reconciliation, different algorithms to solve the robustmathematical model have different effects on data reconciliation. In this paper, theparticle swarm optimization algorithm is used to solve robust mathematical model,and the orthogonal collocation method is used to deal with constraints of chemicalprocess. Then compare the reconciliation results which are solved by particle swarmoptimization algorithm and sequential quadratic programming method, respectively.According to the reconciliation for measurement data which is produced by a CSTR,the robust mathematical model is solved by particle swarm optimization algorithm, itsreconciliation effect is better than that is solved by sequential quadratic programmingmethod. And, when the data is in small scale, the operation time based on these twokinds of method is similar. In the last section, robust estimators are used to modify the variance ofmeasurements. This method is used to improve the robustness of kalman filtering. Inthe process of measurement variance correction, this paper introduces a parameter anda critical value as the judgment standard. And this standard is used to verify thatwhether the variance needs to be modified. This method is used to modify the Extendkalman filtering and Unscented kalman filtering, respectively. Simulation results ofone nonlinear examples showed that, the modified kalman filtering based on robustestimator has a good performance on gross error detection. And by contrast, modifiedUnscented kalman filtering has a better performance on data reconciliation thanmodified Extend kalman filtering.
Keywords/Search Tags:dynamic data reconciliation, gross error detection, waveletfiltering, Particle swarm optimization algorithm, Unscented kalmanfilter
PDF Full Text Request
Related items