Font Size: a A A

Convergence Analysis Of Parameter Estimation Based On Hausdorff Metric

Posted on:2015-07-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:C Y ChenFull Text:PDF
GTID:1220330452458656Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
This dissertation studies convergence properties of parameter estimation and somerelating issues of experiment design in linear time-invariant discrete-time system iden-tification. According to prediction error method, as the length of the input-output datatends to infinity, the sequence of the criterion functions converges to a limiting criterionfunction uniformly in the parameter on a compact set. Parameter estimations are the min-imizing arguments of the criterion function on the compact set. Their convergence issueshave got a lot of attention in literature. Among these studies, there is an interesting factwhich is not paid sufcient attention seemingly. It is that the minimizing arguments ofthe criterion functions or the limiting criterion function may not be unique. In this case,the parameter estimation is represented by a set rather than by a single point. Thus amathematical feature of the convergence problem of parameter estimation is in that weare needed, from the convergence of a sequence of functions, to infer the convergence ofthe sequence of their sets of minimizing arguments. The Hausdorf metric is suggestedto measure the distance of parameter estimation. Then the convergence properties of pa-rameter estimation and some relating issues are discussed in the sense of the Hausdorfmetric.First, the situations that the minimizing arguments of the criterion functions or lim-iting criterion function on a compact set are not unique are pointed out by the actualmodel and theoretical analysis. As the minimizing arguments of the criterion functions orlimiting criterion function are not unique, the shortage of the existing theory on dealingwith the convergence of parameter estimation is presented. The Hausdorf metric is thensuggested as a tool to study the convergence properties of parameter estimation.Second, provided that a general continuous function sequence converges to a lim-iting function uniformly in the independent variable on a compact set, the convergenceproperties of the set of the minimizing arguments of this function sequence are studied inthe sense of the Hausdorf metric. According to the Hausdorf metric, the convergence isnot guaranteed in general. A condition guaranteeing such convergence is given, that is,the minimizing argument of the limiting function is unique. When the general functionsequence is taken specifically as the criterion function sequence, these results reveal some interesting properties of the parameter estimation. Since the parameter estimation com-puted by numerical methods are the local minimizing arguments of the criterion function,the convergence properties of the sets of the local minimizing arguments of the criteri-on function are discussed in the sense of the Hausdorf metric. If the local minimizingarguments of the limiting criterion function are all isolated, we can select a set of some lo-cal minimizing arguments of the criterion function such that the corresponding sequenceconverges to the set of the local minimizing arguments of the limiting criterion functionin the sense of the Hausdorf metric.Then, two cases that the set of the minimizing arguments of the limiting criterionfunction is a continuum and an isolated set are presented. The asymptotic behavior of thecorresponding parameter estimation are analyzed and illustrated by examples. When theset is a continuum, the parameter estimation sequence depends on the choice of input sig-nal and the realization of the noise and exhibits complex asymptotic behavior. When theset is an isolated set, the parameter estimation sequence is absorbed to a nonempty sub-set of the set of the minimizing arguments of the limiting criterion function and exhibitsinteresting oscillatory behavior.Finally, in order to guarantee the convergence of parameter estimation, we introducethe sufcient and necessary conditions under which the minimizing argument of the lim-iting criterion function is unique. Then for some classic model structures, we show theconditions of experiment design in open-loop identification and closed-loop identifica-tion to guarantee such uniqueness. When the parameter estimation converges, we clarifyin what sense the closed loop experiment design is better than or strictly better than thatof the open loop. According to the actual needs, we give three diferent comparison crite-rion functions, and compare the corresponding closed loop with open loop experimentaldesign.
Keywords/Search Tags:Parameter Estimation, Prediction Error Method, Convergence Analysis, Haus-dorf Metric, Experiment Design
PDF Full Text Request
Related items