Linear relations, containing measurement errors in input and output data, are taken into account in this paper. Parameters of these so-called \emph{errors-in-variables} (EIV) models can be estimated by minimizing the \emph{total least squares} (TLS) of the input-output disturbances. Such an estimate is highly non-linear. Moreover in some realistic situations, the errors cannot be considered as independent by nature. \emph{Weakly dependent} (α- and φ-mixing) disturbances, which are not necessarily stationary nor identically distributed, are considered in the EIV model. Asymptotic normality of the TLS estimate is proved under some reasonable stochastic assumptions on the errors. Derived asymptotic properties provide necessary basis for the validity of block-bootstrap procedures.
Interpolating and approximating polynomials have been living separately more than two centuries. Our aim is to propose a general parametric regression model that incorporates both interpolation and approximation. The paper introduces first a new r-point transformation that yields a function with a simpler geometrical structure than the original function. It uses r≥2 reference points and decreases the polynomial degree by r−1. Then a general representation of polynomials is proposed based on r≥1 reference points. The two-part model, which is suited to piecewise approximation, consist of an ordinary least squares polynomial regression and a reparameterized one. The later is the central component where the key role is played by the reference points. It is constructed based on the proposed representation of polynomials that is derived using the r-point transformation Tr(x). The resulting polynomial passes through r reference points and the other points approximates. Appropriately chosen reference points ensure quasi smooth transition between the two components and decrease the dimension of the LS normal matrix. We show that the model provides estimates with such statistical properties as consistency and asymptotic normality.
n−−√-consistency of the least trimmed squares estimator is proved under general conditions. The proof is based on deriving the asymptotic linearity of normal equations.
Asymptotic normality of the least trimmed squares estimator is proved under general conditions. At the end of paper a discussion of applicability of the estimator (including the discussion of algorithm for its evaluation) is offered.