A robust version of the Ordinary Least Squares accommodating the idea of weighting the order statistics of the squared residuals (rather than directly the squares of residuals) is recalled and its properties are studied. The existence of solution of the corresponding extremal problem and the consistency under heteroscedasticity is proved.
For the data sampled from a low-dimensional nonlinear manifold embedded in a high-dimensional space, such as Swiss roll and S-curve, Self-Organizing Map (SOM) tends to get stuck in local minima and then yield topological defects in the final map. To avoid this problem and obtain more faithful visualization results, a variant of SOM, i.e. Dynamic Self-Organizing Map (DSOM), was presented in this paper. DSOM can dynamically increase the map size, as the training data set is expanded according to its intrinsic neighborhood structure, starting from a small neighborhood in which the data points can lie on or close to a linear patch. According to the locally Euclidean nature of the manifold, the map can be guided onto the manifold surface and then the global faithful visualization results can be achieved step by step. Experimental results show that DSOM can discover the intrinsic manifold structure of the data more faithfully than SOM. In addition, as a new manifold learning method, DSOM can obtain more concise visualization results and be less sensitive to the neighborhood size and the noise than typical manifold learning methods, such as Isometric Mapping (ISOMAP) and Locally Linear Embedding (LLE), which can also be verified by experimental results.
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the ϕ-divergence is always equal to its upper bound, and the minimum ϕ-divergence estimates are trivial. Broniatowski and Vajda \cite{IV09} proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by α∈R in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different ϕ-divergence parameters.
The special theory of relativity holds significant interest for scientifi c perspectivists. In this paper, I distinguish between two related meanings of “perspectival,” and argue that reference frames are perspectives, provided that perspectival means “being conditional” rather than “being partial.” Framedependent properties such as length, time duration, and simultaneity, are not partially measured in a reference frame, but their measurements are conditional on the choice of frame. I also discuss whether the constancy of the speed of light depends on perspectival factors such as the idealized defi nition of the speed of light in a perfect vacuum and the Einstein synchronization convention. Furthermore, I argue for the view that the constancy of its speed is a robust property of light according to the conditions of currently acceptable experimental setups pertaining to special relativity, and conclude that this view supports perspectivism. and Speciální teorie relativity je obzvláště zajímavá z hlediska vědeckého perspektivismu. V tomto článku rozliším dva související významy pojmu „perspektiva“ a pokusím se ukázat, že vztažné soustavy lze chápat jako perspektivy za předpokladu, že perspektivou rozumíme spíše „být podmíněný“, a nikoli „být dílčí“. Vlastnosti závislé na vztažné soustavě, jako je délka, časové trvání a simultánnost, nejsou v dané vztažné soustavě měřeny neúplně, ale jejich měření jsou podmíněna výběrem vztažné soustavy. Rovněž se budu zabývat otázkou, zda stálost rychlosti světla závisí na perspektivních faktorech, jako je idealizovaná defi nice rychlosti světla v dokonalém vakuu a Einsteinova synchronizační konvence. Na závěr se pokusím ukázat, že konstantní rychlost je robustní vlastností světla dle podmínek aktuálně přijímaných experimentů týkajících se speciální relativity a že tento pohled podporuje perspektivismus.
In applications of stochastic programming, optimization of the expected outcome need not be an acceptable goal. This has been the reason for recent proposals aiming at construction and optimization of more complicated nonlinear risk objectives. We will survey various approaches to risk quantification and optimization mainly in the framework of static and two-stage stochastic programs and comment on their properties. It turns out that polyhedral risk functionals introduced in Eichorn and Römisch \cite{Eich-Ro} have many convenient features. We shall complement the existing results by an application of contamination technique to stress testing or robustness analysis of stochastic programs with polyhedral risk objectives with respect to the underlying probability distribution. The ideas will be illuminated by numerical results for a bond portfolio management problem.
The paper investigates generalized linear models (GLM's) with binary responses such as the logistic, probit, log-log, complementary log-log, scobit and power logit models. It introduces a median estimator of the underlying structural parameters of these models based on statistically smoothed binary responses. Consistency and asymptotic normality of this estimator are proved. Examples of derivation of the asymptotic covariance matrix under the above mentioned models are presented. Finally some comments concerning a method called enhancement and robustness of median estimator are given and results of simulation experiment comparing behavior of median estimator with other robust estimators for GLM's known from the literature are reported.
In this paper, we consider and introduce methods for robust principal component analysis (PCA), including also cases where there are missing values in the data. PCA is a widely applied standard statistical method for data preprocessing, compression, and analysis. It is based on the second-order statistics of the data and is optimal for Gaussian data, but it is often applied to data sets having unknown or other types of probability distributions. PCA can be derived from minimization of the mean-square representation error or maximization of variances under orthonormality constraints. However, these quadratic criteria are sensitive to outliers in the data and long-tailed distributions, which may considerably degrade the results given by PCA. We introduce robust methods for estimation of both the PCA eigenvectors directly or the PCA subspace spanned by them. Experimental results show that our methods provide often better results than standard PCA when outliers are present in the data. Furthermore, we extend our methods to incomplete data with missing values. The problems arising in such cases have several features typical for nonlinear models.
In this paper we develop the theory of spring balance weighing designs with non-positive correlated errors for that the lower bound of the variance of estimated total weight is attained.
This study deals with uncertainty, stability and robustness in reservoirs operation. These properties of control currently represent a new aspect in the utilization of water resources and their systems in changing conditions. The study is based on the modern control theory of dynamic systems. It also explains the aims and exacting nature of methodical approaches. Instead of analytical methods, simulation models were used for the solution of runoff stability during different flood situations. The flood protection effect of the reservoir was solved on the basis of a set of generated synthetic flood waves. Then, the stability of the runoff was investigated in different hydrological situations. The study concludes that the stability of the runoff from the reservoir is possible to reach only within certain limits, while in a catastrophic flood situation, it is unrealistic. It also concludes that the combination of different flood protection measures is purposeful, e.g. a larger flood-control storage, predischarge, intensification of the hydrometeorological forecast, stream-channel regulation, etc. Finally, the study suggests themes for further investigation in this field. and Studie se zabývá neurčitostí, stabilitou a robustností při operativním řízení nádrží. Tyto vlastnosti řízení se dnes stávají novými hledisky při využívání vodních zdrojů a jejich soustav v měnících se podmínkách. Studie vychází z moderní teorie řízení dynamických systémů, objasňuje její cíle a matematickou náročnost metodických postupů. Místo analytických metod byly ve studii využity pro řešení stability odtoku z nádrží za povodňových situací simulační modely. Ochranný účinek nádrže se řešil variantně na podkladě souborů generovaných syntetických povodňových vln. Stabilita odtoku se tak mohla zkoumat v různých hydrologických situacích. Studie dospěla k závěru, že stabilitu odtoku u vybudovaných nádrží lze zabezpečit zpravidla jen v jistých mezích, v katastrofálních situacích je tento požadavek nereálný. Účelná je tu kombinace různých protipovodňových opatření, např. většího ochranného prostoru nádrže, předvypouštění z nádrže, prohloubení hydrometeorologické předpovědi, úpravy koryta pod nádrží a j. Studie uvádí v závěru náměty na další výzkum v této oblasti.
The second part of the study presents the results of the control stability judgement of the Lipno reservoir in the flood situations. For this solution were used the generated synthetic flood waves. This part is the continuation of the first part with methodology, published in 1/2008 of the JHH. and Tato část studie uvádí výsledky posouzení stability řízení nádrže Lipno za povodňových situací. K tomu byly využity soubory generovaných syntetických povodňových vln. Navazuje na první část s metodickými postupy, publikovanou v č. 1/2008 Journal of Hydrology and Hydromechanics.