Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the ϕ-divergence is always equal to its upper bound, and the minimum ϕ-divergence estimates are trivial. Broniatowski and Vajda \cite{IV09} proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by α∈R in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different ϕ-divergence parameters.