In this paper, a new adjustment to the damping parameter of the Levenberg-Marquardt algorithm is proposed to save training time and to reduce error oscillations. The damping parameter of the Levenberg-Marquardt algorithm switches between a gradient descent method and the Gauss-Newton method. It also affects training speed and induces error oscillations when a decay rate is fixed. Therefore, our damping strategy decreases the damping parameter with the inner product between weight vectors to make the Levenberg-Marquardt algorithm behave more like the Gauss-Newton method, and it increases the damping parameter with a diagonally dominant matrix to make the Levenberg-Marquardt algorithm act like a gradient descent method. We tested two simple classifications and a handwritten digit recognition for this work. Simulations showed that our method improved training speed and error oscillations were fewer than those of other algorithms.