One difficulty for quaternion neural networks (QNNs) is that quaternion nonlinear activation functions are usually non-analytic and thus quaternion derivatives cannot be used. In this paper, we derive the quaternion gradient descent, approximated quaternion Gauss-Newton and quaternion Levenberg-Marquardt algorithms for feedforward QNNs based on the GHR calculus, which is suitable for analytic and non-analytic quaternion functions. Meanwhile, we solve a widely linear quaternion least squares problem in the derivation of quaternion Gauss-Newton algorithm, which is more general than the usual least squares probŹlem. A rigorous analysis of the convergence of the proposed algorithms is provided. Simulations on the prediction of benchmark signals support the approach.