In this paper we present a quasi-Newton’s method for unconstrained multiobjective optimization of strongly convex objective functions. Hence, we can approximate the Hessian matrices by using the well ...known BFGS method. The approximation of the Hessian matrices is usually faster than their exact evaluation, as used in, e.g., recently proposed Newton’s method for multiobjective optimization. We propose and analyze a new algorithm and prove that its convergence is superlinear.
Network Newton Distributed Optimization Methods Mokhtari, Aryan; Qing Ling; Ribeiro, Alejandro
IEEE transactions on signal processing,
2017-Jan.1,-1, 2017-1-1, Letnik:
65, Številka:
1
Journal Article
Recenzirano
Odprti dostop
We study the problem of minimizing a sum of convex objective functions, where the components of the objective are available at different nodes of a network and nodes are allowed to only communicate ...with their neighbors. The use of distributed gradient methods is a common approach to solve this problem. Their popularity notwithstanding, these methods exhibit slow convergence and a consequent large number of communications between nodes to approach the optimal argument because they rely on first-order information only. This paper proposes the network Newton (NN) method as a distributed algorithm that incorporates second-order information. This is done via distributed implementation of approximations of a suitably chosen Newton step. The approximations are obtained by truncation of the Newton step's Taylor expansion. This leads to a family of methods defined by the number K of Taylor series terms kept in the approximation. When keeping K terms of the Taylor series, the method is called NN-K and can be implemented through the aggregation of information in K-hop neighborhoods. Convergence to a point close to the optimal argument at a rate that is at least linear is proven and the existence of a tradeoff between convergence time and the distance to the optimal argument is shown. The numerical experiments corroborate reductions in the number of iterations and the communication cost that are necessary to achieve convergence relative to first-order alternatives.
An alternative strategy for solving systems of nonlinear equations when the classical Newton’s method fails is presented. The proposed strategy is an extension for systems based on the idea presented ...in the article (Ramos and Vigo-Aguiar, 2015). It relies in obtaining the approximate solutions of a given system of equations through solving an associated system obtained through the theory underlying the Newton’s method. In this way, the solutions of the associated system that are not 2-cycles of the Newton iteration function provide solutions of the original system. As it is usual, in most cases, the associated system cannot be solved exactly, and some iterative procedure must be used. For particular starting values, solving the associated system with the Newton’s method results to be more efficient than the application of the Newton’s method to the original system. Some examples are given to illustrate the performance of the proposed strategy. Performance profiles are evaluated in terms of number of iterations, error and CPU time.
Building on previous research of Chi and Chi, this article revisits estimation in robust structured regression under the
criterion. We adopt the majorization-minimization (MM) principle to design a ...new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm by Chi and Chi. In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton's method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to enable constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.
We present a new stabilized 3-D control volume finite element method (FEM) for massively parallel simulation of drift-diffusion transport in semiconductor devices. This new solver employs ...unstructured hexahedral elements, thus leading to a very efficient scheme; both Poisson and current continuity equations are discretized with control volume FEM, and thus, the accuracy and stability are improved. Furthermore, a fully coupled Newton's method is applied to solve these nonlinear equations, thus remarkably increasing the numerical stability. Then, we apply this new solver to simulate diode, MOSFET, and multifinger MOSFET devices. Numerical results indicate that the hexahedron-based method requires fewer iterative numbers, and its computing speed is 3.44-4.61 times faster than the tetrahedron-based counterpart. Moreover, our fully coupled Newton's method permits constant iterative numbers, showing strong numerical stability even when the drain voltage is 160 V.
A proper initial guess is critical for implementing Newton's iteration to approximate an exact solution of a nonlinear differential equation like the Navier-Stokes equations or magnetohydrodynamic ...equations. In this article, we will provide a theoretical criteria for the way of choosing an initial guess required for Newton's iteration for approximating an exact solution of a nonlinear differential equation.
An optimal family of eighth-order multiple-zero finders and the dynamics behind their basins of attraction are proposed by considering modified Newton-type methods with multivariate weight functions. ...Extensive investigation of purely imaginary extraneous fixed points of the proposed iterative methods is carried out for the study of the dynamics associated with corresponding basins of attraction. Numerical experiments strongly support the underlying theory pursued in this paper. An exploration of the relevant dynamics of the proposed methods is presented along with illustrative basins of attraction for various polynomials.