Statistics – Computation
Scientific paper
2008-11-18
Statistics
Computation
Submitted to the Electronic Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of Mathematical Statistics
Scientific paper
A descent algorithm, "Quasi-Quadratic Minimization with Memory" (QQMM), is proposed for unconstrained minimization of the sum, $F$, of a non-negative convex function, $V$, and a quadratic form. Such problems come up in regularized estimation in machine learning and statistics. In addition to values of $F$, QQMM requires the (sub)gradient of $V$. Two features of QQMM help keep low the number of evaluations of the objective function it needs. First, QQMM provides good control over stopping the iterative search. This feature makes QQMM well adapted to statistical problems because in such problems the objective function is based on random data and therefore stopping early is sensible. Secondly, QQMM uses a complex method for determining trial minimizers of $F$. After a description of the problem and algorithm a simulation study comparing QQMM to the popular BFGS optimization algorithm is described. The simulation study and other experiments suggest that QQMM is generally substantially faster than BFGS in the problem domain for which it was designed. A QQMM-BFGS hybrid is also generally substantially faster than BFGS but does better than QQMM when QQMM is very slow.
No associations
LandOfFree
An Algorithm for Unconstrained Quadratically Penalized Convex Optimization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with An Algorithm for Unconstrained Quadratically Penalized Convex Optimization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and An Algorithm for Unconstrained Quadratically Penalized Convex Optimization will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-100673