Mathematics – Optimization and Control
Scientific paper
2011-09-27
Mathematics
Optimization and Control
Scientific paper
We modify Nesterov's constant step gradient method for strongly convex functions with Lipschitz continuous gradient described in Nesterov's book. Nesterov shows that $f(x_k) - f^* \leq L \prod_{i=1}^k (1 - \alpha_k) \| x_0 - x^* \|_2^2$ with $\alpha_k = \sqrt{\rho}$ for all $k$, where $L$ is the Lipschitz gradient constant and $\rho$ is the reciprocal condition number of $f(x)$. Hence the convergence rate is $1-\sqrt{\rho}$. In this work, we try to accelerate Nesterov's method by adaptively searching for an $\alpha_k > \sqrt{\rho}$ at each iteration. The proposed method evaluates the gradient function at most twice per iteration and has some extra Level 1 BLAS operations. Theoretically, in the worst case, it takes the same number of iterations as Nesterov's method does but doubles the gradient calls. However, in practice, the proposed method effectively accelerates the speed of convergence for many problems including a smoothed basis pursuit denoising problem.
Chen Hao
Meng Xiangrui
No associations
LandOfFree
Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Accelerating Nesterov's Method for Strongly Convex Functions with Lipschitz Gradient will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-42406