Mathematics – Optimization and Control
Scientific paper
2005-12-14
Mathematics
Optimization and Control
Titled changed from "Smooth Optimization for Sparse Semidefinite Programs". New figures, tests. Final version
Scientific paper
We show that the optimal complexity of Nesterov's smooth first-order optimization algorithm is preserved when the gradient is only computed up to a small, uniformly bounded error. In applications of this method to semidefinite programs, this means in some instances computing only a few leading eigenvalues of the current iterate instead of a full matrix exponential, which significantly reduces the method's computational cost. This also allows sparse problems to be solved efficiently using sparse maximum eigenvalue packages.
No associations
LandOfFree
Smooth Optimization with Approximate Gradient does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Smooth Optimization with Approximate Gradient, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Smooth Optimization with Approximate Gradient will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-168775