Incremental Stochastic Subgradient Algorithms for Convex Optimization

Mathematics – Optimization and Control

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In this paper we study the effect of stochastic errors on two constrained incremental sub-gradient algorithms. We view the incremental sub-gradient algorithms as decentralized network optimization algorithms as applied to minimize a sum of functions, when each component function is known only to a particular agent of a distributed network. We first study the standard cyclic incremental sub-gradient algorithm in which the agents form a ring structure and pass the iterate in a cycle. We consider the method with stochastic errors in the sub-gradient evaluations and provide sufficient conditions on the moments of the stochastic errors that guarantee almost sure convergence when a diminishing step-size is used. We also obtain almost sure bounds on the algorithm's performance when a constant step-size is used. We then consider \ram{the} Markov randomized incremental subgradient method, which is a non-cyclic version of the incremental algorithm where the sequence of computing agents is modeled as a time non-homogeneous Markov chain. Such a model is appropriate for mobile networks, as the network topology changes across time in these networks. We establish the convergence results and error bounds for the Markov randomized method in the presence of stochastic errors for diminishing and constant step-sizes, respectively.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Incremental Stochastic Subgradient Algorithms for Convex Optimization does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Incremental Stochastic Subgradient Algorithms for Convex Optimization, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Incremental Stochastic Subgradient Algorithms for Convex Optimization will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-172192

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.