Statistics – Applications
Scientific paper
2007-07-05
Published in SIAM Journal on Control and Optimization, vol. 47, pp. 2693--2723, 2008
Statistics
Applications
28 pages, 5 figures
Scientific paper
It is becoming increasingly apparent that probabilistic approaches can overcome conservatism and computational complexity of the classical worst-case deterministic framework and may lead to designs that are actually safer. In this paper we argue that a comprehensive probabilistic robustness analysis requires a detailed evaluation of the robustness function and we show that such evaluation can be performed with essentially any desired accuracy and confidence using algorithms with complexity linear in the dimension of the uncertainty space. Moreover, we show that the average memory requirements of such algorithms are absolutely bounded and well within the capabilities of today's computers. In addition to efficiency, our approach permits control over statistical sampling error and the error due to discretization of the uncertainty radius. For a specific level of tolerance of the discretization error, our techniques provide an efficiency improvement upon conventional methods which is inversely proportional to the accuracy level; i.e., our algorithms get better as the demands for accuracy increase.
Aravena Jorge L.
Chen Xinjia
Zhou Kemin
No associations
LandOfFree
Probabilistic Robustness Analysis -- Risks, Complexity and Algorithms does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Probabilistic Robustness Analysis -- Risks, Complexity and Algorithms, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Probabilistic Robustness Analysis -- Risks, Complexity and Algorithms will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-569712