Mathematics – Optimization and Control
Scientific paper
2011-10-02
Mathematics
Optimization and Control
Scientific paper
Large outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks to contaminated data using least trimmed squares criterion. We introduce a penalized least trimmed squares criterion which prevents unnecessary removal of valid data. Training of ANNs leads to a challenging non-smooth global optimization problem. We compare the efficiency of several derivative-free optimization methods in solving it, and show that our approach identifies the outliers correctly when ANNs are used for nonlinear regression.
Beliakov Gleb
Kelarev Andrei
Yearwood John
No associations
LandOfFree
Robust artificial neural networks and outlier detection. Technical report does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Robust artificial neural networks and outlier detection. Technical report, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Robust artificial neural networks and outlier detection. Technical report will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-284133