Computer Science – Learning
Scientific paper
2010-07-08
Proc. 2010 Eleventh Brazilian Symposium on Neural Networks (S\~ao Bernardo do Campo, SP, Brazil, 23-28 October 2010), IEEE Com
Computer Science
Learning
6 pages, latex in IEEE conference proceedings format
Scientific paper
We show that the learning sample complexity of a sigmoidal neural network constructed by Sontag (1992) required to achieve a given misclassification error under a fixed purely atomic distribution can grow arbitrarily fast: for any prescribed rate of growth there is an input distribution having this rate as the sample complexity, and the bound is asymptotically tight. The rate can be superexponential, a non-recursive function, etc. We further observe that Sontag's ANN is not Glivenko-Cantelli under any input distribution having a non-atomic part.
No associations
LandOfFree
A note on sample complexity of learning binary output neural networks under fixed input distributions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A note on sample complexity of learning binary output neural networks under fixed input distributions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A note on sample complexity of learning binary output neural networks under fixed input distributions will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-102201