Mathematics – Probability
Scientific paper
2009-08-07
Mathematics
Probability
minor edits
Scientific paper
G. Edelman, O. Sporns, and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely exchangeability and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and failure of uniqueness.
Buzzi Jerome
Zambotti Lorenzo
No associations
LandOfFree
A probabilistic study of neural complexity does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A probabilistic study of neural complexity, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A probabilistic study of neural complexity will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-153041