Computer Science – Information Theory
Scientific paper
2010-02-22
Computer Science
Information Theory
8 pages, no figures
Scientific paper
Rough speaking, information theory deals with data transmitted over a channel such as the internet. Modern information theory is generally considered to have been founded in 1948 by Shannon in his seminal paper, "A mathematical theory of communication." Shannon's formulation of information theory was an immediate success with communications engineers. Shannon defined mathematically the amount of information transmitted over a channel. The amount of information doesn't mean the number of symbols of data. It depends on occurrence probabilities of symbols of the data. Meanwhile, psychophysics is the study of quantitative relations between psychological events and physical events or, more specifically, between sensations and the stimuli that produce them. It seems that Shannon's information theory bears no relation to psychophysics established by German scientist and philosopher Fechner. Here I show that to our astonishment it is possible to combine two fields. And therefore we come to be capable of measuring mathematically perceptions of the physical stimuli applicable to the Weber-Fechner law. I will define the concept of new entropy. And as a consequence of this, new field will begin life.
No associations
LandOfFree
Proposal new area of study by connecting between information theory and Weber-Fechner law does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Proposal new area of study by connecting between information theory and Weber-Fechner law, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Proposal new area of study by connecting between information theory and Weber-Fechner law will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-372677