Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Proposal new area of study by connecting between information theory and Weber-Fechner law (1002.3909v7)

Published 22 Feb 2010 in cs.IT and math.IT

Abstract: Rough speaking, information theory deals with data transmitted over a channel such as the internet. Modern information theory is generally considered to have been founded in 1948 by Shannon in his seminal paper, "A mathematical theory of communication." Shannon's formulation of information theory was an immediate success with communications engineers. Shannon defined mathematically the amount of information transmitted over a channel. The amount of information doesn't mean the number of symbols of data. It depends on occurrence probabilities of symbols of the data. Meanwhile, psychophysics is the study of quantitative relations between psychological events and physical events or, more specifically, between sensations and the stimuli that produce them. It seems that Shannon's information theory bears no relation to psychophysics established by German scientist and philosopher Fechner. Here I show that to our astonishment it is possible to combine two fields. And therefore we come to be capable of measuring mathematically perceptions of the physical stimuli applicable to the Weber-Fechner law. I will define the concept of new entropy. And as a consequence of this, new field will begin life.

Citations (1)

Summary

We haven't generated a summary for this paper yet.