Weber-Fechner Law

A principle in psychophysics stating that the perceived intensity of a stimulus is proportional to the logarithm of the actual intensity.

The Weber-Fechner law, formulated by Ernst Heinrich Weber and Gustav Theodor Fechner in the 19th century, describes a fundamental property of human perception: we sense differences in stimuli logarithmically, not linearly.

The basic formula is: S = k × ln(I), where S is perceived intensity, I is actual intensity, and k is a constant. This means:

  • The difference between 1 and 2 feels significant (a 100% increase)
  • The difference between 10 and 11 feels smaller (a 10% increase)
  • The difference between 1,000 and 1,001 is imperceptible (a 0.1% increase)

This law applies to brightness, loudness, weight, and critically, numerical perception. It's why a billion doesn't "feel" a thousand times larger than a million. On your brain's internal logarithmic scale, a billion (10^9) is only 3 perceptual steps away from a million (10^6), because log(10^9) - log(10^6) = 3.

Understanding the Weber-Fechner law explains why visualization tools are necessary. Your brain's built-in number sense uses a logarithmic scale. Visual representations bypass this limitation by converting numerical magnitude into spatial or temporal magnitude, which your brain processes more linearly.

See it in action

Definitions tell you what a number is. Visualization shows you what it means.

Open How Big? Tool

More terms