entropy

(noun)

A measure which quantifies the expected value of the information contained in a message.

Related Terms

  • cumulant
  • empirical rule

Examples of entropy in the following topics:

  • The Uniform Distribution

    • It is the maximum entropy probability distribution for a random variate $X$ under no constraint other than that it is contained in the distribution's support.
  • Probability Histograms and the Normal Curve

    • More generally, velocities of the particles in any system in thermodynamic equilibrium will have normal distribution, due to the maximum entropy principle.
  • The Normal Distribution

    • It is also the continuous distribution with the maximum entropy for a given mean and variance.
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.