entropy

(noun)

A measure of the amount of information and noise present in a signal.

Examples of entropy in the following topics:

  • Defining and Measuring Income Inequality

    • Theil Index:The Theil Index takes a slightly different approach than the rest, identifying entropy within the system.
    • Entropy in this context is different than that which is found in thermodynamics, primarily meaning the amount of noise or deviance from par.
    • When there is perfect equality, maximum entropy occurs because earners cannot be distinguished by their incomes.
    • The gaps between two entropies is called redundancy, which acts as a negative entropy measure in the system.
    • Comparing these gaps and inequality levels (high entropy or high redundancy) is the basic premise behind the Theil Index.
  • Measurement Problems

    • This criticism spans across most poverty measurement systems (Thiel entropy, the 20:20 ratio, and the Palma ratio to name a few), and ultimately implies that much of what is measured as inequality does not take into account absolute gains.
Subjects
  • Accounting
  • Algebra
  • Art History
  • Biology
  • Business
  • Calculus
  • Chemistry
  • Communications
  • Economics
  • Finance
  • Management
  • Marketing
  • Microbiology
  • Physics
  • Physiology
  • Political Science
  • Psychology
  • Sociology
  • Statistics
  • U.S. History
  • World History
  • Writing

Except where noted, content and user contributions on this site are licensed under CC BY-SA 4.0 with attribution required.