Entropy is a noun describing a measure of disorder or randomness in a system, often used in thermodynamics and information theory. It quantifies the number of microscopic configurations consistent with a macroscopic state, and tends to increase as systems evolve toward more probable states. In everyday use, it signals a tendency toward less order and predictability.
Full pronunciation guide