Send
Close Add comments:
(status displays here)
Got it! This site "creationpie.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website. Note: This appears on each machine/browser from which this site is accessed.
Entropy
1. Entropy
Have you ever heard the word "
entropy" used? What does it mean?
The English word "
entropy" comes from a Greek word that had an ancient meaning but was given a modern scientific meaning as a measure of the disorder of a system.
Let us cover the modern scientific use of the word and then look at what it meant in ancient Greek terms.
2. Example
We live in a world of disorder. Unless energy is appropriately and intelligently applied, that disorder does not get more ordered by random chance.
3. Order
Here is a more ordered world - the puzzle pieces appear much less randomly placed.
Note: There are only
24 possible puzzle pieces, ignoring those that can be achieved by rotating the pieces. All
24 are used in this puzzle. Once a picture is placed on the pieces, each piece then has a "
correct" orientation.
Which is more likely to happen over time without intelligent intervention? How about with appropriate intelligent interpretation?
The puzzle pieces apart going to the puzzle pieces together.
The puzzle pieces together going to the puzzle pieces apart.
This same phenomena appears at the microscopic and atomic levels.
4. Clausius
Clausius had used the German word description "
Verwandlungsinhalt" but needed a better word. In 1865 he coined the word "
entropy" from the Greek word "
εντροπή" that meant a transformation of "
turning inward".
The German word
"die Verwandlung" ≈ "transformation, conversion".
Thomas Young had used the modern Greek word
"ενεργεί" (eh-nehr-EE) ≈ "energy" and Clausius appeared to want a similar term for what he was studying (both then started with the prefix "
en-").
5. Physical entropy summary
Entropy is a measure of the disorder of a system.
The second law of thermodynamics states that in a closed system, the entropy of that system increases over time.
The name comes from the Greek meaning a "turning inward". The word was coined in 1865 by German physicist Rudolph Clausius as he needed a word to describe the "transformational content" of what he was observing/studying.
6. Thermodynamics
There are two fundamental physical laws of thermodynamics that govern the physics of the world in which we live.
The 1st law of thermodynamics says that mass-energy can not be created or destroyed, but can change its form.
The 2nd law of thermodynamics says that the entropy of a closed system increases over time.
7. Systems
Systems that, if they existed, would violate the 2nd law:
perpetual motion machine (you cannot get a patent on such a design)
Physical entropy is the primary way in which the elusive concept of physical "
time" is defined.
It turns out to be very difficult to precisely define digital "
time" in a computer. That is a subject for another "
time".
How does entropy relate to software development?
8. Time
Einstein had insight into the true nature of time.
People like us, who believe in physics, know that the distinction between past, present, and future is only a stubbornly persistent illusion. Albert Einstein's (Physicist)
That is, the laws of physics are reversible.
9. Software entropy
A software system over time can be thought of in terms of entropy. Over time, the disorder of your software, as it fits into a system, increases. It is far more likely that any change to the system will make your software stop working than work better.
Does your software ever randomly evolve into better software without a lot of intelligent effort on your part?
When your program does not work, does making random changes improve the chances of it working, or is it more likely that your program will less well after making the changes?
The statistics are against random changes to a program or to software improving the program or software.
10. Boltzmann
Ludwig Boltzmann, an Austrian physicist, started the field of statistical physics by taking Boltzmann's concept of "
entropy" and applying it to atomic or molecular systems involving states and state changes using probability distributions and concepts.
11. Entropy formula
Boltzmann's equation of entropy is still used today.
This logarithmic law that models this behavior has the constant
kB, the Boltzmann constant, named in his honor.
12. Claude Shannon and entropy
In the 1940's, Claude Shannon (1916-2001) needed a name for distributions of discrete data systems that formed the basis of what he called "
information" (in a statistical sense).
Shannon's data had discrete values while Boltzmann's formulas were continuous, but the equations had a similar form. He used the term "
entropy".
13. Comparison
Boltzmann's entropy formula:
Shannon's information entropy formula.
14. Von Neumann
In an often stated story, the famous genius John Von Neumann suggested to Shannon that he name his concept information entropy, which then formed the bases of the theory of information that was founded by Shannon.
Von Neumann first noticed that code could be data and data could be code in a self-referential reflexive way. For this and other reasons, modern electronic digital computers are called Von Neumann machines.
15. Entropy Formula
The following Shannon entropy formula expresses how entropy for a set of symbols is calculated.
S is the entropy
Pi is the probability of a given symbol i in a sequence of symbols appearing
16. Entropy of yes-no-maybe answers
Let us look at the entropy of giving a definitive "
yes" or "
no" answer to a given question. There are actually three possible answer values.
"yes" is a definitive answer.
"no" is a definitive answer.
"maybe" is the "bottom" or "⊥" or "?" (missing) value that may have a probability associated with it.
Note that "
maybe" and "
bottom" or "
⊥" or "
?" (missing) could be separated but that would be a different analysis.
17. Entropy
The formula for information entropy for a single question is as follows.
In this case, the symbol
p is the probability of a definitive answer, not the probability of a correct answer.
Note that probability ranges from
0.0 to
1.0 so that the logarithm of a probability will range from minus infinity to zero.
The negative sign for the entire quantity is so that maximum entropy is at
infinity (with no information, maximum uncertainty) and complete information is at minimum entropy at
0.0 (complete information, no uncertainty).
The minus sign in the formula for information entropy as a negative sign so that complete information is
0.0 while no information is
infinity.
Thus, no information is when entropy is a maximum and complete information is when entropy is a minimum, or
0.0.
18. Yes and no
Here is the entropy calculation for a definitive "
yes" or "
no" answer. The value of
p is
100.0% or
1.0.
In this case, the probability of a "
maybe" is
0.0% or
0.0.
If a definitive answer is not given, the entropy is as follows.
Note: For convenience, this value is taken as zero.
In this case, the probability of a "
maybe" is
100.0% or
1.0 so the probability of a definitive answer is
0.0% or
0.0.
Note that by giving only a "
yes" or "
no" answer, the entropy of the response is minimized.
Here are the entropy calculations for a probability of a definitive "
yes" or "
no" answer for values from
0.0 to
1.0 in increments of
0.1, or
0% to
100% in increments of
10%.
-0.0*log2(0.0) = (zero)
-0.1*log2(0.1) = 0.332
-0.2*log2(0.2) = 0.464
-0.3*log2(0.3) = 0.521
-0.4*log2(0.4) = 0.529
-0.5*log2(0.5) = 0.500
-0.6*log2(0.6) = 0.442
-0.7*log2(0.7) = 0.360
-0.8*log2(0.8) = 0.258
-0.9*log2(0.9) = 0.137
-1.0*log2(1.0) = 0.000
19. Entropy chart
Here is a chart of the above information entropy function.
20. Entropy of yes no answers
The entropy of giving a definitive "
yes" or "
no" answer to a given question can be plotted for the probability of the result of a choice.
"yes" is a definitive answer.
"no" is a definitive answer.
"maybe" is the "bottom" or "⊥" or "?" (missing) value that may have a probability associated with it.
Note that when the probability is
0.5, as in a flip of a fair coin, the entropy is
0.5.
21. End of page