Send Close Add comments: (status displays here)
Got it!  This site "creationpie.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website.  Note: This appears on each machine/browser from which this site is accessed.
Claude Shannon
by RS  admin@creationpie.com : 1024 x 640


1. Claude Shannon
Book: The mathematical theory of communication
Claude Shannon (Information scientist)

Claude Shannon's masters thesis was a detailed description of how to build (the first) programmable digital electronic computer. He later founded the modern field of information and communication theory. He worked on encryption technologies during World War II.
Claude Shannon, in the 1950's, used the term "information" to describe the statistical properties of data transmitted from one place to another. He was working for the telephone company AT&T at Bell Labs in order to improve the quality of phone calls. He founded the modern field of (statistical) information - he was not overly concerned with the meaning of the "data" that he called "information".

2. Claude Shannon and entropy
In the 1940's, Claude Shannon (1916-2001) needed a name for distributions of discrete data systems that formed the basis of what he called "information" (in a statistical sense).

Shannon's data had discrete values while Boltzmann's formulas were continuous, but the equations had a similar form. He used the term "entropy".

Information sign More: Do not be shy and embarrassed to turn away from entropy

3. Claude Shannon and information
Book: The mathematical theory of communication
Claude Shannon did his master's thesis in 1939. It was on how to build a working digital computer - none had ever been built before.
Claude Shannon, in the 1950's, used the term "information" to describe the statistical properties of data transmitted from one place to another. He was working for the telephone company AT&T at Bell Labs in order to improve the quality of phone calls. He founded the modern field of (statistical) information - he was not overly concerned with the meaning of the "data" that he called "information".

Previously, the term "information" was used as the verb "to inform". That is, one is "informed" that such and such is true - what we now call "information".

Information sign More: The full assurance of information

4. Claude Shannon and encryption
Claude Shannon (noisy-channel coding theorem) founded the field of information theory and interested in the statistical properties of preserving message integrity. He worked for Bell Labs and the phone company wanted ways to send messages with minimal distortion or loss of the original signal.

His master's thesis in 1939 was on how to build a working computer - none had been built before. Turing did related theoretical work in 1936.

During World War II, Claude Shannon worked on message encryption and developed a secure device (using OTP (One Time Pad) encryption) to allow Franklin Roosevelt and Winston Churchill to talk to each other. One requirement was that they not sound like Micky Mouse (or Donald Duck, etc).

Now that the field of information theory, and the information age, is maturing, it has become clear that Aristotle and John the Apostle both had ways of thinking that are in line with the thinking and problem solving of (many) modern computer scientists.

5. Matthew 5: Claude Shannon
Book: The mathematical theory of communication
The Information
Digital communication uses bits as 1 (true) and 0 (false) or, for example, "yes" and "no".
Verse routeMatthew 5:37 But let your communication be, Yea, yea; Nay, nay: for whatsoever is more than these cometh of evil. [kjv]

The verse Matthew 5:37 was used on presentation slides by the founder of modern information theory, Claude Shannon, as he presented his new theory of information to interested groups.

Source: Gleick's book The Information: A history, a theory, a flood.

Information sign More: James Gleick

6. Bit as binary digit
Classical bitJohn Tukey (American mathematician and statistician) , working with John von Neumann, coined the term "bit" as "binary digit". The classical "bit" has one of two values, which can be represented as 0 (usually taken as false) and 1 (usually taken as true).

The term "bit" was first used in an article by Claude Shannon in 1948.

Information sign More: John Tukey

7. End of page

by RS  admin@creationpie.com : 1024 x 640