Send Close Add comments: (status displays here)
Got it!  This site "creationpie.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website.  Note: This appears on each machine/browser from which this site is accessed.
Claude Shannon
by RS  admin@creationpie.com : 1024 x 640


1. Claude Shannon
Book: The mathematical theory of communication
Claude Shannon (Information scientist)

Claude Shannon's masters thesis was a detailed description of how to build (the first) programmable digital electronic computer. He later founded the modern field of information and communication theory. He worked on encryption technologies during World War II.
Claude Shannon, in the 1950's, used the term "information" to describe the statistical properties of data transmitted from one place to another. He was working for the telephone company AT&T at Bell Labs in order to improve the quality of phone calls. He founded the modern field of (statistical) information - he was not overly concerned with the meaning of the "data" that he called "information".

2. Claude Shannon and entropy
In the 1940's, Claude Shannon (1916-2001) needed a name for distributions of discrete data systems that formed the basis of what he called "information" (in a statistical sense).

Shannon's data had discrete values while Boltzmann's formulas were continuous, but the equations had a similar form. He used the term "entropy".
Information sign More: Do not be shy and embarrassed to turn away from entropy

3. Claude Shannon and information
Claude Shannon did his master's thesis in 1939. It was on how to build a working digital computer - none had ever been built before.

Claude Shannon, in the 1950's, used the term "information" to describe the statistical properties of data transmitted from one place to another. He was working for the telephone company AT&T at Bell Labs in order to improve the quality of phone calls. He founded the modern field of (statistical) information - he was not overly concerned with the meaning of the "data" that he called "information".

Previously, the term "information" was used as the verb "to inform". That is, one is "informed" that such and such is true - what we now call "information".
Information sign More: The full assurance of information

4. Claude Shannon
Book: The mathematical theory of communication
The Information
The verse Matthew 5:37 was used on presentation slides by the founder of modern information theory, Claude Shannon, as he presented his new theory of information to interested groups.
Verse routeMatthew 5:37 But let your communication be, Yea, yea; Nay, nay: for whatsoever is more than these cometh of evil. [kjv]
Verse routeλογοςναιουπονηρου … [gnt]

Source: Gleick's book The Information: A history, a theory, a flood.
Information sign More: James Gleick

5. Matthew 5:37
   Matthew 5:37 
 All 
KJV: But let your communication be, Yea, yea; Nay, nay: for whatsoever is more than these cometh of evil.
Greek: εστω δε ο λογος υμων ναι ναι ου ου το δε περισσον τουτων εκ του πονηρου εστιν

6. Bit as binary digit
Classical bitJohn Tukey (American mathematician and statistician) , working with John von Neumann, coined the term "bit" as "binary digit". The classical "bit" has one of two values, which can be represented as 0 (usually taken as false) and 1 (usually taken as true).

The term "bit" was first used in an article by Claude Shannon in 1948.
Information sign More: John Tukey

7. End of page

by RS  admin@creationpie.com : 1024 x 640