Got it! This site "creationpie.com" uses cookies. You consent to this by clicking on "Got it!" or by continuing to use this website. Note: This appears on each machine/browser from which this site is accessed.
Computer/Information science can be defined as the search for finite representations of (potentially) infinite objects.
Is this possible? Does it make sense?
2. Dots
How many dots are red?
What is the ratio of red dots to blue dots?
3. Ratio
The ratio of red dots to blue dots is one third or 1/3.
How is this represented in decimal notation?
4. Decimal notation
The decimal notation for 1/3 is infinite.
How can we finitely represent this infinity?
5. Finite representation
Mathematical repetend notation can be used to show that the finite representation is "repeated".
How can we show this in graph notation as might be used in computer science?
6. Graph notation
We could print out this finite representation to any desired precision as long as we eventually stop. Lazy evaluation only continues as needed.
Do real numbers exist? Or are they just in the human imagination as a way of approximating results.
7. A third alternative
▶
+
-
1 Look and do
2 Look not do
3 Do not look
Many people think of two choices.
1. Look and do it.
2. Look but do not do it.
A third choice is at the heart of computer science and statistics.
3. Do not look. There is a cost to looking. And it may not matter.
It may already be known what is there, or it may have been determined that it is not necessary.
8. Short-circuit conditional evaluation
This idea is used all of the time in computer science and programming. A simple example is that of conditional evaluation of expressions (i.e., no side-effects during evaluation).
In the expression A and B, if A is false, then one need not look at B.
In the expression A or B, if A is true, then one need not look at B.
It may be the case that evaluating B may cause an error.
In discussions, it may be the case that B is a "red herring" and does not matter. The only thing that matters is A. Here are some tautologies often used in programming (as algebraic transformations).