Home Events Prophecies Seeking Bible Index | (upd:1/14/09) | Toolbar Legend: ^ << |< <= => >| >> |
*** | < | < | < | The Probabilities of Accidental Creation -vs- Complexity | > | > | > | *** |
<<< follow Satan | < < < Your Choice is Your Eternity > > > | follow Christ >>> |
We are all familiar with the idea that "if we keep on trying" we may eventually "get it right."
What are the Mathematical Probabilities of Random / Accidental Creation -vs- Complexity ?
What would it take to Create Something Complex from Random Accidents ?
What are the odds that Something Complex comes into being, as the result of Random Accidents ?
What are the probabilities (chances) of creating useful complexity either randomly or accidentally ?
Complexity .. Measuring Complexity .. Quantifying Complexity .. Mathematics that describes Complexity
Typical statements involving concepts and degrees of complexity: | |
Something that is very simple
(not complex)
should happen easily or frequently. |
Something that is very complex,
will probably
never happen randomly or accidentally. |
The odds are high that it could happen. | The odds are very low
that it will ever happen.
The chances of it ever happening are very low. |
It happens every few "tries." | The odds/chances of it happening
are so low,
that it won't happen in a "billion years." |
Flipping 3 pennies all at once
comes up (from left
to right) "heads, heads, tails" on the average of once every 8 times (once every 8 tries). 8 tries = 2*2*2 = 2^3 = 2^(3 pennies) = 2^(3 bits of info). |
The odds are so low
that
I would not bet any money on it, nor my future, nor my eternity. |
From a "theory of evolution" perspective, everything that happens must be a truly random event or accident (with no plan, rhyme, or reason, or help, or intelligence from God). So let's rely on a purely mathematical analysis of what happens. (i.e. Let's rely on pure science; nothing religious about it.)
An easy place to start is to measure the complexity of something very complex like the Windows XT operating system.
By doing a right-click on C:\WINDOWS and selecting "Properties", my PC tells me that my Windows XT operating system has a size of about 2.5 GB. Since each byte contains 8 "bits of information", we can say that about 20 billion "bits of information" is required to describe the complexity of this Windows XT operating system. ( 2.5 GB = 2.5 GigaBytes =~ 2.5 Billion Bytes. 2.5 Billion Bytes times 8 bits per byte = 20 billion "bits of information." )
We can conclude, with as close to absolute certainty as we can imagine, that this 20 billion "bits of information" is not going to be randomly or accidentally duplicated (or come into being that way).
( Question to ponder: Why is it that we can believe that there is no way that Windows XP could be created by random accidents, but some of the same individuals can believe that the human brain "evolved" via a series of random accidents ??? )
Now, using a very simple example, let's explore how difficult it would be to accidentally, or randomly, create something that has any significant degree of complexity.
The table below describes this structure, as well as the value of each of the 56 "bits of information" needed to represent our 7-letter word "example".
7-letter
word to create |
"example" | word to
create |
||||||
letters /
bytes |
e | x | a | m | p | l | e | = 7
letters/bytes |
value of
bits (8 bits per byte) |
0110 0101 | 0111 1000 | 0110 0001 | 0110 1101 | 0111 0000 | 0110 1100 | 0110 0101 | = 56 "bits of
information" |
heads/tails
h/t |
thht thth | thhh httt | thht ttth | thht hhth | thhh tttt | thht hhtt | thht thth | = 56 "bits of
information" |
Each "bit of information" has a required value of 0 or 1, shown in the table above.
Our next question is likely to be: On the average, how many times do we have to flip all 56 pennies to get a match?
Another way of expressing this is "the
odds of getting a match on the next try is
1 in 72,057,594,037,927,900."
Let's ask still another question.
If the 56 pennies were all flipped
once every second, on the average, how long would elapse between matches?
(i.e. How long would it take, on the average, to get a match ?)
The answer is: 72,057,594,037,927,900 times, and at once per second = 72,057,594,037,927,900 seconds.
Then, divide by 365.25 days per year to
get 2,283,367,368 years.
( That's 2.28 billion
years. ) (2.28 E+9 years)
What if we want to increase the complexity of what is going to be randomly (accidentally) created?
A simple of rule of thumb is: For each 10 "bits of information" we start with or add to the complexity of something, we must add 3 more zeros to the number of times all the pennies have to be flipped to get a match.
( Note: To be exact, instead of adding 3 more zeros (which is the same as multiplying by 1,000), we must multiply by 1,024 (because 2^10 is exactly = 1,024). )
What are the chances that human DNA came into being by random accidents?
According to Wikipedia, the human genome is the genome of Homo sapiens, which is stored on 23 chromosome pairs. The haploid human genome occupies a total of just over 3 billion DNA base pairs, and has a data size of approximately 750,000,000 bytes.
1 followed by 27,092,700 zeros = 1 E+27,092,700
( Note: Above, we have been focusing on "bits of information" that represent the genetic code for all human beings. However, in DNA detective work, we would focus on the differences in the human genetic code that would give us a unique DNA match or signature for each individual.
By taking our original results of 1 in "1 followed by 27,092,700 zeros", and subtracting the exponents of 1 in "1 followed by 9 zeros", we would wind up with 1 in "1 followed by 27,092,691 zeros" (i.e. with 9 fewer zeros).
Consequently, we can easily see that the DNA differences between humans are very, very small when compared to the DNA similarities between humans. Therefore, our original conclusions are unaffected by these DNA differences. )
As shown in the various analyses above, mathematically speaking, the probability of something very complex being randomly (accidentally) created is as improbable as we can imagine or fathom.
Stated another way: It is mathematically totally improbable that anything very complex ever came into being by accident or by any other random happening.
" I would not bet any money on it,
nor my future,
nor my eternity " ?
The table below enables us to quickly convert the complexity of something (measured in "bits of information") to the average number of tries / attempts needed to randomly (accidentally) create something of that complexity. In this table, binary (computer) terminology is used where:
Average number of tries
/ attempts needed
to randomly (accidentally) create something of a certain complexity. |
|||||||
Complexity (in "bits of information" as measured by the value in one of the columns below) | Average number
of tries / attempts needed
to randomly (accidentally) create something of the complexity shown in the columns to the left. = 2^(# bits) |
||||||
#Tera
Bytes # TiB |
#Giga
Bytes # GiB |
#Mega
Bytes # MiB |
#Kilo
Bytes # KiB |
#
bytes |
# bits | ||
1 | 2 E+0 | 2 | |||||
2 | 4 E+0 | 4 | |||||
3 | 8 E+0 | 8 | |||||
4 | 1.6 E+1 | 16 | |||||
5 | 3.2 E+1 | 32 | |||||
6 | 6.4 E+1 | 64 | |||||
7 | 1.28 E+2 | 128 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
1 | 8 | 2.56 E+2 | 256 | ||||
9 | 5.12 E+2 | 512 | |||||
10 | 1.024 E+3 | 1,024 | |||||
2 | 16 | 6.554 E+4 | 65,536 | ||||
20 | 1.049 E+6 | 1,048,576 | |||||
3 | 24 | 1.678 E+7 | 16,777,216 | ||||
30 | 1.074 E+9 | 1,073,741,824 | |||||
4 | 32 | 4.295 E+9 | 4,294,967,296 | ||||
5 | 40 | 1.100 E+12 | 1,099,511,627,776 | ||||
6 | 48 | 2.815 E+14 | 281,474,976,710,656 | ||||
50 | 1.126 E+15 | 1,125,899,906,842,620 | |||||
7 | 56 | 7.206 E+16 | 72,057,594,037,927,900 | ||||
60 | 1.153 E+18 | 1,152,921,504,606,850,000 | |||||
8 | 64 | 1.845 E+19 | 18,446,744,073,709,600,000 | ||||
70 | 1.181 E+21 | Note: Results above
are
only accurate to 15 digits. Hence, the trailing zeros. |
|||||
9 | 72 | 4.722 E+21 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
10 | 80 | 1.209 E+24 | |||||
11 | 88 | 3.095 E+26 | |||||
90 | 1.238 E+27 | ||||||
12 | 96 | 7.923 E+28 | |||||
100 | 1.268 E+30 | ||||||
20 | 160 | 1.462 E+48 | |||||
30 | 240 | 1.767 E+72 | |||||
40 | 320 | 2.136 E+96 | |||||
50 | 400 | 2.582 E+120 | |||||
60 | 480 | 3.122 E+144 | |||||
70 | 560 | 3.774 E+168 | |||||
80 | 640 | 4.562 E+192 | |||||
90 | 720 | 5.516 E+216 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
100 | 800 | 6.668 E+240 | |||||
200 | 1,600 | 4.446 E+481 | |||||
300 | 2,400 | 2.965 E+722 | |||||
400 | 3,200 | 1.977 E+963 | |||||
500 | 4,000 | 1.318 E+1,204 | |||||
600 | 4,800 | 8.790 E+1,444 | |||||
700 | 5,600 | 5.861 E+1,685 | |||||
800 | 6,400 | 3.908 E+1,926 | |||||
900 | 7,200 | 2.606 E+2,167 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
1,000 | 8,000 | 1.738 E+2,408 | |||||
1 | 1,024 | 8,192 | 1.091 E+2,466 | ||||
2 | 2,048 | 1.190 E+4,932 | |||||
3 | 3,072 | 1.298 E+7,398 | |||||
4 | 4,096 | 1.415 E+9,864 | |||||
5 | 5,120 | 1.544 E+12,330 | |||||
6 | 6,144 | 1.684 E+14,796 | |||||
7 | 7,168 | 1.837 E+17,262 | |||||
8 | 8,192 | 2.004 E+19,728 | |||||
9 | 9,216 | 2.185 E+22,194 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
10 | 2.384 E+24,660 | ||||||
20 | 5.682 E+49,320 | ||||||
30 | 1.354 E+73,981 | ||||||
40 | 3.228 E+98,641 | ||||||
50 | 7.695 E+123,301 | ||||||
60 | 1.834 E+147,962 | ||||||
70 | 4.372 E+172,622 | ||||||
80 | 1.042 E+197,283 | ||||||
90 | 2.484 E+221,943 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
100 | 5.922 E+246,603 | ||||||
200 | 3.507 E+493,207 | ||||||
300 | 2.077 E+739,811 | ||||||
400 | 1.230 E+986,415 | ||||||
500 | 7.282 E+1,233,018 | ||||||
600 | 4.312 E+1,479,622 | ||||||
700 | 2.553 E+1,726,226 | ||||||
800 | 1.512 E+1,972,830 | ||||||
900 | 8.954 E+2,219,433 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
1,000 | 5.302 E+2,466,037 | ||||||
1 | 1,024 | 4.264 E+2,525,222 | |||||
2 | 2,048 | 1.819 E+5,050,445 | |||||
3 | 3,072 | 7.755 E+7,575,667 | |||||
4 | 4,096 | 3.307 E+10,100,890 | |||||
5 | 5,120 | 1.410 E+12,626,113 | |||||
6 | 6,144 | 6.015 E+15,151,335 | |||||
7 | 7,168 | 2.565 E+17,676,558 | |||||
8 | 8,192 | 1.094 E+20,201,781 | |||||
9 | 9,216 | 4.664 E+22,727,003 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
10 | 1.989 E+25,252,226 | ||||||
20 | 3.957 E+50,504,452 | ||||||
30 | 7.871 E+75,756,678 | ||||||
40 | 1.566 E+101,008,905 | ||||||
50 | 3.114 E+126,261,131 | ||||||
60 | 6.195 E+151,513,357 | ||||||
70 | 1.232 E+176,765,584 | ||||||
80 | 2.451 E+202,017,810 | ||||||
90 | 4.875 E+227,270,036 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
100 | 9.698 E+252,522,262 | ||||||
200 | 9.405 E+505,044,525 | ||||||
300 | 9.121 E+757,566,788 | ||||||
400 | 8.846 E+1,010,089,051 | ||||||
500 | 8.579 E+1,262,611,314 | ||||||
600 | 8.320 E+1,515,133,577 | ||||||
700 | 8.068 E+1,767,655,840 | ||||||
800 | 7.825 E+2,020,178,103 | ||||||
900 | 7.588 E+2,272,700,366 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
1,000 | 7.359 E+2,525,222,629 | ||||||
1 | 1,024 | 9.630 E+2,585,827,972 | |||||
2 | 2,048 | 9.274 E+5,171,655,945 | |||||
3 | 3,072 | 8.932 E+7,757,483,918 | |||||
4 | 4,096 | 8.601 E+10,343,311,891 | |||||
5 | 5,120 | 8.283 E+12,929,139,864 | |||||
6 | 6,144 | 7.977 E+15,514,967,837 | |||||
7 | 7,168 | 7.682 E+18,100,795,810 | |||||
8 | 8,192 | 7.398 E+20,686,623,783 | |||||
9 | 9,216 | 7.125 E+23,272,451,756 | |||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
10 | 6.862 E+25,858,279,729 | ||||||
20 | 4.708 E+51,716,559,459 | ||||||
30 | 3.230 E+77,574,839,189 | ||||||
40 | 2.217 E+103,433,118,919 | ||||||
50 | 1.521 E+129,291,398,649 | ||||||
60 | 1.044 E+155,149,678,379 | ||||||
70 | 7.161 E+181,007,958,108 | ||||||
80 | 4.913 E+206,866,237,838 | ||||||
90 | 3.371 E+232,724,517,568 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
100 | 2.313 E+258,582,797,298 | ||||||
200 | 5.351 E+517,165,594,596 | ||||||
300 | 1.238 E+775,748,391,895 | ||||||
400 | 2.863 E+1,034,331,189,193 | ||||||
500 | 6.623 E+1,292,913,986,491 | ||||||
600 | 1.532 E+1,551,496,783,790 | ||||||
700 | 3.544 E+1,810,079,581,088 | ||||||
800 | 8.197 E+2,068,662,378,386 | ||||||
900 | 1.896 E+2,327,245,175,685 | ||||||
# TiB | # GiB | # MiB | # KiB | bytes | # bits | Average number of tries / attempts = 2^(# bits) | |
1,000 | 4.386 E+2,585,827,972,983 | ||||||
1 | 1,024 | 1.776 E+2,647,887,844,335 | |||||
2 | 2,048 | 3.155 E+5,295,775,688,670 |
For those of you who are familiar with using an 8-bit byte to represent 256 (2^8 = 256) printer symbols, you may be thinking that we should eliminate (compress out) many or all of the 256 printer symbols that are not letters of the English alphabet.
2^(# bits) = 26 = 2^(4.700439718 bits) =~ 2^(4.7 bits)
If we define a compression factor for 26 symbols (CF = .587 554 964 767 637), it fits the following equations:
log10( 2^(8x7) ) x CF = log10( 2^( 4.7x7) ) =~ 9 zeros (for 4.7x7=32.9 compressed bits in our 7-letter "example")
16 zeros x CF =~ 9 zeros (where CF = 0.587 554 964 767 637)
log10( 2^(8x7) ) =~ 16 zeros (for 8x7=56 uncompressed bits in our 7-letter word called "example")
Home Events Prophecies Seeking Bible Index | your Bible Prophecy website 888c.com | ^ << |< <= => >| >> |