Diferencia entre revisiones de «Información»

De CreacionWiki
Saltar a: navegación, buscar
 
(No se muestran 5 ediciones intermedias del mismo usuario)
Línea 1: Línea 1:
 
{{traducción}}
 
{{traducción}}
 
[[File:HOT word in sand.jpg|thumb|200px|"HOT" escrito en la arena. Alguien dejó esta información en la arena con un propósito.]]
 
[[File:HOT word in sand.jpg|thumb|200px|"HOT" escrito en la arena. Alguien dejó esta información en la arena con un propósito.]]
'''Información''' es un término derivado del verbo [[latín|latino]] {{Nombre Latino2|informare}} que significa "dar forma a la mente", "la disciplina", "instruir", "enseñar". La información se entiende generalmente como conocimientos o hechos que uno ha adquirido. Sin embargo, en algunas áreas de la ciencia, la información se define de manera diferente ya menudo ambigua.<ref>{{cita web|url=http://www.ime.usp.br/~is/ddt/mac333/aulas/tema-11-24mai99.html|título=MAC 333 - A Revolução Digital e a Sociedade do Conhecimento  - Tema 11 - O que é Informação? Como ela age?|autor=Simon, Imre|trans_title=MAC 333 - La revolución digital y la Sociedad del Conocimiento - Tema 11 - ¿Qué es la información? ¿Cómo funciona?|fechaacceso=31 de julio 2013}}</ref><ref group=nota>[[:en:Wikipedia:Imre Simon|Dr. Imre Simon]], quien hizo esta referencia, fue un conocido matemático y científico de la computación brasileño nacido en Hungría.</ref>  
+
'''Información''' es un término derivado del verbo [[latín|latino]] {{Nombre Latino2|informare}} que significa "dar forma a la mente", "la disciplina", "instruir", "enseñar". La información se entiende generalmente como conocimientos o hechos que uno ha adquirido. Sin embargo, en algunas áreas de la ciencia, la información se define de manera diferente ya menudo ambigua.<ref>{{cita web|url=http://www.ime.usp.br/~is/ddt/mac333/aulas/tema-11-24mai99.html|título=MAC 333 - A Revolução Digital e a Sociedade do Conhecimento  - Tema 11 - O que é Informação? Como ela age?|autor=Simon, Imre|trans_título=MAC 333 - La revolución digital y la Sociedad del Conocimiento - Tema 11 - ¿Qué es la información? ¿Cómo funciona?|fechaacceso=31 de julio 2013}}</ref><ref group=nota>[[:en:Wikipedia:Imre Simon|Dr. Imre Simon]], quien hizo esta referencia, fue un conocido matemático y científico de la computación brasileño nacido en Hungría.</ref>  
  
Para la [[ciencia creacionista]], es la información (la Palabra de Dios) que subyace a el [[cosmología creacionista#ajuste fino cósmico|ajuste fino del universo]]. Además, la existencia de información biológica dentro de cada [[célula]] (ADN y ARN) ofrece lo que es tal vez el más poderoso argumento para el [[diseño inteligente]]. [[William Dembski]] afirma que el ADN posee complejidad especificada (es decir, es al mismo tiempo complejo y especificado, al mismo tiempo), por lo que debe haber sido producido por una causa inteligente (es decir, que fue diseñado), en lugar de ser el resultado de procesos naturales.<ref name=ARN>Dembski, William A. [http://www.arn.org/docs/dembski/wd_idtheory.htm Intelligent Design as a Theory of Information] ''Access Research Network'', 15 de noviembre 1998.</ref>  
+
Para la [[ciencia creacionista]], es la información (la Palabra de Dios) que subyace a el [[cosmología creacionista#ajuste fino cósmico|ajuste fino del universo]]. Además, la existencia de información biológica dentro de cada [[célula]] (ADN y ARN) ofrece lo que es tal vez el más poderoso argumento para el [[diseño inteligente]]. [[William Dembski]] afirma que el ADN posee complejidad especificada (es decir, es al mismo tiempo complejo y especificado, al mismo tiempo), por lo que debe haber sido producido por una causa inteligente (es decir, que fue diseñado), en lugar de ser el resultado de procesos naturales.<ref name=ARN>{{cita web|autor=Dembski, William A.|url=http://www.arn.org/docs/dembski/wd_idtheory.htm|título=Intelligent Design as a Theory of Information|editorial=Access Research Network|fecha=15 de noviembre 1998.|fechaacceso=28 de octubre |añoacceso=2013}}</ref>  
One of the main objections to evolution is the origin of the enormous amounts of genetic information content that is needed for an organism to evolve from microbes to humans.<ref>{{cite book|author=Sarfati, Jonathan|enlaceautor=Jonathan Sarfati|title=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|publisher=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> Not only has no credible source been identified where information could be produced by natural processes, but in contrast the adaptation of living organism involves a reduction of the information in the genome through natural selection.<ref>{{cite book|author=Spetner, Lee M|enlaceautor=Lee Spetner|title=[[Not by Chance!]] |publisher=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
+
Una de las principales objeciones a la evolución es el origen de las enormes cantidades de contenido de información genética que se necesita para un organismo a evolucionar a partir de los microbios a los seres humanos.<ref>{{cita libro|autor=Sarfati, Jonathan|enlaceautor=Jonathan Sarfati|título=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|ubicación=Atlanta, Georgia|editorial=Creation Book Publishers|año=2010|página=43|isbn=978-0-949906-73-1}}</ref> No sólo no tiene ninguna fuente creíble que ha sido identificada, donde la información podría ser producida por procesos naturales, pero en contraste la adaptación de los organismos vivos implica una reducción de la información en el genoma a través de la selección natural.<ref>{{cita libro|autor=Spetner, Lee M|enlaceautor=Lee Spetner|título=[[Not by Chance!]] |editorial=The Judaica Press|año=1998|ubicación=Brooklyn, New York|página=127-160|capítulo=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
  
 
==Definiciones o caracterizaciones==
 
==Definiciones o caracterizaciones==
Línea 11: Línea 11:
 
Royal Truman, en su análisis publicada en el ''[[Journal of Creation]]'' analiza dos familias de enfoques: La primera derivada del trabajo de Shannon y la segunda derivada de la obra de [[Werner Gitt|Gitt]].<ref name=truman1>{{cita publicación|título=Information Theory-Part 1:Overview of Key Ideas|autor=Truman, Royal|año=2012|publicación=[[Journal of Creation]]|volumen=26|número=3|página=101-106|issn=1036-2916}}</ref> Truman también menciona la definición algorítmica de la información, desarrollada por Solomonoff, Kolmogorov y con contribuciones de [[Gregory Chaitin]] pero que no ha sido discutida en su artículo.<ref name=truman1 /> De acuerdo con [[Stephen Meyer]], Los científicos suelen distinguir dos tipos básicos de información: la información significativa o funcional y la llamada información de Shannon<ref name=doubt>{{cita libro|autor=Meyer, Stephen C|enlaceautor=Stephen Meyer|título=Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design|editorial=HarperOne/HarperCollins Publishers|ubicación=Seattle, WA|año=2013|página=164-168|isbn=978-0-06-207147-7}}</ref> (llamado así por Claude Shannon, quien desarrolló la teoría de la información estadística). La información de Shannon no es realmente lo mismo que la información significativa. Información significativa, codificada en un lenguaje, se puede medir estadísticamente, por Shannon, pero la medida es la redundancia de los símbolos, la denominada entropía de Shannon,<ref group=nota>Entropía de Shannon es la imprevisibilidad media en una variable aleatoria, que es equivalente a su contenido de información.</ref> no es una medida del "contenido de información", o el significado. Shannon, por ejemplo, se puede utiliza para medir el "contenido de información" de un conjunto de símbolos al azar que no tienen ningún significado.
 
Royal Truman, en su análisis publicada en el ''[[Journal of Creation]]'' analiza dos familias de enfoques: La primera derivada del trabajo de Shannon y la segunda derivada de la obra de [[Werner Gitt|Gitt]].<ref name=truman1>{{cita publicación|título=Information Theory-Part 1:Overview of Key Ideas|autor=Truman, Royal|año=2012|publicación=[[Journal of Creation]]|volumen=26|número=3|página=101-106|issn=1036-2916}}</ref> Truman también menciona la definición algorítmica de la información, desarrollada por Solomonoff, Kolmogorov y con contribuciones de [[Gregory Chaitin]] pero que no ha sido discutida en su artículo.<ref name=truman1 /> De acuerdo con [[Stephen Meyer]], Los científicos suelen distinguir dos tipos básicos de información: la información significativa o funcional y la llamada información de Shannon<ref name=doubt>{{cita libro|autor=Meyer, Stephen C|enlaceautor=Stephen Meyer|título=Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design|editorial=HarperOne/HarperCollins Publishers|ubicación=Seattle, WA|año=2013|página=164-168|isbn=978-0-06-207147-7}}</ref> (llamado así por Claude Shannon, quien desarrolló la teoría de la información estadística). La información de Shannon no es realmente lo mismo que la información significativa. Información significativa, codificada en un lenguaje, se puede medir estadísticamente, por Shannon, pero la medida es la redundancia de los símbolos, la denominada entropía de Shannon,<ref group=nota>Entropía de Shannon es la imprevisibilidad media en una variable aleatoria, que es equivalente a su contenido de información.</ref> no es una medida del "contenido de información", o el significado. Shannon, por ejemplo, se puede utiliza para medir el "contenido de información" de un conjunto de símbolos al azar que no tienen ningún significado.
  
Information is often not defined.<ref>{{cite web|url=http://www2.sims.berkeley.edu/courses/is101/s97/infoword.html|title="Information" and Other Words|publisher=School of Information Management & Systems|year=Spring 1997|author=M. Buckland.|accessdate=July 31, 2013}}</ref> Some definitions relate the concept of information to meaning. In turn, the meaning of "meaning" has never been properly explained when applied to human thought processes.<ref name=feigenbaum>{{cite book|editor=Feigenbaum, Edward A.; Feldman, Julian|author=Lindsay, Robert K|chapter=Inferential Memory as the Basis of Machines Which Understand Natural Language|title=Computers & Thought|publisher=AAAI Press/The MIT Press|location=Menlo park, Cambridge, London|year=1995|page=218|isbn=0-262-56092-5}}</ref> Some definitions, characterizations or notions about information are found in the literature or on the web but without consensus. Some of these are:
+
La información a menudo no se define.<ref>{{cita web|url=http://www2.sims.berkeley.edu/courses/is101/s97/infoword.html|título="Information" and Other Words|editorial=School of Information Management & Systems|año=Spring 1997|autor=M. Buckland.|fechaacceso=29 de octubre |añoacceso=2013}}</ref> Algunas definiciones relacionan el concepto de información al significado. A su vez, el significado de "significado"  
 +
nunca se ha explicado adecuadamente cuando se aplica a los procesos de pensamiento humano.<ref name=feigenbaum>{{cita libro|editor=Feigenbaum, Edward A.; Feldman, Julian|autor=Lindsay, Robert K|capítulo=Inferential Memory as the Basis of Machines Which Understand Natural Language|título=Computers & Thought|editorial=AAAI Press/The MIT Press|ubicación=Menlo park, Cambridge, London|año=1995|página=218|isbn=0-262-56092-5}}</ref> Algunas definiciones, caracterizaciones o nociones acerca de la información se encuentran en la literatura o en la web, pero sin consenso. Algunos de estas son:
  
 
Robert M. Losee;
 
Robert M. Losee;
{{cquote|Information may be understood in a domain-independent way as the values within the outcome of any process.<ref name=Losee>{{cite journal|title=A Discipline Independent Definition of Information|author=Losee, Robert M.|journal=Journal of the American Society for Information Science|volume=48|issue=3|pages=254–269|year=1997|url=http://www.idt.mdh.se/~gdc/work/ARTICLES/09-VDM-BOOK/pdf/DisciplineIndependentDefinitionOfInformation.pdf|ISSN=1532-2890}}</ref>}}
+
{{cquote|La información puede ser entendida de una manera independiente del dominio como los valores en el resultado de cualquier proceso.<ref name=Losee>{{cita publicación|título=A Discipline Independent Definition of Information|autor=Losee, Robert M.|publicación=Journal of the American Society for Information Science|volume=48|número=3|página=254–269|año=1997|url=http://www.idt.mdh.se/~gdc/work/ARTICLES/09-VDM-BOOK/pdf/DisciplineIndependentDefinitionOfInformation.pdf|ISSN=1532-2890}}</ref>}}
 
Winfried Nöth;
 
Winfried Nöth;
{{cquote|Information in its everyday sense is a qualitative concept associated with meaning and news.<ref name=semiotics>{{cite book|author=Nöth, Winfried|title=Handbook of Semiotics|year=1995|publisher=Indiana University Press|location=Indiana|isbn=0-253-20959-5}}</ref>}}
+
{{cquote|La información en su sentido cotidiano es un concepto cualitativo asociado con significado y noticias.<ref name=semiotics>{{cita libro|autor=Nöth, Winfried|título=Handbook of Semiotics|año=1995|editorial=Indiana University Press|ubicación=Indiana|isbn=0-253-20959-5}}</ref>}}
 
Ray Kurzweil;
 
Ray Kurzweil;
{{cquote|Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program.<ref name=Kurzweil>{{cite book|author=Kurzweil, Ray|title=The Age of Spiritual Machines: When Computers Exceed Human Intelligence|publisher=Penguin Books|location=New York|year=2000|page=30|isbn=0-14-028202-5}}</ref>}}
+
{{cquote|La información es una secuencia de datos que es significativa en un proceso, tal como el código del ADN de un organismo, o los bits en un programa de ordenador.<ref name=Kurzweil>{{cita libro|autor=Kurzweil, Ray|título=The Age of Spiritual Machines: When Computers Exceed Human Intelligence|editorial=Penguin Books|ubicación=New York|año=2000|página=30|isbn=0-14-028202-5}}</ref>}}
 
Gregory Bateson;
 
Gregory Bateson;
{{cquote|Information is a difference which makes a difference.<ref>{{cite web|url=http://www.wired.com/wired/archive/2.03/economy.ideas_pr.html||title=The Economy of Ideas: A framework for patents and copyrights in the Digital Age. (Everything you know about intellectual property is wrong.)|author=Barlow, John Perry|accessdate=July 31, 2013}}</ref>}}
+
{{cquote|La información es una diferencia que hace la diferencia.<ref>{{cita web|url=http://www.wired.com/wired/archive/2.03/economy.ideas_pr.html||título=The Economy of Ideas: A framework for patents and copyrights in the Digital Age. (Everything you know about intellectual property is wrong.)|autor=Barlow, John Perry|fechaacceso=29 de octubre |añoacceso=2013}}</ref>}}
Valdemar W.Setzer<ref group=nota>Dr. Valdemar W.Setzer is a well-known Brazilian computer scientist. He is is a signatory to the list named "[[A Scientific Dissent From Darwinism]]". Found in: {{cite web|url=http://www.discovery.org/scripts/viewDB/filesDB-download.php?command=download&id=660|title=A Scientific Dissent from Darwinism (List)|accessdate=July 31, 2013}}</ref>
+
Valdemar W.Setzer<ref group=nota>Dr. Valdemar W.Setzer is a well-known Brazilian computer scientist. He is is a signatory to the list named "[[A Scientific Dissent From Darwinism]]". Found in: {{cite web|url=http://www.discovery.org/scripts/viewDB/filesDB-download.php?command=download&id=660|título=A Scientific Dissent from Darwinism (List)|accessdate=July 31, 2013}}</ref>
{{cquote|Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms.<ref name=setzer>{{cite web|url=http://www.ime.usp.br/~vwsetzer/data-info.html|author=Setzer, Valdemar W.|title=Data, Information, Knowledge and Competence|accessdate=July 31, 2013}}</ref>}}
+
{{cquote|Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms.<ref name=setzer>{{cite web|url=http://www.ime.usp.br/~vwsetzer/data-info.html|autor=Setzer, Valdemar W.|título=Data, Information, Knowledge and Competence|fechaacceso=31 de julio de 2013}}</ref>}}
 
Wikipedia;
 
Wikipedia;
{{cquote|Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message. Information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system that can interpret the information.<ref>{{cite web|url=http://en.wikipedia.org/wiki/Information|publisher=Wikipedia|title=Information|accessdate=July 31, 2013}}</ref>}}
+
{{cquote|La información, en su sentido técnico más restringido, es una secuencia de símbolos que pueden interpretarse como un mensaje. La información puede registrarse como signos o transmitirse como señales. La información es cualquier tipo de evento que afecta el estado de un sistema dinámico que puede interpretar la información.<ref>{{cite web|url=http://en.wikipedia.org/wiki/Information|editorial=Wikipedia|título=Information|accessdate=July 31, 2013}}</ref>}}
Answer of some of those present in the lectures of Stephen Talbott to a large audience of librarians;
+
Respuesta de algunos de los presentes en las conferencias de Stephen Talbott a una gran audiencia de bibliotecarios.;
{{cquote|That's the stuff we work with.<ref name=setzer />}}
+
{{cquote|Ese es el material con lo que trabajamos.<ref name=setzer />}}
  
 
==Views==
 
==Views==
Línea 69: Línea 70:
 
|-
 
|-
 
|5
 
|5
|Apobetics<ref group=nota>The term "apobetics" was introduced by Dr. Werner Gitt in 1981 to express the teleological aspect of information, the question of purpose. The word is derived from the {{Greek Name|αποβαίνων|apobeinon}} that means result, success, conclusion, consequence. Found in {{cite book|author=Gitt, Werner|title= In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|year=2005|publisher=Master Books|location=Green Forest, AR|page=77|isbn=978-0-89051-461-0}}</ref>
+
|Apobetics<ref group=nota>The term "apobetics" was introduced by Dr. Werner Gitt in 1981 to express the teleological aspect of information, the question of purpose. The word is derived from the {{Greek Name|αποβαίνων|apobeinon}} that means result, success, conclusion, consequence. Found in {{cita libro|autor=Gitt, Werner|título= In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|year=2005|editorial=Master Books|location=Green Forest, AR|page=77|isbn=978-0-89051-461-0}}</ref>
 
|Intended purpose
 
|Intended purpose
 
|Achieved result
 
|Achieved result
Línea 75: Línea 76:
 
|}
 
|}
  
According Dr. Gitt, there is no known law through which matter can give rise to information.<ref name=Gitt />  In his article on scientific laws of information, published in the [[Journal of Creation]], Dr. Gitt states that information is not a property of matter, it is a non-material entity so its origin is in the same way not explicable by material processes.<ref name=laws1>{{cite journal|title=Scientific Laws of Information and Their Implications-Part 1|author=Gitt, Werner|year=2009|journal=[[Journal of Creation]]|volume=23|issue=2|pages=96-102|issn=1036-2916}}</ref> Dr. Gitt also points out that the most important prerequisite for the production of information is the sender's own will, so that information can arises only through will encompassing intention and purpose.<ref name=laws1 />
+
According Dr. Gitt, there is no known law through which matter can give rise to information.<ref name=Gitt />  In his article on scientific laws of information, published in the [[Journal of Creation]], Dr. Gitt states that information is not a property of matter, it is a non-material entity so its origin is in the same way not explicable by material processes.<ref name=laws1>{{cita publicación|título=Scientific Laws of Information and Their Implications-Part 1|autor=Gitt, Werner|year=2009|journal=[[Journal of Creation]]|volume=23|issue=2|pages=96-102|issn=1036-2916}}</ref> Dr. Gitt also points out that the most important prerequisite for the production of information is the sender's own will, so that information can arises only through will encompassing intention and purpose.<ref name=laws1 />
 
Gitt also points out that as information is neither formed of matter (although it can be carried on matter) nor energy, it constitutes a third fundamental quantity of the universe.
 
Gitt also points out that as information is neither formed of matter (although it can be carried on matter) nor energy, it constitutes a third fundamental quantity of the universe.
  
 
===Biology===
 
===Biology===
It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes.<ref name=stanford>{{cite web|url=http://plato.stanford.edu/entries/information-biological/|title=Biological Information|date=Oct 4, 2007|publisher=Stanford Encyclopedia of Philosophy|accessdate=August 2, 2013}}</ref> John von Neumann stated that the genes themselves are clearly parts of a digital system of components.<ref>{{cite book|author=von Neumann, John|title=The Computer and the Brain|edition=2nd|publisher=Yale University Press|location=New Haven and London|year=2000|page=69|isbn=0-300-08473-0}}</ref> Many biologists, especially materialists, see this trend as a having foundational problems.<ref name=stanford />  
+
It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes.<ref name=stanford>{{cite web|url=http://plato.stanford.edu/entries/information-biological/|título=Biological Information|date=Oct 4, 2007|editorial=Stanford Encyclopedia of Philosophy|accessdate=August 2, 2013}}</ref> John von Neumann stated that the genes themselves are clearly parts of a digital system of components.<ref>{{cita libro|autor=von Neumann, John|título=The Computer and the Brain|edition=2nd|editorial=Yale University Press|location=New Haven and London|year=2000|page=69|isbn=0-300-08473-0}}</ref> Many biologists, especially materialists, see this trend as a having foundational problems.<ref name=stanford />  
  
Either way, many scientists in various fields of science consider living organisms as having biological information. [[Gregory Chaitin]], a renowned Argentine-American mathematician and computer scientist, sees [[DNA]] as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last.<ref>{{cite book|author=Chaitin, Gregory|enlaceautor=Gregory Chaitin|title=Meta Math!: The Quest for Omega|publisher=Vintage Books|location=New York|year=2005|page=66-74|url=http://arxiv.org/pdf/math/0404335.pdf|isbn=978-1-4000-7797-7}}</ref> David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information".<ref name=uncensored>{{cite book|title=[[Intelligent Design Uncensored: An Easy-to-Understand Guide to the Controversy]]|author=[[William Dembski|Dembski, William A.]]; Witt, Jonathan|page=72-73|publisher=InterVarsity Press|location=Downers Grove, Illinois|isbn=978-0-8308-3742-7|year=2010}}</ref> Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction".
+
Either way, many scientists in various fields of science consider living organisms as having biological information. [[Gregory Chaitin]], a renowned Argentine-American mathematician and computer scientist, sees [[DNA]] as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last.<ref>{{cita libro|autor=Chaitin, Gregory|enlaceautor=Gregory Chaitin|título=Meta Math!: The Quest for Omega|editorial=Vintage Books|location=New York|year=2005|page=66-74|url=http://arxiv.org/pdf/math/0404335.pdf|isbn=978-1-4000-7797-7}}</ref> David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information".<ref name=uncensored>{{cita libro|título=[[Intelligent Design Uncensored: An Easy-to-Understand Guide to the Controversy]]|autor=[[William Dembski|Dembski, William A.]]; Witt, Jonathan|page=72-73|editorial=InterVarsity Press|location=Downers Grove, Illinois|isbn=978-0-8308-3742-7|year=2010}}</ref> Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction".
<ref name=pandas>{{cite book|title=Of Pandas and People: The Central Question of Biological Origins|author=Davis, Percival; [[Dean H. Kenyon|Kenyon, Dean H]]|publisher=Haughton Publishing Company|location=Dallas, Texas|edition=2nd|page=64|isbn=0-914513-40-0}}</ref> [[Richard Dawkins]], a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information.<ref>{{cite book|author=Dawkins, Richard|enlaceautor=Richard Dawkins|title=River Out of Eden|year=1995|publisher=Basic Books|location=New York|page=4|isbn=978-0-465-06990-3}}</ref> [[Stephen Meyer]] points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function."<ref>{{cite book| author=Meyer, Stephen C|enlaceautor=Stephen Meyer|editor=Dembski, William A|title=Darwin's Nemesis: Philip Johnson and the Intelligent Design Movement|publisher=Inter-Varsity Press|location=Downers Grove, IL|year=2006|page=178-179|chapter=12-The Origin of Biological Information and the Higher Taxonomic Categories|isbn=978-0-8308-2836-4}}</ref> Meyer points out that the information contained in the DNA has a high degree of specificity. [[David Berlinski]] an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:
+
<ref name=pandas>{{cita libro|título=Of Pandas and People: The Central Question of Biological Origins|autor=Davis, Percival; [[Dean H. Kenyon|Kenyon, Dean H]]|editorial=Haughton Publishing Company|location=Dallas, Texas|edition=2nd|page=64|isbn=0-914513-40-0}}</ref> [[Richard Dawkins]], a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information.<ref>{{cita libro|autor=Dawkins, Richard|enlaceautor=Richard Dawkins|título=River Out of Eden|year=1995|editorial=Basic Books|location=New York|page=4|isbn=978-0-465-06990-3}}</ref> [[Stephen Meyer]] points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function."<ref>{{cita libro| autor=Meyer, Stephen C|enlaceautor=Stephen Meyer|editor=Dembski, William A|título=Darwin's Nemesis: Philip Johnson and the Intelligent Design Movement|editorial=Inter-Varsity Press|location=Downers Grove, IL|year=2006|page=178-179|chapter=12-The Origin of Biological Information and the Higher Taxonomic Categories|isbn=978-0-8308-2836-4}}</ref> Meyer points out that the information contained in the DNA has a high degree of specificity. [[David Berlinski]] an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:
  
{{cquote|Whatever else a living creature may be...[it] is ''also'' a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to  another (the proteins)<ref>{{cite book|title=[[The Deniable Darwin and Other Essays]]|author=Berlinski, David|enlaceautor=David Berlinski|year=2010|publisher=Discovery Institute Press|location=Seattle|page=153|isbn=978-0-979014-12-3}} </ref>}}
+
{{cquote|Whatever else a living creature may be...[it] is ''also'' a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to  another (the proteins)<ref>{{cita libro|título=[[The Deniable Darwin and Other Essays]]|autor=Berlinski, David|enlaceautor=David Berlinski|year=2010|editorial=Discovery Institute Press|location=Seattle|page=153|isbn=978-0-979014-12-3}} </ref>}}
  
 
The [[intelligent design]] concept that DNA exhibited [[specified complexity]] was developed by mathematician and  philosopher [[William Dembski]]. Dembski claims that when something exhibits specified complexity (i.e., is both complex and specified, simultaneously) one can infer that it was produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes (''see [[naturalism]]'').<ref name=ARN>Dembski, William A. [http://www.arn.org/docs/dembski/wd_idtheory.htm Intelligent Design as a Theory of Information] ''Access Research Network'', November 15 1998.</ref> He provides the following examples: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified."<ref>Dembski, William A.[http://www.leaderu.com/offices/dembski/docs/bd-specified.html Explaining Specified Complexity] Appeared as Metaviews 139 (www.meta-list.org). September 13 1999.</ref> He states that details of living things can be similarly characterized, especially the "patterns" of molecular sequences in functional biological molecules such as [[DNA]].<ref name=ARN/>
 
The [[intelligent design]] concept that DNA exhibited [[specified complexity]] was developed by mathematician and  philosopher [[William Dembski]]. Dembski claims that when something exhibits specified complexity (i.e., is both complex and specified, simultaneously) one can infer that it was produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes (''see [[naturalism]]'').<ref name=ARN>Dembski, William A. [http://www.arn.org/docs/dembski/wd_idtheory.htm Intelligent Design as a Theory of Information] ''Access Research Network'', November 15 1998.</ref> He provides the following examples: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified."<ref>Dembski, William A.[http://www.leaderu.com/offices/dembski/docs/bd-specified.html Explaining Specified Complexity] Appeared as Metaviews 139 (www.meta-list.org). September 13 1999.</ref> He states that details of living things can be similarly characterized, especially the "patterns" of molecular sequences in functional biological molecules such as [[DNA]].<ref name=ARN/>
Línea 91: Línea 92:
  
 
==Quantifying information==
 
==Quantifying information==
David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously".<ref name=Coding>{{cite book|author=Salomon, David|title=Coding for Data and Computer Communications|publisher=Springer|location=New York|year=2005|page=59|isbn=0-387-21245-0}}</ref> Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.<ref name=Coding />  
+
David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously".<ref name=Coding>{{cita libro|autor=Salomon, David|título=Coding for Data and Computer Communications|editorial=Springer|location=New York|year=2005|page=59|isbn=0-387-21245-0}}</ref> Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.<ref name=Coding />  
  
 
===Shannon entropy===
 
===Shannon entropy===
In ''A Mathematical Theory of Communication'' Shannon endows the term ''information'' not only with technical significance, but also for measure.<ref name=solomon>{{Cite book|author=Salomon, David|title=Data Compression: The Complete Reference|publisher=Springer|location=New York|page=279|year=2000|edition=2nd|isbn=0-387-95045-1}}</ref> Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.<ref name=Sayood>{{cite book|author=Sayood, Khalid|title=Introduction to Data Compression|publisher=Morgan Kaufmann Publishers|location=San Francisco|year=2000|edition=2nd|page=13-14|isbn=1-55860-558-4}}</ref> The self-information, denoted by ''i'', associated with an event A is given by:  
+
In ''A Mathematical Theory of Communication'' Shannon endows the term ''information'' not only with technical significance, but also for measure.<ref name=solomon>{{cita libro|autor=Salomon, David|título=Data Compression: The Complete Reference|editorial=Springer|location=New York|page=279|year=2000|edition=2nd|isbn=0-387-95045-1}}</ref> Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.<ref name=Sayood>{{cita libro|autor=Sayood, Khalid|título=Introduction to Data Compression|editorial=Morgan Kaufmann Publishers|location=San Francisco|year=2000|edition=2nd|page=13-14|isbn=1-55860-558-4}}</ref> The self-information, denoted by ''i'', associated with an event A is given by:  
  
 
:<big>i(A) = -log<sub>''b''</sub>''P''(A)</big>
 
:<big>i(A) = -log<sub>''b''</sub>''P''(A)</big>
Línea 100: Línea 101:
 
where ''P''(A) is the probability that the event A will occur and ''b'' is the chosen base of the log. If the unit of information is bits, we use ''b''=2 and so on.<ref name=Sayood />
 
where ''P''(A) is the probability that the event A will occur and ''b'' is the chosen base of the log. If the unit of information is bits, we use ''b''=2 and so on.<ref name=Sayood />
  
The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions.<ref name=semiotics /> The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits.<ref name=semiotics /> Is worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.<ref>{{cite book|author=Nelson, Mark; Gailly, Jean-Loup|title=The Data Compression Book|edition=2nd|publisher=M&T Books|location=New York|year=1996|page=14|isbn=1-55851-434-1}}</ref>
+
The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions.<ref name=semiotics /> The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits.<ref name=semiotics /> Is worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.<ref>{{cita libro|autor=Nelson, Mark; Gailly, Jean-Loup|título=The Data Compression Book|edition=2nd|editorial=M&T Books|location=New York|year=1996|page=14|isbn=1-55851-434-1}}</ref>
  
 
===Chaitin-Kolmogoroff theory===
 
===Chaitin-Kolmogoroff theory===
Another way of measuring information content is the Kolmogorov complexity (also known as Chaitin information). The Kolmogorov complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (like the Turing machine). Let ''x'' be a binary string and let d(''x'') be the shortest string <M, ''i''> achieved by concatenating a Turing machine M and an input ''i'' to which the Turing machine halts, leaving the string ''x'' on the tape.<ref name=Sipser>{{cite book|author=Sipser, Michael|title=Introduction to the Theory of Computation|publisher=PWS Publishing Company|location=Boston|year=1997|page=214-215|isbn=0-534-94728-X}}</ref> The Kolmogorov complexity K(''x'') is:
+
Another way of measuring information content is the Kolmogorov complexity (also known as Chaitin information). The Kolmogorov complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (like the Turing machine). Let ''x'' be a binary string and let d(''x'') be the shortest string <M, ''i''> achieved by concatenating a Turing machine M and an input ''i'' to which the Turing machine halts, leaving the string ''x'' on the tape.<ref name=Sipser>{{cita libro|autor=Sipser, Michael|título=Introduction to the Theory of Computation|editorial=PWS Publishing Company|location=Boston|year=1997|page=214-215|isbn=0-534-94728-X}}</ref> The Kolmogorov complexity K(''x'') is:
  
 
:<big>K(''x'') = |d(''x'')|</big>
 
:<big>K(''x'') = |d(''x'')|</big>
  
that means, the Kolmogorov complexity is the length of the minimal description of ''x''.<ref name=Sipser /> The complexity can be viewed as the measure of the "patternlessness" of the sequence, and can be equate with the idea of randomness.<ref>{{cite book|author=Knuth, Donald|title=The Art of Computer Programming: Seminumerical Algorithms|volume=2|edition=2nd|page=163-166|publisher=Addison-Wesley Publishing Company|location=Reading, Massachusetts|year=1981|isbn=0-201-03822-6}}</ref> The length of the shortest description will depend on the choice of description language. By way of illustration we compare two strings:
+
that means, the Kolmogorov complexity is the length of the minimal description of ''x''.<ref name=Sipser /> The complexity can be viewed as the measure of the "patternlessness" of the sequence, and can be equate with the idea of randomness.<ref>{{cita libro|autor=Knuth, Donald|título=The Art of Computer Programming: Seminumerical Algorithms|volume=2|edition=2nd|page=163-166|editorial=Addison-Wesley Publishing Company|location=Reading, Massachusetts|year=1981|isbn=0-201-03822-6}}</ref> The length of the shortest description will depend on the choice of description language. By way of illustration we compare two strings:
  
 
  "CREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKI"  
 
  "CREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKI"  
Línea 122: Línea 123:
 
</code>
 
</code>
  
This program contains 34 ASCII characters (counting the blanks and new line) plus 1 character (the parameter: 7) (the value of the variable ''m'' in this particular case). For other values of ''m'' the program length in characters will be 34 + log ''m''. One way to measure the randomness of the former sequence is to form the ratio of program length to string length.<ref>{{cite book|author=Dewdney, A. K|title=The New Turing Omnibus: 66 Excursions in Computer Science|chapter=8-Random Numbers: The Chaitin-Kolmogoroff Theory|page=49-55|publisher=A. W. H. Freeman/Owl Books|location=New York|year=1993|isbn=978-0-8050-7166-5}}</ref> This results in the measure of the randomness of:
+
This program contains 34 ASCII characters (counting the blanks and new line) plus 1 character (the parameter: 7) (the value of the variable ''m'' in this particular case). For other values of ''m'' the program length in characters will be 34 + log ''m''. One way to measure the randomness of the former sequence is to form the ratio of program length to string length.<ref>{{cita libro|autor=Dewdney, A. K|título=The New Turing Omnibus: 66 Excursions in Computer Science|chapter=8-Random Numbers: The Chaitin-Kolmogoroff Theory|page=49-55|editorial=A. W. H. Freeman/Owl Books|location=New York|year=1993|isbn=978-0-8050-7166-5}}</ref> This results in the measure of the randomness of:
  
 
:''r'' ≤ (34 + log ''m'')/(12''m'')
 
:''r'' ≤ (34 + log ''m'')/(12''m'')
Línea 130: Línea 131:
 
==Spontaneous appearance ==
 
==Spontaneous appearance ==
 
[[File:Ultimate Sand Castle.jpg|thumb|200px|A very impressive sand castle on the beach near St Helier in Jersey.]]
 
[[File:Ultimate Sand Castle.jpg|thumb|200px|A very impressive sand castle on the beach near St Helier in Jersey.]]
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.<ref name=uncensored /> Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.<ref name=wilder>{{cite book|author=Wilder-Smith, A. E.|enlaceautor=A. E. Wilder-Smith|title=The Scientific Alternative to Neo-Darwinian Evolutionary Theory|publisher=The Word For Today Publishers|location=Costa Mesa, California|year=1987|page=23-51|isbn=0-936728-1B}}</ref> In his book ''Steps Towards Life'' Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".<ref name=dembski1>{{cite book|author=Dembski, William A|enlaceautor=William Dembski|title=Intelligent Design:The Bridge Between Science & Theology|publisher=IVP Academic|location=Downers Grove, Illinois|year=1999|page=153-183|isbn=0-8308-2314-X}}</ref><ref group=nota>The Eigen's book quoted by Dembski is {{cite book|author=Eigen, Manfred|title=Steps Towards Life: A Perspective on Evolution|location=Oxford|publisher=Oxford University Press|year=1992|page=12|isbn=0-19854751-X}}</ref> [[A. E. Wilder-Smith]], in contrast, states that
+
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.<ref name=uncensored /> Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.<ref name=wilder>{{cita libro|autor=Wilder-Smith, A. E.|enlaceautor=A. E. Wilder-Smith|título=The Scientific Alternative to Neo-Darwinian Evolutionary Theory|editorial=The Word For Today Publishers|location=Costa Mesa, California|year=1987|page=23-51|isbn=0-936728-1B}}</ref> In his book ''Steps Towards Life'' Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".<ref name=dembski1>{{cita libro|autor=Dembski, William A|enlaceautor=William Dembski|título=Intelligent Design:The Bridge Between Science & Theology|editorial=IVP Academic|location=Downers Grove, Illinois|year=1999|page=153-183|isbn=0-8308-2314-X}}</ref><ref group=nota>The Eigen's book quoted by Dembski is {{cita libro|autor=Eigen, Manfred|título=Steps Towards Life: A Perspective on Evolution|location=Oxford|editorial=Oxford University Press|year=1992|page=12|isbn=0-19854751-X}}</ref> [[A. E. Wilder-Smith]], in contrast, states that
  
 
{{cquote|If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.<ref name=wilder />}}
 
{{cquote|If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.<ref name=wilder />}}
Línea 136: Línea 137:
 
Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy<ref group=nota>Negentropy, also negative entropy or syntropy of a living system is the entropy that it exports to keep its own entropy low.</ref> and, on the other side, a correspondence between potential information and entropy.<ref name=wilder /> Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.<ref name=wilder />
 
Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy<ref group=nota>Negentropy, also negative entropy or syntropy of a living system is the entropy that it exports to keep its own entropy low.</ref> and, on the other side, a correspondence between potential information and entropy.<ref name=wilder /> Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.<ref name=wilder />
  
Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.<ref>{{cite book|author=Lester, Lane P; [[Raymond Bohlin|Bohlin, Raymond G]]|title=[[The Natural Limits to Biological Change]]|location=Dallas|publisher=Probe Books|edition=2nd|year=1989|page=157|isbn=0-945241-06-2}}</ref>
+
Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.<ref>{{cita libro|autor=Lester, Lane P; [[Raymond Bohlin|Bohlin, Raymond G]]|título=[[The Natural Limits to Biological Change]]|location=Dallas|editorial=Probe Books|edition=2nd|year=1989|page=157|isbn=0-945241-06-2}}</ref>
  
In his book ''[[A Case Against Accident and Self-Organization]]'', [[Dean L. Overman]] builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.<ref name=overman>{{cite book |author=Overman, Dean L|enlaceautor=Dean L. Overman |editor-last= |title=[[A Case Against Accident and Self-Organization]]|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref> The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.<ref name=overman /> Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.<ref name=overman /> According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.<ref name=overman /><ref group=nota>Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 10<sup>40,000</sup>. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10<sup>-44</sup>. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10<sup>-191</sup>. Harold Morrison obtained in his calculations the impressive number of 1 in 10<sup>100,000,000,000</sup> for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: {{cite book |last=Overman|first=Dean L|editor-first= |editor-last= |title=A Case Against Accident and Self-Organization|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref>
+
In his book ''[[A Case Against Accident and Self-Organization]]'', [[Dean L. Overman]] builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.<ref name=overman>{{cita libro |autor=Overman, Dean L|enlaceautor=Dean L. Overman |editor-last= |título=[[A Case Against Accident and Self-Organization]]|editorial=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref> The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.<ref name=overman /> Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.<ref name=overman /> According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.<ref name=overman /><ref group=nota>Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 10<sup>40,000</sup>. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10<sup>-44</sup>. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10<sup>-191</sup>. Harold Morrison obtained in his calculations the impressive number of 1 in 10<sup>100,000,000,000</sup> for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: {{cita libro |last=Overman|first=Dean L|editor-first= |editor-last= |título=A Case Against Accident and Self-Organization|editorial=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref>
  
 
Evolutionist [[Michael Denton]] wrote the controversial book "[[Evolution: A Theory in Crisis]]". In his book, writing about the origin of life, Denton states:
 
Evolutionist [[Michael Denton]] wrote the controversial book "[[Evolution: A Theory in Crisis]]". In his book, writing about the origin of life, Denton states:
  
{{cquote|The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.<ref>{{cite book|author=Denton, Michael|enlaceautor=Michael Denton|title=[[Evolution: A Theory in Crisis]]|page=271|publisher=Adler & Adler|location=Chevy Chase, MD|year=1985|isbn=0-917561-52-X}}</ref>}}
+
{{cquote|The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.<ref>{{cita libro|autor=Denton, Michael|enlaceautor=Michael Denton|título=[[Evolution: A Theory in Crisis]]|page=271|editorial=Adler & Adler|location=Chevy Chase, MD|year=1985|isbn=0-917561-52-X}}</ref>}}
  
Due to the enormous odds against [[abiogenesis]] on [[earth]] some scientists have turned to the [[panspermia]] hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, [[Fred Hoyle]], Svante Arrhenius, Leslie Orgel and Thomas Gold.<ref name=overman /><ref>{{cite book|author=Shapiro, Robert|title=Origins: A Skeptic´s Guide to the Creation of Life on Earth|publisher=Bantam Books|location=Toronto|year=1987|page=226-227|isbn=0-553-34355-6}}</ref>
+
Due to the enormous odds against [[abiogenesis]] on [[earth]] some scientists have turned to the [[panspermia]] hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, [[Fred Hoyle]], Svante Arrhenius, Leslie Orgel and Thomas Gold.<ref name=overman /><ref>{{cita libro|autor=Shapiro, Robert|título=Origins: A Skeptic´s Guide to the Creation of Life on Earth|editorial=Bantam Books|location=Toronto|year=1987|page=226-227|isbn=0-553-34355-6}}</ref>
  
 
==Problem for evolution==
 
==Problem for evolution==
According to [[Jonathan Sarfati]], the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the origin of the enormous amount of genetic that is needed in order for a microbe to evolve, ultimately reaching the complexity of humans.<ref>{{cite book|author=Sarfati, Jonathan|enlaceautor=Jonathan Sarfati|title=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|publisher=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> [[Dr. Lee Spetner]] points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.<ref>{{cite book|author=Spetner, Lee M|enlaceautor=Lee Spetner|title=[[Not by Chance!]] |publisher=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
+
According to [[Jonathan Sarfati]], the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the origin of the enormous amount of genetic that is needed in order for a microbe to evolve, ultimately reaching the complexity of humans.<ref>{{cita libro|autor=Sarfati, Jonathan|enlaceautor=Jonathan Sarfati|título=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|editorial=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> [[Dr. Lee Spetner]] points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.<ref>{{cita libro|autor=Spetner, Lee M|enlaceautor=Lee Spetner|título=[[Not by Chance!]] |editorial=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
  
 
== Notas ==
 
== Notas ==
Línea 155: Línea 156:
  
 
==Enlaces externos==
 
==Enlaces externos==
* {{cite journal|author=Lin, Shu-Kun|year=2008|title=Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship|journal=Entropy|volume=10|issue=1|pages=1-5|url=http://www.mdpi.com/1099-4300/10/1/1}}
+
* {{cita publicación|autor=Lin, Shu-Kun|year=2008|título=Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship|journal=Entropy|volume=10|issue=1|pages=1-5|url=http://www.mdpi.com/1099-4300/10/1/1}}
* {{cite journal|author=Floridi, Luciano|year=2005|title=Is Information Meaningful Data?|journal=Philosophy and Phenomenological Research|volume=70|issue=2|pages=351–370|url=http://philsci-archive.pitt.edu/archive/00002536/01/iimd.pdf}}
+
* {{cita publicación|autor=Floridi, Luciano|year=2005|título=Is Information Meaningful Data?|journal=Philosophy and Phenomenological Research|volume=70|issue=2|pages=351–370|url=http://philsci-archive.pitt.edu/archive/00002536/01/iimd.pdf}}
 
* Luciano Floridi, (2005). 'Semantic Conceptions of Information', ''The Stanford Encyclopedia of Philosophy'' (Winter 2005 Edition), Edward N. Zalta (ed.). Available online at [http://plato.stanford.edu/entries/information-semantic/ Stanford University]
 
* Luciano Floridi, (2005). 'Semantic Conceptions of Information', ''The Stanford Encyclopedia of Philosophy'' (Winter 2005 Edition), Edward N. Zalta (ed.). Available online at [http://plato.stanford.edu/entries/information-semantic/ Stanford University]
 
* [http://www.lehigh.edu/~dac511/literature/casagrande1999.pdf Information as a Verb: Re-conceptualizing Information for Cognitive and Ecological Models] by David G. Casagrande.  
 
* [http://www.lehigh.edu/~dac511/literature/casagrande1999.pdf Information as a Verb: Re-conceptualizing Information for Cognitive and Ecological Models] by David G. Casagrande.  

Revisión actual del 14:48 17 oct 2019

Icono de traducción.png
Este artículo o sección está siendo traducido. Puedes colaborar con la CreaciónWiki en la traducción desde el artículo original.
"HOT" escrito en la arena. Alguien dejó esta información en la arena con un propósito.

Información es un término derivado del verbo latino informare que significa "dar forma a la mente", "la disciplina", "instruir", "enseñar". La información se entiende generalmente como conocimientos o hechos que uno ha adquirido. Sin embargo, en algunas áreas de la ciencia, la información se define de manera diferente ya menudo ambigua.[1][nota 1]

Para la ciencia creacionista, es la información (la Palabra de Dios) que subyace a el ajuste fino del universo. Además, la existencia de información biológica dentro de cada célula (ADN y ARN) ofrece lo que es tal vez el más poderoso argumento para el diseño inteligente. William Dembski afirma que el ADN posee complejidad especificada (es decir, es al mismo tiempo complejo y especificado, al mismo tiempo), por lo que debe haber sido producido por una causa inteligente (es decir, que fue diseñado), en lugar de ser el resultado de procesos naturales.[2] Una de las principales objeciones a la evolución es el origen de las enormes cantidades de contenido de información genética que se necesita para un organismo a evolucionar a partir de los microbios a los seres humanos.[3] No sólo no tiene ninguna fuente creíble que ha sido identificada, donde la información podría ser producida por procesos naturales, pero en contraste la adaptación de los organismos vivos implica una reducción de la información en el genoma a través de la selección natural.[4]

Definiciones o caracterizaciones

La palabra "información" se utiliza de muchas maneras. Hemos mencionado el sentido de la persona laica arriba, pero también se utiliza una secuencia de símbolos (como letras de un idioma (ver foto a la derecha), puntos y rayas del código Morse, o la disposición de los golpes de Braille) que transmiten significado. Otra forma en que se utiliza el término es en teoría de la comunicación, y la compresión de mensajes. Se trata principalmente de los dos últimos de los sentidos que se analizan en este artículo.

Royal Truman, en su análisis publicada en el Journal of Creation analiza dos familias de enfoques: La primera derivada del trabajo de Shannon y la segunda derivada de la obra de Gitt.[5] Truman también menciona la definición algorítmica de la información, desarrollada por Solomonoff, Kolmogorov y con contribuciones de Gregory Chaitin pero que no ha sido discutida en su artículo.[5] De acuerdo con Stephen Meyer, Los científicos suelen distinguir dos tipos básicos de información: la información significativa o funcional y la llamada información de Shannon[6] (llamado así por Claude Shannon, quien desarrolló la teoría de la información estadística). La información de Shannon no es realmente lo mismo que la información significativa. Información significativa, codificada en un lenguaje, se puede medir estadísticamente, por Shannon, pero la medida es la redundancia de los símbolos, la denominada entropía de Shannon,[nota 2] no es una medida del "contenido de información", o el significado. Shannon, por ejemplo, se puede utiliza para medir el "contenido de información" de un conjunto de símbolos al azar que no tienen ningún significado.

La información a menudo no se define.[7] Algunas definiciones relacionan el concepto de información al significado. A su vez, el significado de "significado" nunca se ha explicado adecuadamente cuando se aplica a los procesos de pensamiento humano.[8] Algunas definiciones, caracterizaciones o nociones acerca de la información se encuentran en la literatura o en la web, pero sin consenso. Algunos de estas son:

Robert M. Losee;

La información puede ser entendida de una manera independiente del dominio como los valores en el resultado de cualquier proceso.[9]

Winfried Nöth;

La información en su sentido cotidiano es un concepto cualitativo asociado con significado y noticias.[10]

Ray Kurzweil;

La información es una secuencia de datos que es significativa en un proceso, tal como el código del ADN de un organismo, o los bits en un programa de ordenador.[11]

Gregory Bateson;

La información es una diferencia que hace la diferencia.[12]

Valdemar W.Setzer[nota 3]

Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms.[13]

Wikipedia;

La información, en su sentido técnico más restringido, es una secuencia de símbolos que pueden interpretarse como un mensaje. La información puede registrarse como signos o transmitirse como señales. La información es cualquier tipo de evento que afecta el estado de un sistema dinámico que puede interpretar la información.[14]

Respuesta de algunos de los presentes en las conferencias de Stephen Talbott a una gran audiencia de bibliotecarios.;

Ese es el material con lo que trabajamos.[13]


Views

Teoría de la información

Una definición clara del concepto de "información" no se puede encontrar en los libros de teoría de la información.[15] Gibbs propone, en este contexto, una definición simple: ”la información (I) es la cantidad de los datos después de la compresión de datos”.[15] Para Shannon, los aspectos semánticos de la comunicación son irrelevantes para el problema de ingeniería y el aspecto significativo es que el mensaje real es una recogida de un conjunto de posibles mensajes.[16] De acuerdo con J. Z. Young, el concepto de la información en un sistema, de acuerdo a Shannon, puede ser definido como la característica de la misma que permanece invariante bajo re-codificación.[17]

Semiótica

Tanto la teoría de la información como la semiótica estudian la información, pero debido a su enfoque estrictamente cuantitativo, la teoría de información tiene un alcance más limitado.[10] En la semiótica, el concepto de información se relaciona con los signos. Un signo es algo que puede ser interpretado como teniendo un significado, distinto de sí mismo, y por lo tanto un vehículo de información a uno capaz de decodificar esta señal. Los signos pueden ser considerados en términos de niveles interdependientes: la pragmática, semántica, sintaxis, y el nivel empírico.

Dr. Werner Gitt propone conceptualmente cinco diferentes niveles de información[18]:

Nivel Nombre Transmisor Receptor Descripción
1 Estadística Señal transmitida Señal recibida A nivel estadístico, el significado es completamente ignorado. En este nivel, la información se refiere exclusivamente a las propiedades estadísticas de las secuencias de símbolos.[18]
2 Syntax Code used Comprehended code At the syntactical level, lexical and syntactic aspects are taken in account. Information concerns all structural properties of the process of information creation, the actual sets of symbols and the rules of syntax.[18]
3 Semantics Communicated ideas Comprehended meaning At the semantic level, meaning is taken in account. Information concerns the meaning of the piece of information been transmitted. Every piece of information leads to a mental source, the mind of the sender.[18]
4 Pragmatics Expected action Implemented action At the pragmatic level, how transmitted information is used in practice is taken into account. Information always entails a pragmatic aspect. Information is able to cause the recipient to take some action.[18] Pragmatics also encompasses the contribution of the context to meaning.
5 Apobetics[nota 4] Intended purpose Achieved result At the apobetic level, the purpose the sender has with the transmitted information is taken in account. Every bit of information is intentional.[18] At this level we take into account what the information transmitted indicates and implies.

According Dr. Gitt, there is no known law through which matter can give rise to information.[18] In his article on scientific laws of information, published in the Journal of Creation, Dr. Gitt states that information is not a property of matter, it is a non-material entity so its origin is in the same way not explicable by material processes.[19] Dr. Gitt also points out that the most important prerequisite for the production of information is the sender's own will, so that information can arises only through will encompassing intention and purpose.[19] Gitt also points out that as information is neither formed of matter (although it can be carried on matter) nor energy, it constitutes a third fundamental quantity of the universe.

Biology

It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes.[20] John von Neumann stated that the genes themselves are clearly parts of a digital system of components.[21] Many biologists, especially materialists, see this trend as a having foundational problems.[20]

Either way, many scientists in various fields of science consider living organisms as having biological information. Gregory Chaitin, a renowned Argentine-American mathematician and computer scientist, sees DNA as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last.[22] David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information".[23] Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction". [24] Richard Dawkins, a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information.[25] Stephen Meyer points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function."[26] Meyer points out that the information contained in the DNA has a high degree of specificity. David Berlinski an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:

Whatever else a living creature may be...[it] is also a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to another (the proteins)[27]


The intelligent design concept that DNA exhibited specified complexity was developed by mathematician and philosopher William Dembski. Dembski claims that when something exhibits specified complexity (i.e., is both complex and specified, simultaneously) one can infer that it was produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes (see naturalism).[2] He provides the following examples: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified."[28] He states that details of living things can be similarly characterized, especially the "patterns" of molecular sequences in functional biological molecules such as DNA.[2]

Dembski defines a probability of 1 in 10150 as the "universal probability bound". Its value corresponds to the inverse of the upper limit of "the total number of possible specified events throughout cosmic history," as calculated by Dembski. He defines complex specified information (CSI) as specified information with a probability less than this limit. (The terms "specified complexity" and "complex specified information" are used interchangeably.) He argues that CSI cannot be generated by the only known natural mechanisms of physical law and chance, or by their combination. He argues that this is so because laws can only shift around or lose information, but do not produce it, and chance can produce complex unspecified information, or non-complex specified information, but not CSI; he provides a mathematical analysis that he asserts demonstrates that law and chance working together cannot generate CSI, either. Dembski and other proponents of ID contend that CSI is best explained as being due to an intelligent cause and is therefore a reliable indicator of design.[29]

Quantifying information

David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously".[30] Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.[30]

Shannon entropy

In A Mathematical Theory of Communication Shannon endows the term information not only with technical significance, but also for measure.[31] Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.[32] The self-information, denoted by i, associated with an event A is given by:

i(A) = -logbP(A)

where P(A) is the probability that the event A will occur and b is the chosen base of the log. If the unit of information is bits, we use b=2 and so on.[32]

The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions.[10] The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits.[10] Is worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.[33]

Chaitin-Kolmogoroff theory

Another way of measuring information content is the Kolmogorov complexity (also known as Chaitin information). The Kolmogorov complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (like the Turing machine). Let x be a binary string and let d(x) be the shortest string <M, i> achieved by concatenating a Turing machine M and an input i to which the Turing machine halts, leaving the string x on the tape.[34] The Kolmogorov complexity K(x) is:

K(x) = |d(x)|

that means, the Kolmogorov complexity is the length of the minimal description of x.[34] The complexity can be viewed as the measure of the "patternlessness" of the sequence, and can be equate with the idea of randomness.[35] The length of the shortest description will depend on the choice of description language. By way of illustration we compare two strings:

"CREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKI" 

and

"7W7JAHAGLKJHGBNMZCXVSFQP92725FFADSHALKJNMZAQWSXPLÇKJHGTRFOUMSVAXZXCTEÇALSKDJFHGBEOQI" 

Both strings have the same number of letters but the former can be represented in a more compact way: "7 x 'CREATIONWIKI". Another way to represent the first sequence is using a language such as this Pascal-like:

for i:=1 to m

write('CREATIONWIKI');

This program contains 34 ASCII characters (counting the blanks and new line) plus 1 character (the parameter: 7) (the value of the variable m in this particular case). For other values of m the program length in characters will be 34 + log m. One way to measure the randomness of the former sequence is to form the ratio of program length to string length.[36] This results in the measure of the randomness of:

r ≤ (34 + log m)/(12m)

Similarly to the previous case (Shannon), the Kolmogorov complexity can't measure the meaning of information. In fact, it measures the compressibility of a given sequence.

Spontaneous appearance

A very impressive sand castle on the beach near St Helier in Jersey.

Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.[23] Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.[37] In his book Steps Towards Life Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".[38][nota 5] A. E. Wilder-Smith, in contrast, states that

If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.[37]


Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy[nota 6] and, on the other side, a correspondence between potential information and entropy.[37] Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.[37]

Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.[39]

In his book A Case Against Accident and Self-Organization, Dean L. Overman builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.[40] The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.[40] Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.[40] According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.[40][nota 7]

Evolutionist Michael Denton wrote the controversial book "Evolution: A Theory in Crisis". In his book, writing about the origin of life, Denton states:

The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.[41]


Due to the enormous odds against abiogenesis on earth some scientists have turned to the panspermia hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, Fred Hoyle, Svante Arrhenius, Leslie Orgel and Thomas Gold.[40][42]

Problem for evolution

According to Jonathan Sarfati, the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the origin of the enormous amount of genetic that is needed in order for a microbe to evolve, ultimately reaching the complexity of humans.[43] Dr. Lee Spetner points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.[44]

Notas

  1. Dr. Imre Simon, quien hizo esta referencia, fue un conocido matemático y científico de la computación brasileño nacido en Hungría.
  2. Entropía de Shannon es la imprevisibilidad media en una variable aleatoria, que es equivalente a su contenido de información.
  3. Dr. Valdemar W.Setzer is a well-known Brazilian computer scientist. He is is a signatory to the list named "A Scientific Dissent From Darwinism". Found in: Plantilla:Cite web
  4. The term "apobetics" was introduced by Dr. Werner Gitt in 1981 to express the teleological aspect of information, the question of purpose. The word is derived from the Plantilla:Greek Name that means result, success, conclusion, consequence. Found in Gitt, Werner (2005). In the Beginning was Information: A Scientist Explains the Incredible Design in Nature. Green Forest, AR: Master Books. p. 77. ISBN 978-0-89051-461-0. 
  5. The Eigen's book quoted by Dembski is Eigen, Manfred (1992). Steps Towards Life: A Perspective on Evolution. Oxford: Oxford University Press. p. 12. ISBN 0-19854751-X. 
  6. Negentropy, also negative entropy or syntropy of a living system is the entropy that it exports to keep its own entropy low.
  7. Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 1040,000. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10-44. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10-191. Harold Morrison obtained in his calculations the impressive number of 1 in 10100,000,000,000 for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: A Case Against Accident and Self-Organization. Lanham: Rowman & Littlefield Publishers. 1997. p. 33-102. ISBN 0-8476-8966-2. 

Referencias

  1. Simon, Imre. «MAC 333 - A Revolução Digital e a Sociedade do Conhecimento - Tema 11 - O que é Informação? Como ela age?». Consultado el 31 de julio 2013.
  2. 2,0 2,1 2,2 Dembski, William A. (15 de noviembre 1998.). «Intelligent Design as a Theory of Information». Access Research Network. Consultado el 28 de octubre de 2013.
  3. Sarfati, Jonathan (2010). The Greatest Hoax on Earth?:Refuting Dawkins on Evolution. Atlanta, Georgia: Creation Book Publishers. p. 43. ISBN 978-0-949906-73-1. 
  4. Spetner, Lee M (1998). «5-Can Random Variation Build Information?». Not by Chance!. Brooklyn, New York: The Judaica Press. p. 127-160. ISBN 1-880582-24-4. 
  5. 5,0 5,1 Truman, Royal (2012). «Information Theory-Part 1:Overview of Key Ideas». Journal of Creation 26 (3):  p. 101-106. ISSN 1036-2916. 
  6. Meyer, Stephen C (2013). Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design. Seattle, WA: HarperOne/HarperCollins Publishers. p. 164-168. ISBN 978-0-06-207147-7. 
  7. M. Buckland. (Spring 1997). «"Information" and Other Words». School of Information Management & Systems. Consultado el 29 de octubre de 2013.
  8. Lindsay, Robert K (1995). «Inferential Memory as the Basis of Machines Which Understand Natural Language». En Feigenbaum, Edward A.; Feldman, Julian. Computers & Thought. Menlo park, Cambridge, London: AAAI Press/The MIT Press. p. 218. ISBN 0-262-56092-5. 
  9. Losee, Robert M. (1997). «A Discipline Independent Definition of Information». Journal of the American Society for Information Science 48 (3):  p. 254–269. ISSN 1532-2890. http://www.idt.mdh.se/~gdc/work/ARTICLES/09-VDM-BOOK/pdf/DisciplineIndependentDefinitionOfInformation.pdf. 
  10. 10,0 10,1 10,2 10,3 Nöth, Winfried (1995). Handbook of Semiotics. Indiana: Indiana University Press. ISBN 0-253-20959-5. 
  11. Kurzweil, Ray (2000). The Age of Spiritual Machines: When Computers Exceed Human Intelligence. New York: Penguin Books. p. 30. ISBN 0-14-028202-5. 
  12. Barlow, John Perry. «The Economy of Ideas: A framework for patents and copyrights in the Digital Age. (Everything you know about intellectual property is wrong.)». Consultado el 29 de octubre de 2013.
  13. 13,0 13,1 Plantilla:Cite web
  14. Plantilla:Cite web
  15. 15,0 15,1 Lin, Shu-Kun (2008). «Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship». Entropy 10 (1):  p. 1-5. http://www.mdpi.com/1099-4300/10/1/1. 
  16. Shannon, Claude E.; Weaver, Warren (1949). The Mathematical Theory of Communication. Illinois: Illini Books. p. 1. Library of Congress Catalog Card nº 49-11922. http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf. 
  17. Young, J. Z (1963). «Memory, Heredity and Information». En Huxley, Julian; Hardy, A. C.; Ford, E. B. Evolution as a Process (2ª edición). New York, N. Y.: Collier Books. p. 326-346. 
  18. 18,0 18,1 18,2 18,3 18,4 18,5 18,6 Gitt, Werner (2005). In the Beginning was Information: A Scientist Explains the Incredible Design in Nature. Green Forest, AR: Master Books. p. 49-87. ISBN 978-0-89051-461-0. 
  19. 19,0 19,1 Gitt, Werner (2009). «Scientific Laws of Information and Their Implications-Part 1». Journal of Creation 23 (2):  pp. 96-102. ISSN 1036-2916. 
  20. 20,0 20,1 Plantilla:Cite web
  21. von Neumann, John (2000). The Computer and the Brain (2nd edición). New Haven and London: Yale University Press. p. 69. ISBN 0-300-08473-0. 
  22. Chaitin, Gregory (2005). Meta Math!: The Quest for Omega. New York: Vintage Books. p. 66-74. ISBN 978-1-4000-7797-7. http://arxiv.org/pdf/math/0404335.pdf. 
  23. 23,0 23,1 Dembski, William A.; Witt, Jonathan (2010). Intelligent Design Uncensored: An Easy-to-Understand Guide to the Controversy. Downers Grove, Illinois: InterVarsity Press. p. 72-73. ISBN 978-0-8308-3742-7. 
  24. Davis, Percival; Kenyon, Dean H. Of Pandas and People: The Central Question of Biological Origins (2nd edición). Dallas, Texas: Haughton Publishing Company. p. 64. ISBN 0-914513-40-0. 
  25. Dawkins, Richard (1995). River Out of Eden. New York: Basic Books. p. 4. ISBN 978-0-465-06990-3. 
  26. Meyer, Stephen C (2006). «12-The Origin of Biological Information and the Higher Taxonomic Categories». En Dembski, William A. Darwin's Nemesis: Philip Johnson and the Intelligent Design Movement. Downers Grove, IL: Inter-Varsity Press. p. 178-179. ISBN 978-0-8308-2836-4. 
  27. Berlinski, David (2010). The Deniable Darwin and Other Essays. Seattle: Discovery Institute Press. p. 153. ISBN 978-0-979014-12-3. 
  28. Dembski, William A.Explaining Specified Complexity Appeared as Metaviews 139 (www.meta-list.org). September 13 1999.
  29. Dembski, William A. No Free Lunch: Why Specified Complexity Cannot Be Purchased without Intelligence. 2001, Rowman & Littlefield Publishers Inc., ISBN 0742512975>
  30. 30,0 30,1 Salomon, David (2005). Coding for Data and Computer Communications. New York: Springer. p. 59. ISBN 0-387-21245-0. 
  31. Salomon, David (2000). Data Compression: The Complete Reference (2nd edición). New York: Springer. p. 279. ISBN 0-387-95045-1. 
  32. 32,0 32,1 Sayood, Khalid (2000). Introduction to Data Compression (2nd edición). San Francisco: Morgan Kaufmann Publishers. p. 13-14. ISBN 1-55860-558-4. 
  33. Nelson, Mark; Gailly, Jean-Loup (1996). The Data Compression Book (2nd edición). New York: M&T Books. p. 14. ISBN 1-55851-434-1. 
  34. 34,0 34,1 Sipser, Michael (1997). Introduction to the Theory of Computation. Boston: PWS Publishing Company. p. 214-215. ISBN 0-534-94728-X. 
  35. Knuth, Donald (1981). The Art of Computer Programming: Seminumerical Algorithms. 2 (2nd edición). Reading, Massachusetts: Addison-Wesley Publishing Company. p. 163-166. ISBN 0-201-03822-6. 
  36. Dewdney, A. K (1993). «8-Random Numbers: The Chaitin-Kolmogoroff Theory». The New Turing Omnibus: 66 Excursions in Computer Science. New York: A. W. H. Freeman/Owl Books. p. 49-55. ISBN 978-0-8050-7166-5. 
  37. 37,0 37,1 37,2 37,3 Wilder-Smith, A. E. (1987). The Scientific Alternative to Neo-Darwinian Evolutionary Theory. Costa Mesa, California: The Word For Today Publishers. p. 23-51. ISBN 0-936728-1B. 
  38. Dembski, William A (1999). Intelligent Design:The Bridge Between Science & Theology. Downers Grove, Illinois: IVP Academic. p. 153-183. ISBN 0-8308-2314-X. 
  39. Lester, Lane P; Bohlin, Raymond G (1989). The Natural Limits to Biological Change (2nd edición). Dallas: Probe Books. p. 157. ISBN 0-945241-06-2. 
  40. 40,0 40,1 40,2 40,3 40,4 Overman, Dean L (1997). A Case Against Accident and Self-Organization. Lanham: Rowman & Littlefield Publishers. p. 33-102. ISBN 0-8476-8966-2. 
  41. Denton, Michael (1985). Evolution: A Theory in Crisis. Chevy Chase, MD: Adler & Adler. p. 271. ISBN 0-917561-52-X. 
  42. Shapiro, Robert (1987). Origins: A Skeptic´s Guide to the Creation of Life on Earth. Toronto: Bantam Books. p. 226-227. ISBN 0-553-34355-6. 
  43. Sarfati, Jonathan (2010). The Greatest Hoax on Earth?:Refuting Dawkins on Evolution. Atlanta, Georgia: Creation Book Publishers. p. 43. ISBN 978-0-949906-73-1. 
  44. Spetner, Lee M (1998). «5-Can Random Variation Build Information?». Not by Chance!. Brooklyn, New York: The Judaica Press. p. 127-160. ISBN 1-880582-24-4. 

Enlaces externos