Cambios

Información

16 bytes añadidos, 11:17 26 oct 2013
sin resumen de edición
Para la [[ciencia creacionista]], it is information (God's word) that underlies the [[creation cosmology#Cosmic fine tuning|fine-tuning of the universe]]. Furthermore the existence of biological information (DNA and RNA) provides what is perhaps the most powerful argument for [[intelligent design]]. [[William Dembski]] asserts that DNA possesses specified complexity (i.e., is both complex and specified, simultaneously) and therefore it must have been produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes.<ref name=ARN>Dembski, William A. [http://www.arn.org/docs/dembski/wd_idtheory.htm Intelligent Design as a Theory of Information] ''Access Research Network'', November 15 1998.</ref>
One of the main objections to evolution is the origin of the enormous amounts of genetic information content that is needed for an organism to evolve from microbes to humans.<ref>{{cite book|author=Sarfati, Jonathan|authorlinkenlaceautor=Jonathan Sarfati|title=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|publisher=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> Not only has no credible source been identified where information could be produced by natural processes, but in contrast the adaptation of living organism involves a reduction of the information in the genome through natural selection.<ref>{{cite book|author=Spetner, Lee M|authorlinkenlaceautor=Lee Spetner|title=[[Not by Chance!]] |publisher=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
==Definiciones o caracterizaciones==
La palabra "información" se utiliza de muchas maneras. Hemos mencionado el sentido de la persona laica arriba, pero también se utiliza una secuencia de símbolos (como letras de un idioma (ver foto a la derecha), puntos y rayas del código Morse, o la disposición de los golpes de Braille) que transmiten significado. Otra forma en que se utiliza el término es en teoría de la comunicación, y la compresión de mensajes. Se trata principalmente de los dos últimos de los sentidos que se analizan en este artículo.
Royal Truman, en su análisis publicada en el ''[[Journal of Creation]]'' analiza dos familias de enfoques: La primera derivada del trabajo de Shannon y la segunda derivada de la obra de [[Werner Gitt|Gitt]].<ref name=truman1>{{cita publicación|título=Information Theory-Part 1:Overview of Key Ideas|autor=Truman, Royal|año=2012|publicación=[[Journal of Creation]]|volumen=26|número=3|página=101-106|issn=1036-2916}}</ref> Truman también menciona la definición algorítmica de la información, desarrollada por Solomonoff, Kolmogorov y con contribuciones de [[Gregory Chaitin]] pero que no ha sido discutida en su artículo.<ref name=truman1 /> De acuerdo con [[Stephen Meyer]], Los científicos suelen distinguir dos tipos básicos de información: la información significativa o funcional y la llamada información de Shannon<ref name=doubt>{{cita libro|autor=Meyer, Stephen C|autorlinkenlaceautor=Stephen Meyer|título=Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design|editorial=HarperOne/HarperCollins Publishers|ubicación=Seattle, WA|año=2013|página=164-168|isbn=978-0-06-207147-7}}</ref> (llamado así por Claude Shannon, quien desarrolló la teoría de la información estadística). La información de Shannon no es realmente lo mismo que la información significativa. Información significativa, codificada en un lenguaje, se puede medir estadísticamente, por Shannon, pero la medida es la redundancia de los símbolos, la denominada entropía de Shannon,<ref group=nota>Entropía de Shannon es la imprevisibilidad media en una variable aleatoria, que es equivalente a su contenido de información.</ref> no es una medida del "contenido de información", o el significado. Shannon, por ejemplo, se puede utiliza para medir el "contenido de información" de un conjunto de símbolos al azar que no tienen ningún significado.
Information is often not defined.<ref>{{cite web|url=http://www2.sims.berkeley.edu/courses/is101/s97/infoword.html|title="Information" and Other Words|publisher=School of Information Management & Systems|year=Spring 1997|author=M. Buckland.|accessdate=July 31, 2013}}</ref> Some definitions relate the concept of information to meaning. In turn, the meaning of "meaning" has never been properly explained when applied to human thought processes.<ref name=feigenbaum>{{cite book|editor=Feigenbaum, Edward A.; Feldman, Julian|author=Lindsay, Robert K|chapter=Inferential Memory as the Basis of Machines Which Understand Natural Language|title=Computers & Thought|publisher=AAAI Press/The MIT Press|location=Menlo park, Cambridge, London|year=1995|page=218|isbn=0-262-56092-5}}</ref> Some definitions, characterizations or notions about information are found in the literature or on the web but without consensus. Some of these are:
Tanto la teoría de la información como la semiótica estudian la información, pero debido a su enfoque estrictamente cuantitativo, la teoría de información tiene un alcance más limitado.<ref name=semiotics /> En la semiótica, el concepto de información se relaciona con los signos. Un signo es algo que puede ser interpretado como teniendo un significado, distinto de sí mismo, y por lo tanto un vehículo de información a uno capaz de decodificar esta señal. Los signos pueden ser considerados en términos de niveles interdependientes: la pragmática, semántica, sintaxis, y el nivel empírico.
[[Dr. Werner Gitt]] propone conceptualmente cinco diferentes niveles de información<ref name=Gitt>{{cita libro|autor=Gitt, Werner|autorlinkenlaceautor=Werner Gitt|título=In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|año=2005|editorial=Master Books|ubicación=Green Forest, AR|página=49-87|isbn=978-0-89051-461-0}}</ref>:
{| class="wikitable"
It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes.<ref name=stanford>{{cite web|url=http://plato.stanford.edu/entries/information-biological/|title=Biological Information|date=Oct 4, 2007|publisher=Stanford Encyclopedia of Philosophy|accessdate=August 2, 2013}}</ref> John von Neumann stated that the genes themselves are clearly parts of a digital system of components.<ref>{{cite book|author=von Neumann, John|title=The Computer and the Brain|edition=2nd|publisher=Yale University Press|location=New Haven and London|year=2000|page=69|isbn=0-300-08473-0}}</ref> Many biologists, especially materialists, see this trend as a having foundational problems.<ref name=stanford />
Either way, many scientists in various fields of science consider living organisms as having biological information. [[Gregory Chaitin]], a renowned Argentine-American mathematician and computer scientist, sees [[DNA]] as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last.<ref>{{cite book|author=Chaitin, Gregory|authorlinkenlaceautor=Gregory Chaitin|title=Meta Math!: The Quest for Omega|publisher=Vintage Books|location=New York|year=2005|page=66-74|url=http://arxiv.org/pdf/math/0404335.pdf|isbn=978-1-4000-7797-7}}</ref> David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information".<ref name=uncensored>{{cite book|title=[[Intelligent Design Uncensored: An Easy-to-Understand Guide to the Controversy]]|author=[[William Dembski|Dembski, William A.]]; Witt, Jonathan|page=72-73|publisher=InterVarsity Press|location=Downers Grove, Illinois|isbn=978-0-8308-3742-7|year=2010}}</ref> Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction".<ref name=pandas>{{cite book|title=Of Pandas and People: The Central Question of Biological Origins|author=Davis, Percival; [[Dean H. Kenyon|Kenyon, Dean H]]|publisher=Haughton Publishing Company|location=Dallas, Texas|edition=2nd|page=64|isbn=0-914513-40-0}}</ref> [[Richard Dawkins]], a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information.<ref>{{cite book|author=Dawkins, Richard|authorlinkenlaceautor=Richard Dawkins|title=River Out of Eden|year=1995|publisher=Basic Books|location=New York|page=4|isbn=978-0-465-06990-3}}</ref> [[Stephen Meyer]] points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function."<ref>{{cite book| author=Meyer, Stephen C|authorlinkenlaceautor=Stephen Meyer|editor=Dembski, William A|title=Darwin's Nemesis: Philip Johnson and the Intelligent Design Movement|publisher=Inter-Varsity Press|location=Downers Grove, IL|year=2006|page=178-179|chapter=12-The Origin of Biological Information and the Higher Taxonomic Categories|isbn=978-0-8308-2836-4}}</ref> Meyer points out that the information contained in the DNA has a high degree of specificity. [[David Berlinski]] an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:
{{cquote|Whatever else a living creature may be...[it] is ''also'' a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to another (the proteins)<ref>{{cite book|title=[[The Deniable Darwin and Other Essays]]|author=Berlinski, David|authorlinkenlaceautor=David Berlinski|year=2010|publisher=Discovery Institute Press|location=Seattle|page=153|isbn=978-0-979014-12-3}} </ref>}}
The [[intelligent design]] concept that DNA exhibited [[specified complexity]] was developed by mathematician and philosopher [[William Dembski]]. Dembski claims that when something exhibits specified complexity (i.e., is both complex and specified, simultaneously) one can infer that it was produced by an intelligent cause (i.e., that it was designed), rather than being the result of natural processes (''see [[naturalism]]'').<ref name=ARN>Dembski, William A. [http://www.arn.org/docs/dembski/wd_idtheory.htm Intelligent Design as a Theory of Information] ''Access Research Network'', November 15 1998.</ref> He provides the following examples: "A single letter of the alphabet is specified without being complex. A long sentence of random letters is complex without being specified. A Shakespearean sonnet is both complex and specified."<ref>Dembski, William A.[http://www.leaderu.com/offices/dembski/docs/bd-specified.html Explaining Specified Complexity] Appeared as Metaviews 139 (www.meta-list.org). September 13 1999.</ref> He states that details of living things can be similarly characterized, especially the "patterns" of molecular sequences in functional biological molecules such as [[DNA]].<ref name=ARN/>
==Spontaneous appearance ==
[[File:Ultimate Sand Castle.jpg|thumb|200px|A very impressive sand castle on the beach near St Helier in Jersey.]]
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.<ref name=uncensored /> Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.<ref name=wilder>{{cite book|author=Wilder-Smith, A. E.|authorlinkenlaceautor=A. E. Wilder-Smith|title=The Scientific Alternative to Neo-Darwinian Evolutionary Theory|publisher=The Word For Today Publishers|location=Costa Mesa, California|year=1987|page=23-51|isbn=0-936728-1B}}</ref> In his book ''Steps Towards Life'' Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".<ref name=dembski1>{{cite book|author=Dembski, William A|authorlinkenlaceautor=William Dembski|title=Intelligent Design:The Bridge Between Science & Theology|publisher=IVP Academic|location=Downers Grove, Illinois|year=1999|page=153-183|isbn=0-8308-2314-X}}</ref><ref group=nota>The Eigen's book quoted by Dembski is {{cite book|author=Eigen, Manfred|title=Steps Towards Life: A Perspective on Evolution|location=Oxford|publisher=Oxford University Press|year=1992|page=12|isbn=0-19854751-X}}</ref> [[A. E. Wilder-Smith]], in contrast, states that
{{cquote|If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.<ref name=wilder />}}
Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.<ref>{{cite book|author=Lester, Lane P; [[Raymond Bohlin|Bohlin, Raymond G]]|title=[[The Natural Limits to Biological Change]]|location=Dallas|publisher=Probe Books|edition=2nd|year=1989|page=157|isbn=0-945241-06-2}}</ref>
In his book ''[[A Case Against Accident and Self-Organization]]'', [[Dean L. Overman]] builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.<ref name=overman>{{cite book |author=Overman, Dean L|authorlinkenlaceautor=Dean L. Overman |editor-last= |title=[[A Case Against Accident and Self-Organization]]|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref> The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.<ref name=overman /> Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.<ref name=overman /> According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.<ref name=overman /><ref group=nota>Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 10<sup>40,000</sup>. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10<sup>-44</sup>. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10<sup>-191</sup>. Harold Morrison obtained in his calculations the impressive number of 1 in 10<sup>100,000,000,000</sup> for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: {{cite book |last=Overman|first=Dean L|editor-first= |editor-last= |title=A Case Against Accident and Self-Organization|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref>
Evolutionist [[Michael Denton]] wrote the controversial book "[[Evolution: A Theory in Crisis]]". In his book, writing about the origin of life, Denton states:
{{cquote|The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.<ref>{{cite book|author=Denton, Michael|authorlinkenlaceautor=Michael Denton|title=[[Evolution: A Theory in Crisis]]|page=271|publisher=Adler & Adler|location=Chevy Chase, MD|year=1985|isbn=0-917561-52-X}}</ref>}}
Due to the enormous odds against [[abiogenesis]] on [[earth]] some scientists have turned to the [[panspermia]] hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, [[Fred Hoyle]], Svante Arrhenius, Leslie Orgel and Thomas Gold.<ref name=overman /><ref>{{cite book|author=Shapiro, Robert|title=Origins: A Skeptic´s Guide to the Creation of Life on Earth|publisher=Bantam Books|location=Toronto|year=1987|page=226-227|isbn=0-553-34355-6}}</ref>
==Problem for evolution==
According to [[Jonathan Sarfati]], the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the origin of the enormous amount of genetic that is needed in order for a microbe to evolve, ultimately reaching the complexity of humans.<ref>{{cite book|author=Sarfati, Jonathan|authorlinkenlaceautor=Jonathan Sarfati|title=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|publisher=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> [[Dr. Lee Spetner]] points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.<ref>{{cite book|author=Spetner, Lee M|authorlinkenlaceautor=Lee Spetner|title=[[Not by Chance!]] |publisher=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>
== Notas ==
Creationist, administrador
37 993
ediciones