Cambios

Información

28 557 bytes añadidos, 14:10 11 ago 2013
Página creada con '{{traducción}} "HOT" written in the sand. Someone left this information in the sand with some purpose. '''Information''' is a term pro...'
{{traducción}}
[[File:HOT word in sand.jpg|thumb|200px|"HOT" written in the sand. Someone left this information in the sand with some purpose.]]
'''Information''' is a term probably derived from the stem {{Latin Name|information-}} which in turn is derived from the [[Latin]] verb {{Latin Name2|informare}} meaning "to give form to the mind", "to discipline", "instruct", "teach".
Information is generally understood as knowledge or facts that one has acquired, but in some areas of science, information can have slightly different definitions, although we do not have a precise definition of '''information''', of what is or what is not information.<ref>{{cite web|url=http://www.ime.usp.br/~is/ddt/mac333/aulas/tema-11-24mai99.html|title=MAC 333 - A Revolução Digital e a Sociedade do Conhecimento - Tema 11 - O que é Informação? Como ela age?|author=Simon, Imre|trans_title=MAC 333 - The Digital Revolution and the Knowledge Society - Track 11 - What is Information? How it works?|accessdate=July 31, 2013}}</ref><ref group=note>[[Wikipedia:Imre Simon|Dr. Imre Simon]], who made this reference, was a well-known Hungarian-born Brazilian mathematician and computer scientist.</ref> What we have is rather a vague and intuitive notion.

The word "information" is used in many ways. We mentioned the lay person's sense above, but it is also used of a sequence of symbols (such as letters of a language (see picture at right), dots and dashes of Morse code, or the arrangement of the bumps of Braille) that convey meaning.
Another way the term is used is in communications theory, and compression of messages.
It is mainly the last two of those senses that are discussed in this article.

Royal Truman, in his analysis published in the ''[[Journal of Creation]]'' discusses two families of approaches: The one derived from Shannon's work and the other derived from [[Werner Gitt|Gitt]]'s work.<ref name=truman1>{{cite journal|title=Information Theory-Part 1:Overview of Key Ideas|author=Truman, Royal|year=2012|journal=[[Journal of Creation]]|volume=26|issue=3|pages=101-106|issn=1036-2916}}</ref> Truman also mention the algorithm definition of information, developed by Solomonoff, Kolmogorov and with contributions from [[Gregory Chaitin]] but that has not been discussed in his article.<ref name=truman1 /> According to [[Stephen Meyer]], scientists usually distinguish two basic types of information: meaningful or functional information and so-called Shannon information<ref name=doubt>{{cite book|author=Meyer, Stephen C|authorlink=Stephen Meyer|title=Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design|publisher=HarperOne|/HarperCollins Publishers|location=Seattle, WA|year=2013|page=164-168|isbn=978-0-06-207147-7}}</ref> (named after Claude Shannon, who developed statistical information theory). Shannon information is really not the same thing as meaningful information. Meaningful information, encoded into a language, can be measured statistically, per Shannon, but the measure is of the redundancy of the symbols, the so-called Shannon entropy,<ref group=note>Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.</ref> not a measure of the "information content", or meaning. Shannon, for example, can be uses to measure the "information content" of a set of random symbols that have no meaning.

==Definitions or Characterizations ==
Information is often not defined.<ref>{{cite web|url=http://www2.sims.berkeley.edu/courses/is101/s97/infoword.html|title="Information" and Other Words|publisher=School of Information Management & Systems|year=Spring 1997|author=M. Buckland.|accessdate=July 31, 2013}}</ref> Some definitions relate the concept of information to meaning. In turn, the meaning of "meaning" has never been properly explained when applied to human thought processes.<ref name=feigenbaum>{{cite book|editor=Feigenbaum, Edward A.; Feldman, Julian|author=Lindsay, Robert K|chapter=Inferential Memory as the Basis of Machines Which Understand Natural Language|title=Computers & Thought|publisher=AAAI Press/The MIT Press|location=Menlo park, Cambridge, London|year=1995|page=218|isbn=0-262-56092-5}}</ref> Some definitions, characterizations or notions about information are found in the literature or on the web but without consensus. Some of these are:

Robert M. Losee;
{{cquote|Information may be understood in a domain-independent way as the values within the outcome of any process.<ref name=Losee>{{cite journal|title=A Discipline Independent Definition of Information|author=Losee, Robert M.|journal=Journal of the American Society for Information Science|volume=48|issue=3|pages=254–269|year=1997|url=http://www.idt.mdh.se/~gdc/work/ARTICLES/09-VDM-BOOK/pdf/DisciplineIndependentDefinitionOfInformation.pdf|ISSN=1532-2890}}</ref>}}
Winfried Nöth;
{{cquote|Information in its everyday sense is a qualitative concept associated with meaning and news.<ref name=semiotics>{{cite book|author=Nöth, Winfried|title=Handbook of Semiotics|year=1995|publisher=Indiana University Press|location=Indiana|isbn=0-253-20959-5}}</ref>}}
Ray Kurzweil;
{{cquote|Information is a sequence of data that is meaningful in a process, such as the DNA code of an organism, or the bits in a computer program.<ref name=Kurzweil>{{cite book|author=Kurzweil, Ray|title=The Age of Spiritual Machines: When Computers Exceed Human Intelligence|publisher=Penguin Books|location=New York|year=2000|page=30|isbn=0-14-028202-5}}</ref>}}
Gregory Bateson;
{{cquote|Information is a difference which makes a difference.<ref>{{cite web|url=http://www.wired.com/wired/archive/2.03/economy.ideas_pr.html||title=The Economy of Ideas: A framework for patents and copyrights in the Digital Age. (Everything you know about intellectual property is wrong.)|author=Barlow, John Perry|accessdate=July 31, 2013}}</ref>}}
Valdemar W.Setzer<ref group=note>Dr. Valdemar W.Setzer is a well-known Brazilian computer scientist. He is is a signatory to the list named "[[A Scientific Dissent From Darwinism]]". Found in: {{cite web|url=http://www.discovery.org/scripts/viewDB/filesDB-download.php?command=download&id=660|title=A Scientific Dissent from Darwinism (List)|accessdate=July 31, 2013}}</ref>
{{cquote|Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms.<ref name=setzer>{{cite web|url=http://www.ime.usp.br/~vwsetzer/data-info.html|author=Setzer, Valdemar W.|title=Data, Information, Knowledge and Competence|accessdate=July 31, 2013}}</ref>}}
Wikipedia;
{{cquote|Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as a message. Information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system that can interpret the information.<ref>{{cite web|url=http://en.wikipedia.org/wiki/Information|publisher=Wikipedia|title=Information|accessdate=July 31, 2013}}</ref>}}
Answer of some of those present in the lectures of Stephen Talbott to a large audience of librarians;
{{cquote|That's the stuff we work with.<ref name=setzer />}}

==Views==
===Information theory===
A clear definition of the concept of “information” cannot be found in information theory textbooks.<ref name=gibbs>{{cite journal|author=Lin, Shu-Kun|year=2008|title=Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship|journal=Entropy|volume=10|issue=1|pages=1-5|url=http://www.mdpi.com/1099-4300/10/1/1}}</ref> Gibbs proposes in this context a simple definition: “Information (I) is the amount of the data after data compression”.<ref name=gibbs /> To Shannon, the semantic aspects of communication are irrelevant to the engineering problem and the significant aspect is that the actual message is one picked from a set of possible messages.<ref name=shannon>{{cite book|author=Shannon, Claude E.; Weaver, Warren|title=The Mathematical Theory of Communication|location=Illinois|publisher=Illini Books|year=1949|page=1|id=Library of Congress Catalog Card nº 49-11922|url=http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf}}</ref> According to J. Z. Young, the concept of information in a system, to Shannon, may be defined as that feature of it which remains invariant under re-coding.<ref>{{cite book|editor=Huxley, Julian; Hardy, A. C.; Ford, E. B|title=Evolution as a Process|author=Young, J. Z|chapter=Memory, Heredity and Information|publisher=Collier Books|location=New York, N. Y.|edition=2nd|year=1963|page=326-346}}</ref>

===Semiotics===
Both Information theory and semiotics study information but because of its strictly quantitative approach, information theory has a more limited scope.<ref name=semiotics /> In semiotics, the concept of information is related to signs. A sign is something that can be interpreted as having a meaning, other than itself, and therefore a vehicle of information to one able to decode this signal. Signs can be regarded in terms of inter-dependent levels: pragmatics, semantics, syntax, and empiric.

[[Dr. Werner Gitt]] proposes conceptually five different levels of information<ref name=Gitt>{{cite book|author=Gitt, Werner|authorlink=Werner Gitt|title=In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|year=2005|publisher=Master Books|location=Green Forest, AR|page=49-87|isbn=978-0-89051-461-0}}</ref>:

{| class="wikitable"
!style="background:#FF8888;"|Level
!style="background:#AADFEA;"|Name
!style="background:#F0E68C;"|Sender
!style="background:#F088F0;"|Receiver
!style="background:#AAEADF;"|Description
|-
|1
|Statistics
|Transmitted signal
|Received signal
|At the statistical level, the meaning is completely ignored. At this level, information exclusively concerns the statistical properties of sequences of symbols.<ref name=Gitt />
|-
|2
|Syntax
|Code used
|Comprehended code
|At the syntactical level, lexical and syntactic aspects are taken in account. Information concerns all structural properties of the process of information creation, the actual sets of symbols and the rules of syntax.<ref name=Gitt />
|-
|3
|Semantics
|Communicated ideas
|Comprehended meaning
|At the semantic level, meaning is taken in account. Information concerns the meaning of the piece of information been transmitted. Every piece of information leads to a mental source, the mind of the sender.<ref name=Gitt />
|-
|4
|Pragmatics
|Expected action
|Implemented action
|At the pragmatic level, how transmitted information is used in practice is taken into account. Information always entails a pragmatic aspect. Information is able to cause the recipient to take some action.<ref name=Gitt /> Pragmatics also encompasses the contribution of the context to meaning.
|-
|5
|Apobetics<ref group=note>The term "apobetics" was introduced by Dr. Werner Gitt in 1981 to express the teleological aspect of information, the question of purpose. The word is derived from the {{Greek Name|αποβαίνων|apobeinon}} that means result, success, conclusion, consequence. Found in {{cite book|author=Gitt, Werner|title= In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|year=2005|publisher=Master Books|location=Green Forest, AR|page=77|isbn=978-0-89051-461-0}}</ref>
|Intended purpose
|Achieved result
|At the apobetic level, the purpose the sender has with the transmitted information is taken in account. Every bit of information is intentional.<ref name=Gitt /> At this level we take into account what the information transmitted indicates and implies.
|}

According Dr. Gitt, there is no known law through which matter can give rise to information.<ref name=Gitt /> In his article on scientific laws of information, published in the [[Journal of Creation]], Dr. Gitt states that information is not a property of matter, it is a non-material entity so its origin is in the same way not explicable by material processes.<ref name=laws1>{{cite journal|title=Scientific Laws of Information and Their Implications-Part 1|author=Gitt, Werner|year=2009|journal=[[Journal of Creation]]|volume=23|issue=2|pages=96-102|issn=1036-2916}}</ref> Dr. Gitt also points out that the most important prerequisite for the production of information is the sender's own will, so that information can arises only through will encompassing intention and purpose.<ref name=laws1 />
Gitt also points out that as information is neither formed of matter (although it can be carried on matter) nor energy, it constitutes a third fundamental quantity of the universe.

===Biology===
It is generally accepted that the meaning of information given by Claude Shannon in his theory of mathematical information is relevant and legitimate in many areas of biology but in recent decades, and even before, many biologists have applied informational concepts in a broader sense. They see most basic processes characteristic of living organisms being understood in terms of the expression of information, the execution of programs, and the interpretation of codes.<ref name=stanford>{{cite web|url=http://plato.stanford.edu/entries/information-biological/|title=Biological Information|date=Oct 4, 2007|publisher=Stanford Encyclopedia of Philosophy|accessdate=August 2, 2013}}</ref> John von Neumann stated that the genes themselves are clearly parts of a digital system of components.<ref>{{cite book|author=von Neumann, John|title=The Computer and the Brain|edition=2nd|publisher=Yale University Press|location=New Haven and London|year=2000|page=69|isbn=0-300-08473-0}}</ref> Many biologists, especially materialists, see this trend as a having foundational problems.<ref name=stanford />

Either way, many scientists in various fields of science consider living organisms as having biological information. [[Gregory Chaitin]], a renowned Argentine-American mathematician and computer scientist, sees [[DNA]] as a computer program for calculating the organism and the relationship between male and female as a way of transmission of biological information from the first to the last.<ref>{{cite book|author=Chaitin, Gregory|authorlink=Gregory Chaitin|title=Meta Math!: The Quest for Omega|publisher=Vintage Books|location=New York|year=2005|page=66-74|url=http://arxiv.org/pdf/math/0404335.pdf|isbn=978-1-4000-7797-7}}</ref> David Baltimore, an American biologist and Nobel laureate, stated that "Modern Biology is a science of information".<ref name=uncensored>{{cite book|title=[[Intelligent Design Uncensored: An Easy-to-Understand Guide to the Controversy]]|author=[[William Dembski|Dembski, William A.]]; Witt, Jonathan|page=72-73|publisher=InterVarsity Press|location=Downers Grove, Illinois|isbn=978-0-8308-3742-7|year=2010}}</ref> Edmund Jack Ambrose, quoted by Davis and Kenyon, said that "There is a message if the order of bases in DNA can be translated by the cell into some vital activity necessary for survival or reproduction".
<ref name=pandas>{{cite book|title=Of Pandas and People: The Central Question of Biological Origins|author=Davis, Percival; [[Dean H. Kenyon|Kenyon, Dean H]]|publisher=Haughton Publishing Company|location=Dallas, Texas|edition=2nd|page=64|isbn=0-914513-40-0}}</ref> [[Richard Dawkins]], a British ethologist and evolutionary biologist, has written that life itself is the flow of a river of DNA which he also denominates a river of information.<ref>{{cite book|author=Dawkins, Richard|authorlink=Richard Dawkins|title=River Out of Eden|year=1995|publisher=Basic Books|location=New York|page=4|isbn=978-0-465-06990-3}}</ref> [[Stephen Meyer]] points out that producing organismal form requires the generation of information in Shannon's sense. But he goes further to observe that "like meaningful sentences or lines of computer code, genes and proteins are also specified with respect to function."<ref>{{cite book| author=Meyer, Stephen C|authorlink=Stephen Meyer|editor=Dembski, William A|title=Darwin's Nemesis: Philip Johnson and the Intelligent Design Movement|publisher=Inter-Varsity Press|location=Downers Grove, IL|year=2006|page=178-179|chapter=12-The Origin of Biological Information and the Higher Taxonomic Categories|isbn=978-0-8308-2836-4}}</ref> Meyer points out that the information contained in the DNA has a high degree of specificity. [[David Berlinski]] an American philosopher, educator, and author, also draws a parallel between biology and information theory. In his book "The Deniable Darwin & Other Essays" he stated that:

{{cquote|Whatever else a living creature may be...[it] is ''also'' a combinatorial system, its organization controlled by a strange, a hidden and obscure text, one written in a biochemical code. It is an algorithm that lies at the humming heart of life, ferrying information from one set of symbols (the nucleic acids) to another (the proteins)<ref>{{cite book|title=[[The Deniable Darwin and Other Essays]]|author=Berlinski, David|authorlink=David Berlinski|year=2010|publisher=Discovery Institute Press|location=Seattle|page=153|isbn=978-0-979014-12-3}} </ref>}}

==Quantifying information==
David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously".<ref name=Coding>{{cite book|author=Salomon, David|title=Coding for Data and Computer Communications|publisher=Springer|location=New York|year=2005|page=59|isbn=0-387-21245-0}}</ref> Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.<ref name=Coding /> In ''A Mathematical Theory of Communication'' Shannon endows the term ''information'' not only with technical significance, but also for measure.<ref name=solomon>{{Cite book|author=Salomon, David|title=Data Compression: The Complete Reference|publisher=Springer|location=New York|page=279|year=2000|edition=2nd|isbn=0-387-95045-1}}</ref> Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.<ref name=Sayood>{{cite book|author=Sayood, Khalid|title=Introduction to Data Compression|publisher=Morgan Kaufmann Publishers|location=San Francisco|year=2000|edition=2nd|page=13-14|isbn=1-55860-558-4}}</ref> The self-information, denoted by ''i'', associated with an event A is given by:

<big>i(A) = -log<sub>''b''</sub>''P''(A)</big>

where ''P''(A) is the probability that the event A will occur and ''b'' is the chosen base of the log. If the unit of information is bits, we use ''b''=2 and so on.<ref name=Sayood />

The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions.<ref name=semiotics /> The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits.<ref name=semiotics /> Is worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.<ref>{{cite book|author=Nelson, Mark; Gailly, Jean-Loup|title=The Data Compression Book|edition=2nd|publisher=M&T Books|location=New York|year=1996|page=14|isbn=1-55851-434-1}}</ref>

==Spontaneous appearance ==
[[File:Ultimate Sand Castle.jpg|thumb|200px|A very impressive sand castle on the beach near St Helier in Jersey.]]
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.<ref name=uncensored /> Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.<ref name=wilder>{{cite book|author=Wilder-Smith, A. E.|authorlink=A. E. Wilder-Smith|title=The Scientific Alternative to Neo-Darwinian Evolutionary Theory|publisher=The Word For Today Publishers|location=Costa Mesa, California|year=1987|page=23-51|isbn=0-936728-1B}}</ref> In his book ''Steps Towards Life'' Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".<ref name=dembski1>{{cite book|author=Dembski, William A|authorlink=William Dembski|title=Intelligent Design:The Bridge Between Science & Theology|publisher=IVP Academic|location=Downers Grove, Illinois|year=1999|page=153-183|isbn=0-8308-2314-X}}</ref><ref group=note>The Eigen's book quoted by Dembski is {{cite book|author=Eigen, Manfred|title=Steps Towards Life: A Perspective on Evolution|location=Oxford|publisher=Oxford University Press|year=1992|page=12|isbn=0-19854751-X}}</ref> [[A. E. Wilder-Smith]], in contrast, states that

{{cquote|If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.<ref name=wilder />}}

Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy<ref group=note>Negentropy, also negative entropy or syntropy of a living system is the entropy that it exports to keep its own entropy low.</ref> and, on the other side, a correspondence between potential information and entropy.<ref name=wilder /> Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.<ref name=wilder />

Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.<ref>{{cite book|author=Lester, Lane P; [[Raymond Bohlin|Bohlin, Raymond G]]|title=[[The Natural Limits to Biological Change]]|location=Dallas|publisher=Probe Books|edition=2nd|year=1989|page=157|isbn=0-945241-06-2}}</ref>

In his book ''[[A Case Against Accident and Self-Organization]]'', [[Dean L. Overman]] builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.<ref name=overman>{{cite book |author=Overman, Dean L|authorlink=Dean L. Overman |editor-last= |title=[[A Case Against Accident and Self-Organization]]|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref> The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.<ref name=overman /> Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.<ref name=overman /> According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.<ref name=overman /><ref group=note>Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 10<sup>40,000</sup>. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10<sup>-44</sup>. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10<sup>-191</sup>. Harold Morrison obtained in his calculations the impressive number of 1 in 10<sup>100,000,000,000</sup> for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: {{cite book |last=Overman|first=Dean L|editor-first= |editor-last= |title=A Case Against Accident and Self-Organization|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref>

Evolutionist [[Michael Denton]] wrote the controversial book "[[Evolution: A Theory in Crisis]]". In his book, writing about the origin of life, Denton states:

{{cquote|The failure to give a plausible evolutionary explanation for the origin of life casts a number of shadows over the whole field of evolutionary speculation.<ref>{{cite book|author=Denton, Michael|authorlink=Michael Denton|title=[[Evolution: A Theory in Crisis]]|page=271|publisher=Adler & Adler|location=Chevy Chase, MD|year=1985|isbn=0-917561-52-X}}</ref>}}

Due to the enormous odds against [[abiogenesis]] on [[earth]] some scientists have turned to the [[panspermia]] hypothesis, the belief that life started off this planet. Among these scientists are Francis Crick, [[Fred Hoyle]], Svante Arrhenius, Leslie Orgel and Thomas Gold.<ref name=overman /><ref>{{cite book|author=Shapiro, Robert|title=Origins: A Skeptic´s Guide to the Creation of Life on Earth|publisher=Bantam Books|location=Toronto|year=1987|page=226-227|isbn=0-553-34355-6}}</ref>

==Problem for evolution==
According to [[Jonathan Sarfati]], the main scientific objection to evolution is not whether changes, whatever their extent, occur through time. The key issue is the huge genetic information content that is needed in order to, starting from the DNA of microbes, reach the complexity of human DNA.<ref>{{cite book|author=Sarfati, Jonathan|authorlink=Jonathan Sarfati|title=[[The Greatest Hoax on Earth?]]:Refuting Dawkins on Evolution|location=Atlanta, Georgia|publisher=Creation Book Publishers|year=2010|page=43|isbn=978-0-949906-73-1}}</ref> [[Dr. Lee Spetner]] points out that in living organisms, adaptation often take place by reducing the information in the genome and notes that the vertebrate eye or its immune system could never have evolved by loss of information alone.<ref>{{cite book|author=Spetner, Lee M|authorlink=Lee Spetner|title=[[Not by Chance!]] |publisher=The Judaica Press|year=1998|location=Brooklyn, New York|page=127-160|chapter=5-Can Random Variation Build Information?|isbn=1-880582-24-4}}</ref>

== Notas ==
<references group="nota"/>

==Referencias==
{{reflist|2}}

==Enlaces externos==
* {{cite journal|author=Lin, Shu-Kun|year=2008|title=Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship|journal=Entropy|volume=10|issue=1|pages=1-5|url=http://www.mdpi.com/1099-4300/10/1/1}}
* {{cite journal|author=Floridi, Luciano|year=2005|title=Is Information Meaningful Data?|journal=Philosophy and Phenomenological Research|volume=70|issue=2|pages=351–370|url=http://philsci-archive.pitt.edu/archive/00002536/01/iimd.pdf}}
* Luciano Floridi, (2005). 'Semantic Conceptions of Information', ''The Stanford Encyclopedia of Philosophy'' (Winter 2005 Edition), Edward N. Zalta (ed.). Available online at [http://plato.stanford.edu/entries/information-semantic/ Stanford University]
* [http://www.lehigh.edu/~dac511/literature/casagrande1999.pdf Information as a Verb: Re-conceptualizing Information for Cognitive and Ecological Models] by David G. Casagrande.
* [http://plato.stanford.edu/entries/information-biological/ Biological Information] at Stanford Encyclopedia of Philosophy
* [http://www.astorehouseofknowledge.info/w/Information Information] at ''A Storehouse of Knowledge''.

[[Categoría: Teoría de la información]]

[[en:Information]]
[[pt:Informação]]
Creationist, administrador
37 993
ediciones