Cambios

De CreacionWiki
Saltar a: navegación, buscar

Información

14 bytes eliminados, 16:10 1 sep 2013
sin resumen de edición
{{traducción}}
[[File:HOT word in sand.jpg|thumb|200px|"HOT" escrito en la arena. Alguien dejó esta información en la arena con un propósito.]]
'''Información''' es un término derivado probablemente de la raíz {{Nombre Latino|information-}} que a su vez se deriva del verbo [[latino]] {{Latin Name2|informare}} meaning "to give form to the mind", "to discipline", "instruct", "teach". Information is generally understood as knowledge or facts that one has acquired, but in some areas of science, information can have slightly different definitions, although we do not have a precise definition of '''information''', of what is or what is not information.<ref>{{cite web|url=http://www.ime.usp.br/~is/ddt/mac333/aulas/tema-11-24mai99.html|title=MAC 333 - A Revolução Digital e a Sociedade do Conhecimento - Tema 11 - O que é Informação? Como ela age?|author=Simon, Imre|trans_title=MAC 333 - The Digital Revolution and the Knowledge Society - Track 11 - What is Information? How it works?|accessdate=July 31, 2013}}</ref><ref group=notenota>[[Wikipedia:Imre Simon|Dr. Imre Simon]], who made this reference, was a well-known Hungarian-born Brazilian mathematician and computer scientist.</ref> What we have is rather a vague and intuitive notion.
The word "information" is used in many ways. We mentioned the lay person's sense above, but it is also used of a sequence of symbols (such as letters of a language (see picture at right), dots and dashes of Morse code, or the arrangement of the bumps of Braille) that convey meaning.
It is mainly the last two of those senses that are discussed in this article.
Royal Truman, in his analysis published in the ''[[Journal of Creation]]'' discusses two families of approaches: The one derived from Shannon's work and the other derived from [[Werner Gitt|Gitt]]'s work.<ref name=truman1>{{cite journal|title=Information Theory-Part 1:Overview of Key Ideas|author=Truman, Royal|year=2012|journal=[[Journal of Creation]]|volume=26|issue=3|pages=101-106|issn=1036-2916}}</ref> Truman also mention the algorithm definition of information, developed by Solomonoff, Kolmogorov and with contributions from [[Gregory Chaitin]] but that has not been discussed in his article.<ref name=truman1 /> According to [[Stephen Meyer]], scientists usually distinguish two basic types of information: meaningful or functional information and so-called Shannon information<ref name=doubt>{{cite book|author=Meyer, Stephen C|authorlink=Stephen Meyer|title=Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design|publisher=HarperOne|/HarperCollins Publishers|location=Seattle, WA|year=2013|page=164-168|isbn=978-0-06-207147-7}}</ref> (named after Claude Shannon, who developed statistical information theory). Shannon information is really not the same thing as meaningful information. Meaningful information, encoded into a language, can be measured statistically, per Shannon, but the measure is of the redundancy of the symbols, the so-called Shannon entropy,<ref group=notenota>Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content.</ref> not a measure of the "information content", or meaning. Shannon, for example, can be uses to measure the "information content" of a set of random symbols that have no meaning.
==Definitions or Characterizations ==
Gregory Bateson;
{{cquote|Information is a difference which makes a difference.<ref>{{cite web|url=http://www.wired.com/wired/archive/2.03/economy.ideas_pr.html||title=The Economy of Ideas: A framework for patents and copyrights in the Digital Age. (Everything you know about intellectual property is wrong.)|author=Barlow, John Perry|accessdate=July 31, 2013}}</ref>}}
Valdemar W.Setzer<ref group=notenota>Dr. Valdemar W.Setzer is a well-known Brazilian computer scientist. He is is a signatory to the list named "[[A Scientific Dissent From Darwinism]]". Found in: {{cite web|url=http://www.discovery.org/scripts/viewDB/filesDB-download.php?command=download&id=660|title=A Scientific Dissent from Darwinism (List)|accessdate=July 31, 2013}}</ref>
{{cquote|Information is an informal abstraction (that is, it cannot be formalized through a logical or mathematical theory) which is in the mind of some person in the form of thoughts, representing something of significance to that person. Note that this is not a definition, it is a characterization, because "mind", "thought", "something", "significance" and "person" cannot be well defined. I assume here an intuitive (naïve) understanding of these terms.<ref name=setzer>{{cite web|url=http://www.ime.usp.br/~vwsetzer/data-info.html|author=Setzer, Valdemar W.|title=Data, Information, Knowledge and Competence|accessdate=July 31, 2013}}</ref>}}
Wikipedia;
|-
|5
|Apobetics<ref group=notenota>The term "apobetics" was introduced by Dr. Werner Gitt in 1981 to express the teleological aspect of information, the question of purpose. The word is derived from the {{Greek Name|αποβαίνων|apobeinon}} that means result, success, conclusion, consequence. Found in {{cite book|author=Gitt, Werner|title= In the Beginning was Information: A Scientist Explains the Incredible Design in Nature|year=2005|publisher=Master Books|location=Green Forest, AR|page=77|isbn=978-0-89051-461-0}}</ref>
|Intended purpose
|Achieved result
==Spontaneous appearance ==
[[File:Ultimate Sand Castle.jpg|thumb|200px|A very impressive sand castle on the beach near St Helier in Jersey.]]
Manfred Eigen, Bernd Olaf-Küppers and John Maynard Smith and many other biologists have stated that the origin of information is biology's central problem.<ref name=uncensored /> Some, like Manfred Eigen argue that it is possible for the spontaneous, stochastic, emergence of information out of chaos.<ref name=wilder>{{cite book|author=Wilder-Smith, A. E.|authorlink=A. E. Wilder-Smith|title=The Scientific Alternative to Neo-Darwinian Evolutionary Theory|publisher=The Word For Today Publishers|location=Costa Mesa, California|year=1987|page=23-51|isbn=0-936728-1B}}</ref> In his book ''Steps Towards Life'' Eigen states what he regards as the central problem faced in origins of life research: "Our task is to find an algorithm, a natural law that leads to the origin of information".<ref name=dembski1>{{cite book|author=Dembski, William A|authorlink=William Dembski|title=Intelligent Design:The Bridge Between Science & Theology|publisher=IVP Academic|location=Downers Grove, Illinois|year=1999|page=153-183|isbn=0-8308-2314-X}}</ref><ref group=notenota>The Eigen's book quoted by Dembski is {{cite book|author=Eigen, Manfred|title=Steps Towards Life: A Perspective on Evolution|location=Oxford|publisher=Oxford University Press|year=1992|page=12|isbn=0-19854751-X}}</ref> [[A. E. Wilder-Smith]], in contrast, states that
{{cquote|If information, like entropy were to arise stochastically, then the basis of Shannon and Wiener´s definition of information would be fundamentally and thoroughly destroyed.<ref name=wilder />}}
Wilder-Smith establishes a distinction between actual and potential information. The former can never be synthesized by stochastic processes, while the latter might be. He establishes a comparison between actual information and negentropy<ref group=notenota>Negentropy, also negative entropy or syntropy of a living system is the entropy that it exports to keep its own entropy low.</ref> and, on the other side, a correspondence between potential information and entropy.<ref name=wilder /> Wilder-Smith proposes a simple example that clarifies the distinction between potential and actual information. The potential to make pictures out of a large amount of randomly distributed dots is infinite although a set of randomly distributed dots will not show in reality an image that looks like something (e.g., a bicycle). The points randomly distributed do possess the capacity for endless amounts of information, but do not communicate any by themselves, so, indeed, there's no actual information.<ref name=wilder />
Lester and Bohlin also agree with Wilder-Smith. They point out that several authors in recent years have established a connection between the genetic code present in DNA and information theory. The overall conclusion of their studies is that information cannot arise spontaneously by mechanistic processes.<ref>{{cite book|author=Lester, Lane P; [[Raymond Bohlin|Bohlin, Raymond G]]|title=[[The Natural Limits to Biological Change]]|location=Dallas|publisher=Probe Books|edition=2nd|year=1989|page=157|isbn=0-945241-06-2}}</ref>
In his book ''[[A Case Against Accident and Self-Organization]]'', [[Dean L. Overman]] builds a compelling case that life is no accident. It is not possible to bring the entire argument of the book here. Overman poses that a central distinction between living and non-living matter is the existence of a genome or a composite of genetic messages which carry enough information content to replicate and maintain the organism.<ref name=overman>{{cite book |author=Overman, Dean L|authorlink=Dean L. Overman |editor-last= |title=[[A Case Against Accident and Self-Organization]]|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref> The information contained in the genetic code, like any information or message, is not made of matter. The meaning of the genetic code can not be reduced to a physical or chemical property.<ref name=overman /> Information content is the minimum number of instructions necessary to specify the structure and, in living systems, information content requires an enormous amount of specified instructions.<ref name=overman /> According to Overman, many have proposed calculations for the probability of complex organic compounds such as enzymes, proteins or DNA molecules emerge by chance. Many have concluded that this probability is extremely low, virtually an impossibility.<ref name=overman /><ref group=notenota>Sir Fred Hoyle and Chandra Wickramasinghe calculated the probability of appearance of the different enzymes forming in one place at one time to produce a single bacterium in 1 in 10<sup>40,000</sup>. Hubert Yockey calculated the probability for the appearance of the iso-l-cytochrome c at random as being 2 x 10<sup>-44</sup>. Walter L. Bradley and Charles B. Thaxton calculated the probability of a random formation of amino acids into a protein as being 4.9 x 10<sup>-191</sup>. Harold Morrison obtained in his calculations the impressive number of 1 in 10<sup>100,000,000,000</sup> for a single celled bacterium to develop from accidental or chance processes. As quoted by Overman in the book: {{cite book |last=Overman|first=Dean L|editor-first= |editor-last= |title=A Case Against Accident and Self-Organization|publisher=Rowman & Littlefield Publishers|location=Lanham|year=1997|page=33-102|isbn=0-8476-8966-2}}</ref>
Evolutionist [[Michael Denton]] wrote the controversial book "[[Evolution: A Theory in Crisis]]". In his book, writing about the origin of life, Denton states:
<references group="nota"/>
==Referencias=={{reflist|2Referencias}}
==Enlaces externos==
Creationist, administrador
37 993
ediciones