Cambios

De CreacionWiki
Saltar a: navegación, buscar

Información

2829 bytes añadidos, 15:00 27 ago 2013
sin resumen de edición
{{traducción}}
[[File:HOT word in sand.jpg|thumb|200px|"HOT" written in the sandescrito en la arena. Someone left this information in the sand with some purposeAlguien dejó esta información en la arena con un propósito.]]'''InformationInformación''' is a term probably derived from the stem es un término derivado probablemente de la raíz {{Latin NameNombre Latino|information-}} which in turn is derived from the que a su vez se deriva del verbo [[Latinlatino]] verb {{Latin Name2|informare}} meaning "to give form to the mind", "to discipline", "instruct", "teach".
Information is generally understood as knowledge or facts that one has acquired, but in some areas of science, information can have slightly different definitions, although we do not have a precise definition of '''information''', of what is or what is not information.<ref>{{cite web|url=http://www.ime.usp.br/~is/ddt/mac333/aulas/tema-11-24mai99.html|title=MAC 333 - A Revolução Digital e a Sociedade do Conhecimento - Tema 11 - O que é Informação? Como ela age?|author=Simon, Imre|trans_title=MAC 333 - The Digital Revolution and the Knowledge Society - Track 11 - What is Information? How it works?|accessdate=July 31, 2013}}</ref><ref group=note>[[Wikipedia:Imre Simon|Dr. Imre Simon]], who made this reference, was a well-known Hungarian-born Brazilian mathematician and computer scientist.</ref> What we have is rather a vague and intuitive notion.
==Quantifying information==
David Salomon states: "Information seems to be one of those entities that cannot be precisely defined, cannot be quantified, and cannot be dealt rigorously".<ref name=Coding>{{cite book|author=Salomon, David|title=Coding for Data and Computer Communications|publisher=Springer|location=New York|year=2005|page=59|isbn=0-387-21245-0}}</ref> Salomon went on to say, however, that in the field of information theory, information can be treated quantitatively.<ref name=Coding /> In ''A Mathematical Theory of Communication'' Shannon endows the term ''information'' not only with technical significance, but also for measure.<ref name=solomon>{{Cite book|author=Salomon, David|title=Data Compression: The Complete Reference|publisher=Springer|location=New York|page=279|year=2000|edition=2nd|isbn=0-387-95045-1}}</ref> Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.<ref name=Sayood>{{cite book|author=Sayood, Khalid|title=Introduction to Data Compression|publisher=Morgan Kaufmann Publishers|location=San Francisco|year=2000|edition=2nd|page=13-14|isbn=1-55860-558-4}}</ref> The self-information, denoted by ''i'', associated with an event A is given by:
===Shannon entropy===In ''A Mathematical Theory of Communication'' Shannon endows the term ''information'' not only with technical significance, but also for measure.<ref name=solomon>{{Cite book|author=Salomon, David|title=Data Compression: The Complete Reference|publisher=Springer|location=New York|page=279|year=2000|edition=2nd|isbn=0-387-95045-1}}</ref> Shannon theorized the idea of quantitative measure of information and defined a quantity called self-information.<ref name=Sayood>{{cite book|author=Sayood, Khalid|title=Introduction to Data Compression|publisher=Morgan Kaufmann Publishers|location=San Francisco|year=2000|edition=2nd|page=13-14|isbn=1-55860-558-4}}</ref> The self-information, denoted by ''i'', associated with an event A is given by:  :<big>i(A) = -log<sub>''b''</sub>''P''(A)</big>
where ''P''(A) is the probability that the event A will occur and ''b'' is the chosen base of the log. If the unit of information is bits, we use ''b''=2 and so on.<ref name=Sayood />
The measurement of information, in mathematical terms, has to consider the number of signals, their probability, and combinatorial restrictions.<ref name=semiotics /> The amount of information transmitted by a signal increases the more it is their rarity, and the more frequent is a signal, less information it transmits.<ref name=semiotics /> Is worth noting that while we can quantify the probability of any given symbol, we can use no absolute number for the information content of a given message.<ref>{{cite book|author=Nelson, Mark; Gailly, Jean-Loup|title=The Data Compression Book|edition=2nd|publisher=M&T Books|location=New York|year=1996|page=14|isbn=1-55851-434-1}}</ref>
 
===Chaitin-Kolmogoroff theory===
Another way of measuring information content is the Kolmogorov complexity (also known as Chaitin information). The Kolmogorov complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (like the Turing machine). Let ''x'' be a binary string and let d(''x'') be the shortest string <M, ''i''> achieved by concatenating a Turing machine M and an input ''i'' to which the Turing machine halts, leaving the string ''x'' on the tape.<ref name=Sipser>{{cite book|author=Sipser, Michael|title=Introduction to the Theory of Computation|publisher=PWS Publishing Company|location=Boston|year=1997|page=214-215|isbn=0-534-94728-X}}</ref> The Kolmogorov complexity K(''x'') is:
 
:<big>K(''x'') = |d(''x'')|</big>
 
that means, the Kolmogorov complexity is the length of the minimal description of ''x''.<ref name=Sipser /> The complexity can be viewed as the measure of the "patternlessness" of the sequence, and can be equate with the idea of randomness.<ref>{{cite book|author=Knuth, Donald|title=The Art of Computer Programming: Seminumerical Algorithms|volume=2|edition=2nd|page=163-166|publisher=Addison-Wesley Publishing Company|location=Reading, Massachusetts|year=1981|isbn=0-201-03822-6}}</ref> The length of the shortest description will depend on the choice of description language. By way of illustration we compare two strings:
 
"CREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKICREATIONWIKI"
 
and
 
"7W7JAHAGLKJHGBNMZCXVSFQP92725FFADSHALKJNMZAQWSXPLÇKJHGTRFOUMSVAXZXCTEÇALSKDJFHGBEOQI"
 
Both strings have the same number of letters but the former can be represented in a more compact way: "7 x 'CREATIONWIKI". Another way to represent the first sequence is using a language such as this Pascal-like:
 
<code>
for i:=1 to m<br />
:write('CREATIONWIKI');
</code>
 
This program contains 34 ASCII characters (counting the blanks and new line) plus 1 character (the parameter: 7) (the value of the variable ''m'' in this particular case). For other values of ''m'' the program length in characters will be 34 + log ''m''. One way to measure the randomness of the former sequence is to form the ratio of program length to string length.<ref>{{cite book|author=Dewdney, A. K|title=The New Turing Omnibus: 66 Excursions in Computer Science|chapter=8-Random Numbers: The Chaitin-Kolmogoroff Theory|page=49-55|publisher=A. W. H. Freeman/Owl Books|location=New York|year=1993|isbn=978-0-8050-7166-5}}</ref> This results in the measure of the randomness of:
 
:''r'' ≤ (34 + log ''m'')/(12''m'')
 
Similarly to the previous case (Shannon), the Kolmogorov complexity can't measure the meaning of information. In fact, it measures the compressibility of a given sequence.
==Spontaneous appearance ==
Creationist, administrador
37 993
ediciones