_The Information: A History, a Theory, a Flood_ by James Gleick
NY: Pantheon Books, 2011
ISBN 978-0-375-42372-7
(22) He [John Carrington] finally published his discoveries about drums in 1949, in a slim volume titled The Talking Drums of Africa.
(25) Songe, the moon, is rendered as songe li tange la manga - "the moon looks down at the earth." Koko, the fowl, is rendered koko lonogo l bokiokio - "the fowl, the little one that says kiokio." The extra drumbeats, far from being extraneous, provide context. Every ambiguous word begins in a cloud of possible alternative interpretations; then the unwanted possibilities evaporate. This takes place below the level of consciousness. Listeners are hearing only staccato drum tones, low and high, but in effect they "hear" the missing consonants and vowels, too. For that matter, they hear whole phrases, not individual words. "Among peoples who know nothing or writing or grammar, a word per se, cut out of its sounds group, seems almost to cease to be an intelligible articulation." Captain Rattray reported.
The stereotyped long tails flap along, their redundancy overcoming ambiguity. The drum language is creative, freely generating neologisms for innovations from the north: steamboats, cigarettes, and the Christian god being three that Carrington particularly noted. bUt drummers begin by learning the traditional fixed formulas. Indeed, the formulas of the African drummers sometimes preserve archaic words that have been forgotten in the everyday language. For the Yaunde, the elephant is always "the great awkward one." The resemblance to Homeric formulas - not merely Zeus, but Zeus the cloud gatherer; not just the sea, but the wine-dark sea - is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne.
NB: Alfred Lord, Singer of Tales
(39) Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures: for example, for geometrical shapes. Shown drawings of circles and squares, they named them as "plate, sieve, bucket, watch, or moon" and "mirror, door house, apricot drying board." They could not, or would not, accept logical syllogisms. A typical question:
In the Far North, where there is snow, all bears are white. Novaya Zembla is in the Far North and there is always snow there. What color are the bears?
Typical response: "I don't know. I've seen a black bear. I've never seen any others... Each locality has its own animals."
By contrast, a man who has just learned to read and write responds, "To go by your words, they should all be white." To go by your words - in that phrase, a level is crossed.
(166) "A word is a tool for thinking, before the thinker uses it as a signal for communicating his thought," asserted in Harper's New Monthly Magazine in 1873.
(229) Looking at correlations extending over eight letters, [Claude] Shannon estimated that English has a built-in redundancy of about 50%: that each new character of a message conveys not 5 bits but only about 2.3. Considering longer-range statistical effects, at the level of sentences and paragraphs, he raised that estimate to 75% - warning, however, that such estimates become "more erratic and uncertain, and they depend more critically on the type of text involved." One way to measure redundancy was crudely empirical: carry out a psychology test with a human subject. This method "exploits the fact that anyone speaking a language possesses, implicitly, an enormous knowledge of the statistics of the language."
(247) Information is surprise.
NB: Surprise is an attitude. It does not have to be novelty.
(266) Later, he [Claude Shannon] wrote thousands of words on scientific aspects of juggling - with theorems and corollaries - and included from memory a quotation from E. E. Cummings: "Some son-of-a-bitch will invent a machine to measure Spring with."
(280) To the physicist, entropy is a measure of uncertainty about the state of a physical system: one state among all the possible states it can be in. These microstates may not be equally likely, so the physicist writes S = -∑ p sub i log p sub i.
To the information theorist, entropy is a measure of uncertainty about a message: one message among all the possible mesages that a communications source can produce. The possible messages may not be equally likely, so Shannon wrote H = -∑ p sub i log p sub i
It is not just a coincidence of formalism: nature providing similar answers to similar problems, It is all one problem, to reduce entropy in a box of gas, to perform useful work, one pays a price in information. Likewise, a particular message reduces the entropy in the ensemble of possible messages - in terms of dynamical systems, a phase space.
(298) Crick's Central Dogma: "Once 'information' has passed into protein it cannot get out again. In more detail, the transfer of information from nucleic acid to nucleic acid, or from nucleic acid to protein may be possible, but transfer from protein to protein, or from protein to nucleic acid is impossible. Information means here the precise determination of sequence."
(337) The Kolmogorov complexity of an object is the size, in bits, of the shortest algorithm needed to generate it. This is also the amount of information. And it is also the degree of randomness - Kolmogorov declared "a new conception of the notion 'random' corresponding to the natural assumption that randomness is the absence of regularity." The three are fundamentally equivalent: information, randomness, and complexity - three powerful abstractions, bound all along like secret lovers.
(343) As [Gregory] Chaitin put it, "God not only plays dice in quantum mechanics and nonlinear dynamics, but even in elementary number theory."
Among its lessons were these:
Most numbers are random. Yet every few of them can be _proved_ random.
A choice steam of information may yet hide a simple algorithm. Working backward from the chaos to the algorithm may be impossible.
Kolmogorov-Chaitin (KC) complexity is to mathematics what entropy is to thermodynamics: the antidote to perfection. Just as we can have no perpetual-motion machines, there can be no complex formal axiomatic systems.
Some mathematical facts are true for no reason. They are accidental, lacking a cause or deeper meaning.
(361) On the contrary, it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible operation, he argued, increases entropy.
(362) In every case, Bennett found, heat dissipation occurs only when information is _erased_. Erasure is the irreversible logical operation. When the head on a Turing machine erases one square of the tape, or when an electronic computer clears a capacitor, a bit is lost, and _then_ heat must be dissipated. In Szilard's thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next.
Forgetting takes work.
(365) It [qubit] is not just either-or. Its 0 and 1 values are represented by quantum states that can be reliably distinguished - for example, horizontal and vertical polarizations - but coexisting with these are the whole continuum of intermediate states, such as diagonal polarizations, that lean toward 0 or 1 with different probabilities. So a physicist says that a qubit is a _superposition_ of states; a combination of probability amplitudes. It is a determinate thing with a cloud of indeterminacy living inside. But the qubit is not a muddle; a superposition is not a hodgepodge but a combining of probabilistic elements according to clear and elegant mathematical rules.
NB: Buddhist logic: yes, no, not yes, not no, neither yes nor no, both yes and no. I have discovered that when you use Buddhist logic as the responses in a poll, readers like don't understand the question? and none of the above.
(369) In quantum computing, multiple qubits are entangled. Putting qubits at work together does not merely multiply their power; the power increases exponentially. In classical computing, where a bit is either-or, n bits can encode any one of 2 to the n values. Qubits can encode these Boolean values along with all their possible superpositions. This gives a quantum computer a potential for parallel processing that has no classical equivalent.
(395) A gigabyte also encompasses the entire human genome. A thousand of those would fill a terabyte.
(399) Elizabeth Eisenstein The Printing Press as an Agent of Change
(477) Aunger, Robert, ed. Darwinizing Culture: The Status of Memetics as a Science. Oxford: Oxford University Press, 2000
(479) Bikhchandani, Sushil, David Hirshleifer, and Ivo Welch. "A theory of Fads, Fashion, Custom, and Cultural Change as Informational Cascades." Journal of Political Economy 100, no. 5 (1992); 992-1026.
(487) Goody, Jack. The Interface Between the Written and the Oral. Cambridge: Cambridge University Press, 1987.
(492) Lynch, Aaron. Thought Contagion: How Belief Spreads Through Society. New York: Basic Books, 1996.
(495) Ong, Walter. Interfaces of the Word. Ithaca, NY: Cornell Unviersity Press, 1977
Ong Walter. Orality and Literacy: The Technologizing of the Word. London: Methuen, 1982
No comments:
Post a Comment