1

If I read a book I can see with my inner eye (or ear) see the plot developing. With a random string of letters, with random spacings, this isn't the case. Then why contains the former less information than the latter?

  • Related: http://physics.stackexchange.com/q/263197/44126 – rob Jan 26 '17 at 08:55
  • 3
    Mathematically, you have to look at the distribution of all possible books (entropy is a function defined on statistical distributions, not statistical samples). Physically/practically, try compressing it in a .zip file! A written word text document will be compressed a lot. The random character file will barely be compressed at all. Hence the latter has more information, in the sense that it takes more bits to represent. –  Jan 26 '17 at 08:57
  • Google "Kolmogorov complexity" (or "algorithmic complexity", and note that G.Chaitin has written lots about this) for much additional info along the lines described here. And note that @NeuroFuzzy seems to have, very logically, misread your question which explicitly asks why the non-random string contains more info. And actually, you're wrong, the random string contains more, like everybody said. But the combination (maybe tensor product) of the string's info with info already in your mind is what counts here. And the non-random string, acting as "input", generates much more "output". –  Jan 26 '17 at 10:37
  • Your title is the exact opposite of the question in the body: "[...]less information in a written book" vs "[...]the former [book] more information than the letter [random string]". Please fix this. 2. How is this a physics question? It appears purely information-theoretic in nature, with no direct physical connection. 3. You need to carefully distinguish between the colloquial meaning of "information" and the technical meaning of information given by measures like Shannon entropy or Kolmogorov complexity. Please give the definition of "information" you are using here.
  • – ACuriousMind Jan 26 '17 at 14:15
  • @ACuriousMind Hmm ... understanding that entropy is a feature of the (possibly constrained) ensemble of possible microstates representing a macrostate rather than of the specific microstate in front of you at this moment is as important in statistical physics as in information theory. Admittedly the OP hadn't framed in that fashion, but if I was to answer this question it is the approach I would take. – dmckee --- ex-moderator kitten Jan 26 '17 at 17:50