"Kolmogorov complexity and algorithmic information theory" by Alexander Shen
It is usual to measure the amount of information in a message in bits. However, just the number of bits is not a good measure: message in 8-bit/char encoding and 16-bit/char have essentially the same information content even if the second one has twice more bits. More natural approach is to measure the ``compressed size'' of a message, but this depends on the compression technique; there are many compressor algorithms and none of them is ``the right one''. Andrei Kolmogorov and others (R.Solomonoff, and later G.Chaitin) found that one still have reasonably invariant definition of information content (=algorithmic complexity) and there is a rich theory around this notion. It also allows us to address the philosophical question: what is randomness? why we reject the fair coin hypothesis if we see 0101...0101 (1000 alternating bits) but some other sequence of 1000 bit may look plausible as the outcome of coin tossing? (Note that any two bit strings of length 1000 have the same probability if the coin is symmetric and coin tossings are independent).
We will try to discuss both mathematical properties of algorithmic complexity and its use in the foundation of probability theory. Some common sense as a prerequisite will be useful for the second part.
There are currently no items in this folder.