Given by Professor Yuri Manin, Professor Emeritus, Max Planck Institute for Mathematics, Bonn, Germany; Professor Emeritus, Northwestern University, Evanston, USA; Principal Researcher, Steklov Mathematical Institute, Academy of Sciences, Moscow, Russia.

In the 1930s, George Kingsley Zipf discovered an empirical statistical law that later proved to be remarkably universal. Consider a corpus of texts in a given language, make the list of all words that occur in them and the number of occurences. Range the words in the order of diminishing frequencies. Define the Zipf rank of the word as its number in this ordering. Then Zipf's Law says: "Frequency is inversely proportional to the rank". Zipf himself suggested that this law must follow from the principle of 'minimisation of effort' by the brain. However, the nature of this effort and its measure remained mysterious. In my lecture, I will argue that Zipf's effort needed to produce a word (say, name of the number) must be measured by the celebrated Kolmogorov complexity: the length of the shortest Turing program (input) needed to produce this word/name/combinatorial object/etc. as its output. I will describe basic properties of the complexity (some of them rather counterintuitive) and one more situation from the theory of error-correcting codes, where Kolmogorov complexity again plays the role of 'energy in the world of ideas'.