Measuring Information

Chapman, David (2016). Measuring Information. In: Images of Europe Past, Present and Future. ISSEI 2014 – Conference Proceedings (Espina, Yolanda ed.), Universidade Católica Editora, Porto, Portugal, pp. 83–93.



It is easy to look at developments arising from digital technology such as social media, smart phones and tablets, digital photography, and go along with the general perception that there is vastly more information around than ever before. Claiming that there is more information around, however, assumes that it is possible to quantify information, and that in turn seems to assume that we know what information is.
Engineering disciplines have measures for what they call information. Claude Shannon used one in his celebrated work on a mathematical theory of communication which led to the discipline of information theory, and a variation known as algorithmic information theory (AIT) was developed by Andrey Kolmogorov, Ray Solomonoff and Gregory Chaitin.
It has long been debated whether these information theories have relevance to semantic information, with some authors dismissing them on the grounds that they are not about the content of information. However, using the example of the information contained in a school report, this paper shows how information theory can quantify semantic information. Based on the modelling developed by Shannon, it is shown that school reports generated with the assistance of dedicated report-writing computer programmes contain a lot less information than it might appear at first sight, and may contain a lot less information than a hand-written report from the days before digital technologies.

Viewing alternatives

Download history

Item Actions