Information
Adriaans (Pieter)
Source: Stanford Encyclopaedia of Philosophy, 2012
Paper - Abstract

Paper SummaryNotes Citing this PaperText Colour-Conventions


Author’s Introduction1

  1. Philosophy of Information deals with the philosophical analysis of the notion of information both from a historical and a systematic perspective. With the emergence of empiricist theory of knowledge in early modern philosophy, the development of various mathematical theories of information in the 20th century and the rise of information technology, the concept of ‘information’ has conquered a central place in the sciences and in society. This interest also led to the emergence of a separate branch of philosophy that analyzes information in all its guises. Information has become a central category in both the sciences and the humanities and the reflection on information influences a broad range of philosophical disciplines varying from logic to ethics and aesthetics to ontology.
  2. The term ‘information’ in colloquial speech is currently predominantly used as an abstract mass-noun used to denote any amount of data, code or text that is stored, sent, received or manipulated in any medium. The detailed history of both the term ‘information’ and the various concepts that come with it is complex and for the larger part still has to be written. The exact meaning of the term ‘information’ varies in different philosophical traditions and its colloquial use varies geographically and over different pragmatic contexts. Although an analysis of the notion of information has been a theme in Western philosophy from its early inception, the explicit analysis of information as a philosophical concept is recent, and dates back to the second half of the 20th century. Historically the study of the concept of information can be understood as an effort to make the extensive properties of human knowledge measurable.
  3. In the 20th century various proposals for formalization of concepts of information were made:
    1. Fisher information (1925): the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends.
    2. Shannon information (1948): the entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X.
    3. Kolmogorov complexity (1965): the information in a binary string x is the length of the shortest program p that produces x on a reference universal Turing machine U.
    4. Quantum Information: The qubit is a generalization of the classical bit and is described by a quantum state in a two-state quantum-mechanical system, which is formally equivalent to a two-dimensional vector space over the complex numbers (Von Neumann 1955, etc.).
    5. Information as a state of an agent: the formal logical treatment of notions like knowledge and belief … questions and answers, or … general messaging. Also Dunn seems to have this notion in mind when he defines information as “what is left of knowledge when one takes away believe, justification and truth”.
    6. Semantic Information: defined as well-formed, meaningful and truthful data. Formal entropy based definitions of information do not imply well-formedness or truthfulness.
  4. The first four concepts are quantitative, the last two qualitative. These proposals can roughly be classified in terms of the nature of the definiens:
    1. Probability in the case of Fisher and Shannon Information,
    2. Computation in the case of Kolmogorov complexity,
    3. Quantum mechanics in the case of quantum information,
    4. True beliefs as the core concept of Semantic Information, whereas
    5. Information states of agents seem to correlate with the formal notion propositions2 that not necessarily have to be true.
  5. The philosophical interpretation of the definiendum ‘Information’ naturally depends on the views one holds about the definiens. Until recently the possibility of a unification of these theories was generally doubted but in the past decade conversions and reductions between various formal models have been studied. The situation that seems to emerge is not unlike the concept of energy: there are various formal sub-theories about energy (kinetic, potential, electrical, chemical, nuclear) with well-defined transformations between them. Apart from that, the term ‘energy’ is used loosely in colloquial speech. There is no consensus about the exact nature of the field of philosophy of information. Some authors present ‘Philosophy of Information’ as a completely new development with a capacity to revolutionize philosophy per se. Others see it more as a technical discipline with deep roots in the history of philosophy and consequences for various disciplines like methodology, epistemology and ethics.

Contents
  1. Information in colloquial speech
  2. History of the term and the concept of information
    → 2.1 Classical philosophy
    → 2.2 Medieval philosophy
    → 2.3 Modern philosophy
    → 2.4 Historical development of the meaning of the term ‘information’
  3. Building blocks of modern theories of information
    → 3.1 Languages
    → 3.2 Optimal codes
    → 3.3 Numbers
    → 3.4 Physics
    → 3.5 Logic
  4. Developments in philosophy of Information
    → 4.1 Popper: Information as degree of falsifiability
    → 4.2 Shannon: Information defined in terms of probability
    → 4.3 Solomonoff, Kolmogorov, Chaitin: Information as the length of a program
    → 4.4 Applications
  5. Conclusion

Comment:

First published Fri Oct 26, 2012



In-Page Footnotes

Footnote 1: Footnote 2:

Text Colour Conventions (see disclaimer)

  1. Blue: Text by me; © Theo Todman, 2018
  2. Mauve: Text by correspondent(s) or other author(s); © the author(s)



© Theo Todman, June 2007 - May 2018. Please address any comments on this page to theo@theotodman.com. File output:
Website Maintenance Dashboard
Return to Top of this Page Return to Theo Todman's Philosophy Page Return to Theo Todman's Home Page