## Defining “Information”

I’m working on my first draft of *Key Terms in Philosophy of Mind*, a book under contract with Continuum Books. From time to time I’ll be posting draft entries on *Brain Hammer*, especially for controversial or especially difficult to arrive at definitions. Here’s “information”:

**information**, a property of a state or event, X, (a signal) enabling one to infer truths about some state or event Y (where X and Y are usually distinct). Alternately, “information” may be used to refer to the truths about Y that X enables inferences of. The mathematical theory of information (Shannon and Weaver’s “Mathematical Theory of Communication”) provides means for defining amounts of information (such as “bits”) in terms of the number and probability of possible events. Philosophical theories of information strive to define the semantic CONTENT of information, that is, they strive to define not how much information a signal carries but instead what information a signal carries. Various philosophical conceptions of information define signal content in terms of what events are either causally, nomologically, or probabilistically correlated with the occurrence of a signal. The notion of information may be utilized to characterize various mental states such as states of PERCEPTION and MEMORY as information-bearing states: states by which a creature respectively acquires and retains information about its environment. The notion of information has also been used by some philosophers as a basis for understanding INTENTIONALITY and CONTENT (see INFORMATIONAL THEORY OF CONTENT). A further use of information of significance for the philosophy of mind is in characterizations of COMPUTATION as “information processing”.

This entry was posted
on Wednesday, May 28th, 2008 at 11:59 am and is filed under Key Terms in Philosophy of Mind.
You can follow any responses to this entry through the RSS 2.0 feed.
Responses are currently closed, but you can trackback from your own site.

Hi Pete,

what about the widely cited (minimal) definition from Gregory Bateson:

“A ‘bit’ of information is definable as a difference which makes a difference.”

[Bateson, Gregory (1972): Steps to an Ecology of Mind. Chicago: University of Chicago Press 2000, p. 315.]

Would it be too minimal?

Examples would help. This is a tough set of concepts.

The mathematical theory of information (Shannon and Weaver’s “Mathematical Theory of Communication”) provides means for defining amounts of information (such as “bits”) in terms of the number and probability of possible events.This might be touched up a bit. They define the mutual information (aka transmitted information aka transinformation) between X and Y as a measure of statistical dependence between the two signals, a sort of general nonlinear correlation measure that detects any statistical dependence, not just linear correlation but any deviation from independence.

Surprisingly, transinformation also is equivalent to the reduction in the number of bits required to specify X (channel intput) given Y (channel output).

The usual gloss, which you should probably include despite its mushiness is that Shannon information transmission is reduction in uncertainty of one signal given another signal. Uncertainty is entropy (H).

For instance, if X=Y, then maximum information is transmitted about X, since Y completely eliminates uncertainty about X (i.e., H(X|Y)=0). If they are only weakly dependent, you’d get something between 0 and H(X) bits transmitted.

Anyway, there’s my shotgun at Shannon.

Dretske defines it generally as ‘X carries info about Y if you can learn something about Y from X’. This is close to your ‘infer’ formulation. I’d prefer a more neutral description in terms of ideal observers (e.g.,, ‘A machine, given X, could predict the value of Y within such and such degree of accuracy’) but that is likely way beyond the scope of a dictionary def..

I look forward to your paper from the conference, and really wish I could have gone. It looks like a really interesting paper, and that’s saying a lot in the saturated field of qualia studies with its high shit to goodness ratio.

Incidentally, did you see the recent paper on ego-to-object centered coordinate transforms in monkey parietal? It’s here.

Information, in its technical sense, is not a semantic notion; it has nothing to do with inference or truth or aboutness or reference. It’s a measure of how finely a signal divides the number of alternative states of its source. A signal that carries 5 bits of information divides the source into 32 alternatives. For the receiver, this entails a corresponding reduction of uncertainty about the source. Note carefully that information is a quantity, an amount. It’s like size. And thus it is measured by a number, the number of bits, which are binary alternatives.

To be precise: information is a property of a channel; it’s a channel that carries 5 bits of information; thus to talk about information is to talk about the size of the phase space of a channel - the number of possible alternative states of the channel.

When used semantically, that is, when information is confused with content, it’s never a matter of truth or inference. It’s always a matter of probability - namely, conditional probability. The probability that somone is at the door given that the doorbell rings may be very high, but is never 1. To set it to 1 is Dretske’s famous error - a channel for which the conditional probability is 1 is a channel with no noise. It is also a channel in which the relations between source and receiver are logical necessities. But to say the relation between the doorbell ringing and a person being at the door is a matter of logic is obviously false. The relation is contingent. The mathematical (e.g. engineering) analysis of signals makes all this plain.

But probably what I’m writing is irrelevant - my experience is that philosophers have taken this very clear scientific - engineering concept and made a mess out of it. Oh well. You’ll have to do the best you can to make it presentable.

Pat, Eric & Eric,

thanks for your help!

Eric S: entropy has to do not just with the number of states, but the probability distribution defined over those states (number of states does put an upper bound on entropy of course).

We’ve been discussing the point you make about semantics here.

‘Information’ is not univocal. It has a Shannon sense and other senses that Pete is getting at. For instance in ordinary language ‘misinformation’ exists.

Eric T makes a very good point - there are many different meanings of “information”.

It might be useful to break out your entries according to some taxonomy.

Information — in communications theory;

Information– as signal or indicator content;

Information — etc.