The Contents of Consciousness

Oxford Companion to Consciousness

Susanna Siegel * June 2006


Of all the mental states that humans have, only some of them are conscious states. Likewise, of all the information processed by humans can have, only some of it is processed consciously. What are the contents of consciousness? This entry provides an overview of two approaches to this question. The first approach asks what kinds of states can be conscious. The second approach asks what information conscious mental states convey to the subject who has them. Both questions are central to the philosophical and scientific study of consciousness.


Kinds of conscious mental states

When there is something it is like to be in a mental state, that mental state has a phenomenal aspect or phenomenal character to it. Pain, for instance, has a distinctive phenomenal aspect or character: it is painful.  For the purposes of this entry, conscious states will be those states with phenomenal character.

For each of kind of conscious state, it is an open question whether it is essentially conscious, or whether it can exist without phenomenal character. In some cases it seems plain that they can. For instance, it is clearly possible to feel frightened, as when one can be struck by palpable bout of fear. But there are also undoubtedly unconscious fears. Other cases are less clear. For instance, one might think that while there could be an unfelt fear, there could be no such thing as an unfelt pain. Yet some people who have taken anesthetics report having pains that do not hurt, which suggests that at least some conscious aspects of pain are not essential to a state’s being a state of pain.

What kinds of mental states can be conscious? Some clear cases of conscious mental states include:


·            perceptual experiences in the modalities of vision, audition, gustation, olfaction and tactile perception

·            bodily sensations, including pain, nausea, and awareness of one’s own body’s position and movement


There are plenty of other examples of conscious states, but the ones listed above share the feature of each having a distinctive phenomenal character - one that is not derivative from any other kind of mental state. Consider seeing. There is something it is like to see things, as opposed to hearing them or feeling them. Visual experience seems to have its own distinctively visual phenomenal character, which does not derive from, say, bodily sensations, or moods, or mental imagery.

            Which other conscious mental states are like visual experience in having distinctive phenomenal character, and which of them have a merely derivative phenomenal character? This is a matter of controversy. Consider, for example:


·            emotions

·            imagery [see Imagery, Philosophical]

·            thoughts: cognitive (as opposed to perceptual) states, such as beliefs, deciding or willing


The eighteenth-century empiricist philosopher David Hume held a systematic composition thesis: he thought that all mental states, conscious and otherwise, were ultimately built up out of primitive sensory ingredients, which he called ‘impressions’.  This view denies that the fundamental contents of consciousness include members of the second list above, since fundamentally, on this view, the only contents of consciousness are impressions.

Even if Hume’s systematic view turns out to be false, there might be more local cases of composition. Are any or all of the emotions just bodily sensations? Are any or all emotions partly composed of beliefs?  For example, one might fear going to see the dentist only if one believes that something bad will or might happen. One position says that there is nothing to emotion besides its perceptual or bodily aspects (Prinz 2004). A more liberal position allows non-bodily, non-perceptual emotional phenomenology.

In the case of visual imagery, it is controversial whether imagistic phenomenology is parasitic on the phenomenology of seeing (analogous controversies apply to the other sensory modalities). In the case of conscious thoughts, it is controversial whether their phenomenal character is distinctive, as opposed to parasitic on the phenomenal character of other states. Suppose you suddenly think to yourself, “Oh no! I’ve left my keys in the movie theater! I must go and get them.” According to one position, the phenomenal character of this thought is parasitic on acoustic or visual imagery (for example, you picture your keys on a seat in the theater, or you hear yourself saying the sentence ‘oh no... etc.’), or of emotion (for example, you may feel a pang of panic), or some combination. More generally, this position says that the phenomenal character of conscious thought is always exhausted by other kinds of phenomenal character. (See Robinson 2006). The opposed position says that the kind of phenomenal character had by conscious states is its own kind. Analogous positions could be formulated for conscious desires.

A related controversy about conscious thought concerns the relation between their phenomenal character (or equivalently, how it feels from the inside to have a conscious thought), and the conscious thought itself. Return to the case where you suddenly think to yourself, “Oh no! I’ve left my keys in the movie theater! I must go and get them.” Would anyone who felt from the inside exactly like you do when you think this thought likewise be thinking that very thought – as opposed to thinking some other thought, or thinking nothing at all? Conversely, if someone were thinking that exact thought, would there have to be anything in common between the way that they feel and the way that you feel? One position on these questions is that conscious thoughts co-vary with their phenomenal character, so that someone is consciously thinking that p just in case they have that phenomenal character. An opposed position allows that when you consciously think a thought, your thought has a phenomenal character that could in principle be had by a mental state that wasn’t a thought. This latter position goes naturally with the view that the phenomenal character of this conscious thought is parasitic on panic, since panic is a feeling you can have even without thinking that specific thought. (for further discussion, see Horgan and Tienson 2002, Robinson 2006, Siewert 1998, Strawson 1996). 

So far, we’ve been focusing on mental states that clearly can be conscious, and asking which of those have a distinctive phenomenal character. We can also step back and ask which mental states can be conscious in the first place. Consider background states, such as:


·            moods, such as depression

·            alertness

·            intoxication

·            hypnotic state


Moods, such as depression, seem to remain in the background of attention, unless you attend to them. It is not obvious whether background states are distinctive sorts of conscious states. We associate alertness, for example, with a disposition to acquire information from the environment. Depression - a mood - is associated with a disposition to expect the worst, and to feel sad. But should alertness or depression be identified with their associated dispositions? If so, then perhaps they are not themselves conscious states after all: perhaps the phenomenal character in the vicinity just attaches to specific instances of fulfilling the disposition, e.g., to specific occurrences of pessimism or sadness in the case of depression, or to specific acquisitions of information in the case of alertness. Or perhaps the dispositions are implemented by imagery or other phenomenal mechanisms, without being identical with them. One could ask the same questions about intoxication and the disposition to act more boldly than usual. 

The state of being hypnotized raises a slightly different controversy. Does hypnosis consists in the operation of regular psychological processes? If so, then it can be wholly accounted for by a theory of how those processes interact with the hypnotized subject’s expectations. Alternatively, hypnosis involves the operation of non-standard psychological processes, leading to a truly altered state of consciousness [see Hypnosis]. If hypnosis involves an altered state of consciousness, then it would be something over and above the mental states that are uncontroversially conscious.

So far, we’ve discussed which kinds of mental states can be conscious. We can also ask how simultaneous contents of consciousness are related to one another. For instance, you might hear the sounds of a truck groaning by, at the same time as you feel the rain coming down on one’s umbrella. These auditory and tactile sensations can both be within the same focus of attention. But are all conscious states that a subject has at a time necessarily unified? It might seem impossible to enjoy two conscious states at the same time, without experiencing them together, but people with ‘split brains’ challenge this assumption. [See split-brains].

A potentially different sort of unity is temporal unity among experiences. The early-twentieth century philosopher Edmund Husserl discussed the way that the sounds of a series of musical notes could be heard as unified into a single melody, as opposed to a series of motley sounds is heard simply as a succession. His notion of retention was meant to capture a special way a succession of inputs may nonetheless seem to be present all at once, as a whole. (See Temporality, philosophical). The same phenomenon might apply to a succession of experiences, such as the different parts of a conversation, which may feel as if it is a single experience, present all at once.


Kinds of information conveyed by conscious states

             One idea guiding investigations into the nature of consciousness is that the contents of consciousness are analogous to the contents of a newspaper story. Perhaps the most influential version of this idea is that the contents of an experience are given by the conditions under which it is accurate. What an experience conveys to the subject, according to this conception, is that those conditions are satisfied. (See Peacocke 1992, Searle 1983, Chalmers 2004, Siegel 2005)

            We can say that the accuracy conditions of experience are its representational contents. On this conception, there is a broad analogy between the contents of experience and the representational contents of thoughts and utterances: both contents are assessable for accuracy. Suppose I utter the sentence “Fish can swim” and thereby express my belief that fish can swim. The representational content of my utterance is what I assert, and the representational content of my belief is what I believe – in both cases, that fish can swim. 
            The conception of the contents of experience as given by its accuracy condition is motivated by the idea that it often seems to make sense to ask, ‘How would things have to be, in order for what my experience is conveying to me to be accurate?’ The conditions under which the experience would be accurate determine the contents of the experience. This conception can also easily account for how experiences may mislead. Suppose you see a fish while unwittingly looking in a mirror. It may look as if there is a red fish in front of you, when in fact the red fish you see is behind you. Similarly, in auditory or olfactory hallucinations, one may seem to hear voices when in fact no one is speaking, or to smell an odor when in fact nothing is emitting that smell. In phantom limb pain, one feels pain as located where one’s limb used to be but is no longer. These are cases of being misled by one’s senses, and it is natural to say that in these cases things are not as they appear to be. Given an experience – either one we actually have, or a hypothetical one – we sometimes have intuitions about whether the experience is accurate (“veridical”) or inaccurate (“falsidical”). To this extent, we seem to be able to assess experiences for accuracy.  Experiences that have contents represent the world as being a certain way, so they are said to have representational contents.
            If experiences have representational contents, which properties do experiences portray things as having? Suppose you see a bowl of wax peaches. Does your visual experience represent that there is fruit in the bowl, or, more minimally, that there are variously colored roundish volumes in the bowl (or perhaps, yet more minimally, that there are items with peach-colored convex surfaces in the bowl)? On the first option, the experience will be inaccurate, since the bowl contains wax fruit, not peaches. In contrast, since the other options convey less committal information, the experience will be accurate, since there really are roundish volumes with peach-colored convex surfaces in the bowl. This dispute about visual experiences concerns whether they represent only colors and shapes and illumination properties, or higher-level properties such as a being a piece of fruit. (Siegel 2006). 
            Which experiences, if any, have representational contents? It is natural to think of standard cases of visual consciousness, such as seeing a fish tank, as presenting the environment as being a certain way: e.g., containing a fish tank full of fish. It might appear to be in a location slightly different from the location it is actually in, due to the distortion of seeing things in water. For the experience to be accurate, the fish would have to be floating exactly where it appears to be – which is different from the place where it is. It is also possible, however, for there to be experiences to represent occurrences inside the body. For instance, one proposal is that pains represent that there is damage in the part of the body that hurts, so that if the painful experience is accurate then there really is damage in that part of the body.    
            While it may seem natural to hold that standard visual experiences of seeing have representational contents, for other kinds of conscious experience it is harder to say what the contents would be. If you gently press your eyeball with your finger, you will seem to see a ‘phosphene’, which is a colored patch of light. Perhaps rather than conveying that there is a luminous occurrence behind the eyelid, the phosphene experience is not conveying anything at all, either about the space outside the body or the space inside the body. To take another example, when you hear the notes of a melody, what would have to happen in the world for the experience to be accurate, besides the sounds succeeding one another? It is difficult to say. Switching modalities, does the experience of smelling odors convey to you any way the world has to be in order to be accurate, or is olfactory experience nothing more than a sensory affliction?  Some philosophers have argued that conscious experiences in the modalities of taste and smell do not convey anything to the subject (Smith 2002, Lycan?). Other philosophers deny that even standard visual experiences are assessable for accuracy (Travis 2004, Brewer (forthcoming)).

            We’ve been discussing conscious experiences in the sensory modalities. When we consider such experiences to be accurate or inaccurate, we are taking them to have something in common with belief: just as the belief that fish swim is true in virtue of the truth of its content fish swim, so too do sensory experience inherit their status as accurate or inaccurate from the status of their contents as true or false. Other kinds of conscious states, such as some emotions, might not themselves be assessable for accuracy. For example, fear is not by itself correct or incorrect, but it might nonetheless have parts that are so assessable. A fear that your dentist will hurt you could have as a component the belief that dentists are apt to cause harm. Relatedly, the relational component of your fear might have the structure of a belief: when you fear that your dentist will hurt you, this might (unfortunately) be true, or it might be false. Either way, your fear will have a content that is assessable for truth, even though the fear itself is not assessable for truth. Alternatively, fear might be a mere sensation that is not in any way directed at the environment. Finally, there might be two kinds of fear, or more generally two kinds of emotion – the kind with that is directed at the world. and the kind that is a bodily sensation. [See Emotion, philosophical issues].

One of the central debates concerning the representational contents of experience is whether they determine its phenomenal character. Representationalists say that the phenomenal character of experiences derives wholly from their representational content.  The most far-reaching version of representationalism applies across the board to all conscious states, whereas more modest versions are directed only at experiences in specific kinds of conscious states, such as visual experiences, or pain.

Whereas representationalists say that representational content determine phenomenal character, others say that phenomenal features of conscious states determine representational features. If two experiences are phenomenally the same, then will there necessarily be some contents that they share? Or, more strongly, will the experiences share all their contents? Some philosophers hold both representationalism and the latter (and stronger) claim, on the grounds that phenomenal properties are identical to representational properties (see Dretske 1995, Tye 1995, Carruthers 2000). Other philosophers embrace the former (and weaker) claim by holding that there is a kind of content that experiences have in virtue of their phenomenology, so that their phenomenology has explanatory priority over the content itself (Horgan and Tieson 2002, Siewert 1998, Kriegel 2002). 

The contents of experience raise methodological questions for the scientific study of consciousness, particularly with the search for neural correlates of conscious experiences. According to a standard definition (Chalmers 1997) a neural representational system counts as a correlate of an experience, just in case the neural correlate’s having content C suffices for the experience to have content C. If there are neural representational systems whose contents match the contents of experience in this way, then knowing what the contents of an experience are help us identify its neural correlate. Conversely, if one already knew what the neural correlate of an experience was, its role in information-processing could tell us what contents the experience has. It is controversial, however, whether there are any such neural representational systems. Some philosophers think that neural systems cannot by themselves encode the kind of information that is present in the content of conscious experience: rather, aspects of the body and the environment are also required (Noe and Thompson 2004). If so, then there will be no purely neural correlates of experiences of the sort defined above: either experience has no neural correlates at all, or else neural correlates will not be defined in terms of matching contents. [See Correlates of consciousness].

            There are many other debates in philosophy about the representational contents of consciousness: Do they always or ever involve concepts? What sorts of differences in contents do we find between humans, infants, and animals? Do different sensory modalities, such as vision and touch, share the same sorts of contents?  Is the content of experiences determined wholly by what is in the subject’s head, or is it also determined by the subject’s environment?  These issues are discussed in other entries (see Non-conceptual content, and Animal Consciousness, Parrot and –Metacognition).