Consciousness as internal monitoring


The theory

  • X is conscious when X uses attention to monitor lower-level mental states. Lycan:

@Consciousness is the functioning of internal attention mechanisms directed upon lower-order psychological states and events. ... Attention mechanisms are devices that have the job of relaying and/or coordinating information about ongoing psychological events and processes.@

  • Attention is necessary for consciousness. So organisms that lack attentional mechanisms are unconscious.
  • Consciousness is to be explained in computational / representational terms.

Clarification: creature-consciousness vs. state-consciousness

  • The first c-theory does not entail the second s-theory.
    • A creature X is conscious when X uses attention to monitor lower-level mental states.
    • A mental state m is conscious when m is a lower-level state being monitored by an attentional mechanism.

Some arguments for the s-theory

  • Pain is not necessarily a conscious state. It becomes conscious only when one attends to it.
  • Wine-tasting.
  • Many psychologists argue that visual attention is necessary for conscious visual experience. See Consciousness and attention

Objections to the c-theory

Self-monitoring is not sufficient for consciousness.

  • Example #1: Webcam + motion detector
  • Example #2: Sewage system - Such a system would have lots of monitoring processes, perhaps even an indicator light that shows whether the whole system is functioning normally or not. Similarly, all modern computers have processes that monitor the programs and threads of computations going on. Would all such systems be conscious?
  • What would be the ethical consequences?


  1. Consciousness requires attention to real mental states. The sewage system does not have mental states. But what is the difference and is it really important?
  2. Stephen White - Having a concept of the self is not a simple matter.
  3. Lycan - Consciousness comes in degree. The sewage system is barely conscious.
  • It is not clear what the variation in the degree of consciousness is supposed to correlate with. Mere complexity in the connections of the processes and the number of processes being monitored do not seem to matter that much.

Is self-monitoring necessary for consciousness?

  • What exactly is required? Monitoring, attention or concept of the self?
  • The theory runs the risk of being too strong or too weak.
    • Too strong, if concept of the self is required. See self recognition.
    • Too weak, if monitoring is sufficient.

Questions to think about

  • Consider this passage from Lycan. How would you explain what "monitoring for me" means?

@To count in the analysis of my consciousness, the monitor must do its monitoring for me. A monitor might have been implanted in me somewhere that sends its outputs straight to Reuters and to CNN, so that the whole world may learn of my first-order psychological states as soon as humanly possible. Such a device would be teleologically a monitor, but the wire services' monitor rather than mine.@

  • What might Lycan say about the objection that the theory is either too strong or too weak?