Main.ConsciousnessLycan History

Hide minor edits - Show changes to output

August 17, 2008, at 10:12 PM by 219.78.21.219 -
Changed line 34 from:
* Example #1: [[http://www.cityguide.gov.mo/webcam2/webcam1_p.htm|Webcam + motion detector]]
to:
* Example #1: Webcam + motion detector
May 12, 2007, at 05:51 AM by 219.77.142.103 -
Changed lines 7-8 from:
* [Required] cite:lycan97 William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]].
to:
* [Required] cite:lycan97
May 12, 2007, at 05:51 AM by 219.77.142.103 -
Changed lines 7-8 from:
* [Required] cite:lycan97 William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]]. In Ned Block, Owen Flanagan, and Guven Güzeldere (Eds.). The Nature of Consciousness: Philosophical Debates. MIT Press, 755-771.
to:
* [Required] cite:lycan97 William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]].
May 12, 2007, at 05:15 AM by 219.77.142.103 -
Changed lines 7-8 from:
* [Required] William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]]. In Ned Block, Owen Flanagan, and Guven Güzeldere (Eds.). The Nature of Consciousness: Philosophical Debates. MIT Press, 755-771.
to:
* [Required] cite:lycan97 William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]]. In Ned Block, Owen Flanagan, and Guven Güzeldere (Eds.). The Nature of Consciousness: Philosophical Debates. MIT Press, 755-771.
May 12, 2007, at 05:14 AM by 219.77.142.103 -
Added lines 1-62:
!Consciousness as internal monitoring

media:control-station.jpg

!!Readings

* [Required] William G. Lycan (1997). [[http://www.nd.edu/~lstubenb/Lycan.html|Consciousness as Internal Monitoring]]. In Ned Block, Owen Flanagan, and Guven Güzeldere (Eds.). The Nature of Consciousness: Philosophical Debates. MIT Press, 755-771.

!!The theory

* ''X'' is conscious when ''X'' uses attention to monitor lower-level mental states. Lycan:

@@@Consciousness is the functioning of internal attention mechanisms directed upon lower-order psychological states and events. ... Attention mechanisms are devices that have the job of relaying and/or coordinating information about ongoing psychological events and processes.@@@

* Attention is necessary for consciousness. So organisms that lack attentional mechanisms are unconscious.
* Consciousness is to be explained in computational / representational terms.

Clarification: creature-consciousness vs. state-consciousness

*The first c-theory does not entail the second s-theory.
** A ''creature'' ''X'' is conscious when ''X'' uses attention to monitor lower-level mental states.
** A ''mental state'' ''m'' is conscious when ''m'' is a lower-level state being monitored by an attentional mechanism.

Some arguments for the s-theory

* Pain is not necessarily a conscious state. It becomes conscious only when one attends to it.
* Wine-tasting.
* Many psychologists argue that visual attention is necessary for conscious visual experience. See [[Consciousness and attention]]

!!Objections to the c-theory

!!!Self-monitoring is not sufficient for consciousness.

* Example #1: [[http://www.cityguide.gov.mo/webcam2/webcam1_p.htm|Webcam + motion detector]]
* Example #2: Sewage system - Such a system would have lots of monitoring processes, perhaps even an indicator light that shows whether the whole system is functioning normally or not. Similarly, all modern computers have processes that monitor the programs and threads of computations going on. Would all such systems be conscious?
* What would be the ethical consequences?

media:sewer-system.jpg

Replies

# Consciousness requires attention to real ''mental states''. The sewage system does not have mental states. But what is the difference and is it really important?
# Stephen White - Having a concept of the self is not a simple matter.
# Lycan - Consciousness comes in degree. The sewage system is barely conscious.
* It is not clear what the variation in the degree of consciousness is supposed to correlate with. Mere complexity in the connections of the processes and the number of processes being monitored do not seem to matter that much.

!!!Is self-monitoring necessary for consciousness?

* What exactly is required? Monitoring, attention or concept of the self?
* The theory runs the risk of being too strong or too weak.
** Too strong, if concept of the self is required. See [[self recognition]].
** Too weak, if monitoring is sufficient.

!Questions to think about

* Consider this passage from Lycan. How would you explain what "monitoring for me" means?

@@@To count in the analysis of my consciousness, the monitor must do its monitoring for me. A monitor might have been implanted in me somewhere that sends its outputs straight to Reuters and to CNN, so that the whole world may learn of my first-order psychological states as soon as humanly possible. Such a device would be teleologically a monitor, but the wire services' monitor rather than mine.@@@

* What might Lycan say about the objection that the theory is either too strong or too weak?

[[Category.Mind]]