next up previous
Next: The Tough Question Up: Why Did Evolution Engineer Previous: The Question

Taming the Mongrel

Ned Block [Block, 1995] has recently pointed out that the concept of consciousness is a ``mongrel" one: the term `consciousness' connotes different things to different people -- sometimes radically different things. Accordingly, Block distinguishes between

This isn't the place to carefully disentangle these four breeds. It will suffice for our purposes if we manage to get a rough-and-ready characterization of Block's quartet on the table, with help from Block himself, and some others.

Block describes the first of these phenomena in Nagelian fashion as follows:

So how should we point to P-consciousness? Well, one way is via rough synonyms. As I said, P-consciousness is experience. P-conscious properties are experiential properties. P-conscious states are experiential states, that is, a state is P-conscious if it has experiential properties. The totality of the experiential properties of a state are ``what it is like" to have it. Moving from synonyms to examples, we have P-conscious states when we see, hear, smell, taste and have pains. P-conscious properties include the experiential properties of sensations, feelings and perceptions, but I would also include thoughts, wants and emotions. ([Block, 1995], p. 230)

According to this explanation, the list with which we began the paper corresponds to a list of P-conscious states, viz.,

A-consciousness admits of more precise treatment; Block writes:

A state is access-conscious (A-conscious) if, in virtue of one's having the state, a representation of its content is (1) inferentially promiscuous, i.e., poised to be used as a premise in reasoning, and (2) poised for [rational] control of action and (3) poised for rational control of speech. ([Block, 1995], p. 231)

A-consciousness seems to be a property bound up with information-processing. Indeed, as one of us has explained elsewhere [Bringsjord, minga], it's plausible to regard certain extant, mundane computational artifacts to be bearers of A-consciousness. For example, theorem provers with natural language generation capability would seem to qualify with flying colors. In recent conversation, Block has gladly confessed that computational systems, by his lights, are A-conscious.5

S-consciousness is said by Block to mean ``the possession of the concept of the self and the ability to use this concept in thinking about oneself" ([Block, 1995], p. 235). There is a famous family of cases (cf. [Perry, 1979]) which seem to capture S-consciousness in gem-like fashion: Suppose that you are sitting in the Collar City Diner looking out the window at the passing traffic, when you notice the reflection of a man sitting alone in the diner -- a man in a rumpled tweed jacket who is looking out the window with a blank, doleful expression. On the basis of what you see, you affirm, to put it a bit stiffly, this proposition: ``The man with the tweed blazer is looking blankly out a window of the Collar City Diner." But suppose you then suddenly realize that the man in question is you. At this point you affirm a different proposition, viz., ``I am looking blankly out a window of the Collar City Diner." In this case we say that the indexical is essential; and, following Pollock, we say that beliefs that the second sort of proposition holds are de se beliefs. We can then say that an agent having de se beliefs, as well as the capacity to reason over them (after your epiphany in the diner you may conclude that you need to stop philosophizing and go home and sleep), enjoys S-consciousness.6

Block tells us that M-consciousness corresponds to at least three notions in the literature: inner perception, internal scanning, and so-called ``higher order" thought. The third of these has been explicated and defended through the years by David Rosenthal ([Rosenthal, ming], [Rosenthal, 1986], [Rosenthal, 1989], [Rosenthal, 1990b], [Rosenthal, 1990a]). According to Rosenthal, a state is conscious (in some for-now generic sense of `conscious') just in case it is the target of a higher-order thought. Courtesy of Rosenthal's [Rosenthal, ming], the view can be put in declarative form:

Def 1
s is a conscious mental state at time t for agent a =df s is accompanied at t by a higher-order, noninferential, occurrent, assertoric thought s' for a that a is in s, where s' is conscious or unconscious.4

Def 1, as the higher-order theory of consciousness, is often abbreviated as simply `HOT.' What sorts of examples conform to HOT? Consider the state wanting to be fed. On Rosenthal's view, this state is a conscious state -- and the reason it is is that it's the target of a higher-order thought, viz., the thought that I want to be fed. Rosenthal's Def 1, of course, leaves open the possibility that the higher-order thought can be itself unconscious.

With Block's quartet characterized, it's time to return to Q1, the question with which we began.

next up previous
Next: The Tough Question Up: Why Did Evolution Engineer Previous: The Question
Selmer Bringsjord