On the (too) many faces of consciousness

Harnad, S. (2021). On the (Too) Many Faces of ConsciousnessJournal of Consciousness Studies28(7-8), 61-66.

Abstract:  Pereira, like some other authors, both likens and links consciousness to homeostasis. He accordingly recommends that some measures of homeostasis be taken into account as biomarkers for sentience in patients who are in a chronic vegetative state. He rightly defines “sentience” as the capacity to feel (anything). But the message is scrambled in an incoherent welter of weasel-words for consciousness and the notion (in which he is also not alone) that there are two consciousnesses (sentience and “cognitive consciousness”). I suggest that one “hard problem” of consciousness is already more than enough.

Homeostasis. A thermostat is a homeostat. It measures temperature and controls a furnace. It “defends” a set temperature by turning on the furnace when the temperature falls below the set point and turning the furnace off when it goes above the set point. This process of keeping variables within set ranges is called homeostasis. A higher-order of homeostasis (“allostasis”) would be an integrative control system that received the temperature from a fleet of different thermostats, furnaces and climates, doing computations on them all based on the available fuel for the furnaces and the pattern of changes in the temperatures, dynamically modifying their set-points so as to defend an overall optimum for the whole fleet.

Biological organisms’ bodies have homeostatic and allostatic mechanisms of this kind, ranging over functions like temperature, heart-rate, blood-sugar, immune responses, breathing and balance – functions we would call “vegetative” – as well as functions we consider “cognitive,” such as attention and memory.

Sentience. Pereira (2021) rightly distinguishes between sentience itself – any state that it feels like something to be in – and cognition, which is also sentient, but involves more complicated thought processes, especially verbal ones. “Cognitive,” however, is often a weasel-word – one of many weasel-words spawned by our unsuccessful efforts to get a handle on “consciousness,” which is itself a weasel-word for sentience,which simply means feeling (feeling anything at all, from warm or tired or hungry, to angry or joyful or jealous, including what it feels like to see, hear, touch, taste or smell something, and what it feels like to understand (or think you understand) the meaning of a sentence or the proof of a theorem).

When Pereira speaks of sentience, however, he usually means it literally: a state is sentient if it is felt (e.g., pain); and an organism is sentient if it is able to feel. The main point of Pereira’s paper is that the tests for “consciousness” in human patients who are in a chronic vegetative state are insufficient. Such patients cannot make voluntary movements, nor can they understand or respond to language, but they still have sleeping and waking states, as well as reflexes, including eye-opening, chewing and some vocalizations; and their homeostatic vegetative functions persist. 

Vegetative states. Pereira insists, rightly, that if patients in a chronic vegetative state can still feel (e.g., pain) then they are still sentient. With Laureys (2019) and others, he holds that there are two networks for “awareness” (another weasel-word), one related to wakefulness and the other to “cognitive representations of the environment” (more weasel-words). Pereira accordingly recommends homeostasis-related measures such as lactate concentrations in the cerebrospinal fluid and astrocyte transport “waves” as biomarkers for sentience where behavioral tests and cerebral imagery draw a blank.

This seems reasonable enough. The “precautionary principle” (Birch 2017) dictates that the patients should be given every benefit of the doubt about whether they can feel. But what about these two networks of “awareness/consciousness/subjectivity” and their many other variants (“qualia” – “noetic” and “anoetic,” “internal” and “external”) and the very notion of two kinds of “consciousnesses”: “cognitive” and “noncognitive”?

Weasel-Words. Weasel-words are words used (deliberately or inadvertently) to mask redundancy or incoherence. They often take the form of partial synonyms that give the impression that there are more distinct entities or variables at issue than there really are. Such is the case with the Protean notion of “consciousness,” for which there are countless Mustelidian memes besides the ones I’ve already mentioned, including: subjective states, conscious states, mental states, phenomenal states, qualitative states, intentional states, intentionality, subjectivity, mentality, private states, 1st-person view, 3rd person view, contentful states, reflexive states, representational states, sentient states, experiential states, reflexivity, self-awareness, self-consciousness, sentience, raw feels, feeling, experience, soul, spirit, mind
.

I think I know where the confusion resides, and also, if not when the confusion started, at least when it was compounded and widely propagated by Block’s (1995) target article in Behavioral and Brain Sciences â€œOn a confusion about a function of consciousness.” It was there that Block baptized an explicit distinction between (at least) two “consciousnesses”: “phenomenal consciousness” and “access consciousness”:

“Consciousness is a mongrel concept: there are a number of very different ‘consciousnesses.’ Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action.”

Feeling. What Block meant by “phenomenal consciousness” is obviously sentience; and what he meant to say (in a needlessly Nagelian way) is that there is something it feels like (not the needlessly vague and equivocal “is like” of Nagel 1974) to be in a sentient state. In a word, a sentient state is a felt state. (There is something it “is like” for water to be in a state of boiling: that something is what happens to the state of water above the temperature of 212 degrees Fahrenheit; but it does not feel like anything to the water to be in that state – only to a sentient organism that makes the mistake of reaching into the water.)

Block’s “access consciousness,” in contrast, is no kind of  â€œconsciousness” at all, although it is indeed also a sentient state — unless there are not only “a number of very different ‘consciousnesses’,” but an infinitenumber, one for every possible feeling that can be felt, from what it feels like for a human to hear an oboe play an A at 440 Hz, to what it feels like to hear an oboe play an A at 444 Hz, to what it feels like to know that Trump is finally out of the White House. 

Information. No, those are not different consciousnesses; they are differences in the content of “consciousness,” which, once de-weaseled, just means differences in what different feelings feel like. As to the “access” in the notion of an “access consciousness,” it just pertains to access to that felt content, along with the information (data) that the feelings accompany. Information, in turn, is anything that resolves uncertainty about what to do, among a finite number of options (Cole 1993).

Access. There is no such thing as an unfelt feeling (even though Pereira invokes some incoherent Freudian notions that became enshrined in the myth of an unconscious “mind,” which would have amounted to an unconscious consciousness): 

“Sentience can also be conceived as co-extensive to the (Freudian) ‘unconscious’… In Freudian psychotherapy, the ways of sentience are classically understood as unconscious processes of the Id and Ego that influence emotional states and behavior”

The contents we access are the contents that it feels like something to know (or believe you know, Burton 2008). If I am trying to remember someone’s name, then unless I can retrieve it I do not have access to it. If and when I retrieve it, not only do I have access to the name, so I can tell someone what it is, and phone the person, etc., but, as with everything else one knows, it feels like something to know that name, a feeling to which I had no “access” when I couldn’t remember the name. Both knowing the name and not knowing the name were sentient states; what changed was not a kind of consciousness, but access to information (data): to the content that one was conscious of, along with what it felt like to have that access. Computers have access to information, but because they are insentient, it does not feel like anything to “have” that information. And for sentient organisms it not only feels like something to see and hear, to be happy or sad, and to want or seek something, but also to reason about, believe, understand or know something.  

Problems: Easy, Hard and Other. Pereira unfortunately gets Chalmers’s “easy” and “hard” problem very wrong:

“the study of sentience is within the ‘Easy Problems’ conceived by Chalmers (1995), while explaining full consciousness is the ‘Hard Problem’.”

The “easy problem” of cognitive science is to explain causally how and why organisms can do all the (observable) things that organisms are able to do (from seeing and hearing and moving to learning and reasoning and communicating), including what their brains and bodies can do internally (i.e., their neuroanatomy, neurophysiology and neurochemistry, including homeostasis and allostasis). To explain these capacities is to “reverse-engineer them” so as to identify, demonstrate and describe the underlying causal mechanisms that produce the capacities (Turing 1950/2009). 

The “hard problem” is to explain how and why (sentient) organisms can feel. Feelings are not observable (to anyone other than the feeler); only the doings that are correlated with them are observable. This is the “other-minds problem” (Harnad 2016), which is neither the easy problem nor the hard problem. But Pereira writes:

“On the one hand, [1] sentience, as the capacity of controlling homeostasis with the generation of adaptive feelings, can be studied by means of biological structures and functions, as empirical registers of ionic waves and the lactate biomarker. On the other hand, [2] conscious first-person experiences in episodes containing mental representations and with attached qualia cannot be reduced to their biological correlates, neuron firings and patterns of connectivity, as famously claimed by Chalmers (1995).”

Integration. Regarding [1] It is true that some thinkers (notably Damasio 1999) have tried to link consciousness to homeostasis, perhaps because of the hunch that many thinkers (e.g., Baars 1997) have had that consciousness, too, may have something to do with monitoring and integrating many distributed activities, just as homeostasis does. But I’m not sure others would go so far as to say that sentience (feeling) is the capacity to control homeostasis; in fact, it’s more likely to be the other way round. And the ionic waves and lactate biomarker sound like Pereira’s own conjecture.

Regarding [2], the Mustelidian mist is so thick that it is difficult to find one’s way: “conscious, first-person experiences” (i.e., “felt, feeler’s feelings”) is just a string of redundant synonyms: Unfelt states are not conscious states; can feelings be other than 1st-personal? How is an unfelt experience an experience? “Mental” is another weasel-word for felt. “Representation” is the most widely used weasel-word in cognitive science and refers to whatever state or process the theorist thinks is going on in a head (or a computer). The underlying intuition seems to be something like an internal pictorial or verbal “representation” or internal model of something else. So, at bottom “internally represented” just means internally coded, somehow. “Qualia” are again just feelings. And, yes, correlation is not causation; nor is it causal explanation. That’s why the hard problem is hard.

None of this is clarified by statements like the following (which I leave it to the reader to try to de-weasel):

“sentience can be understood as potential consciousness: the capacity to feel (proprioceptive and exteroceptive sensations, emotional feelings) and to have qualitative experiences, while cognitive consciousness refers to the actual experience of thinking with mental representations.”

Ethical Priority of Sentience. But, not to end on a negative note: not only is Pereira right to stress sentience and its biomarkers in the assessment of chronic vegetative states in human patients, but, inasmuch as he (rightly) classifies the capacity to feel pain as sentience (rather than as “cognitive consciousness”), Pereira also accords sentience the ethical priority that it merits wherever it occurs, whether in our own or any other sentient species (Mikhalevich & Powell 2020). 

References

Baars B. (1997). In the Theater of Consciousness: The Workspace of the Mind. New York: Oxford University Press. 

Birch, J. (2017) Animal sentience and the precautionary principleAnimal Sentience 16(1)

Block, N. (1995). On a confusion about a function of consciousnessBehavioral and Brain Sciences, 18(2), 227-247. 

Burton, RA. (2008) On Being Certain: Believing You Are Right Even When You’re Not.  New York City: Macmillan Publishers/St. Martin’s Press.

Cole, C. (1993). Shannon revisited: Information in terms of uncertainty. Journal of the American Society for Information Science44(4), 204-211.

Damasio, A. (1999) The Feeling of What Happens: Body and Emotion in the Making of Consciousness. New York: Harcourt. 

Harnad, S. (2016). Animal sentience: The other-minds problemAnimal Sentience 1(1). 

Mikhalevich, I. and Powell, R. (2020) Minds without spines: Evolutionarily inclusive animal ethicsAnimal Sentience 29(1)

Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435-450.

Pereira, O. (2021) The Role of Sentience in the Theory of Consciousness and Medical Practice. Journal of Consciousness Studies (Special Issue on “Sentience and Consciousness”) 

Turing, A. M. (2009). Computing machinery and intelligence. In Parsing the Turing Test (pp. 23-65). Springer, Dordrecht.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.