Harnad, S. (2021). On the (Too) Many Faces of Consciousness. Journal of Consciousness Studies, 28(7-8), 61-66.
Abstract: Pereira, like some other authors, both likens and links consciousness to homeostasis. He accordingly recommends that some measures of homeostasis be taken into account as biomarkers for sentience in patients who are in a chronic vegetative state. He rightly defines âsentienceâ as the capacity to feel (anything). But the message is scrambled in an incoherent welter of weasel-words for consciousness and the notion (in which he is also not alone) that there are two consciousnesses (sentience and âcognitive consciousnessâ). I suggest that one âhard problemâ of consciousness is already more than enough.
Homeostasis. A thermostat is a homeostat. It measures temperature and controls a furnace. It âdefendsâ a set temperature by turning on the furnace when the temperature falls below the set point and turning the furnace off when it goes above the set point. This process of keeping variables within set ranges is called homeostasis. A higher-order of homeostasis (âallostasisâ) would be an integrative control system that received the temperature from a fleet of different thermostats, furnaces and climates, doing computations on them all based on the available fuel for the furnaces and the pattern of changes in the temperatures, dynamically modifying their set-points so as to defend an overall optimum for the whole fleet.
Biological organismsâ bodies have homeostatic and allostatic mechanisms of this kind, ranging over functions like temperature, heart-rate, blood-sugar, immune responses, breathing and balance â functions we would call âvegetativeâ â as well as functions we consider âcognitive,â such as attention and memory.
Sentience. Pereira (2021) rightly distinguishes between sentience itself â any state that it feels like something to be in â and cognition, which is also sentient, but involves more complicated thought processes, especially verbal ones. âCognitive,â however, is often a weasel-word â one of many weasel-words spawned by our unsuccessful efforts to get a handle on âconsciousness,â which is itself a weasel-word for sentience,which simply means feeling (feeling anything at all, from warm or tired or hungry, to angry or joyful or jealous, including what it feels like to see, hear, touch, taste or smell something, and what it feels like to understand (or think you understand) the meaning of a sentence or the proof of a theorem).
When Pereira speaks of sentience, however, he usually means it literally: a state is sentient if it is felt (e.g., pain); and an organism is sentient if it is able to feel. The main point of Pereiraâs paper is that the tests for âconsciousnessâ in human patients who are in a chronic vegetative state are insufficient. Such patients cannot make voluntary movements, nor can they understand or respond to language, but they still have sleeping and waking states, as well as reflexes, including eye-opening, chewing and some vocalizations; and their homeostatic vegetative functions persist.
Vegetative states. Pereira insists, rightly, that if patients in a chronic vegetative state can still feel (e.g., pain) then they are still sentient. With Laureys (2019) and others, he holds that there are two networks for âawarenessâ (another weasel-word), one related to wakefulness and the other to âcognitive representations of the environmentâ (more weasel-words). Pereira accordingly recommends homeostasis-related measures such as lactate concentrations in the cerebrospinal fluid and astrocyte transport âwavesâ as biomarkers for sentience where behavioral tests and cerebral imagery draw a blank.
This seems reasonable enough. The âprecautionary principleâ (Birch 2017) dictates that the patients should be given every benefit of the doubt about whether they can feel. But what about these two networks of âawareness/consciousness/subjectivityâ and their many other variants (âqualiaâ â ânoeticâ and âanoetic,â âinternalâ and âexternalâ) and the very notion of two kinds of âconsciousnessesâ: âcognitiveâ and ânoncognitiveâ?
Weasel-Words. Weasel-words are words used (deliberately or inadvertently) to mask redundancy or incoherence. They often take the form of partial synonyms that give the impression that there are more distinct entities or variables at issue than there really are. Such is the case with the Protean notion of âconsciousness,â for which there are countless Mustelidian memes besides the ones Iâve already mentioned, including: subjective states, conscious states, mental states, phenomenal states, qualitative states, intentional states, intentionality, subjectivity, mentality, private states, 1st-person view, 3rd person view, contentful states, reflexive states, representational states, sentient states, experiential states, reflexivity, self-awareness, self-consciousness, sentience, raw feels, feeling, experience, soul, spirit, mindâŠ.
I think I know where the confusion resides, and also, if not when the confusion started, at least when it was compounded and widely propagated by Blockâs (1995) target article in Behavioral and Brain Sciences âOn a confusion about a function of consciousness.â It was there that Block baptized an explicit distinction between (at least) two âconsciousnessesâ: âphenomenal consciousnessâ and âaccess consciousnessâ:
âConsciousness is a mongrel concept: there are a number of very different âconsciousnesses.â Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action.â
Feeling. What Block meant by âphenomenal consciousnessâ is obviously sentience; and what he meant to say (in a needlessly Nagelian way) is that there is something it feels like (not the needlessly vague and equivocal âis likeâ of Nagel 1974) to be in a sentient state. In a word, a sentient state is a felt state. (There is something it âis likeâ for water to be in a state of boiling: that something is what happens to the state of water above the temperature of 212 degrees Fahrenheit; but it does not feel like anything to the water to be in that state â only to a sentient organism that makes the mistake of reaching into the water.)
Blockâs âaccess consciousness,â in contrast, is no kind of âconsciousnessâ at all, although it is indeed also a sentient state — unless there are not only âa number of very different âconsciousnessesâ,â but an infinitenumber, one for every possible feeling that can be felt, from what it feels like for a human to hear an oboe play an A at 440 Hz, to what it feels like to hear an oboe play an A at 444 Hz, to what it feels like to know that Trump is finally out of the White House.
Information. No, those are not different consciousnesses; they are differences in the content of âconsciousness,â which, once de-weaseled, just means differences in what different feelings feel like. As to the âaccessâ in the notion of an âaccess consciousness,â it just pertains to access to that felt content, along with the information (data) that the feelings accompany. Information, in turn, is anything that resolves uncertainty about what to do, among a finite number of options (Cole 1993).
Access. There is no such thing as an unfelt feeling (even though Pereira invokes some incoherent Freudian notions that became enshrined in the myth of an unconscious âmind,â which would have amounted to an unconscious consciousness):
“Sentience can also be conceived as co-extensive to the (Freudian) âunconsciousâ… In Freudian psychotherapy, the ways of sentience are classically understood as unconscious processes of the Id and Ego that influence emotional states and behavior”
The contents we access are the contents that it feels like something to know (or believe you know, Burton 2008). If I am trying to remember someoneâs name, then unless I can retrieve it I do not have access to it. If and when I retrieve it, not only do I have access to the name, so I can tell someone what it is, and phone the person, etc., but, as with everything else one knows, it feels like something to know that name, a feeling to which I had no âaccessâ when I couldnât remember the name. Both knowing the name and not knowing the name were sentient states; what changed was not a kind of consciousness, but access to information (data): to the content that one was conscious of, along with what it felt like to have that access. Computers have access to information, but because they are insentient, it does not feel like anything to âhaveâ that information. And for sentient organisms it not only feels like something to see and hear, to be happy or sad, and to want or seek something, but also to reason about, believe, understand or know something.
Problems: Easy, Hard and Other. Pereira unfortunately gets Chalmersâs âeasyâ and âhardâ problem very wrong:
“the study of sentience is within the âEasy Problemsâ conceived by Chalmers (1995), while explaining full consciousness is the âHard Problemâ.”
The âeasy problemâ of cognitive science is to explain causally how and why organisms can do all the (observable) things that organisms are able to do (from seeing and hearing and moving to learning and reasoning and communicating), including what their brains and bodies can do internally (i.e., their neuroanatomy, neurophysiology and neurochemistry, including homeostasis and allostasis). To explain these capacities is to âreverse-engineer themâ so as to identify, demonstrate and describe the underlying causal mechanisms that produce the capacities (Turing 1950/2009).
The âhard problemâ is to explain how and why (sentient) organisms can feel. Feelings are not observable (to anyone other than the feeler); only the doings that are correlated with them are observable. This is the âother-minds problemâ (Harnad 2016), which is neither the easy problem nor the hard problem. But Pereira writes:
“On the one hand, [1] sentience, as the capacity of controlling homeostasis with the generation of adaptive feelings, can be studied by means of biological structures and functions, as empirical registers of ionic waves and the lactate biomarker. On the other hand, [2] conscious first-person experiences in episodes containing mental representations and with attached qualia cannot be reduced to their biological correlates, neuron firings and patterns of connectivity, as famously claimed by Chalmers (1995).”
Integration. Regarding [1] It is true that some thinkers (notably Damasio 1999) have tried to link consciousness to homeostasis, perhaps because of the hunch that many thinkers (e.g., Baars 1997) have had that consciousness, too, may have something to do with monitoring and integrating many distributed activities, just as homeostasis does. But Iâm not sure others would go so far as to say that sentience (feeling) is the capacity to control homeostasis; in fact, itâs more likely to be the other way round. And the ionic waves and lactate biomarker sound like Pereiraâs own conjecture.
Regarding [2], the Mustelidian mist is so thick that it is difficult to find oneâs way: âconscious, first-person experiencesâ (i.e., âfelt, feelerâs feelingsâ) is just a string of redundant synonyms: Unfelt states are not conscious states; can feelings be other than 1st-personal? How is an unfelt experience an experience? âMentalâ is another weasel-word for felt. âRepresentationâ is the most widely used weasel-word in cognitive science and refers to whatever state or process the theorist thinks is going on in a head (or a computer). The underlying intuition seems to be something like an internal pictorial or verbal ârepresentationâ or internal model of something else. So, at bottom âinternally representedâ just means internally coded, somehow. âQualiaâ are again just feelings. And, yes, correlation is not causation; nor is it causal explanation. Thatâs why the hard problem is hard.
None of this is clarified by statements like the following (which I leave it to the reader to try to de-weasel):
âsentience can be understood as potential consciousness: the capacity to feel (proprioceptive and exteroceptive sensations, emotional feelings) and to have qualitative experiences, while cognitive consciousness refers to the actual experience of thinking with mental representations.â
Ethical Priority of Sentience. But, not to end on a negative note: not only is Pereira right to stress sentience and its biomarkers in the assessment of chronic vegetative states in human patients, but, inasmuch as he (rightly) classifies the capacity to feel pain as sentience (rather than as âcognitive consciousnessâ), Pereira also accords sentience the ethical priority that it merits wherever it occurs, whether in our own or any other sentient species (Mikhalevich & Powell 2020).
References
Baars B. (1997). In the Theater of Consciousness: The Workspace of the Mind. New York: Oxford University Press.
Birch, J. (2017) Animal sentience and the precautionary principle. Animal Sentience 16(1)
Block, N. (1995). On a confusion about a function of consciousness. Behavioral and Brain Sciences, 18(2), 227-247.
Burton, RA. (2008) On Being Certain: Believing You Are Right Even When You’re Not. New York City: Macmillan Publishers/St. Martin’s Press.
Cole, C. (1993). Shannon revisited: Information in terms of uncertainty. Journal of the American Society for Information Science, 44(4), 204-211.
Damasio, A. (1999) The Feeling of What Happens: Body and Emotion in the Making of Consciousness. New York: Harcourt.
Harnad, S. (2016). Animal sentience: The other-minds problem. Animal Sentience 1(1).
Mikhalevich, I. and Powell, R. (2020) Minds without spines: Evolutionarily inclusive animal ethics. Animal Sentience 29(1)
Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435-450.
Pereira, O. (2021) The Role of Sentience in the Theory of Consciousness and Medical Practice. Journal of Consciousness Studies (Special Issue on “Sentience and Consciousness”)
Turing, A. M. (2009). Computing machinery and intelligence. In Parsing the Turing Test (pp. 23-65). Springer, Dordrecht.