{"id":2050,"date":"2023-12-03T14:33:41","date_gmt":"2023-12-03T14:33:41","guid":{"rendered":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/?p=2050"},"modified":"2023-12-03T14:33:41","modified_gmt":"2023-12-03T14:33:41","slug":"on-the-too-many-faces-of-consciousness","status":"publish","type":"post","link":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/12\/03\/on-the-too-many-faces-of-consciousness\/","title":{"rendered":"On the (too) many faces of consciousness"},"content":{"rendered":"\n<p class=\"has-small-font-size\">Harnad, S. (2021). <a href=\"https:\/\/www.ingentaconnect.com\/content\/imp\/jcs\/2021\/00000028\/f0020007\/art00005\">On the (Too) Many Faces of Consciousness<\/a>.&nbsp;<em>Journal of Consciousness Studies<\/em>,&nbsp;<em>28<\/em>(7-8), 61-66.<\/p>\n\n\n\n<p><strong>Abstract:&nbsp;&nbsp;<\/strong>Pereira, like some other authors, both likens and links consciousness to homeostasis. He accordingly recommends that some measures of homeostasis be taken into account as biomarkers for sentience in patients who are in a chronic vegetative state. He rightly defines \u201csentience\u201d as the capacity to feel (anything). But the message is scrambled in an incoherent welter of weasel-words for consciousness and the notion (in which he is also not alone) that there are&nbsp;<em>two<\/em>&nbsp;consciousnesses (sentience and \u201ccognitive consciousness\u201d). I suggest that one \u201chard problem\u201d of consciousness is already more than enough.<strong><\/strong><\/p>\n\n\n\n<p><strong>Homeostasis.<\/strong>&nbsp;A thermostat is a homeostat. It measures temperature and controls a furnace. It \u201cdefends\u201d a set temperature by turning on the furnace when the temperature falls below the set point and turning the furnace off when it goes above the set point. This process of keeping variables within set ranges is called homeostasis. A higher-order of homeostasis (\u201callostasis\u201d) would be an integrative control system that received the temperature from a fleet of different thermostats, furnaces and climates, doing computations on them all based on the available fuel for the furnaces and the pattern of changes in the temperatures, dynamically modifying their set-points so as to defend an overall optimum for the whole fleet.<\/p>\n\n\n\n<p>Biological organisms\u2019 bodies have homeostatic and allostatic mechanisms of this kind, ranging over functions like temperature, heart-rate, blood-sugar, immune responses, breathing and balance \u2013 functions we would call \u201cvegetative\u201d \u2013 as well as functions we consider \u201ccognitive,\u201d such as attention and memory.<\/p>\n\n\n\n<p><strong>Sentience.<\/strong>&nbsp;Pereira (2021) rightly distinguishes between&nbsp;<em>sentience<\/em>&nbsp;itself \u2013 any state that it feels like something to be in \u2013 and&nbsp;<em>cognition<\/em>, which is also sentient, but involves more complicated thought processes, especially verbal ones. \u201cCognitive,\u201d however, is often a weasel-word \u2013 one of many weasel-words spawned by our unsuccessful efforts to get a handle on \u201cconsciousness,\u201d which is itself a weasel-word for&nbsp;<em>sentience,<\/em>which simply means&nbsp;<em>feeling<\/em>&nbsp;(feeling anything at all, from warm or tired or hungry, to angry or joyful or jealous, including what it feels like to see, hear, touch, taste or smell something, and what it feels like to understand (or think you understand) the meaning of a sentence or the proof of a theorem).<\/p>\n\n\n\n<p>When Pereira speaks of sentience, however, he usually means it literally: a state is sentient if it is felt (e.g., pain); and an organism is sentient if it is able to feel. The main point of Pereira\u2019s paper is that the tests for \u201cconsciousness\u201d in human patients who are in a chronic vegetative state are insufficient. Such patients cannot make voluntary movements, nor can they understand or respond to language, but they still have sleeping and waking states, as well as reflexes, including eye-opening, chewing and some vocalizations; and their homeostatic vegetative functions persist.&nbsp;<\/p>\n\n\n\n<p><strong>Vegetative states.<\/strong>&nbsp;Pereira insists, rightly, that if patients in a chronic vegetative state can still feel (e.g., pain) then they are still sentient. With Laureys (2019) and others, he holds that there are two networks for \u201cawareness\u201d (another weasel-word), one related to wakefulness and the other to \u201ccognitive representations of the environment\u201d (more weasel-words). Pereira accordingly recommends homeostasis-related measures such as lactate concentrations in the cerebrospinal fluid and astrocyte transport \u201cwaves\u201d as biomarkers for sentience where behavioral tests and cerebral imagery draw a blank.<\/p>\n\n\n\n<p>This seems reasonable enough. The \u201cprecautionary principle\u201d (Birch 2017) dictates that the patients should be given every benefit of the doubt about whether they can feel. But what about these two networks of \u201cawareness\/consciousness\/subjectivity\u201d and their many other variants (\u201cqualia\u201d \u2013 \u201cnoetic\u201d and \u201canoetic,\u201d \u201cinternal\u201d and \u201cexternal\u201d) and the very notion of two kinds of \u201cconsciousnesses\u201d: \u201ccognitive\u201d and \u201cnoncognitive\u201d?<\/p>\n\n\n\n<p><strong>Weasel-Words.<\/strong>&nbsp;Weasel-words are words used (deliberately or inadvertently) to mask redundancy or incoherence. They often take the form of partial synonyms that give the impression that there are more distinct entities or variables at issue than there really are. Such is the case with the Protean notion of \u201cconsciousness,\u201d for which there are countless Mustelidian memes besides the ones I\u2019ve already mentioned, including: subjective states, conscious states, mental states, phenomenal states, qualitative states, intentional states, intentionality, subjectivity, mentality, private states, 1st-person view, 3rd person view, contentful states, reflexive states, representational states, sentient states,&nbsp;experiential&nbsp;states, reflexivity, self-awareness, self-consciousness, sentience, raw feels, feeling, experience, soul, spirit, mind\u2026.<\/p>\n\n\n\n<p>I think I know where the confusion resides, and also, if not when the confusion started, at least when it was compounded and widely propagated by Block\u2019s (1995) target article in&nbsp;<em>Behavioral and Brain Sciences<\/em>&nbsp;\u201cOn a confusion about a function of consciousness.\u201d It was there that Block baptized an explicit distinction between (at least) two \u201cconsciousnesses\u201d: \u201cphenomenal consciousness\u201d and \u201caccess consciousness\u201d:<\/p>\n\n\n\n<p>\u201cConsciousness is a mongrel concept: there are a number of very different \u2018consciousnesses.\u2019 Phenomenal consciousness is experience; the phenomenally conscious aspect of a state is what it is like to be in that state. The mark of access-consciousness, by contrast, is availability for use in reasoning and rationally guiding speech and action.\u201d<\/p>\n\n\n\n<p><strong>Feeling.<\/strong>&nbsp;What Block meant by \u201cphenomenal consciousness\u201d is obviously sentience; and what he meant to say (in a needlessly Nagelian way) is that there is something it&nbsp;<em>feels<\/em>&nbsp;like (not the needlessly vague and equivocal \u201c<em>is<\/em>&nbsp;like\u201d of Nagel 1974) to be in a sentient state. In a word, a sentient state is a&nbsp;<em>felt<\/em>&nbsp;state. (There is something it \u201cis like\u201d for water to be in a state of boiling: that something is what happens to the state of water above the temperature of 212 degrees Fahrenheit; but it does not feel like anything&nbsp;<em>to the water<\/em>&nbsp;to be in that state \u2013 only to a sentient organism that makes the mistake of reaching into the water.)<\/p>\n\n\n\n<p>Block\u2019s \u201caccess consciousness,\u201d in contrast, is no kind of&nbsp;&nbsp;\u201cconsciousness\u201d at all, although it is indeed also a sentient state &#8212; unless there are not only \u201ca number&nbsp;of very different \u2018consciousnesses\u2019,\u201d but an&nbsp;<em>infinite<\/em>number, one for every possible feeling that can be felt, from what it feels like for a human to hear an oboe play an A at 440 Hz, to what it feels like to hear an oboe play an A at 444 Hz, to what it feels like to know that Trump is finally out of the White House.&nbsp;<\/p>\n\n\n\n<p><strong>Information<\/strong>. No, those are not different consciousnesses; they are differences in the&nbsp;<em>content<\/em>&nbsp;of \u201cconsciousness,\u201d which, once de-weaseled, just means differences in what different feelings feel like. As to the \u201caccess\u201d in the notion of an \u201caccess consciousness,\u201d it just pertains to access to that felt content, along with the information (data) that the feelings accompany. Information, in turn, is anything that resolves&nbsp;<em>uncertainty about what to do<\/em>, among a finite number of options (Cole 1993).<\/p>\n\n\n\n<p><strong>Access.<\/strong>&nbsp;There is no such thing as an unfelt feeling (even though Pereira invokes some incoherent Freudian notions that became enshrined in the myth of an unconscious \u201cmind,\u201d which would have amounted to an unconscious consciousness):&nbsp;<\/p>\n\n\n\n<p>&#8220;Sentience can also be conceived as co-extensive to the (Freudian) \u2018unconscious\u2019&#8230; In Freudian psychotherapy, the ways of sentience are classically understood as unconscious processes of the Id and Ego that influence emotional states and behavior&#8221;<\/p>\n\n\n\n<p>The contents we access are the contents that it feels like something to know (or believe you know, Burton 2008). If I am trying to remember someone\u2019s name, then unless I can retrieve it I do not have access to it. If and when I retrieve it, not only do I have access to the name, so I can tell someone what it is, and phone the person, etc., but, as with everything else one knows, it&nbsp;<em>feels like something<\/em>&nbsp;to know that name, a feeling to which I had no \u201caccess\u201d when I couldn\u2019t remember the name. Both knowing the name and not knowing the name were sentient states; what changed was not a kind of consciousness, but access to information (data): to the content that one was conscious of, along with what it felt like to have that access. Computers have access to information, but because they are insentient, it does not feel like anything to \u201chave\u201d that information. And for sentient organisms it not only feels like something to see and hear, to be happy or sad, and to want or seek something, but also to reason about, believe, understand or know something.&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>Problems: Easy, Hard and Other.<\/strong>&nbsp;Pereira unfortunately gets Chalmers\u2019s \u201ceasy\u201d and \u201chard\u201d problem very wrong:<\/p>\n\n\n\n<p>&#8220;the study of sentience is within the \u2018Easy Problems\u2019 conceived by Chalmers (1995), while explaining full consciousness is the \u2018Hard Problem\u2019.&#8221;<\/p>\n\n\n\n<p>The \u201ceasy problem\u201d of cognitive science is to explain causally how and why organisms can&nbsp;<em>do<\/em>&nbsp;all the (observable) things that organisms are able to do (from seeing and hearing and moving to learning and reasoning and communicating), including what their brains and bodies can do internally (i.e., their neuroanatomy, neurophysiology and neurochemistry, including homeostasis and allostasis). To explain these capacities is to \u201creverse-engineer them\u201d so as to identify, demonstrate and describe the underlying causal mechanisms that produce the capacities (Turing 1950\/2009).&nbsp;<\/p>\n\n\n\n<p>The \u201chard problem\u201d is to explain how and why (sentient) organisms can feel. Feelings are not observable (to anyone other than the feeler); only the doings that are correlated with them are observable. This is the \u201cother-minds problem\u201d (Harnad 2016), which is neither the easy problem nor the hard problem. But Pereira writes:<\/p>\n\n\n\n<p>&#8220;On the one hand, [1] sentience, as the capacity of controlling homeostasis with the generation of adaptive feelings, can be studied by means of biological structures and functions, as empirical registers of ionic waves and the lactate biomarker. On the other hand, [2] conscious first-person experiences in episodes containing mental representations and with attached qualia cannot be reduced to their biological correlates, neuron firings and patterns of connectivity, as famously claimed by Chalmers (1995).&#8221;<\/p>\n\n\n\n<p><strong>Integration.<\/strong>&nbsp;Regarding [1] It is true that some thinkers (notably Damasio 1999) have tried to link consciousness to homeostasis, perhaps because of the hunch that many thinkers (e.g., Baars 1997) have had that consciousness, too, may have something to do with monitoring and integrating many distributed activities, just as homeostasis does. But I\u2019m not sure others would go so far as to say that sentience (feeling) is the capacity to control homeostasis; in fact, it\u2019s more likely to be the other way round. And the ionic waves and lactate biomarker sound like Pereira\u2019s own conjecture.<\/p>\n\n\n\n<p>Regarding [2], the Mustelidian mist is so thick that it is difficult to find one\u2019s way: \u201cconscious, first-person experiences\u201d (i.e., \u201cfelt, feeler\u2019s feelings\u201d) is just a string of redundant synonyms: Unfelt states are not conscious states; can feelings be other than 1st-personal? How is an unfelt experience an experience? \u201cMental\u201d is another weasel-word for&nbsp;<em>felt<\/em>. \u201cRepresentation\u201d is the most widely used weasel-word in cognitive science and refers to whatever state or process the theorist thinks is going on in a head (or a computer). The underlying intuition seems to be something like an internal pictorial or verbal \u201crepresentation\u201d or internal model of something else. So, at bottom \u201cinternally represented\u201d just means internally coded, somehow. \u201cQualia\u201d are again just feelings. And, yes, correlation is not causation; nor is it causal explanation. That\u2019s why the hard problem is hard.<\/p>\n\n\n\n<p>None of this is clarified by statements like the following (which I leave it to the reader to try to de-weasel):<\/p>\n\n\n\n<p>\u201csentience can be understood as potential consciousness: the capacity to feel (proprioceptive and exteroceptive sensations, emotional feelings) and to have qualitative experiences, while cognitive consciousness refers to the actual experience of thinking with mental representations.\u201d<\/p>\n\n\n\n<p><strong>Ethical Priority of Sentience.<\/strong>&nbsp;But, not to end on a negative note: not only is Pereira right to stress sentience and its biomarkers in the assessment of chronic vegetative states in human patients, but, inasmuch as he (rightly) classifies the capacity to feel pain as sentience (rather than as \u201ccognitive consciousness\u201d), Pereira also accords sentience the ethical priority that it merits wherever it occurs, whether in our own or any other sentient species (Mikhalevich &amp; Powell 2020).&nbsp;<\/p>\n\n\n\n<p><strong>References<\/strong><\/p>\n\n\n\n<p>Baars B. (1997).&nbsp;<em>In the Theater of Consciousness: The Workspace of the Mind<\/em>. New York: Oxford University Press.&nbsp;<\/p>\n\n\n\n<p>Birch, J. (2017)&nbsp;<a href=\"https:\/\/www.wellbeingintlstudiesrepository.org\/cgi\/viewcontent.cgi?article=1200&amp;context=animsent\">Animal sentience and the precautionary principle<\/a>.&nbsp;<em>Animal Sentience<\/em>&nbsp;16(1)<\/p>\n\n\n\n<p>Block, N. (1995).&nbsp;<a href=\"http:\/\/cogprints.org\/231\/1\/199712004.html\">On a confusion about a function of consciousness<\/a>.&nbsp;<em>Behavioral and Brain Sciences<\/em>, 18(2), 227-247.&nbsp;<\/p>\n\n\n\n<p>Burton, RA. (2008)&nbsp;<em>On Being Certain: Believing You Are Right Even When You&#8217;re Not.<\/em>&nbsp;&nbsp;New York City: Macmillan Publishers\/St. Martin&#8217;s Press.<\/p>\n\n\n\n<p>Cole, C. (1993). Shannon revisited: Information in terms of uncertainty.&nbsp;<em>Journal of the American Society for Information Science<\/em>,&nbsp;<em>44<\/em>(4), 204-211.<\/p>\n\n\n\n<p>Damasio, A. (1999)&nbsp;<em>The Feeling of What Happens: Body and Emotion in the Making of Consciousness<\/em>. New York: Harcourt.&nbsp;<\/p>\n\n\n\n<p>Harnad, S. (2016).&nbsp;<a href=\"https:\/\/www.wellbeingintlstudiesrepository.org\/animsent\/vol1\/iss1\/1\">Animal sentience: The other-minds problem<\/a>.&nbsp;<em>Animal Sentience<\/em>&nbsp;<em>1<\/em>(1).&nbsp;<\/p>\n\n\n\n<p>Mikhalevich, I. and Powell, R.&nbsp;(2020)&nbsp;<a href=\"https:\/\/www.wellbeingintlstudiesrepository.org\/animsent\/vol5\/iss29\/1\">Minds without spines: Evolutionarily inclusive animal ethics<\/a>.&nbsp;<em>Animal Sentience<\/em>&nbsp;29(1)<\/p>\n\n\n\n<p>Nagel, T. (1974).&nbsp;<a href=\"https:\/\/www.jstor.org\/tc\/accept?origin=%2Fstable%2Fpdf%2F2183914.pdf%3Fcasa_token%3DFDLKj_fxwGgAAAAA%3AeTL4KkdxvxbveDJFwHcj8zViFzIMtUvdNSTK0oqMlo-bJuMk2E2na5lMrk6QoriCi1GUkpUIUaBjXgX4UZWdHAtm5IAMhlagcMMrSQd8ATS4wjm2_Wml&amp;is_image=False\">What is it like to be a bat?<\/a>&nbsp;<em>The Philosophical Review<\/em>, 83(4), 435-450.<\/p>\n\n\n\n<p>Pereira, O. (2021)&nbsp;The Role of Sentience in the Theory of Consciousness and Medical Practice.&nbsp;<em>Journal of Consciousness Studies<\/em>&nbsp;(Special Issue on &#8220;Sentience and Consciousness&#8221;)&nbsp;<\/p>\n\n\n\n<p>Turing, A. M. (2009).&nbsp;<a href=\"https:\/\/watermark.silverchair.com\/lix-236-433.pdf?token=AQECAHi208BE49Ooan9kkhW_Ercy7Dm3ZL_9Cf3qfKAc485ysgAAAq0wggKpBgkqhkiG9w0BBwagggKaMIIClgIBADCCAo8GCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQM2y9uRC9_jzLa9WJjAgEQgIICYJHZkI2Q9QnlMkCsTYptutvfgZxAY8bfpgueV37c4zLYvTT-MLxQdbXGI4ZAFogvpRIpRek-qhM3jZOeI3ZAVB7PZLwQ0e5ZNofMoWLGgJofuAGoD9CqFS1QxnezYc0yMLOwzFEJEumrV3U2jRewisTMsTQtEKHMbJzS-SjypgHXT8yS4YGQw8kwIITqprKkpqrXG9Bk9sD0w559uqONR4Qp4hkeP5OaeCEXeSzz6r9dBm0ZYbX5LT7sD7NR0eBuhUCNnBfCzU0ZGR21yrZXh426vtgsm4fRTYVhApVzUw-XlR7304lLUF0DX5OAkcRUwmnJ6I5BaIm6D9o--Gfo4mZyq9HyD3ICCqQB-wY91BJaRKJVUVIAlzmrNpdbi92Pju4qcL04MVDQB0fiovbSb9LobB5UhgUMbGalqFb5A5jMqovs2ZlZL_m6BnzLl-OmUSAebJjAncNFYIfbz_l_ZXHdAv4wkBDAdcSykKHfYJZVnVLiXepFMaiyryOish08kn4Lys9WMbqyS1XGjSAddvAYTFBRXrFm0_pEQ8XWJFoib_bVivHW2EIWehSLKmvrun7MvzY-NW0HaVzA3-z-LaAQijHjnjMqf8fqPbjwyf47aRJKe91MRieULXJJWwXed8otCx7Ai5K8r_oXRBl8SWoVGl93gRQwSFwY3LxTJT7MEALtgmDyX9ZxQWTn2VTcSKZAFx2SGlQ5ba2iHiVboj3Thu9IoIF_G079ags0-Yed2thgo3tCQdKALCoV7Z79ImO7TFixChOAiCjGgMCOSJ_973xNBUeJMsVuRqGuzJsK\">Computing machinery and intelligence<\/a>. In&nbsp;<em>Parsing the Turing Test<\/em>&nbsp;(pp. 23-65). Springer, Dordrecht.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Harnad, S. (2021). On the (Too) Many Faces of Consciousness.&nbsp;Journal of Consciousness Studies,&nbsp;28(7-8), 61-66. Abstract:&nbsp;&nbsp;Pereira, like some other authors, both likens and links consciousness to homeostasis. He accordingly recommends that some measures of homeostasis be taken into account as biomarkers for sentience in patients who are in a chronic vegetative state. He rightly defines \u201csentience\u201d &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/12\/03\/on-the-too-many-faces-of-consciousness\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;On the (too) many faces of consciousness&#8221;<\/span><\/a><\/p>\n","protected":false},"author":3074,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-2050","post","type-post","status-publish","format-standard","hentry","category-hard-problem-consciousness"],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/2050","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/users\/3074"}],"replies":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/comments?post=2050"}],"version-history":[{"count":2,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/2050\/revisions"}],"predecessor-version":[{"id":2052,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/2050\/revisions\/2052"}],"wp:attachment":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/media?parent=2050"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/categories?post=2050"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/tags?post=2050"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}