{"id":1853,"date":"2023-05-16T20:54:33","date_gmt":"2023-05-16T19:54:33","guid":{"rendered":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/?p=1853"},"modified":"2023-05-19T22:06:07","modified_gmt":"2023-05-19T21:06:07","slug":"semantic-ghosts-in-syntax","status":"publish","type":"post","link":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/05\/16\/semantic-ghosts-in-syntax\/","title":{"rendered":"Semantic Ghosts in Syntax"},"content":{"rendered":"\n<p><strong>&#8220;<a href=\"https:\/\/www.nytimes.com\/2023\/05\/16\/technology\/microsoft-ai-human-reasoning.html?smid=nytcore-ios-share&amp;referringSource=articleShare\">Microsoft Says New A.I. Shows Signs of Human Reasoning<\/a>&#8220;<\/strong><\/p>\n\n\n\n<p>GPT definitely does not understand. It\u2019s just a computer program plus an unimaginably huge database of the words that countless real thinking people have written, in books, articles, and online media. The software does increasingly sophisticated \u201cfilling in the blanks\u201d when it answers questions, using that enormous data-base of things that other people have written (and spoken) about all kinds of other things. It is amazing how much can be gotten out of such an enormous database by software (which is what GPT is).<\/p>\n\n\n\n<p>GPT neither understands nor means anything with what it says. It is (as\u00a0<a href=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/03\/12\/1827\/\">Emily Bender<\/a>\u00a0has aptly dubbed it) a \u201cstatistical parrot.\u201d It doesn\u2019t parrot verbatim, echolalicly, what words are said to it, but it draws on the combinations of the words said to it, along with all those other words it has ingested, recombining them to fill in the blanks. \u201cWhat is the opposite of down?\u201d No one is surprised that GPT says \u201cup.\u201d Well, all those other verbal interactions are the same, only you need very powerful software and an enormous database to fill in the blanks there too.<\/p>\n\n\n\n<p>One other thing (much more important than fussing about whether GPT thinks, understands or means anything &#8212; it doesn\u2019t): The success of GPT teaches us something about&nbsp;<em>the nature of language itself<\/em>: Language can encode and transmit the thoughts of real, thinking people. The words (in all languages) are arbitrary in shape. (\u201cRound\u201d does not look round.) Single words could all just as well have been strings of 0\u2019s and 1\u2019s in Morse code (of English or any other language; &#8220;round&#8221; = <strong>.-. &#8212; ..- -. -..<\/strong>). The strings of words, though (rather than just sounds or letters), are  not quite that arbitrary. Not just because the patterns follow grammatical rules (those rules are arbitrary too, though systematic), but, once people agree on a vocabulary and grammar, the things they say to one another, preserve (in linguistic code) some of the <em>structure of what speakers and hearers&nbsp;&nbsp;are thinking<\/em>, and transmitting, in words. Their form (partly) resembles their meaning. They are the syntactic shadows of semantics.<\/p>\n\n\n\n<p>Language is not video (where it\u2019s form that preserves form). And video is not language. If I want you to know that the cat is on the mat (and the cat really is on the mat, and we are zooming), I can aim the camera at the cat and if you have functioning eyes, and have seen cats and mats before, you will see that the cat is on the mat. And if you can talk English, you can caption what you are seeing with the sentence \u201cThe cat is on the mat.\u201d Now that sentence does not look anything like a cat on a mat. But, if you speak English, it preserves some of its structure, which is not the same structure as \u201cthe mat is on the cat\u201d or \u201cthe chicken crossed the road.\u201d<\/p>\n\n\n\n<p>But if there were nothing in the world but mats and cats, with one on the other or vice versa, and chickens and roads, with the chicken crossing or not crossing the road, then any trivial scene-describing software could answer questions like: \u201cIs there anything on the mat?\u201d \u201cWhat?\u201d \u201cIs anything crossing the road?\u201d Trivial. Could also easily be done with just words, describing what\u2019s where, in a toy conversational program.<\/p>\n\n\n\n<p>Now scale that up with what GPT can do with words, if it has an enormous sample of whatever people have said about whatever there is to say, as GPT has. \u201cWho killed Julius Caesar?\u201d \u201cWhat is the proof of Fermat\u2019s last theorem?\u201d That\u2019s all in GPT\u2019s database, in words. And if you talk about more nonsensical things (\u201cIs there a Supreme Being?\u201d \u201cIs there life after death?\u201d \u201cWill Trump stage a second coming?\u201d) it\u2019ll parrot back a gentrified synthesis of the same kinds of nonsense we parrot to one another about that stuff &#8212; and that\u2019s all in the database too.&nbsp;<\/p>\n\n\n\n<p>Every input to GPT is saying or asking something, in the same way. It just sometimes takes more work to fill in the blanks from the database in a way that makes sense. And of course GPT errs or invents where it cannot (yet) fill the blanks. But one of the reasons OpenAI is allowing people to use (a less advanced version of) GPT for free, is that another thing GPT can do is <em>learn.<\/em> <span style=\"text-decoration: underline\">Your<\/span> input to GPT can become part of its database. And that\u2019s helpful to OpenAI.<\/p>\n\n\n\n<p>It&#8217;s nevertheless remarkable that GPT can do what it does, guided by the formal structure inherent in the words in its database and exchanges. The statistical parrot is using this property of spoken words to say things with them that make sense to us. A lot of that, GPT can do without help, just filling in the blanks with the ghosts of meaning haunting the structure of the words. And what\u2019s missing, we supply to GPT with our own words as feedback. But it\u2019s doing it all with parrotry, computations,&nbsp;&nbsp;a huge database, and some deep learning.&nbsp;<\/p>\n\n\n\n<p>So what do we have in our heads that GPT <em>doesn\u2019t<\/em> have? Both GPT and we have words. But our words are connected, by our eyes, hands, doings, learnings and brains, to the things those words stand for in the world, their <em>referents<\/em>. \u201cCat\u201d is connected to cats. We have learned to recognize a cat when we see one. And we can tell it apart from a chicken. And we know what to do with cats (hug them) and chickens (also hug them, but more gently). How we can do that is what I\u2019ve dubbed the \u201csymbol grounding problem.\u201d And GPT can\u2019t do it (because it doesn\u2019t have a body, nor sense and movement organs, nor a brain, with which it can learn what to do with what \u2013 including what to call it).<\/p>\n\n\n\n<p>And we know where to go to get the cat, if someone tells us it&#8217;s on the mat. But the sentence  \u201cthe cat is on the mat\u201d is connected to the cat\u2019s being on the mat in a way that is different from the way the word \u201ccat\u201d is connected to cats. It\u2019s based on the structure of language, which includes Subject\/Predicate propositions, with truth values (T &amp; F), and negation.<\/p>\n\n\n\n<p>GPT has only the words, and can only fill in the blanks. But the words have more than we give them credit for. They are the pale Platonic contours of their meanings: semantic epiphenomena in syntax.<\/p>\n\n\n\n<p>And now, thanks to GPT, I can\u2019t give any more take-home essay exams.<\/p>\n\n\n\n<p>Here are some of my<strong>&nbsp;<a href=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/02\/04\/chats-with-gpt-on-symbol-grounding-and-turing-testing\/\">chats with GPT<\/a><\/strong>.<\/p>\n\n\n\n<p class=\"has-text-align-left\"><small>P.S. If anyone is worrying that I have fallen into an early-Wittgensteinian trap of thinking only about a toy world of referents of concrete, palpable objects and actions like cats, mats, roads, and crossing, not abstractions like &#8220;justice,&#8221; &#8220;truth&#8221; and &#8220;beauty,&#8221; relax. There&#8217;s already more than that in sentences expressing propositions (the predicative IS-ness of &#8220;the cat is on the mat&#8221; already spawns a more abstract category &#8212; &#8220;being-on-mat&#8221;-ness &#8212; whose name we could, if we cared to, add to our lexicon, as yet another referent). But there&#8217;s also the &#8220;<a href=\"https:\/\/www.southampton.ac.uk\/~harnad\/Papers\/Harnad\/harnad87.uncomp.htm\">peekaboo unicorn<\/a>&#8221; &#8212; a horse with a single horn that vanishes without a trace the instant anyone trains eyes or measuring instrument on it. And the property of being such a horse is also a referent, eligible for filling in the blank for a predicate in a subject-predicate proposition, is now  available. (Exercise of scaling this to justice, truth and beauty is left to the reader. There&#8217;s a lot of structure left in them there formal shadows.)<\/small><\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"1024\" src=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave.png\" alt=\"\" class=\"wp-image-1867\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave.png 1024w, https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave-300x300.png 300w, https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave-150x150.png 150w, https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave-768x768.png 768w, https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-content\/uploads\/sites\/287\/2023\/05\/PlatoGptCave-100x100.png 100w\" sizes=\"auto, (max-width: 767px) 89vw, (max-width: 1000px) 54vw, (max-width: 1071px) 543px, 580px\" \/><figcaption><strong>Platonic Shades<\/strong> <strong>Meaning<\/strong> &#8212; by <em>Dall-e<\/em><\/figcaption><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>&#8220;Microsoft Says New A.I. Shows Signs of Human Reasoning&#8220; GPT definitely does not understand. It\u2019s just a computer program plus an unimaginably huge database of the words that countless real thinking people have written, in books, articles, and online media. The software does increasingly sophisticated \u201cfilling in the blanks\u201d when it answers questions, using that &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/2023\/05\/16\/semantic-ghosts-in-syntax\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Semantic Ghosts in Syntax&#8221;<\/span><\/a><\/p>\n","protected":false},"author":3074,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146,106],"tags":[],"class_list":["post-1853","post","type-post","status-publish","format-standard","hentry","category-chatgpt","category-language"],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/1853","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/users\/3074"}],"replies":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/comments?post=1853"}],"version-history":[{"count":8,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/1853\/revisions"}],"predecessor-version":[{"id":1883,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/posts\/1853\/revisions\/1883"}],"wp:attachment":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/media?parent=1853"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/categories?post=1853"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/skywritings\/wp-json\/wp\/v2\/tags?post=1853"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}