The University of Southampton

‘Watch’ this space: how our favourite devices could detect Alzheimer’s disease.

What is Alzheimer’s Disease?

Having close personal experience with dementia, I often think about the subtle changes in behaviour and function that I noticed, but initially overlooked, in family members who were (much) later diagnosed. Alzheimer’s disease (AD) is the most common form of dementia and accounts for 60-80% of dementia cases(1). It is a degenerative brain disease, where cell damage leads to complex brain changes that gradually worsen over time. Crucially, the slow progression of AD means that behavioural and physiological signs, or ‘biomarkers’ may present long before a diagnosis is even considered. I imagine many people who care for or love someone with dementia feel similarly, particularly as AD currently has no effective treatment and the current method of clinical diagnosis is criticised for cultural bias and inaccuracy. Early diagnosis is essential as the chance of reversing anatomic and physiologic changes decreases dramatically as the disease advances(2). Personally, I believe one of the most important benefits of early diagnosis is allowing people with dementia to open a dialogue about their ideal care plan, and express autonomy over their future.

The inspirational Wendy Mitchell

I highly recommend reading “What I wish People knew about dementia” by Wendy Mitchell, which provides a great insight into how early diagnosis and acceptance of dementia can help individuals maintain independence and autonomy.

Towards Digital Detection

A labrador retriever trained to sniff out covid

The race to improve diagnosis and treatment for AD is on. In 2023 there was even a $200,000 reward advertised for anyone who can prove that dogs can sniff out Alzheimer’s disease. Whilst MRI and PET molecular imaging of beta-amyloid and tau proteins are perhaps more promising, the cost and invasive nature of such methods precludes practical clinical applications(3).

Interestingly, digital consumer technology can overcome these limitations with the added benefit of pre-symptomatic detection. What’s amazing is that cognitive, behavioural, sensory, and motor biomarkers can aid detection of AD 10 to 15 years prior to effective diagnosis(4). This allows leverage of existing technologies and sensors such as smartphone microphones and GPS systems.

Watch this quick video I put together to find out more…

Original video, inspired by Kourtis et al (2019).

Innovation or Invasion?

Whilst I do own both a smartphone and an (albeit old) smartwatch, the concept of these devices continuously monitoring everything from my speech to my walk and even the way I view an Instagram post, makes me slightly uneasy. However, I can’t deny the advantages of personal continuous monitoring for public health. The question is where do we draw the line? Could we be heading down a path where continuous passive monitoring involves cameras wired up in our home, even in our toilets?! (apparently yes! Find out more here).

I think the answer is purely personal – for some people continuous passive monitoring may be the difference between life and death, for others it might feel a little too 1984. For people experiencing cognitive decline, informed consent may complicate matters further. According to the Mental Capacity Act (2005), many people with dementia may be considered ‘Incompetent adults’ (I’m not a fan of that term) if they fail to understand the device, and cannot remember or communicate the reasoning for use. This means they would be legally unable to consent. Thankfully, the increasing nature of wearable technologies may mean that, in the future, many people will own devices decades before they are at high risk of AD, and thus can choose (at legal capacity) whether or not to install bio-monitoring software. Of course, if these devices continue to be solely commercial, then the financial accessibility of these devices may be limited – which is a whole other debate in itself.

Original image: the pros and cons of digital biomarkers.

Summary

Given the increasingly technological cultural landscape – our access to devices capable of passive monitoring is increasing. Considering the UK Alzheimer’s epidemic and our ageing population, it seems a waste to not make use of the endless potential health benefits of these devices (especially as many already monitor us for consumer metrics anyway)! Although the degree of monitoring may seem invasive, I think Alzheimer’s is a far bigger threat to our personal privacy and autonomy. Such developments could help people communicate with their families and manage symptoms before its too late.

References

  1. Dementia vs. Alzheimer’s Disease: What Is the Difference? | alz.org
  2. Kourtis, L.C. et al. 2019. Nature. 2, 9. https://doi.org/10.1038/s41746-019-0084-2
  3. Bao, W. et al. 2021. Front. Aging Neurosci. https://doi.org/10.3389/fnagi.2021.624330
  4. Vrahatis, A. G. 2023. Sensors. 23, 9. https://doi.org/10.3390/s23094184
  5. Stringer, G. et al. 2018. Int J Geratr Psychiatry. 33,7. https://doi.org/10.1002%2Fgps.4863
  6. Sun, J. et al. 2022. Sec. Brain Imaging and Stimulation. 16. https://doi.org/10.3389/fnhum.2022.972773

Where do cochlear implants fit in Deaf culture? 

After watching the film ‘The Sound of Metal’, I realised that my previous perceptions of hearing loss didn’t consider the personal nuances and complexities that are integral to the Deaf community. The film follows a drummer who suddenly loses hearing in both ears. It is a highly personal portrayal of the different perspectives on hearing loss and the difficulties of adjusting to cochlear implants. What struck me the most (spoiler alert!) was the main character’s initial disappointment when getting fitted with a CI and the reaction of the deaf community he lived with to his decision. Following a fascinating lecture from Nicci Campbell, I decided to explore the perceptions of cochlear implants within Deaf culture further. 

What Are Cochlear Implants? 

Diagram of an in-situ cochlear implant. (NIDCD, Cochlear Implants).

Cochlear implants are small electronic devices that aid hearing acquisition and sense of sound in profoundly deaf or hard of hearing individuals (NIDCD). The instrument picks up sound through the microphone. Sound is then arranged by a speech processor and transmitted as an electrical signal to the electrode array, which sends the electrical impulses to various regions of the auditory nerve (NIDCD).

Concerningly socioeconomic status can influence outcomes of cochlear implant surgery, particularly in children (Sharma et al., 2020). I was unpleasantly surprised to learn that adults are only entitled to one CI on the NHS. It seems that whilst just one CI may provide sufficient access to auditory stimulation, this could intensify the socioeconomic divide in treatment for hearing loss and may prevent a significant increase in quality of life of individuals who can’t afford a second implant. One reality star, Daisy Kent, spoke about her hearing loss and stated that since she had the implant she doesn’t have ringing in her left ear, but “in my right ear, I have a ton of ringing”. I think this helps illustrate how only having one CI can prevent a much more desirable outcome for those who can’t afford two. 

Deaf Culture 

Prior to watching ‘The sound of metal’, I perhaps wouldn’t have considered that cochlear implants could be such a controversial topic. However it is clear that individual perspectives, particularly within the Deaf community, vary quite dramatically (Li et al., 2024). In the film, the main character joins a deaf school, and is told to leave once he secretly pays for cochlear implant surgery.

Some members of the deaf community see CI as a threat to Deaf culture. I think this highlights the rich history of communication and adaptations of people with hearing loss. To understand this further I have included a brilliant Ted Talk by Glenna Cooper.

I particularly enjoyed her statement that deaf people tend to have a much greater appreciation for the exchange of information, and I think this enhances her point that deaf people should not be considered as disabled, rather that they “have a different language”.  

You can read more about Deaf culture here.  

A Middle Ground

Sign language is perhaps the most obvious facet of Deaf culture. However, I was horrified to learn that not too long-ago many doctors told parents to discourage their deaf children from signing – and this is just one of the reasons why I can appreciate the sensitivity of assuming all deaf people may benefit from auditory aids, which may lead to a decline in the use of sign language. However, it is important to appreciate experiences where cochlear implants have created a unique path between both ways of life – Heather Artinian, a lawyer who was born deaf and to deaf parents, decided to get a cochlear implant surgery at age 10, against her parents initial wishes. She describes how she operates in the ‘Heather world’ where her upbringing amongst a deaf community, and her implant, allows her to enjoy aspects of both the hearing and the deaf world. I would highly encourage listening to her engaging and positive perspective on being in ‘not the hearing or deaf world.’ https://youtu.be/jhm5OaXJVMQ?si=TU_DSGcD-m-fEeqq

In an ideal world, we would all be more accommodating of Deaf culture, and more people would aim to learn sign language.  

You can follow this link to find out how to start learning sign language. You can also learn how to sign your own name, and other words here.

 

Celebrating diversity and appreciating different ways of experiencing the world enhances new perspectives and solutions of healthcare. Following utilitarian beliefs that aim to serve the majority threatens minority cultures, such as Deaf culture, which could be excluded when attempting to ‘fix’ what many people consider a significant part of their identity. I believe that whilst the development of CI has provided many people with access to a better quality of life, reduced social isolation and discomfort, we shouldn’t immediately assume that anatomical differences need to be universally ‘fixed’, rather than accommodated and respected, whether that be through learning BSL or providing equitable access to assisted hearing technology.

Links

NIDCD https://www.nidcd.nih.gov/health/cochlear-implants#:~:text=A%20cochlear%20implant%20is%20a,the%20skin%20(see%20figure).

Sharma et al (2020) https://doi.org/10.1016/j.ijporl.2020.109984

Li et al (2024) https://doi.org/10.1038/s41598-024-55006-8

Dawn of the RoboDogs

Global Veterinary Orthotics-Prosthetics Industry Projecting US$ 164.2 Million Valuation by 2033 with a 9.5% CAGR | FMI – FMIBlog

An article from Future Markets Insights reports a surge in the US veterinary Orthotics-Prosthetics, where the industry surpassed a valuation of $66.5 million in 2023. According to the FMI, the market share could rocket to $164.2 million by 2033.

This is perhaps unsurprising considering the strong bond between humans and their pets, however, the article suggests the driving force behind the market growth is a rising number of animal injuries.