An article in the Guardian suggests the use of artificial intelligence in caregiving is growing at a rapid pace. Computers are guiding decisions about elder care, driven by a shortage of caregivers, an ageing population and families wanting their aging parents to stay in their own homes longer. A plethora of so called “age tech” companies have sprung up over the last few years including those designed to keep tabs on older adults, particularly those with cognitive decline. Their solutions are now beginning to permeate into home care, assisted living and nursing facilities.
The technology can free up human caregivers so they can be “as efficient as potentially possible” sums up Majd Alwan, the executive director of the Centre for Aging Services Technologies at LeadingAge, an organization representing non-profit ageing services providers.
But while there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. They raise questions around the accuracy of the systems, as well as about privacy, consent and the kind of world we want for our elders. “We’re introducing these products based on this enthusiasm that they’re better than what we have – and I think that’s an assumption,” says Alisa Grigorovich, a gerontologist who has also been studying the technology at the KITE-Toronto Rehabilitation Institute, University Health Network, Canada.
Computers are guiding decisions about older people’s care
While there are potential benefits of the technology in terms of safety for older people and a reprieve for caregivers, some also worry about its potential harms. Computers are increasingly guiding decisions about elder care – and tracking everything from toilet visits to whether someone has bathed.
Technology to help keep seniors safe has been in use for a long time – think life alert pendants and so called “nanny cams” set up by families fearful their loved ones could be mistreated. But incorporating systems that use data to make decisions – what we now call AI – is new.
Increasingly cheap sensors collect many terabytes of data which is then analyzed by computer scripts known as algorithms to infer actions or patterns in activities of daily living and detect if things might be off.
A fall, “wandering behaviour”, or a change in the number or duration of bathroom visits that might signal a health condition such as a urinary tract infection or dehydration are just some of the things that trigger alerts to carers. The systems use everything from motion sensors to cameras to even lidar, a type of laser scanning used by self-driving cars, to monitor spaces. Others monitor individuals using wearables.
Detection of falls
In late 2019, AI-based fall detection technology from a Bay Area startup, SafelyYou, was installed to monitor its 23 apartments (it is turned on in all but one apartment where the family didn’t consent). A single camera unobtrusively positioned high on each bedroom wall continuously monitors the scene.
If the system, which has been trained on SafelyYou’s ever expanding library of falls, detects a fall, staff are alerted. The footage, which is kept only if an event triggers the system, can then be viewed in the Trousdale’s control room by paramedics to help decide whether someone needs to go to hospital – did they hit their head? – and by designated staff to analyze what changes could prevent the person falling again.
Implications of digital technologies
Clara Berridge, who studies the implications of digital technologies used in elder care at the University of Washington, privacy intrusion on older adults is one of the most worrying risks. She also fears it could reduce human interaction and hands-on care – already lacking in many places – further still, worsening social isolation for older people.
In 2014, Berridge interviewed 20 non-cognitively-impaired elder residents in a low-income independent living apartment building that used an AI-based monitoring system called QuietCare, based on motion detection. It triggered an operator call to residents – escalating to family members if necessary – in cases such as a possible bathroom fall, not leaving the bedroom, a significant drop-in overall activity or a significant change in nighttime bathroom use.
What she found was damning. The expectation of routine built into the system disrupted the elders’ activities and caused them to change their behaviour to try to avoid unnecessary alerts that might bother family members. One woman stopped sleeping in her recliner because she was afraid it would show inactivity and trigger an alert. Others rushed in the bathroom for fear of the consequences if they stayed too long.
Some residents begged for the sensors to be removed – though others were so lonely they tried to game the system so they could chat with the operator.
Social care will never have sufficient resources to meet the demands placed upon it. Some providers going forward may well see a role for artificial intelligence in care giving. In the studies covered in this article the use if AI was primarily used to monitor behaviour and activity within residential apartments in the USA with the aim of preventing risk. But who’s to say it will not encroach on care provision in the future within the UK.
Clara Berridge, research study identified an alarming downside in the use of AI where some residents begged to have sensors to be removed. I have no doubt that enthusiasts of AI will drive the agenda, given shortage of staff and risk reduction as adequate reason for its introduction. But surely one thing we have all learned during the pandemic is how much human beings value the company of others. Purveyors of AI will do well to bear this in mind.
Albert Cook BA, MA & Fellow Charted Quality Institute Managing Director Bettal Quality Consultancy