Older adults are often left out of decisions surrounding technology related to their care, such as location trackers and companion robots, new research finds.

Clara Berridge, associate professor of social work at the University of Washington, studies issues facing older adults, in particular technology that can support care or a person’s ability to live independently.

She recently published two articles related to older adults and technology. In articles in the Journal of Elder Policy and Frontiers in Psychology, Berridge explores older adults’ opinions of companion robots, finding that such devices may not provide the blanket comfort or utility that creators presume—and that older adults have an interest in data protections.

“Older adults have been learning about, adapting, and integrating technology solutions into their lives for longer than anyone,” Berridge says. “Older adults’ feelings about technologies on offer to them for care and living at home, and their creative use, resistance, and other interactions with these technologies should be taken seriously.

“So much research, time, and money has been focused on pushing acceptance of technologies that could be better spent enabling control by older adults over direction, purpose, and design.”

Here, Berridge explains the importance of involving older adults in the design and use of technology:

What do you think is important for people outside the field to understand?

One of the themes in my research is that older adults are rarely empowered to refuse or negotiate how technology is used in their care. That doesn’t prevent many from refusing and negotiating nevertheless—older adults are not passive users—but that this kind of engagement is often discouraged in design and implementation.

My research on long-term care in peoples’ homes and in residential facilities has found that people are not meaningfully engaged in decisions about how and what data should be collected about them. There’s a misperception that most don’t want to be involved or consulted. Older adults are often configured as passive data points.

This matters because embedded in these technologies are certain values (e.g., that safety justifies invasion of privacy) or limitations on the older adult (i.e., they can’t deviate from routine, like younger adults can, without triggering an intervention).

With the algorithmic management of care, the person being targeted by technology may not share the priorities embedded in the devices themselves. When the technology practice enables older adults to be controlled, rather than enabling them to have control, this intensifies unbalanced power dynamics in care. It can mean restrictions or exercise of control over their lives.

There’s a lot of ageism at play in how technologies used in care are developed, hyped, and implemented. And ableism, particularly when it comes to dementia.

Read the full article about technology and eldercare by Kim Eckart at Futurity.