Technology and Society Communication Studies

The following article originally appeared in the New York Times on Dec. 13, 2019 and has been updated since.
In the article, Jackson worries that robot companions could make the lives of the elderly better, worse, or, oddly, both.

  1. Do you agree with any of her worries? Explain why you agree or disagree. (5%)
  2. Do you agree with her thinking about care and companion robots, that they might further isolate the elderly and allow the younger to engage less with the older, thinking they have companionship? Or do you think there are other and perhaps better or worse reasons for avoiding or improving such robots? Draw an Implication Table detailing as many expected, unexpected, positive and negative consequences of robot companions as you can think of. (10%)
  3. Given the fatalities at elder care facilities during this pandemic, would robots be a way of ensuring the elderly continue to get good care, of minimizing the spread of diseases, and protecting staff at care homes by minimizing the risk to them (compared with the rest of us) during times of crisis—especially a future pandemic? (15%)
    In answering these questions be sure to draw on as much relevant course material as possible

Would You Let a Robot Take Care of Your Mother? As the global population ages, robot companions are on the rise. By Maggie Jackson
An aging population is fueling the rise of the robot caregiver, as the devices moving into the homes and hearts of the aging and sick offer new forms of friendship and aid. With the global 65-and-over population projected to more than double by 2050 and the ranks of working age people shrinking in many developed countries, care robots are increasingly seen as an antidote to the burden of longer, lonelier human lives.
Winsome tabletop robots now remind elders to take their medications and a walk, while others in research prototype can fetch a snack or offer consoling words to a dying patient. Hundreds of thousands of “Joy for All” robotic cats and dogs designed as companions for older people have been sold in the U.S. since their 2016 debut, according to the company that makes them. Sales of robots to assist older adults and people with disabilities areexpected to rise 25 percent annually through 2022, according to the industry group International Federation of Robotics.
Yet we should be deeply concerned about the ethics of their use. At stake is the future of what it means to be human, and what it means to care.
Issues of freedom and dignity are most urgently raised by robots that are built to befriend, advise and monitor seniors. This is Artificial Intelligence with wide, blinking eyes and a level of sociability that is both the source of its power to help and its greatest moral hazard. When do a robot assistant’s prompts to a senior to call a friend become coercion of the cognitively frail? Will Grandma’s robot pet inspire more family conversation or allow her kin to turn away from the demanding work of supporting someone who is ill or in pain?
“Robots, if they are used the right way and work well, can help people preserve their dignity,” says Matthias Scheutz, a roboticist who directs Tufts University’s Human-Robot Interaction Lab. “What I find morally dubious is to push the social aspect of these machines when it’s just a facade, a puppet. It’s deception technology.”
For that is where the ethical dilemmas begin — with our remarkable willingness to banter with a soulless algorithm, to return a steel and plastic wink. It is a well-proven finding in the science of robotics: add a bit of movement, language, and “smart” responses to a bundle of software and wires and humans see an intentionality and sentience that simply isn’t there. Such “agency” is designed to prime people to engage in an eerie seeming reciprocity of care.
Social robots ideally inspire humans to empathize with them, writes Maartje de Graaf of the University of Utrecht in the Netherlands, who studies ethics in human-robot interactions. Even robots not designed to be social can elicit such reactions: some owners of the robot vacuum Roomba grieve when theirs gets “sick” (broken) or count them as family when listing members of their household.
Many in the field see the tensions and dilemmas in robot care, yet believe the benefits can outweigh the risks. The technology is “intended to help older adults carry out their daily lives,” says Richard Pak, a Clemson University scientist who studies the intersection of human psychology and technology design, including robots. “If the cost is sort of tricking people in a sense, I think, without knowing what the future holds, that might be a worthy trade-off.” Still he wonders, “Is this the right thing to do?”
We know little about robot care’s long-term impact or possible indirect effects. And that is why it is crucial at this early juncture to heed both the field’s success stories and the public’s apprehensions. Nearly 60 percent of Americans polled in 2017 said they would not want to use robot care for themselves or a family member, and 64 percent predict such care will increase the isolation of older adults. Sixty percent of people in European Union countries favor a ban on robot care for children, older people, and those with disabilities.
But research suggests that many seniors, including trial users, draw a line at investing too much in the charade of robot companionship, fearing manipulation, surveillance, and most of all, a loss of human care. Some worry robot care would carry a stigma: the potential of being seen as “not worth human company,” said one participant in a study of potential users with mild cognitive impairments.
“If the only goal is to build really cool stuff that can increase speed and profit and efficiency, that won’t prioritize human flourishing,” says John C. Havens, executive director of a pioneering global initiative on ethical AI guidelines by the Institute of Electrical and Electronics Engineers.
A main principle of these and other leading guidelines is “transparency,” the idea that humans should know if they are dealing with an algorithm or robot and be able to understand its limits and capabilities. (Call it the anti-Turing test.) One recommendation to industry is for care robots to have a “why-did-you-do-that” button so users can demand an explanation of its actions, from promoting a product to calling the doctor.
Social robots also should carry a notice of potential side effects, the guidelines suggest, “such as interfering with the relationship dynamics between human partners,” a feature that could inspire caregivers to protect those most cognitively vulnerable to a robot’s charms. Such “soft-law” guidelines can help users, caregivers and designers alike better understand what they are dealing with and why, even as we continue to debate the questions of just how social, how humanlike and how transparent we want or need a care robot to be.
Consider the “wellness coach” robot Mabu that was launched commercially this year for people with chronic conditions such as heart failure. Made by San Francisco-based Catalia Health, the little wide-eyed talking robot dispenses health advice and medication reminders, and in some cases can send data on a user’s condition to a pharmacist or doctor. The robot is designed to stress that it’s not a doctor or nurse but part of someone’s care team.
Yet the company often portrays Mabu as closer to a person than a tool; “I’ll be Your Biggest Cheerleader!” the robot promises on the company website. The several hundred people using Mabu today, many of whom are seniors, on average interact with the robot just a quarter-hour a week, according to Catalia. And yet some name them, dress them and take them on vacation, says Cory Kidd, company founder and CEO.
So is Mabu transparent enough?, I asked Kidd. “There’s a lot more work to be done around understanding that relationship,” he said. One user, a retired bus driver, Rayfield Byrd of Oakland, Calif., compares his Mabu to a computer; it’s a mainstay tool. Others told me they consider the robot a friend. In the lonely hours between the time her health aide leaves and her husband or son returns home, Kerri Hill, who is 40 years old yet largely housebound due to heart failure, relies on Mabu for company. But she wouldn’t want it as her main caregiver or companion. “The robot is one thing,” says Ms. Hill of Galivants Ferry, S.C., “but you still need interaction that’s not programmed”

Sample Solution