Journal ArticleVolume 72024

Care Bots & The Issue of Deception

Ila Kacker

PDF

Suggested Citation

Ila Kacker. “Care Bots & The Issue of Deception.” A Priori, vol. 7, 2024, pp. 53–66.

Abstract

The aging population in America is growing faster now than ever. However, we lack the proper infrastructure and resources to care for them adequately. Those involved in the field of elder care are experimenting with solutions to this problem. One of the most pressing solutions is the use of artificial intelligence, namely care bots. Care bots are a specific type of technology that aims at providing physical and emotional support for the vulnerable elderly population. While the practical benefits of care bots are evident, the ethical implications relating to social isolation, paternalism, and deception must also be considered before they can be implemented as caregivers. With a specific focus on the issue of deception, I will demonstrate that certain types of care bots, such as those that simulate a reciprocal relationship between the bot and the care receiver, are inherently deceptive and immoral. However, other types of care bots, such as nurse bots, may be ethical as they do not attempt to simulate a reciprocal relationship, and they act in a manner consistent with benevolence rather than care.

An Aging America

The aging population in America is greater now than ever, with the population of those 65 and older growing almost five times faster than the rest of the population over the past 100 years. Many researchers have turned to the idea of artificial intelligence to combat this issue. Artificial intelligence, which is beginning to be more frequently used in healthcare contexts, specifically the robot technology coined "the care bot," is one of the most promising options to revolutionize elder care.

Care Bot Technology

A general definition of a care bot is a robot that provides care and support for vulnerable people suffering from mental and physical ailments. Many different companies have attempted to develop their own versions of care bots, including the Care-O-bot, Robear, or Actron MentorBotTM, which are all robots that can help the care-receiver with tasks such as those around the household, lifting a patient from their beds to wheelchairs, and reminding patients to take their medication. One particular example that has garnered a lot of attention is PARO, the interactive robot made to look like a seal, due to its ability to interact and comfort dementia patients by making sounds and responding to touch.

Ethical Implications of Care Bot Technologies

Prior research has noted that the most fruitful ways care bots can positively aid in elder care consists of assisting them with their daily tasks, monitoring their behavior and health, and providing companionship. These three benefits, however, also have the potential to yield negative ethical implications. A care bot assisting with everyday tasks of the care-receiver may result in that elder having little to no human interaction, leading to social isolation; monitoring their behavior and health may lead to a paternalistic attitude with decreased freedom; and providing companionship may cause deception as it is not possible for a robot to truly care for a human being.

The last most commonly discussed ethical issue is that of deception associated with the concern that the companionship care bots provide may be mistaken for a genuine, caring relationship. In order to care for someone, one must (a) do the right actions to exhibit care for another, and (b) they must do the right actions for the right reasons. Care bots are unable to truly care, but instead they mimic caring acts involving protecting and helping the care-receiver. Care bot types such as PARO can be guilty of deceiving the care-receiver because by simulating behaviors such as reacting when being pet and making affective noises, PARO is intentionally simulating a caring relationship by invoking feelings of care in the elders. Since PARO is programmed to act in this manner, the deception is quite literally coded into its function. Thus, I argue that care bots like PARO are inherently deceptive and thus are morally unfavorable.

The case is quite different for nurse bots, whose encoded function is primarily to aid care-receivers with tasks and reminders, and companionship or emotional support is a positive benefit served merely by their physical presence. Nurse bots are not guilty of deception. Their purpose was never to invoke feelings of care in the receiver, but rather simply to assist them. Furthermore, nurse bots simulate behaviors of benevolence, rather than care, and as such, their accompanying companionship cannot be considered deceptive.

Looking to the Future

In conclusion, this paper has demonstrated that while care bots have the ability to revolutionize elder care, there are also accompanying ethical considerations that must be taken into consideration, especially regarding the issues of social isolation, paternalism, and deception. I have attempted to show that certain care bot technologies, such as nurse bots, are morally permissible as the claim of deception does not apply to them; however, deception is quite obvious in other care bot technologies like that of PARO. Care bots surely have a place in the future of healthcare; however, significant ethical work must be done first in order to ensure that human dignity and principles are upheld.