Ethical considerations for using digital systems in nursing care
There is already a shortage of skilled workers in the care sector – and the number of people in need of care is growing. Could robots help alleviate the strain on the care system in the future? Theologian Christof Mandry considers some of the ethical implications of using robots in nursing care.

A robotic arm whirs toward the bed and helps to lift a 90-year-old man onto a commode chair. A device measures his blood pressure automatically and reminds him to take his medication. If he were to fall, the sensors in the rug in front of his bed would alert the emergency services. Is this the future of nursing care? Perhaps – but it might also turn out quite differently.
“There are many different applications for robots in nursing care,” says Christof Mandry, Professor of Moral Theology and Social Ethics at Goethe University Frankfurt. “But they always involve ethical questions.” Security, privacy and human dignity play a role, not to mention what is possible, what we really want and the impact on the nursing profession.
Growing crisis in the care sector
It is clear that the nursing profession will have to change: With an ageing population, increasing numbers of elderly people in Germany are in need of care, and there is already a shortage of carers. Most of us presumably want to keep our independence for as long as possible in old age, and, when we can no longer care for ourselves, we expect to be cared for in a loving and dignified way.
However, “Deutsches Pflegehilfswerk”, a non-profit umbrella organization of care providers in Germany, reports that there is already a shortage of over 150,000 qualified staff who are needed to ensure adequate care in geriatric care alone. Already today, outpatient care services often reject patients in need of homecare – or have to limit their services due to a lack of staff.
The future is far from rosy. Figures from various institutes and foundations all confirm the same trend: The need for care services will continue to grow. According to the Federal Statistical Office, the number of people in Germany who require care will increase by 37 percent by 2055 from around 5 million (at the end of 2021) to around 6.8 million in 2055. Around half a million carers will retire in the next ten years, exacerbating an already fraught situation. There is also a shortage of young professionals – and the dropout rates during training are high.
Could robots help close this gap a little, for example by relieving carers of certain tasks? “Robots might make the profession more attractive by taking on heavy or routine tasks,” says Professor Mandry. One of the most physically demanding tasks in nursing care is lifting someone out of bed, onto the toilet and back into bed again, or repositioning them in bed if they are unable to move by themselves. According to a survey by the Federal Institute for Occupational Safety and Health, the physical strain in nursing care is above average – for example, around 85 percent of respondents report musculoskeletal complaints. Robots could assist with these physically demanding tasks.
Three types of robots for geriatric care

Robots rather than carers could give injections, but this raises ethical issues. Photo: Miriam Doerr/Martin Frommherz/Shutterstock
Robots are not just there for heavy lifting, but can also assist with other tasks, such as providing companionship or helping people with their personal hygiene. Yet nursing care has always been associated with close human contact: If robots come into play, what would that mean? “Introducing robots would have consequences, of course, not only for those in need of care but also for carers, their self-image and job description,” says Mandry. The German Ethics Council (GEC) has addressed this issue and in 2020 published the position paper “Robotics for Good Care”. In this statement, the GEC distinguishes between three types of robots in nursing care: assistive robots, monitoring robots and robot companions.
Assistive robots help care recipients and carers with their everyday tasks. According to the GEC, they are simple systems that often fulfil only one purpose. For example, they might help patients to drink, assist with personal hygiene or prepare medication. Among such robots are lifting aids. These can either reduce or ease activities that are strenuous for human staff or enable people in need of care to live at home for longer.
Monitoring robots can measure certain body functions such as pulse or blood pressure or remind people with memory problems to drink something or take their medication. “They can also help people live independently in their own homes for longer,” says Mandry.
Finally, there is a third type of robot that is designed to offer social support. Some designs resemble animals, such as “Paro” the seal or “Aibo” the robot dog. They respond to touch, speech, and may relieve stress or feelings of loneliness.
“Ultimately, the question facing all systems is how we can use them to counteract the shortage of carers in an ageing society,” says Mandry. Importantly, robots should not replace carers but support them, a view that is also shared by the GEC.
Enthusiasm has given way to skepticism
A few years ago, the response to robots in the care sector was extremely enthusiastic. It was hoped that robotics could help people live independently and self-sufficiently for longer, and supporters were confident that complex systems might even replace carers more or less completely. It was the time when humanoid robots such as “Pepper” were being developed and introduced into care facilities.
Pepper was jointly developed in France and Japan as a kind of companion robot and programmed to recognize human gestures and facial expressions. With large child-like eyes and a round head, Pepper was intended for salesrooms and reception desks as well as care homes. “Disenchantment has set in,” says Mandry. “It is clear that robots should be seen more as an aid and cannot replace carers – nor should they.” Incidentally, production of the “Pepper” robot was paused in 2021 due to insufficient demand.
“We need to remember that robots not only save work but also create it,” says Mandry. They need maintenance, for example – and may not always work or at least not always as planned. The question is therefore not only to what extent the use of robots in nursing care is ethically justifiable and what points need considering here but also what is possible, expedient and affordable.
Robots for individual tasks

Incidentally, robots were not originally designed for the care sector but for industry, where they have long been used for routine tasks, such as in manufacturing in the automotive industry. Industrial robots often work separately from human staff and always perform the same repetitive tasks. This fails to make them social companions or suitable for operation at home, particularly as many activities in nursing care do not always involve exactly the same procedure every time.
“The goal today is more for robots to work together with humans,” says Mandry. They could relieve the burden on carers or help those in need of care when a carer is not available. In this scenario, the robot assumes the role of an assistant and caring for humans is left to the humans. “Robots by all means make tasks easier for carers,” says Mandry.
Advanced smart home systems that report when a person has fallen or not left their bed all day could also be useful in nursing care. “This is similar to emergency call buttons, which are often used today, but would include a sensor system,” says Mandry. Such systems raise many ethical concerns. Although they offer security and allow people to live in their own homes for longer, they could also make their homes transparent. How does this affect people who are constantly monitored in their everyday life because cameras film their behavior or sensors record their every move? “Is my home then still my own private domain?”, asks Mandry, posing a key question that raises further concerns: “Would we choose not to do things that we might otherwise be doing? Or do we forget that we are being watched and lay ourselves bare?”
Privacy and security are always an issue with monitoring systems. “Such monitoring systems are by nature very vulnerable and that makes us more vulnerable, too,” says Mandry. What happens, for example, if a network that connects a care recipient with their carer fails? Or if sensitive data is leaked because it is not as well protected as it should be? “These are not showstopper arguments against such systems, but important considerations around their use,” says Professor Mandry.
The use of robots also raises the question of how far they are allowed to intervene. What happens if a person is unwell and needs medical attention? Their blood pressure or sugar levels could be too high – or perhaps they have not taken important medications. “It’s unthinkable that a robot would then come and give the patient an injection,” says Mandry. “That would be irresponsible, and a human must always make the final decision.” This view is also shared by the German Ethics Council. But what is the point of being able to live at home for longer if you are potentially isolated?
Robotic systems might help against loneliness. However, are we deceiving people when robots simulate human emotions and reactions? That aside, there are also economic issues: Who will pay for such systems? And what makes monetary sense? “Robotic systems require a lot of resources, raw materials and energy,” says Mandry. In most cases, people’s homes have to be retrofitted. “It is clear that we can’t afford everything.” But what should be given priority?
Using robots would also significantly change professional training and everyday life. “Care would become much more of a technical profession than it is to date,” says Mandry. And are carers prepared to accept robots as their assistants? “Carers might fear being replaced by a robot at some point,” says Mandry. The care profession might become specialized and differentiate between human-to-human care and a more technical approach with corresponding devices, which may attract more people to the profession than is currently the case. “But so far, this is all hypothetical.”
It is often said that other countries, such as Japan, are much more advanced in this area, but the impression is misleading. For example, a study by the Institute for Health and Social Research (IGES) in Berlin from 2020 shows that so far hardly any robotic systems have been used in direct nursing activities. Obstacles include financing problems, lack of mature technical solutions and the extensive time required to introduce new technology.
Saving time with artificial intelligence
Nevertheless, there are already some promising practical examples. Goethe University Frankfurt offers the Master’s degree program “Social Ethics in Healthcare”. A student on this course recently completed a practical semester on robotics in nursing, reports Mandry. “Her work focused on an assistive system that helps with documentation. It is a very interesting option because it frees up time for actual patient care.” Care documentation is very time-consuming but an important factor that helps protect and even indemnify the carer. “The capability of AI language models has increased enormously,” says Mandry. “I find such systems very promising, as they have the potential to make everyday care much easier.”
Acceptance of such systems will change over time – depending on which generation is providing care and which one is receiving it. “Many people today are growing up with AI applications as a matter of course,” says Mandry. “Sometimes we don’t even know whether we’re chatting with a human or a bot.” The situation in nursing care, however, is certainly different: Mandry is convinced that social communication with a robot can hardly replace having another person in the room: “Human-to-human contact will not lose its importance in nursing care in the future.”

About / Christof Mandry, born in 1968, is Professor of Moral Theology and Social Ethics at Goethe University Frankfurt. After studying Catholic Theology in Tübingen and Paris, he completed his doctoral degree with a dissertation on the relationship between theological and philosophical ethics. He was a postdoctoral fellow at the Max Weber Center for Advanced Cultural and Social Studies at the University of Erfurt and from 2004 to 2006 Visiting Professor for Christian Social Ethics at the Catholic University of Applied Social Sciences (KHSB) in Berlin. His postdoctoral thesis (Habilitation) dealt with Europe as a community of values. Mandry’s main research interests lie in theology and culture, political ethics (in particular Europe and the European Union), ethical questions of medicine and health, as well as fundamental questions and applications of theological ethics and Christian social ethics.
mandry@em.uni-frankfurt.de

The author / Maria Berentzen,, born in 1983, studied German, political science and communication studies and works as a freelance journalist. She enjoys writing about artificial intelligence and the opportunities, possibilities and risks associated with developments in AI.
post@mariaberentzen.de