Our ageing population has prompted a move towards replacing carers with robots – but not everyone is convinced of the merits of such machines.
A range of robots are already being used in the disability, aged care, education, and health sectors, and countries such as Japan see robots playing a key role in filling their workforce gaps.
A number of Australian residential aged care facilities are also using Paro, a therapeutic robot that looks and sounds like a baby harp seal.

Paro interacts by moving its head, heavily-lashed wide eyes and flippers, making sounds and responding to particular forms of touch on its furry coat.
Paro has been used extensively in aged care in the United States, Europe and parts of Asia, typically among people living with dementia.
While useful, concerns have been expressed about the reduction in privacy, exposure to data hacking, and risk of physical harm.
There is also limited evidence as to the potential long-term implications of human-machine interactions.
A research report published late last year by the Australia and New Zealand School of Government assessed the role of government as a steward in shaping this framework.
It found that the sector, to date, has been largely driven by the interests of technology suppliers.
Providers are also not always engaging in critical analysis, it said.
Robots can help draw in potential clients, but there could be issues with addiction and reliance on the robot in the long term.
As artificial intelligence develops, robots will develop different levels of capabilities for “knowing” the human they are caring for. This raises concerns about potential hacking and security issues. On the flip side, it raises questions of inequity if different levels of care available at different price points.
Participants were also concerned about the unintended consequences of robot relationships on human relationships. Families may feel that the robot proxy is sufficient companionship, for instance, and leave their aged relative socially isolated.
The study suggested a responsive regulatory approach which relies on the sector to self- and peer-regulate, and to escalate issues as they arise.
Many respondents called for establishment of industry standards to protect against data and privacy threats, and the loss of jobs.
The study authors said governments have a responsibility to ensure vulnerable people aren’t exploited or harmed by technologies.
And they must also ensure robots don’t replace human care and lead to greater social isolation.