AI in elder care: ten recommendations

dutchhealthhub
November 16, 2023
6 min

The eldercare industry increasingly sees AI as an opportunity to address the challenges surrounding an aging society. Ten recommendations from practitioners.

Vilans and ActiZ are making available a white paper that highlights these opportunities in elder care. Think of automating administrative processes, supporting decision-making and promoting self-reliance among clients. As a result, the quality, efficiency and affordability of care can be improved. The white paper contains ten recommendations from practice in addition to practical examples. We summarize them here.

1. Start from vision and strategy

As an organization, you may be considering getting started with AI. You may want to use an existing AI application within your own organization, or you may want to (co-)develop a new AI application. In practice, this requires a clear picture of where your organization wants to go with the deployment of data and AI. How does this fit within the broader goals of the organization? And within the vision of care and the primary care process? What is needed to achieve valuable applications of AI? A vision and strategy form the basis of a successful implementation of AI applications.

2. Immerse yourself in practice

What do caregivers, clients and their loved ones encounter in daily care practice? How can AI provide a solution to this? Walk around the work floor and talk regularly with employees. Then you can identify where the bottlenecks are and start developing a specific AI application together from there.

Position AI as a support and not a replacement 3.

There are workers who worry that their jobs will be taken over by healthcare technology in the future. But AI is less capable of being caring, for example, and human caregivers, with their knowledge, skill, experience and common sense, will simply remain at the helm in providing, and deciding on, care and support.

4. Commit to a new way of working

Employees need time to learn how to work with the new application and to embrace the new process. It is important that employees are adequately facilitated and trained in working with an AI application. This may also become a new task for "digicoaches," professionals within healthcare providers tasked with supporting colleagues in increasing their digital skills. Enthusiastic boosters also play a major role in securing innovation. You also need these boosters, sometimes called 'super users,' to get started with AI, but also to take use in practice to the next level.

5. Pay attention to both opportunities and risks

In healthcare and as a society, like most businesses, we have the best intentions when it comes to deploying technology. But due to the delusion of the day, people are often tempted to push forward thinking about ethical and social implications of technology or invest them with other experts. However, it is important for everyone to be aware that the deployment of AI carries risks in addition to opportunities. These include dehumanization of care, problematization and stigmatization of old age, and invasion of privacy and autonomy.

It is also important that there is room for clients and caregivers to be able to make their own conscious choices about what is appropriate in an individual's context. There are many examples of AI systems whose outcomes are not representative. Or even unfair to certain individuals or groups of people. The cause then is often that AI models have been trained using flawed data sets. Since it is very difficult to completely eliminate bias from datasets and decision rules for AI, it is all the more important that users do not view outcomes and advice from AI as leading. They must be able to critically determine whether and to what extent they fit the individual and/or situation.

6. Work from a multidisciplinary approach

The deployment of AI requires a look at the broader organizational context and social and cultural changes required in the process. It is important that those involved understand healthcare and technology and thus know what works at the technical level and in everyday healthcare practice. This requires mutual understanding between developers and healthcare experts. Roles such as the role the Chief Nursing Information Officer (CNIO) can help form the bridge between IT and daily care practice.

7. Get the (technical) basics in order

Getting started with AI requires healthcare organizations to have good information management and IT architecture and to have your data and data management in order. To (co)develop and/or implement useful and reliable AI applications in the organization, a rich variety, volume and quality of data is often required. This is often where there is still a battle to be made to properly apply data and AI in healthcare.

With data availability, it is important to distinguish between care content data and other data. Care-related data refers to data on the content of care and health that care professionals record in an ECD. Other data is (mostly) also recorded in an ECD or in other linked business systems such as a planning system.

8. Respond to laws and regulations in a timely manner

Having good visibility and compliance with relevant laws and regulations appears to be an important prerequisite. Strict rules apply in healthcare for the use of data and the development and marketing of products. Among other things, products must be tested for safety, effectiveness and accuracy before they can be applied in practice. This testing is often an extensive and costly process for the developer. To help ensure the ethical and legal aspects of deploying AI, among others, the Care Transformation Model includes a "Guide to Applications and Algorithms in Healthcare," which refers to guidelines and principles for the use of AI in healthcare.

For AI systems deployed in the delivery of healthcare, sector-specific European legislation such as the Medical Device Regulation (MDR) also comes into play. In addition, specific legislation for AI will also come into effect in the near future: the Artificial Intelligence Act (AI Act). This regulates the legal framework of AI systems and, like the MDR, has a risk-based approach

Guarantee usability and transparency 9.

The use of an AI application must be well integrated into practice. This must be done in such a way that the user does not have to take any or only a minimal number of additional steps to perform tasks using the AI application. Suppose a caregiver uses the ability to do voice-based reporting in the ECD. Then you need dictation software that is integrated within the ECD in a user-friendly way.

Collaborate with the industry 10.

Valuable applications of AI can start small, from ideas or problem sets for which solutions are developed and tested iteratively. At the same time, it can be (too) challenging for individual healthcare providers to independently develop, test and implement AI applications. For AI applications to succeed in practice, scale is generally important. This applies, for example, in terms of innovation power and available data as input for AI. Collaboration can be the answer.

Read more: then download the white paper AI in elder care here

 

Check out this article and more at dutchhealthhub.nl

Related articles