Challenges of Recent Innovations

Challenges of Recent Innovations | Sara Gerke AMAG Summary | DHC Summit 2019

Sarah Gerke - Recent innovations summary

As a part of a recent DHC Summit, we invited medical ethics expert, Sara Gerke (currently a research fellow at Harvard) to share her perspective on the impact and challenges of recent innovations. As she has studied the implications of ingestible devices and artificial intelligence with regards to patient privacy, she was able to offer several points for consideration to marketers charged with understanding the opportunity for their company.

A summary of her challenges of recent innovations is provided below.

Digital Health Innovation

In discussion of the cutting edge of medicine, few topics have drawn as much attention as digital health. Despite its vast potential, digital health also raises ethical challenges. Stakeholders are using digital health in their efforts to improve access to health, increase the quality of products and reduce the cost of inefficiencies, and enable precision medicine with its goal to better tailor treatment to patient needs. The use of smartphones, mobile health apps and wearables provides new ways to manage health and wellness.

Key IES Challenges

An IES, or ingestible electronic sensor, is usually combined with medicine or taken as an embedded part of the drug. Once ingested into the human body, it can communicate with a wearable sensor capable of detecting and recording data, such as the time of medication intake, as well as behavioral and physiological metrics. The wearable sensor then forwards the collected information to a compatible computing device will then process and display the data. This type of function can also be connected with a cloud database that enables data sharing, such as to a family member or a physician.

IES holds great potential to transform healthcare for the better. However, they also face ethical challenges that stakeholders need to be aware of and need to address as early as possible in the development of these products. It is helpful to group the issues into three categories: patient, provider, and social issues. IES raise a number of patient issues, especially the ethical issues of autonomy and informed consent.

1. Informed Consent

Because IES will involve therapies with elements that are unfamiliar to patients, it is particularly important to ensure full voluntary and informed consent.

2. Frequent Updates

The software component of IES products will mean that digital systems, such as the use of the app, will involve user agreements, which many patients have trouble understanding and also most users routinely ignore, so the need for frequent updates raises challenges.

3. Data Privacy

Data collected by IES products also raise a multitude of issues, including the question of the doctor-patient relationship and the related issue of medical confidentiality. The availability of this data in the hands of third parties might have implications on life insurance premiums, employment opportunities and personal relationships.

4. Data Usage Transparency

Finally, IES makers must be frank about the future use of the collected data and the terms surrounding it.

Key AI Challenges

One of the most exciting developments is the use of Artificial Intelligence in medicine. AI can assist in examining vast amounts of data, such as the results of diagnostic tests to make prediction recommendations tailored to the characteristics of a patient. In the context of health and research, it is expected that AI will be increasingly applied in four areas.

  • First, clinical applications such as imaging, diagnostics, autonomous via assisted robotic surgery.
  • Second, in patient and consumer use, such as AI health apps, which we already heard today, and chat bots.
  • Third, R&D process, such as the acceleration of the drug discovery process and higher efficiency of clinical trials.
  • Fourth, workflow optimization, such as a reduction of costs and the training of clinicians.

AI products are already in clinical use in the US. In particular, AI shows great promises in the areas of diagnostics and imaging, e.g. Viz.ai stroke device, OsteoDetect, Arterys oncology imaging interpretation. AI has huge potential to transform healthcare for the better, but at the same time, also raises ethical challenges. In particular, there are four key challenges that need to be addressed.

1. Informed Consent

Informed consent will be one of the most immediate challenges in integrating AI into clinical practice. To what extent do clinicians have a responsibility to educate the patient around the complexities of AI, including the forms of machine learning used by the system, the kind of data inputs, and the possibility of biases or any shortcomings in the data that is being used. These questions are especially challenging to answer in cases where the AI operates using black box algorithms.

2. Safety and Transparency

Safety represents one of the most serious challenges for AI in healthcare. E.g. IBM Watson for oncology uses machine learning algorithms for assessing information from patients’ medical records and help clinicians explore cancer treatment options. This system has recently come under criticism by reportedly giving unsafe and incorrect recommendations for cancer treatments.

3. Algorithm Fairness and Biases

The data needs to be used in a way that minimizes the potential for biases. Experts are really concerned that AIs could simply automate human biases, such as gender and racial biases, rather than remove them.

4. Data Privacy

It is essential to diversify data to reflect different populations adequately to ensure fairness. The lack of patient data of various geographies, races, and socioeconomic status creates biases and thus probably prevents the most vulnerable patient groups from benefiting from new AI-based technologies. In the service of safety and patient confidence, some amount of transparency must be ensured. Transparency creates trust among stakeholders, particularly clinicians and patients, which is the key to a successful implementation of AI in clinical practice.

Summary

Digital health, smart pills, wearables, and AI — they have all huge potential to transform healthcare for the better, but at the same time also create these ethical challenges that need to be addressed at the earliest stages of the development process of these products. The goal should be ethics by design, rather than after product has been designed and tested.