Prediction in Fear: AI in Healthcare
  1. Home
  2. Future of Health
  3. Prediction in Fear: AI in Healthcare

Prediction in Fear: AI in Healthcare

Disruptive technologies, AI and machine learning can be powerful tools to drive better health outcomes. But in a fear-based economy, the AI-enabled predictions need to be fully understood within the context of each individual person and their circumstances. MedicalDirector’s CEO, Matthew Bardsley discusses.

Universally, we transact in two things – fear or pleasure. Every decision we make, always comes back to one of these two things, and within each of those two utilities, there’s a moral compass.

But to be equitable in pleasure, isn’t as greatly valued as being equitable in fear.

Let’s take this example. You go on holiday to Fiji and stay at a beautiful island and five-star resort. Your colleague also goes on holiday to Fiji, but stays on the mainland in cheaper accommodation. You both share your very different experiences with each other back at work, and happily share holiday snaps and stories.

Now lets take an example in healthcare. You are diagnosed with a rare cancer, its treatable but the drugs aren’t available on the PBS. You meet with a support group to discuss the impact this is having on your life only to find out that many people in the support group have been able to afford the drugs and are in recovery. Your experiences are inequitable and well, it just doesn’t feel right.

The moral question arises as to why should someone be disadvantaged, in their most vulnerable state, because of their economic position. So equitability is a real issue when it comes to the moral compass in healthcare in a fear-based economy.

This takes us to the issue of predictability and fear. There is a real moral issue with ‘prediction’ in fear, compared to prediction in pleasure.

Prediction in pleasure is easy to digest, you go on Amazon, search for outdoor activity ideas and the CX machine learning capabilities suggest rollerblades. You say ‘these look great!’ and buy it, but then later find you didn’t actually use it that much. It’s not a big issue, you find something else you like.

Prediction in a fear-based economy however, is a very different experience. Imagine if an AI-enabled health prediction tool said to you, ‘you’re going to die when you’re 70.’ You then base all your life decisions based on that prediction, but come your 70th birthday, you’re still around, but you’re alone and broke, with no set future plans in place.

The impact prediction has in a fear-based economy is very real and has a far more implications for our individual decisions. If the fear doesn’t manifest itself or the fear is greater than what was initially predicted, these are much harder for the human mind to cope and take on board. So it has to be managed a lot more carefully.

And it is effectively managing predictions in a fear-based economy that we need to bear in mind as we innovate in healthcare.

Disruptive technologies, AI and machine learning can be powerful tools to drive better health outcomes. But in a fear-based economy, the AI-enabled predictions need to be fully understood within the context of each individual person and their circumstances. The message then needs to be delivered with the right empathy and sensitivity.

One example is a simple birth control alert or reminder, which would seem quite straightforward to communicate to a patient. But if that patient is in an abusive relationship, that message needs to be conveyed in a very different way.

For AI to really be a powerful tool in healthcare and enabler of better health outcomes, we need to make sure it is done respectfully, there is a level of empathy in the system, and a layer of emotional artificial intelligence, before it can be unleashed into the healthcare ecosystem. Otherwise just one health prediction delivered incorrectly can lead to a life or death situation for a patient, and stifle innovation.

Moving forward, innovators in healthcare need to understand and respect this layer of emotional AI required, and really lean into this problem and understand its repercussions, so healthcare can enjoy the same capabilities as other parts of the market.

See Matthew Bardsley discuss this topic in more detail at Informa’s upcoming AI, Machine Learning & Robotics in health event in November 2018.

This article was originally published on LinkedIn