Artificial intelligence set to transform the patient experience

ORLANDO – From Watson to Siri, Alexa to Cortana, consumers and patients have become much more familiar with artificial intelligence and natural language processing in recent years. Pick your terminology: machine learning, cognitive computing, neural networks/deep learning. All are becoming more commonplace – in our smartphones, in our kitchens – and as they continue to evolve at a rapid pace, expectations are high for how they’ll impact healthcare.

Skepticism is, too. And even fear.

As it sparks equal part doubt and hope (and not a little hype) from patients, physicians and technologists, a panel of IT experts at HIMSS17 discussed the future of AI in healthcare on Sunday afternoon.

Kenneth Kleinberg, managing director at The Advisory Board Company, spoke with execs from two medical AI startups: Cory Kidd, CEO of Catalia Health, and Jay Parkinson, MD, founder and CMO of Sherpaa.

Catalia developed a small robot, the Mabu Personal Healthcare Companion, aimed at assisting with “long-term patient engagement.” It’s able to have tailored conversations with patients that can evolve over time as the platform – developed using principles of behavioral psychology – gains daily data about treatment plans, health challenges and outcomes.

Sherpaa is billed as an “on-demand doctor practice” that connects subscribers with physicians, via its app, who can make diagnoses, order lab tests and imaging and prescribe medications at locations near the patient. “Seventy percent of time, the doctors have a diagnosis,” said Parkinson. “Most cases can be solved virtually.” Rather than just a virtual care, platform, it enables “care coordination with local clinicians in the community,” he said.

In this fast-changing environment, there are many questions to ask: “We’re starting to see these AI systems appear in other parts of our lives,” said Kleinberg. “How valuable are they? How capable are they? What kind of authority will these systems attain?”

And also: “What does it mean to be a physician and patient in this new age?”

Kidd said he’s a “big believer – when it’s used right.”

Parkinson agreed: “It has to be targeted to be successful.”

Another important question: For all the hype and enthusiasm about AI, “where on the inflection curve are we?” asked Kleinberg. “Is it going to take off and get a lot better? And does it offer more benefits at the patient engagement level? Or as an assistant to clinicians?”

For Kidd, it’s clearly the former, as Catalia’s technology deploys AI to help patients manage their own chronic conditions.

“The kinds of algorithms we’re developing, we’re building up psychological models of patients with every encounter,” he explained. “We start with two types of psychologies: The psychology of relationships – how people develop relationships over time – as well as the psychology of  behavior change: How do we chose the right technique to use with this person right now?”

The platform also gets “smarter” as it become more attuned to “what we call our biographical model, which is kind of a catch-all for everything else we learn in conversation,” he said. “This man has a couple cats, this woman’s son calls her every Sunday afternoon, whatever it might be that we’ll use later in conversations.”

Consumer applications driving clinical innovations
AI is fast advancing in healthcare in large part because it’s evolving so quickly in the consumer space. Take Apple’s Siri, for instance: “The more you talk to it, the better it makes our product,” said Kidd. “Literally. We’re licensing the same voice recognition and voice outlet technology thats running on your iPhone right now.”

For his part, Parkinson sees problems with simply adding AI technology onto the doctor-patient relationship as it currently exists. Most healthcare encounters involve “an oral conversation between doctor and patient,” he said, where “retention is 15 percent or less.”

For AI to truly be an effective augmentation of clinical practices, that conversation “needs to be less oral and more text-driven,” he said. “I’m worried about layering AI on a broken delivery process.”

But machine learning is starting to change the came in areas large and small throughout healthcare. Kleinberg pointed to the area of imaging recognition. IBM, for instance, made headlines when it acquired Merge Healthcare for $1 billion in 2015, allowing Watson to “see” medical images – the largest data source in healthcare.

Then there are the various iPhone apps that say they can help diagnose skin cancer with photos users take of their own moles. Kleinberg said he mentioned the apps to a dermatologist friend of his.

“I want to quote him very carefully: He said, ‘Naaaaahhhhhh.'”

But Parkinson took a different view: “About 25 percent of our cases have photos attached,” he said. “Right now, if it’s a weird mole we’re sending people out to see a dermatologist. But I would totally love to replace that (doctor) with a robot. And I don’t think that’s too far off.”

In the near term, however, “you would be amazed at the image quality that people taking photographs think are good photographs,” he said. “So there’s a lot of education for the patient about how to take a picture.”

The patient’s view
If artificial intelligence is having promising if controversial impact so far on the clinical side, one of the most important aspects of this evolution also still has some questions to answer. Most notably: What do the patient think?

One one hand, Kleinberg pointed to AI pilots where patients paired with humanoid robots “felt a sense of loss” after the test ended. “One woman followed the robot out and waved goodbye to it.”

On the other, “some people are horrified that we would be letting machines play a part in a role that should be played by humans,” he said.

The big question, then: “Do we have place now for society and a system such as this?” he asked.

“The first time I put something like this in a patient’s home was 10 years ago now,” said Kidd.  “We’ve seen, with the various versions of AI and robots, that people can develop an attachment to them. At the same time, typical conversation is two or three minutes. It’s not like people spend all day talking with these.”

It’s essential, he argued, to be up front with patients about just what the technology can and should do.

“How you introduce this, and how you couch the terminology around this technology and what it can and can’t do is actually very important in making it effective for patients,” said Kidd. “We don’t try to convince anyone that this is a doctor or a nurse. As long as we set up the relationship in the right way so people understand how it works and what it can do, it can be very effective.

“There is this cultural conception that AI and robotics can be scary,” he conceded. “But what I’ve seen, putting this in front of patients is that this is a tool that can do something and be very effective, and people like it a lot.”


Leave a Reply