Health care

Patients want the use of AI in healthcare to be transparent

BOSTON – The ability to change the way health care providers communicate with patients with artificial intelligence is not only about accuracy, transparency, impartiality and care of the data model, it is about finding a way to handle and personal problems.

What patients want to know also adds a greater level of complexity – which challenges the AI ​​healthcare industry to consider both expected and unexpected patient data, according to panelists on Thursday at HIMSS AI in Healthcare Forum.

With the power of conditioning, and allowing doctors to be freed from data to be more human in their interactions with patients, artificial intelligence can transform patient-doctor interactions.

“To some extent, these fascinating tools that are evolving faster than the health system can imagine how they intervene, set the stage or provide the best opportunity to create that conversation, what’s important to that person, and to advise and support them. make relevant decisions,” said Anne Snowdon, chief research officer at HIMSS, the parent company of Healthcare IT News.

While the use of AI technologies is an important part of the conversation around trust, mapping transparency, choice, autonomy and decision-making is of utmost importance to patients.

“From that point of view, it’s starting to redefine and rethink care,” said Snowdon, the panel’s director. Snowdon has a doctorate in nursing.

Improving patient communication

Snowdon was joined by Alexandra Wright, patient advocate and director of research at HIMSS, Dr. Chethan Sarabu, director of clinical operations at Cornell Tech’s Health Tech Hub, Mark Polyak, president of analytics IPSOS, and Dr. Lukasz Kowalczyk, physician. at Peak Gastroenterology Associates, for an in-depth discussion about what patients want from an AI-enabled healthcare experience.

“While healthcare is still considering the challenges of thinking about artificial intelligence, AI can elevate the conversation and build more trust,” said Sarabu, who is also a Light board member. Collective, a non-profit organization that seeks to promote collective rights, interests. and patient community voices in health technology.

In her work with the team, Sarabu said she heard during a patient insight group that a patient, who thought she was talking to a very helpful nurse named Jessica in her clinic in the inpatient ward, had he expressed despair when he asked. for the nurse in person at his doctor’s office.

“He just said he wished they had told him it was a chatbot first,” he said.

“You shouldn’t hold back your patients,” said Kowalczyk, a GI specialist and consultant at Denver-based Cliexa, a digital health platform.

But if the patient knows health chatbots like Jessica aren’t real people, difficult situations for the patient can be improved with AI’s ability to communicate with empathy.

“Compassion fatigue is a real thing in health care,” Kowalczyk said. “Sometimes it’s very difficult, especially when you go through the day, and taking one or two patients makes it difficult to prepare for the next one.”

He said that great speech models are effective in changing and translating information and explaining patient concerns to nurses, giving them a “second to breathe” and re-empathize.

“I think those are opportunities where patients feel that AI is acting as their agent, and it helps me understand who I am as a good person.”

The dynamics of personalization

AI may not increase the patient’s view of care. In some cases – predictive analytics, for example – they can provide information that patients don’t want.

“Perhaps some patients want more information, others may want less, or someone may want less information in an individual visit, but they want more information to review.” after that,” Sarabu said.

According to the doctor’s point of view, “It is difficult to create all the information and conditions and issues of each patient.”

According to Polyak, there are three aspects of attention – getting attention, getting the right information and speed of information.

He noted that, among the group of patients using ChatGPT, 16% were asking health care questions to reduce their health care costs.

“[They] asked ChatGPT to give them a different level of how our doctors should approach their care based on what they have – in order to reduce costs.”

“That wasn’t something I really expected, but it was the kind of situation they would emphasize and bring” to the meetings.

The perception of control also varies among patients.

For patients and their families facing a health crisis, “information is very powerful,” Wright said.

“A lot of times, when you’re in these situations, it can feel out of control,” he said.

And if you don’t really understand your situation or what’s going on, it can feel like you have no control over what’s happening to you.

When the doctor is no longer in the room, and patients have questions, they will turn to search engines and ChatGPT for information, he said.

The situation also plays an important role in the information that patients want to control.

“When I first went to the hospital, would I have wanted them to tell me my risk to life? Probably not, because I don’t think it would have helped the situation,” Wright said.

“But now to think now, if someone were to tell me about my risk of, let’s say, future cancer, would I want to know if there’s anything I can do to prevent that? Maybe. ”

What granular communication has suggested, said Snowdon, is turning the health application of AI on its head: “How can we help people help themselves make those decisions, inform with confidence and trust themselves. [discover] What is most important to them?”

Andrea Fox is the senior editor of Healthcare IT News.
Email: afox@himss.org

Healthcare IT News is a publication of HIMSS Media.

#Patients #healthcare #transparent

Leave a Reply

Your email address will not be published. Required fields are marked *