How to identify, prevent bias in EHRs

How to identify, prevent bias in EHRs

They then used machine learning to parse through more than 40,000 clinical records from 33,000 patient encounters from the University of Chicago Academic Medical Center between January 2019 and October 2020. Of the 18,500 patients included in the study, approximately 61% were Black, 30% were white, 6% were Hispanic or Latino and 3.5% were categorized as “other.” In total, 8.2% of the patients had one or more negative descriptors in their medical history.

Fifteen common patient descriptors were used to pinpoint stigmatizing language, including non-adherent, aggressive, agitated, angry, challenging, combative, non-compliant, confront, non-cooperative, defensive, exaggerate, hysterical, unpleasant, refuse and resist. Sun said the commonality of these terms across EHRs reflects a cultural norm of not properly articulating a patient’s barriers to health.

“There is a pattern of words that we’re using that are shortcuts and we are doing a disservice to our patients by not affording them the full context, their full story,” he said.

For example, a physician can call a patient “noncompliant” when the patient can really lack health literacy and misunderstood what they were supposed to be doing. Understanding that difference can help put a patient back on a treatment plan that works for them and improves outcomes.

The next step in research will be to explore the link of negative comments within a patient’s electronic health record to clinical outcomes, Sun said. The report does not directly correlate bad medical outcomes as a consequence of implicit biases, but notes other research, including a study that found doctors with high measures of implicit biases were more verbally dominant with Black patients and a report that indicates bias in healthcare is associated with lower levels of patient adherence.

Download Modern Healthcare’s app to stay informed when industry news breaks.

The report also explains how electronic health records can perpetuate bias and stigma amongst clinicians. The authors cited a 2018 study that found medical providers were more likely to have a negative perception of a patient’s pain when presented a chart with notes that contain stigmatizing language, like “frequent flier.”

“It would not be hard to imagine the different types of interactions they might be having,” he said. “This will certainly be a follow up area of study for us, but we anticipate that these descriptors are having some effect as far as the doctor-patient relationship, and also the many healthcare provider-to-patient relationships that will happen during a patient’s hospital stay.”

Researchers found the use of stigmatizing language lessened in 2020. Sun said that after the COVID-19 pandemic began, and amid a national reckoning with the murder of Georgie Floyd, clinicians were less likely to use a negative descriptor in an EHR. He said the findings illustrate clinicians’ ability to check their biases and hesitate because of using negative descriptors in their charts, especially when describing a patient of color or marginalized identity. It could also reflect a growing interest amongst providers to address cultural incompetencies within their operations.

“It surprised us at first because we thought that the pandemic as a whole, as a stressful environment, would cause people to use more cognitive shortcuts or stereotypes, relying on bias or using bias a little bit more. I think it’s really encouraging to find that it actually decreased during the pandemic,” he said. “I hope that people think about this as an opportunity to tell a patient’s full story, and provide them more compassionate and empathetic care. It is certainly within our grasp, it just takes a little bit more intention.”

Source link

Leave a Reply

Your email address will not be published.