Articles of Interest: Best Heart Research Paper in 2020 and Underdiagnosis Bias in AI Decision-Making Models< Back To Feed
This week, we look at the Heart Best Research Paper Award 2021 for the best research paper published in the journal Heart in 2020. We also see how AI-based decision models can underdiagnosis under-served groups, an important issue if we are to achieve both efficiency and fairness from our decision support tools.
Heart Best Research Paper 2021: Risk of severe COVID-19 disease with ACE inhibitors and angiotensin receptor blockers
Patients with chronic conditions, like cardiovascular disease, are often on medications for long periods of time. When a new disease like COVID-19 arises, it is uncertain how the disease will interact with current medications and understanding these interactions is critical to both preventing adverse outcomes and providing insights into disease mechanisms.
Since SARS-CoV-2 interacts with ACE2 receptors, there was significant uncertainty at the start of the pandemic that ACE inhibitor and angiotensin receptor blockers (ARB) drugs may be associated with severe COVID-19 disease susceptibility and progression. In 2020, Hippisley-Cox et al published a large, open cohort study in Heart that looked at all patients aged 20-99 years across 1,205 general practices in England. This population-based study, which received the "Heart Best Research Paper Award 2021", showed that both ACE inhibitor and ARB prescriptions were associated with a decreased risk of COVID-19 disease, but had no association with increased risk of COVID-19 disease severe enough to result in ICU admission. Importantly, they also observed significant variation of risk between ethnic groups, with lessened protective effects and higher risk of severe COVID-19 disease associated with ARB use in Black, Asian, and minority ethnic (BAME) groups that are not explained by other factors, such as age, sex, or comorbidities.
Beyond the important results in this paper, this study also demonstrated the power of having established and high-quality databases that are representative of your population-of-interest. In this case, the ability to link general practitioner and ICU data to COVID-19 test records allowed this research group to quickly assess novel drug-disease interactions and identify variable effects between ethnic groups.
How does underdiagnosis bias in AI affect under-served groups?
As artificial intelligence (AI) becomes more pervasive in healthcare, particularly in radiology, we need to understand how algorithmic bias could negatively affect subpopulations and determine methods to mitigate these biases. Automatic screening tools for supporting triage have the potential to relieve burden on the healthcare system, but also present the risk of underdiagnosis, which is when an unhealthy patient is incorrectly categorized as healthy, resulting in lower triage priority.
This month in Nature Medicine, Seyyed-Kalantari et al published a study assessing underdiagnosis bias in AI-based chest X-ray (CXR) prediction models. They found consistent underdiagnosis biases in under-served subpopulations, such as patients with lower socioeconomic status, female patients, and Black and Hispanic patients. These biases could come from 1) automatic labeling by NLP or 2) existing bias within clinical records (bias amplification) but are likely not a result of the choice of algorithm.
The authors also point out that attempting to impose fairness after the fact (e.g., by selecting different classification thresholds for different groups) may not result in ethically desirable results and may, for example, induce overdiagnosis disparities and increase uncertainty. This study is illustrative of the unintended consequences we as an industry must address as we move AI-based decision models from "paper to practice". In our opinion, a Decision Intelligence approach which considers the decision-making context and the qualities of the underlying data before embarking on a technical solution could help proactively identify, and potentially mitigate, these problems.