How Fatal Can Be Medical Device Biases For Human Life?
Some pandemics last a few years, while others continue to evolve into more lethal forms, causing ever-increasing damages. The case of medical device biases is one of those everlasting pandemics. We can thank COVID-19 for bestowing the issue with the needed limelight. According to a recent estimate, healthcare-based racial disparities cost $93 billion of extended costs and approx. $42 billion worth of lost productivity per year. Apart from other healthcare services, the data suggests that several medical devices show reliable accuracy only for Caucasians and anomalies when used on colored individuals. Another research from the Centers for Disease Control and Prevention (CDC) disclosed huge disparities in mortality rate, life expectancy, pregnancy-related deaths, chronic conditions’ prevalence, and degradation of mental & physical health in colored individuals as compared to their white counterparts.
For example, a high inaccuracy has been registered in bilirubin measurement devices for newborns. Newborn babies often show high bilirubin levels due to the breakdown of red blood cells. If left unchecked, it may lead to permanent brain damage. The bilirubin levels are detected through light-based treatments that examine the skin and eye color of the infant. It was found that the accuracy of this test was low in black newborns, making proper diagnosis and treatment difficult for a considerable part of the population. Since high bilirubin levels are quite common in the black population, a biased, inaccurate apparatus could become an instrument of havoc any day.
In the upcoming paragraphs, we will discuss this matter’s significance, plausible solutions, and bottom line, along with some important examples that deserve attention and correction without delay.
Medical Device Biases: What’s The Accuracy Level Of Health Monitors?
Inaccuracies in hospital monitoring devices are not new. Almost all healthcare professionals verify false alarms sent by the most accurate machines sometimes. But those are once in a hundred cases, as high-stakes medical devices are vetted and extensively tested by their makers and regulatory bodies. They ensure that there is no room for mistakes. But things are different when it comes to home-based health monitoring devices.
Most personal health monitoring devices don’t undergo extensive testing or vetting processes by their makers or regulatory bodies. A study conducted in this context found that while the accuracy level of heart rate monitors for phone apps was close to 95%, the same could not be said for energy expenditure calculators. It also concluded that the accuracy levels were subjective to the race of the person being monitored. The precision level of many instruments was found to vary according to the user’s skin color. The most prominent machines included in the list were:
- Pulse Oximeters: Numerous tests and examinations verified that the accuracy levels of pulse oximeters were relatively low for colored individuals. The device detects the color of blood by shining light through the skin, where the color of blood varies according to oxygen level. Skin pigment closely influences how light penetrates the skin to reach the blood vessels, leading to inaccurate results. A study by the University of Michigan Hospital found that blood oxygen levels were overestimated for patients with darker skin, which delayed required treatment and placed their lives at risk
- Respirator Masks: Respirator masks act as a barrier between health workers and contagious infections like COVID-19. The protection is ensured by a good fit of these masks’ Filtering Face Piece (FFP). However, studies published in 2020 and 2021 found that the masks didn’t fit well on health professionals from Asian backgrounds. The fitness level of these masks varied from 40 to 90 percent, especially among Asian females. The 2021 study also found that the fitting of these masks specifically focused on Caucasians, leaving people of other races aside and endangering their lives.
- Spirometers: Racial biases have also been reported when interpreting data collected by spirometers. These instruments are used for measuring lung capacity. Experts raised questions on how most spirometers showed a lower lung capacity for Asian and black people compared to white individuals.
- Remote Plethysmography: It is a medical technology used to scale pulse rates by noting variations in skin color. Achuta Kadambi, a computer scientist and electrical engineer at the University of California, states that the visual cues noted in this process are highly susceptible to biases due to variations in subsurface melanin quantities.
- Artificial Intelligence Systems: AI applications in medical devices and diagnostics is rapidly expanding worldwide. However, some research raised concerns over how most image databases used for developing AI for skin cancer detection were devoid of valid ethnicity or skin tone labels. Several institutes developing AI projects for different diseases have acknowledged this default error and are now working to develop “inclusive and diverse” datasets for AI systems.
Why Is It Important To Acknowledge Medical Device Biases?
Usually, minute errors in measurement have a very small or distant effect on a subject’s health. However, as technology dives deeper and deeper into diagnostics and health monitoring systems, it is important to set some high standards to create an inclusive and nonbiased healthcare system. For example, in the case of pulse oximeters, the reading determines if a person needs intensive care or not. Wrong or false reading can easily be fatal simply because of the darker skin color of an individual. Incidents like this can easily worsen the disparities humanity has been trying to leave in the past so desperately.
Ways To Fix Medical Device Biases
The answer to this problem begins with accepting that racial bias is still at the core of healthcare education in the Western world. Though the issue of medical device bias came to light during the pandemic, the foundation of this problem was laid years before that. Achuta Kadambi states the importance of unbiased evaluation of new technology and how a conscious effort needs to be made to remove physical bias from the mechanics of medical devices. Once dealt with, the teams need to work on the interpretation bias that is more in the users than the machines. Clinicians must apply equal, non-racial standards to data transmitted by medical devices and tests.
Final Word
Computational biases penetrate medical technologies when research and studies are performed on a homogeneous group of individuals. Deliberate efforts must be made to recruit people from different ethnic backgrounds to design and test medical technologies and devices. Lastly, universities and other higher-level educational institutes need to work on building diversity in their research teams. There should be mandatory training on the fundamental effects of racism on health and the overall healthcare system. Altogether, such efforts may change the scenario in the coming years by eliminating the basic causes of bias in human nature and device mechanisms.