Hospitals and health systems are adopting more data analysis and processing tools to try to improve patient care, which raises questions about when and how to integrate race and ethnicity data.
Racial data has become more complex as the US becomes more diverse, with a growing number of Americans identifying with more than one race or ethnicity.
According to last year’s US census, which is taken every 10 years, the number of Americans who identify with at least two races has doubled in the past decade. According to The newspaper “New York Times… It is now the fastest growing racial and ethnic category.
This is a demographic shift that leaders need to keep in mind as the healthcare industry becomes increasingly data driven. If an analytic or artificial intelligence tool takes into account in its prediction whether a patient is black, white, or of a different race, it can, for example, mislead a patient who is black and white.
Patients from different races are a growing population that needs to be accounted for with AI and other data-driven tools, according to Tina Hernandez-Bussard, assistant professor of medicine in biomedical informatics, biomedical data and surgery at Stanford University.
If healthcare systems and software developers are not looking at ways to ensure that patients from different races are accounted for using race-based algorithms or protocols, such models may not be reliable for this patient population, she said. This can undermine patient confidence in the healthcare system.
“It’s very difficult,” said Hernandez-Bussar. “By developing algorithms that are not specifically designed for this growing population, we are losing the trust of this community.”
In recent years, healthcare organizations have invested in data-scoring tools to flag patients in need of additional care, those at risk of poor outcomes, or who may have other needs. More than three quarters of emergency and outpatient care organizations use advanced analytics to assess population health, according to the study. survey from the College of Health Information Management Executives.
Some of these tools – everything from simple risk equations to advanced AI – involve race, but not always in a way that can be explained by the growing multiracial population of the United States.
What is the best way to care for people of different races? Dr. Michael Simonov, director of clinical informatics at hospital information company Truveta, spoke about risk calculators and predictive models that include race and ethnicity data. “This is an open question and a very active area of research.”
Several risk prediction algorithmsthat have been used in medicine for years, ask clinicians to indicate if the patient is black or white as part of their calculations.
A tool that estimates a patient’s 10-year risk of atherosclerotic cardiovascular disease requires the user to select the patient’s race as “white”, “African American” or “other”, which can leave ambiguity for a black and white patient, especially if the patient only chose one race in their forms of admission, or if the physician assumes race on the basis of the patient’s appearance.
This year the National Kidney Foundation and the American Society of Nephrology released an equation for assessing kidney function that does not include race, replacing the existing version asking if the patient was black. A calculator used to predict the risk to a patient if she has a vaginal delivery following a caesarean section in a previous pregnancy ruled out by this year’s race.
“If a doctor has been trained to view race as a risk factor and he is confronted with a patient who does not fit into the pure category of race, then it is very difficult for him to make the assessment they were trained in,” said Dr. Megan Mahoney, Stanford University’s head of health and professor Department of Medicine, Stanford University.
“I don’t fall into any clean category for using their calculator,” added Mahoney, who is black and white.
Mahoney said she would like more data processing tools and calculators to follow in the footsteps of the equation for assessing kidney function, moving away from race altogether.
Next generation medicine
Artificial intelligence, which has been touted for years as the future of healthcare, could provide an opportunity to incorporate multiracial and multi-ethnic data – if developers have the data to work with.
Unlike other approaches to analytics or modeling, which tend to hard-code certain types of data to calculate a result, advanced AI is more flexible – it can accept more variables, as well as complex and layered data for which it was not explicitly programmed to process. … said Dr. Russ Kusina, Chief Medical Information Officer at UCSF Health.
But good algorithms start with good data.
In order for an artificial intelligence tool to generate generalizable information, it needs to analyze a huge amount of data that reflects the population with which the tool will be used.
To create an AI system, developers provide AI with trainings that they can use to learn how to identify features and draw patterns. However, if this dataset is small and does not contain information about some subpopulations, the system’s predictions and recommendations may not be as accurate for these patient populations.
Healthcare providers and protection groups It is becoming increasingly difficult to include race data in algorithms, arguing that race has been inappropriately used as a surrogate for other variables associated with disease risk, such as lineage, genetics, socioeconomic status, or environment a patient.
According to them, it would be more appropriate to use this data rather than race.
But even if race is not included as a variable in the algorithm, it is important to have a diverse set of data available to validate AI tools so that organizations can test the product on specific subpopulations and make sure it performs well in demographic groups.
“We see many examples of problems that can arise when we do not have good representative samples of data in developing these algorithms,” said Dr. Peter Amby, President and CEO of the Regenstrief Institute. Emby joins Vanderbilt University Medical Center as Chair of the Department of Biomedical Informatics in January.
In dermatology, for example, researchers said Artificial intelligence tools for detecting skin cancer, primarily trained on images of light-skinned patients, may not be as accurate for dark-skinned patients.
More research is needed to find out where multiple races or ethnicities will improve the predictive tool’s accuracy, said Suchi Saria, professor and director of the Johns Hopkins University Machine Learning and Health Laboratory and CEO of Bayesian Health. , a company that develops AI for decision support in clinical practice.
Getting correct data
But even collecting enough data from patients from different races to train or test an artificial intelligence system is difficult.
Only about 10% of Americans are of different races. It’s a diverse label in and of itself, encompassing people who can be white and black, black and Asian, Asian and Native American, to name a few examples – not to mention patients who would choose more than two races.
Patient data are often not recorded in medical records in sufficient detail to identify patients of different races.
Based on Bayesian Health’s experience with EHR data from hospital clients, Saria said she suspects that patients of different races are underreported in medical records.
Only about 1% of the patients in the data the company worked with were registered as having multiple races, she said.
This may be due to the fact that patients of different races are often grouped into a “different” category, or may only choose one of the races with which they identify.
Collecting enough data to research, develop, and validate analytics, artificial intelligence and other data-driven tools will be key to making them work effectively for patients with diverse backgrounds.
“If we had the data, then yes, the algorithm could properly address these problems,” said Hernandez-Bussar. “But the problem is that we don’t have training data. [algorithms] respectively. “