跳到主要內容區
:::

Researchers Discover Racial Bias in AI Diagnosis

2022.06.09
Is artificial intelligence (AI) racist? Assistant Professor Kuo Po-Chih of the Department of Computer Science, together with colleagues at MIT and Emory, have recently found that AI deep learning algorithms can determine the race of a patient from X-rays and computed tomography (CT) scans, and that this can affect the accuracy of a diagnosis. It is still unclear how AI is able to make this distinction.
 
Kuo said that in recent years the medical field has begun to make extensive use of AI to increase the speed and accuracy of diagnosis and treatment. AI deep learning systems make it possible for computers to train themselves for use in the development of recognition models, and the US Food and Drug Administration (FDA) has also approved AI devices for use in the interpretation of X-rays and CT scans. Yet AI’s use has also given rise to some challenging ethical concerns relating to racial bias.
 
The research team, which included members from MIT, Stanford University, Emory University and the University of Toronto, analyzed the chest, cervical vertebrae, and hand X-rays, as well as the chest CT scans, of more than 200,000 patients, and discovered that AI could actually be contributing to racial discrimination. The team’s research has been published in a recent issue of the top journal Lancet Digital Health and has attracted much media attention.
 
One of the team members, a professor at Emory, was so amazed when they found that AI can determine race from X-rays with an accuracy rate of 90% that she could hardly believe it, thinking that something must have gone wrong. Some of the other team members also found the implications of the results to be deeply troubling.
 
Kuo said that the team initially guessed that AI could determine race from bone density, since blacks have a higher bone density than whites, but it was later found that this was not the case. Pointing to an X-ray of a hand, Kuo said that they discovered that AI determines race from the third knuckle of the middle and index fingers, and that even radiologists with 20 or 30 years of experience are unable to do so.
 
Even more concerning for the research team was that the racial identification provided by AI affects the accuracy of medical image interpretation. Kuo said that they found that the false negative rate for the medical images of blacks was 28%, but only 17% for whites. Moreover, the misinterpretation of medical images affects the allocation of medical resources, such as emergency treatment and medical benefits.
 
The research team also included two seniors from the Department of Computer Science, Wang Ryan and Chen Li-Ching. Wang, who was in charge of analyzing the AI models, said that it’s not often that undergraduates have a chance to participate in such a large-scale multinational research project, and that he gained a wealth of valuable experience.
 
Assistant Professor Kuo Po-Chih (center) of the Department of Computer Science and his students Chen Li-Ching (left) and Wang Ryan (right).

Assistant Professor Kuo Po-Chih (center) of the Department of Computer Science and his students Chen Li-Ching (left) and Wang Ryan (right).

visited: