RACIST AGEIST HEIGHTIST SOFTWARE
The pedestrian detection systems in self-driving cars are less likely to detect children and people of color, study suggestsKatie Hawkinson
Sat, August 26, 2023
Self-driving cars, like the Cruise model from General Motors pictured above, are already on streets throughout the United States
.Heather Somerville/Reuters
Pedestrian detectors in self-driving cars are less likely to detect kids and people of color, study shows.
This is due to bias in open-source AI, on which self-driving cars rely, researchers say.
Researchers are calling on lawmakers to enact policies to regulate these detectors.
As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.
AI recruitment tools have been shown to discriminate against women. ChatGPT has demonstrated racist and discriminatory biases. In every reported case of police misidentifying a suspect because of facial recognition technology, that person has been Black.
And now, new research suggests even the pedestrian detection software in self-driving cars may be less effective in detecting people of color — as well as children, generally — as a result of AI bias, putting them at greater safety risk as more carmakers use the technology.
A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person's race, gender, and age. While gender only presented a small discrepancy in accuracy, researchers found the detection systems were less accurate at detecting pedestrians with dark skin tones.
"Before, minority individuals may have been denied vital services. Now they might face severe injury," Jie Zhang, a computer scientist at King's College London and a member of the research team, said in a statement.
The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.
"Overall, this study sheds light on the fairness issues faced by existing pedestrian detectors, emphasizing the importance of addressing bias related to age and skin tone," the study reads. "The insights gained can pave the way for more fair and unbiased autonomous driving systems in the future."
This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors. While the study did not use the exact software companies like Tesla use to power self-driving cars because they are confidential, the software systems used for the study are based on the same open-source AI those companies use, according to Zhang.
The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.
"It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately," the study reads.
Pedestrian detectors in self-driving cars are less likely to detect kids and people of color, study shows.
This is due to bias in open-source AI, on which self-driving cars rely, researchers say.
Researchers are calling on lawmakers to enact policies to regulate these detectors.
As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.
AI recruitment tools have been shown to discriminate against women. ChatGPT has demonstrated racist and discriminatory biases. In every reported case of police misidentifying a suspect because of facial recognition technology, that person has been Black.
And now, new research suggests even the pedestrian detection software in self-driving cars may be less effective in detecting people of color — as well as children, generally — as a result of AI bias, putting them at greater safety risk as more carmakers use the technology.
A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person's race, gender, and age. While gender only presented a small discrepancy in accuracy, researchers found the detection systems were less accurate at detecting pedestrians with dark skin tones.
"Before, minority individuals may have been denied vital services. Now they might face severe injury," Jie Zhang, a computer scientist at King's College London and a member of the research team, said in a statement.
The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.
"Overall, this study sheds light on the fairness issues faced by existing pedestrian detectors, emphasizing the importance of addressing bias related to age and skin tone," the study reads. "The insights gained can pave the way for more fair and unbiased autonomous driving systems in the future."
This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors. While the study did not use the exact software companies like Tesla use to power self-driving cars because they are confidential, the software systems used for the study are based on the same open-source AI those companies use, according to Zhang.
The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.
"It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately," the study reads.
No comments:
Post a Comment