One Georgia man recently discovered the dark side of facial recognition technology when he was arrested on a warrant from Louisiana. Randall Reid, 28, was picked up in DeKalb County, Georgia, last November. Authorities had connected him to a string of purse thefts in Jefferson Parish and Baton Rouge, Louisiana.
Randall insisted he’d never been to Louisiana in his life, and didn’t even know what “Jefferson Parish” was. He couldn’t have done it. The problem is, the computer said he did.
Facial recognition software connected surveillance images to Reid’s Georgia identification records, and an arrest warrant was issued by Baton Rouge authorities. Georgia authorities executed the warrant and jailed the Georgia man.
Reid was later released after authorities noticed significant discrepancies between the two men. Reid had a mole on his face, while the suspect did not. There was also at least a forty pound difference between the men. The one feature they had in common was that they are both black.
That fact has renewed fears around the inherent danger in relying on technology in law enforcement situations. Facial recognition software is known to misidentify black people and other minority groups at a much higher rate than white people.
how about an investigation before arresting someone ... has the suspect ever BEEN to the State / County / Parish
Randall insisted he’d never been to Louisiana in his life, and didn’t even know what “Jefferson Parish” was. He couldn’t have done it. The problem is, the computer said he did.
“They told me I had a warrant out of Jefferson Parish. I said, ‘What is Jefferson Parish?’” Reid said. “I have never been to Louisiana a day in my life. Then they told me it was for theft. So not only have I not been to Louisiana, I also don’t steal.”
Facial recognition software connected surveillance images to Reid’s Georgia identification records, and an arrest warrant was issued by Baton Rouge authorities. Georgia authorities executed the warrant and jailed the Georgia man.
Reid was later released after authorities noticed significant discrepancies between the two men. Reid had a mole on his face, while the suspect did not. There was also at least a forty pound difference between the men. The one feature they had in common was that they are both black.
That fact has renewed fears around the inherent danger in relying on technology in law enforcement situations. Facial recognition software is known to misidentify black people and other minority groups at a much higher rate than white people.
An MIT study of three commercial gender-recognition systems found they had errors rates of up to 34% for dark-skinned women — a rate nearly 49 times that for white men.
A Commerce Department study late last year showed similar findings. Looking at instances in which an algorithm wrongly identified two different people as the same person, the study found that error rates for African men and women were two orders of magnitude higher than for Eastern Europeans, who showed the lowest rates.
how about an investigation before arresting someone ... has the suspect ever BEEN to the State / County / Parish