This 23 May 2018 ACLU video from the USA says about itself:
From Associated Press :
So, it looks like this product of Amazon , owned by Jeff Bezos , the richest man in the world , is a big flop. I would not be surprised at all if this Amazon software would not only massively misidentify women as men; but would also massively misidentify innocent people as criminals. After all, in England, London police facial recognition software ‘recognizes’ 100% of innocent people as criminals . Err … maybe it is not as bad as 100%. British daily The Independent says it is ‘only’ 98% misidentifications . And a BBC report looks at this even more through rose-coloured glasses: ‘only’ 92% of innocent people ‘recognized’ as criminals .Researchers say Amazon face-detection technology shows bias
By TALI ARBEL
January 25, 2019
NEW YORK (AP) — Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.
Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities . Some Amazon investors have also asked the company to stop out of fear that it makes Amazon vulnerable to lawsuits.
The researchers said that in their tests, Amazon ’s technology labeled darker-skinned women as men 31 percent of the time.
Facebook uses abuses facial recognition technology as well. Probably, it is as dodgy as Amazon ‘s or the London police’s, or even worse.
Lighter-skinned women were misidentified 7 percent of the time. Darker-skinned men had a 1 percent error rate, while lighter-skinned men had none.
Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life. The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology . …
In a Friday post on the Medium website, MIT Media Lab researcher Joy Buolamwini responded that companies should check all systems that analyze human faces for bias.
“If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free,” she wrote.
Amazon ’s reaction shows that it isn’t taking the “really grave concerns revealed by this study seriously,” said Jacob Snow, an attorney with the American Civil Liberties Union .
Buolamwini and Inioluwa Deborah Raji of the University of Toronto said they studied Amazon ’s technology because the company has marketed it to law enforcement . …
Amazon ’s website credits Rekognition for helping the Washington County Sheriff Office in Oregon speed up how long it took to identify suspects from hundreds of thousands of photo records.
Thanks to: https://dearkitty1.wordpress.com