Showing 1 results for:
Popular topics
A recent study by the National Institute of Standards of Technology revealed that facial recognition software delivers flawed results when assessing minority populations. The study has sparked yet another debate about the controversial technology, given its frequent use in the apprehension of suspects. The NIST study tested nearly 200 facial recognition algorithms. The results illustrated a higher rate of misidentification and other errors among subjects who identified as people of color. “ With mug shot images, the highest false positives are in American Indians, with elevated rates in African American and Asian populations ,” according to the study. False positives refer to the instances in which the software identified an incorrect match; false negatives refer to situations in which the software failed to recognize a match. The report also showed greater instances of false positives among African American women . The study has given advocacy groups and politicians more ammunition...