The danger does not come only from government abuses

The early uses of facial recognition for background checks have already yielded damaging results. In the UK, Uber’s use of facial recognition has caused many misidentified people to lose their jobs. And more than 20 states trust the background check service to use facial recognition to perform fraud checks on unemployment benefit recipients, but this system has falsely labeled individuals as fraudsters, ” drive to massive delays in survival funds.

Finally, misidentification is likely to become a danger when consumer facial recognition is rolled out by online detectives seeking to identify criminals. As a quick and easy-to-use tool yet unreliable in its results, facial recognition could be the perfect storm for web sleuth disasters. This area, which abounds in self-defense, is already experiencing problems that have led to dangerous identification errors. A 2019 case shows how posting facial recognition mismatches online could get out of hand: After the Easter attacks in Sri Lanka in 2019, authorities included an American student, based on facial recognition. shift– on his public list of suspects, making him face a wave of death threats.

These risks are all magnified by the fact that facial recognition has been repeatedly shown to misrepresent women and people of color in the world. higher rates, which means that people who already face the barriers of systemic sexism and racism in institutions ranging from housing at medical care may soon have even more unfair and overwhelming obstacles in everyday life.

Public de-anonymization and Doxxing

Another serious risk with a mainstream facial recognition system is how it might amplify efforts to de-anonymize individuals and doxize people (an internet-age technique of posting someone’s personal information for the purpose of generating harassment) who are engaged in potentially sensitive public relations. Activities.

We have seen this play out in the area of ​​sex work before. FindFace was used for de-anonymize and stalk sex workers and adult film actresses. In 2019, a Chinese programmer claimed to have developed a personalized facial recognition program for publicly identify and catalog
100,000 women on adult websites explicitly for allow men to find out if their girlfriends have appeared in adult films. After the public outcry, the programmer would have closed the program.

Previous Heroes and villains in recent and past history
Next Short-term benefits undermine long-term resilience

No Comment

Leave a reply

Your email address will not be published.