Researchers have been pursuing facial recognition technologies since the 1960 s but its only been in the last couple of years that these systems have actually become so amazingly capable. Neural networks able to match faces across countless features with better than 98 percent precision as well as an explosion of readily available training data sets have fueled the field’s current wave of developments. Nevertheless even though these algorithms can identify a person using low quality security camera video footage, they’re still awful at distinguishing in between folks with darker complexion
Facial acknowledgments systems aren’t all bad. They use people convenience, as any Apple Face ID user can attest, as well as fast and seamless security access. Unless you’re stuck in a Nick Cage-John Travolta action thriller, you’ll never need to fret about somebody gaining unauthorized access to your devices. The technology has likewise proven a boon to police, making it possible for officers to faster find suspects, as the NYPD finished with an armed rapist last August, or recognizing lost kids and addled seniors. Heck, even pop star Taylor Swift uses the innovation at her programs to foil stalkers.
But the exact same power and flexibility that makes facial acknowledgment so beneficial is what makes it so unsafe. China’s authoritarian government has long used mass security and facial recognition to keep tabs on its people. London’s Met just last week followed match, revealing that it will formalize its use of the questionable ClearView system which can track people in real-time. A number of American police companies are similarly excited to set up or in many cases broaden their surveillance capabilities at the expenditure of its residents privacy and eroding civil liberties. And yet, despite the technology’s shortcomings in accuracy and massive capacity for misuse, just a trio of states have sought to stop its adoption. For better or worse, facial acknowledgment is not going away; it’s now a question of how much damage it will do before our chosen leaders seek to limit it.
All items advised by Engadget are chosen by our editorial team, independent of our parent business. Some of our stories include affiliate links. If you purchase something through one of these links, we might earn an affiliate commission.