Facial Recognition Technology – A Slippery Slope

2023-05-11
4 minutes to read
SteelToad OP
Copy link
Facial Recognition Technology – A Slippery Slope

Facial Recognition Technology (FRT) is a $9.6 billion dollar industry (TechHQ, 2022) that was first developed in the 1960s as a computer program to detect human characteristics and features. Since it has been adopted by federal, military, and police, and private companies, it has gotten wildly out of hand. Suddenly, protestors are being abducted by federal agents based on photos taken from street cameras. Facebook and Instagram can find user accounts based on a single photo. Employees who work remotely from home are monitored via cameras to ensure productivity, and even have their facial expressions tracked to “detect boredom”. Something as simple as your own face can now be tracked and used against you. Quietly, the U.S. has become a surveillance state, and your privacy is no longer private.

FRT was implemented by the U.S. government as early as 1993. Nicknamed “FERET” and sponsored by the Department of Defense (DoD) Counterdrug Technology Development Program, the software was modified with algorithms that “automated mug book search using surveillance images, limited access to restricted equipment or facilities, examined the background and security clearances of employees, monitored specific persons at airports, border crossings, and secure manufacturing locations, identified and recorded people’s repeated appearances in surveillance footage over time, identified oneself at an ATM, and examined picture ID data to look for fraud” (Rauss, 1997). This software was then rolled out and tested on subjects at George Mason University and the Army Research Laboratory in Maryland. It is unclear if invasive algorithms such as these were implemented with the consent of those who were being watched.

From there, FRT technology was purchased by the DMV in order to stop people from purchasing multiple driver’s licenses under different names. With facial recognition technology, DMV clerks could check for the face of the individual registering for a license to make sure there were no duplicates of the same person under different names, or to make sure that someone’s identity was not being stolen or used by another person. Ironically around the time this was implemented, privately run prisons had dramatically increased across the country. And shortly thereafter, in 1999, Minnesota began to use FRT to match and recognize mugshots. Police, judges, and court officers could now track virtually anyone with a criminal record, no matter how small or large the misdemeanor (Gates, 2011). Since FRT has been modified by police forces, there are reports of algorithms being biased, in that they are programmed to target faces or people of color rather than those who are white.

FRT quickly was bought and developed by private companies such as Ring, Apple, Facebook, and Instagram. Even ExxonMobil and Amazon are getting in on the technology. Soon you could be paying for gasoline or Amazon packages with a quick scan of your face. Walmart will be able to analyze your mood as you walk through the store. In Japanese McDonald’s outlets, FRT monitors if the employees are smiling while they are at work (TechHQ, 2022). With FRT surveillance so normalized, where do we draw the line between private and public information? How can we be sure that our private information, accessed without our knowledge, is being used ethically and fairly?

In Europe at least, the danger of FRT being abused is a hot topic of debate. The European Union’s draft Artificial Intelligence Act would restrict FRT’s use in the public domain, and the European Parliament has proposed banning it altogether (Lively, 2021). Reports of FRT in Australia, New Zealand, and even Russia are also receiving push back by citizens. Amidst the ongoing invasion by Russia, Ukraine has been using FRT developed by Clearview AI to identify dead Russian soldiers so they may inform their families and loved ones of their fate. They connect the facial profiles to that of their social media accounts (if they have one) and then notify the proper authorities who may then inform the relatives. Clearview AI itself, however, is currently undergoing a lawsuit “in U.S. federal court in Chicago filed by consumers under the Illinois Biometric Information Privacy Act. The ongoing case concerns whether the company’s gathering of images from the internet violated privacy law” (Dave, 2022).

It is indeed a slippery slope in which facial recognition technology can be invasive and abused. Hopefully more regulations and laws will be established in the coming years as the U.S. focuses more on cybersecurity. Until then, be mindful of where your information, and your face, is uploaded.

 

Sources

Dave, Paresh. Ukraine uses facial recognition to identify dead Russian soldiers, minister says. Reuters, 24 March 2022. https://www.reuters.com/technology/ukraine-uses-facial-recognition-identify-dead-russian-soldiers-minister-says-2022-03-23/ Date Accessed June 24, 2022

Jones, Nigel. 10 reasons to be concerned about facial recognition technology. The Privacy Compliance Hub Limited, August 2021. https://www.privacycompliancehub.com/gdpr-resources/10-reasons-to-be-concerned-about-facial-recognition-technology/ Accessed June 24, 2022.

Lively, Taylor Kay. Facial Recognition in the United States: Privacy Concerns and Legal Developments. ASIS International, 1 December 2021. https://www.asisonline.org/security-management-magazine/monthly-issues/security-technology/archive/2021/december/facial-recognition-in-the-us-privacy-concerns-and-legal-developments/ Accessed June 24, 2022.

Rauss, Patrick; Philips, Jonathan; Hamilton, Mark; DePersia, Trent (February 26, 1997). FERET (Face Recognition Technology) program. 25th AIPR Workshop: Emerging Applications of Computer Vision. Vol. 2962. pp. 253–263.

Gates, Kelly (2011). Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance. NYU Press. p. 54.

TechHQ. Nothing personal? How private companies are using facial recognition tech. TechHQ, 8 June 2020. https://techhq.com/2020/06/nothing-personal-how-private-companies-are-using-facial-recognition-tech/ Date Accessed June 24, 2022.

Share this post