top of page
  • Solanki Barua

Right To Privacy Violation By Facial Recognition Devices.

Updated: Nov 19, 2023

FRT to identify people without their knowledge or agreement presents privacy issues.


Facial recognition devices are systems that recognize faces and employ distinctive mathematical patterns to store biometric information. As a result, they are some of the biometric technology's safest and most reliable identifying techniques. To lessen the possibility of unauthorized access, facial data can be anonymized and kept secret.

But sadly Facial recognition technology (FRT) is becoming more and more integrated into daily life, from unlocking an iPhone to automatically tagging Facebook images to companies analyzing productivity and police forces monitoring rallies. FRT compares collected photographs to other facial images that are available, such as those that are on databases or government watchlists. It is a very invasive sort of surveillance that has the potential to substantially impair people's freedoms and, eventually, society.


Lack of consent: A fundamental tenet of all data privacy regulations is that businesses must disclose the biometric data they are collecting to users and obtain their consent before doing so. FRT has several privacy implications, but the use of technology to identify people without their permission is the most important. This involves utilizing tools like real-time public surveillance or a collection of illegally created databases.

Unencrypted faces: It is getting easier and less expensive to gather and store faces from a distance. Faces cannot be encrypted, in contrast to many other types of data. Because faces cannot be easily changed, unlike passwords and credit card numbers, data breaches using facial recognition data raise the risk of identity theft, stalking, and harassment.

Lack of transparency: Given that biometrics are personal to each person, using FRT to identify people without their knowledge or agreement presents privacy issues. Facial scans raise extra issues because they can be readily, remotely, and covertly taken, in contrast to other biometrics (such as fingerprints).

Technical flaws: Using images or three-dimensional (3D) masks made from images of victims, it might be able to spoof a system with FRT and pose as a victim. Additionally, presentation attacks and the deployment of physical or digital spoofs, such as masks or deep fakes, can be risk factors for FRT.

Accuracy: Another frequent criticism of FRT is its accuracy. The effects of a captured facial scan that incorrectly identifies someone could be long-lasting. Additionally, accuracy varies by population, with women and persons of colour seeing the highest rates of false positives, which can result in wrongful arrests in a criminal setting.


To oversee and impose regulations on the usage of FRT, several laws have been created all over the world. Recent legislation focuses on controlling government organizations rather than the private sector. While some initiatives govern the public sector, some are solely focused on law enforcement. Prior legislative clearance is currently necessary in Pittsburgh, Philadelphia, and Virginia, a US state. Massachusetts and Utah's state agencies that oversee the database must receive a formal request from law enforcement before they may use face recognition technology to look for people. In South Australia, the Surveillance Devices Act (2016) was passed. Without the express or implicit approval of all relevant parties, it is illegal for anybody to intentionally install, maintain, or operate optical surveillance equipment on a property that visually records or witnesses a private action. Additionally, the legislation forbids the willful use, dissemination, or publishing of data or materials obtained through the use of optical surveillance equipment.


Consent: When enrolling a person in a program that employs FRT for verification or identification purposes and/or identifying a person to third parties who would not otherwise know that person's identity, enterprises should seek express, affirmative consent.

Utilization: Businesses should make a commitment to gathering, utilizing, and disclosing face recognition data in ways that are consistent with unauthorized -it reasonable consumer expectations given the context in which the data was gathered.

Transparency: Businesses should inform customers in a meaningful way about the processes involved in developing the face recognition software templates as well as the uses, storage, sharing, upkeep, and destruction of such data.

Data security: Enterprises should keep a thorough data security program that is logically created to guard against risks (such as access or use, unintended or inappropriate disclosure) using administrative, technological, and physical safeguards appropriate to the sensitivity of the information.

Privacy: In addition to legislative, legal, and administrative procedures, businesses should work to create technology controls that support or enforce adherence to these objectives.

Integrity and access: Businesses should put adequate safeguards in place to keep face recognition data accurate. Give people a fair chance to evaluate false identity labelling, ask for it to be corrected, and ask for their face recognition data to be deleted.

Accountability: Enterprises are responsible for making sure that the use of FRT and data by the company, as well as in collaboration with any third-party service providers or business partners, complies with the aforementioned guidelines.

Co-Authored with: Prof. Jharna Jagtiani


bottom of page