Facial Recognition Technology [FRT] has become increasingly prevalent recently. It is in places from airports to shopping centres and used for many purposes, including law enforcement.
While the technology may prevent some crimes and assist in solving others, it poses serious privacy risks, says the JOIC.
FRT uses a database of photos to identify people in security photos and videos. The key identification feature is the geometry of a face, particularly the distance between a person’s eyes and the distance from their forehead to their chin. This creates a ‘facial signature’, a mathematical formula that is stored in a database of known faces for purposes of identification.
Facial recognition equipment collects and uses personal data including biometric data, which under the Data Protection (Jersey) Law 2018 [DPJL] constitutes special category data. The law treats special category data as more sensitive and requires that it receive greater protection.
Information Commissioner, Dr Jay Fedorak says: “FRT processing involves a high risk to the rights and freedoms of natural persons. Studies have demonstrated that it is prone to error, including false positives. And these errors are greater for individuals of different ethnic backgrounds.”
FRT is subject to all the requirements of the DPJL for high risk processing of special category data, which includes;
- The need to incorporate ‘Data Protection by Design and Default[1]
- The need to create ‘Data Protection Impact Assessments’. Where processing is likely to result in a high risk to the rights and freedoms of natural persons, a controller must carry out a prior assessment of the impact of the envisaged processing operations on the protection of personal data, particularly with respect to the use of new technologies, and the nature, scope, context and purposes of the processing.
Given these risks and legal requirements, Data Controllers should consult with the Jersey Office of the Information Commissioner [JOIC] before embarking on the processing of high volume special category data.
FRT, like all forms of video surveillance should be an avenue of last resort, to address serious problems where less intrusive alternatives have failed, and the benefits of processing clearly outweigh the risk of harm to data subjects, states the JOIC.