A British civil liberties campaign organization has warned citizens are unwittingly being added to police watch lists by facial recognition technology.
The alert from Big Brother Watch comes after a woman was mistakenly accused of being a thief after her image was captured by FaceWatch, a facial-recognition system deployed by several British retailers.
As reported by the BBC, Sara (who wished to remain anonymous) entered a shop to buy chocolate but instead, she was approached immediately and told, “You’re a thief, you need to leave the store.”
Sara was led away, had her bag searched and advised she was banned from all outlets that use the FaceWatch tech but it later transpired mistake had occurred and she received a letter to acknowledge the error.
Campaign group strives to increase opposition to facial recognition
Silkie Carlo, director of Big Brother Watch, has recorded the police on several facial-recognition deployments, but she stresses there is a lack of awareness of how the surveillance works.
“My experience, observing live facial recognition for many years, (is that) most members of the public don’t really know what live facial recognition is,” stated Carlo.
“If they trigger a match alert, then the police will come in, possibly detain them and question them and ask them to prove their innocence”, adding any person whose face is scanned is effectively part of a digital police line-up compilation.
Big Brother Watch wants to stop mass surveillance, imploring that society needs to react to prevent facial recognition from becoming normalised.
In London under the Metropolitan Police force, the use of the evolving technology is increasing. Between 2020 and 2022, the Met used live facial recognition nine times, which increased to 23 the following year. So far in 2024, there have been 67 deployments so it is clear to see the proliferation.
Supporters of this security tool insist mistakes are rare, while the Met Police says around one in every 33,000 people captured by the cameras are misidentified but the BBC indicates this stat is misleading.
The report outlines that once a person is flagged by facial recognition, the error count increases with one in 40 alters having a false positive outcome, so far this year.
Image credit: Ideogram