Published: 20 November 2025 Thursday . The English Chronicle Desk. The English Chronicle Online
A 66-year-old man has spoken out after being falsely accused of shoplifting due to a facial recognition error, warning that such mistakes could happen to anyone and highlighting the risks of relying heavily on surveillance technology.
Byron Long, from Cardiff, was accused of stealing £75 worth of items during a visit to a B&M store at Cardiff Bay Retail Park, after being flagged on a facial recognition watchlist. The incident occurred on 29 April, when staff confronted Mr Long and asked him to leave the store. Initially told that he was implicated in a prior alleged theft, it was later confirmed through a review of CCTV footage that Mr Long had in fact paid for all his purchases, clearing him of any wrongdoing.
B&M has issued an apology to Mr Long and offered a £25 store voucher as a gesture of goodwill, which he has rejected, describing it as insufficient given the distress the incident caused. “They don’t recognise the dangers and harm they can cause with the power they have,” Mr Long said. “My interaction has gone out the window. I suffer with my mental health, so it’s important for me to get out. It was a horrible experience, and I haven’t been back there since. I’m worried it will happen again.”
The incident has drawn attention to the potential pitfalls of facial recognition technology in retail settings. Big Brother Watch, a civil liberties advocacy group, has lodged a formal complaint with the Information Commissioner’s Office (ICO) on behalf of Mr Long. The organisation emphasised the risks of false positives and the serious impact such errors can have on individuals’ lives.
Facewatch, the company that supplied the facial recognition system used by B&M, stated that the incident was due to human error and not a failure of the technology. However, the case underscores the ongoing debate about the accuracy, ethics, and accountability of surveillance systems in public spaces. Critics argue that even minor mistakes in identification can have disproportionate consequences, particularly for vulnerable individuals or those with pre-existing mental health conditions.
Mr Long explained how the experience had aggravated his mental health difficulties, leaving him anxious about visiting public spaces and wary of interactions with retail staff. “They branded me a thief in front of everyone, and the embarrassment and fear have stayed with me,” he said. “Facial recognition might sound like a useful tool, but it can have a devastating impact when it’s wrong. It could happen to anyone.”
The case also raises questions about corporate responsibility and the steps retailers should take to prevent such incidents. While B&M has apologised, Mr Long insists that words alone are inadequate. “A £25 voucher doesn’t fix the trauma of being accused of something I didn’t do. There needs to be accountability, not just empty apologies,” he added.
Civil liberties campaigners have called for stricter regulations on the use of facial recognition in commercial settings. Big Brother Watch warned that without proper oversight, individuals could face unjust treatment, reputational damage, and psychological harm. They urged the ICO to investigate and consider wider implications for privacy, consent, and safeguards against misidentification.
Facial recognition technology has become increasingly widespread in the UK, with retailers adopting it as a tool for loss prevention. Proponents argue that it can reduce theft and protect businesses. However, opponents point to a growing number of incidents in which innocent individuals are wrongly identified, leading to stress, public humiliation, and erosion of trust in both businesses and technology.
Experts in technology ethics caution that even the most advanced systems are not infallible. Algorithms may produce false matches due to poor lighting, camera angles, or limitations in the data used to train them. Human oversight, while intended to correct errors, can sometimes introduce further mistakes, as appears to have occurred in Mr Long’s case.
For Mr Long, the emotional and psychological impact has been profound. He described feeling humiliated, anxious, and reluctant to engage in routine activities, including shopping and social interactions. “I’ve lost confidence in going to places where I know surveillance is being used. It’s not just the embarrassment; it’s the fear of being misjudged again,” he said.
The incident has sparked discussion among policymakers, advocacy groups, and retail operators about how to balance the benefits of technology with the rights and dignity of individuals. There is a growing consensus that companies must implement rigorous verification processes, provide training to staff, and ensure that affected individuals have access to meaningful redress when mistakes occur.
While B&M has maintained that the error was unintentional and that corrective action was taken once the mistake was identified, the case illustrates the broader societal implications of automated surveillance. False accusations can disproportionately affect those with vulnerabilities, such as pre-existing mental health conditions, and highlight the need for ethical and accountable technology deployment.
In his public statements, Mr Long emphasised that his experience should serve as a cautionary tale. “This could happen to anyone, at any time. Technology is not perfect, and the consequences of errors are real and lasting,” he said. He also urged retailers to consider the human cost of surveillance and to prioritise fairness and verification over automated judgments.
The case is ongoing, with Big Brother Watch’s complaint to the Information Commissioner’s Office expected to influence future discussions about oversight and regulation of facial recognition in retail environments. It underscores the need for clear rules, transparency, and accountability when personal data and technology intersect with public safety and business operations.
Mr Long’s experience serves as a reminder of the potential human impact behind technological tools. While facial recognition may offer efficiency and security benefits, the cost of errors—misidentification, stress, and reputational harm—cannot be ignored. As retailers and technology providers continue to adopt these systems, careful attention to safeguards and ethical use will be essential to prevent similar incidents from recurring.
The B&M incident, and the broader discussion it has sparked, highlights the intersection of technology, privacy, and human rights. It illustrates that even well-intentioned innovations can have unintended consequences and that robust oversight, accountability, and ethical deployment are necessary to protect the public from harm.
For Mr Long, the personal impact of the error is ongoing, as he navigates the emotional aftermath of being falsely accused. While he welcomes corrective measures and the acknowledgement of the mistake, he continues to warn others about the potential risks of automated surveillance, advocating for greater awareness, fairness, and safeguards to ensure that no one else experiences the distress he endured.



























































































