Tech News : Warning Over Lessening Of AI Facial Recognition Supervision | Digital Network Solutions
Need support? 01603 778255

Tech News : Warning Over Lessening Of AI Facial Recognition Supervision

Written by: Paul | November 23rd, 2023

Computer Weekly recently reported that in an interview with the outgoing England and Wales biometrics and surveillance camera commissioner, Professor Fraser Sampson, he warned of the declining state of oversight in AI facial recognition surveillance deployment by UK police. 

Resignation 

Professor Fraser Sampson emailed his resignation letter to (then) Home Secretary Suella Braverman in August, stating his intention to resign by October 31. The reason given was that the Data Protection and Digital Information Bill will essentially make his role redundant by removing the responsibilities of the Biometrics Commissioner’s position and giving these powers to the Investigatory Powers Commissioner.  

Professor Sampson, who was only appointed to the role in March 2021, said: “Having explored a number of alternatives with officials, I am unable to find a practical way in which I can continue to discharge the functions of these two roles beyond 1st November.” 

Professor Sampson’s responsibilities in the role had included overseeing how the police collect, retain and use biometric material (such as digital facial images), and encouraging their compliance with the surveillance camera code of practice. 

Past Concerns and Criticisms 

In addition to espousing the many benefits of AI facial recognition’s deployment in the UK (e.g. catching known criminals – including those who are involved in child sexual abuse material), finding missing or vulnerable people, locating terror suspects, and helping to prevent the suffering of inhumane or degrading treatment of citizens, Professor Sampson has also previously criticised and raised concerns about aspects of its deployment. For example, in February, he noted: 

– The absence of a clear set of legal rules or a framework to regulate the police’s use of AI and biometric material. 

– A lack of clarity about the scale and extent of public space surveillance, particularly in relation to the proliferation of Chinese surveillance technology across the public sector. 

Professor Sampson has also been vocal about a number of other related issues and concerns, such as: 

– Issues related to the questionable legality of using public cloud infrastructure to store and process law enforcement data and the police’s general culture of retaining biometric data. 

– Concerns about the unlawful retention of millions of custody images of people who have been charged with a crime. Despite Professor Sampson raising the issue, and the High Court ruling in 2012 that they should be deleted, it’s been reported that the Home Office, which owns UK police biometric databases, hasn’t done so because it has no bulk deletion capacity.  

– The dangers of the UK slipping into becoming an “all-encompassing” surveillance state if concerns about these technologies (facial recognition) are not addressed. He has expressed his surprise at the disconnected approach of the UK government and his shock at how little the police and local authorities know about the capabilities and implications of the surveillance equipment they were using. 

– Concerns about the possible misuse of facial recognition and AI technologies in controversial settings ( i.e. that the approach taken by UK police / their deployment methods in controversial settings could negate any benefits of the usage of the technologies). Controversial settings could include mass surveillance at public events, targeting specific communities, routine public surveillance, application in schools or educational institutions, and use in workplaces, all of which raise concerns about privacy, discrimination, and infringement on individuals’ rights. 

– Rejection of the “nothing to worry about” defence, i.e. he challenged the common justification for surveillance that people who have done nothing wrong have nothing to worry about, stating this misses the point entirely. 

– The government’s data reform proposals. For example, he criticised the government’s Data Protection and Digital Information (DPDI) Bill, arguing that it would lead to weaker oversight by subsuming biometric oversight under the Investigatory Powers Commissioner and removing the obligation to publish a Surveillance Camera Code of Practice. 

– Efficacy and ethical concerns. Professor Sampson questioned the effectiveness of facial recognition in preventing serious crimes and highlighted the risk of pervasive facial-recognition surveillance. He also noted the chilling effect of such surveillance, where people might alter their behaviour due to the knowledge of being watched and warned against the abuse of these powers. 

– He also advocated for a robust, clear, and intuitive oversight accountability framework for facial-recognition and biometric technologies, expressing concern about the fragmentation of the existing regulatory framework. 

– The government’s lack of understanding and direction. For example, Professor Sampson commented on the lack of understanding and rationale in the government’s direction with police technology oversight and emphasised the need for public trust and confidence as a prerequisite, not just a desired outcome, for the rollout of new technologies. 

– Predictive policing concerns. He warned against the dangers of using algorithms or AI for predictive policing, arguing that such approaches rely heavily on assumptions and create a baseline level of suspicion around the public. 

Wider Concerns About Police Surveillance Using Facial Recognition 

Professor Sampson’s concerns about the police using Live Facial Recognition (LFR) surveillance at special assignments and high-profile events echo many of those expressed by others over the last few years. For example: 

– Back in 2018, Elizabeth Denham, the then UK Information Commissioner launched a formal investigation into how police forces used FRT after high failure rates, misidentifications and worries about legality, bias, and privacy. In the same year, a letter written by privacy campaign group Big Brother Watch and signed by more than 18 politicians, 25 campaign groups, plus numerous academics and barristers, highlighted concerns that facial recognition is being adopted in the UK before it has been properly scrutinised.   

– In the EU, in January 2020, the European Commission considered a ban on the use of facial recognition in public spaces for up to five years while new regulations for its use are put in place. In June this year, the EU actually adopted a blanket ban on AI-powered facial recognition in public spaces. 

What Does This Mean For Your Business? 

The evolving landscape of the Data Protection and Digital Information Bill, particularly in the context of Professor Fraser Sampson’s resignation, could hold significant implications for UK businesses. This shift indicates a potential realignment of regulatory focus from physical biometric surveillance to digital data protection. For businesses, this underscores the need to adapt to a framework that prioritises digital data security and privacy. 

The possible consolidation of regulatory bodies, like merging the roles of the Biometrics Commissioner into the Investigatory Powers Commissioner, may not necessarily suggest a decline of oversight, as suggested by Professor Fraser, but could also suggest a more streamlined oversight process. On the upside, this could mean simpler compliance procedures for businesses, but may also demand a broader understanding of a wider set of regulations. On the downside, companies (especially those dealing with biometric data) may need to very closely track these changes to ensure they remain compliant. 

As the bill is likely to address the complexities of digital data, businesses will need to be proactive in understanding how these complexities are regulated. This is crucial for those handling large volumes of customer data or relying heavily on digital platforms. Adapting to evolving technologies and staying abreast of technological advancements will, therefore, be key. 

All in all, in the light of the changes (and possible decline in oversight) highlighted by Professor Fraser, businesses will now need to be mindful of shifting political and public sentiments around privacy and surveillance, as these can influence consumer behaviour. While the changing regulatory landscape presents challenges, it also offers opportunities for businesses to align with contemporary data protection standards. Staying informed and adaptable may therefore be essential for navigating these changes successfully going forward.