The UK’s Biometrics and Surveillance Camera Commissioner has written to Michael Gove, Secretary of State for Levelling Up, Housing, and Communities, to express his concern about the human rights and security issues associated with public procurement of surveillance technologies, including security cameras and real-time facial recognition systems.

Professor Fraser Sampson wrote to Gove on 31 May, in a follow-up letter to one he wrote on 22 April, to which the Minister apparently did not reply.

In both letters, Sampson focused on the state adoption of surveillance technologies that, in other parts of the world, have been associated with atrocities and human rights abuses.

Equipment from some of the same manufacturers has been adopted in the UK, he noted, making specific reference to systems used in the oppression of Uyghur Muslims in Xinjiang.

In his April letter Sampson wrote, “In terms of security, public space surveillance is increasingly intrusive and modern surveillance cameras are built with the maximum functionality inside at the point of manufacture.

“This means they come with capabilities that can be switched on remotely in the future as and when they are needed, for example, the ability to pick up sound or read vehicle number plates.

“The more that surveillance camera systems can do, the more important it will be to reassure people about what those systems are not doing, whether that is in our streets, our sports grounds, or our schools. This is increasingly difficult to detect technically and requires transparency and due diligence by all concerned in public space surveillance activity.”

He reminded the Secretary of State of the UK’s obligations under UN agreements. “I will shortly be publishing advice under the Home Secretary’s Surveillance Camera Code of Practice (SC Code) to assist relevant authorities to meet their human rights and ethical obligations in the use of public space surveillance.

“This approach is consistent with the government’s incremental, principles-based approach to regulation of the use of biometric surveillance technologies generally.

“It is also consistent with the specific provisions of the SC Code which state that it is a legitimate public expectation of relevant authorities that they are able to demonstrate how they have had regard to it, and which remind those authorities that their duty to have regard to the SC Code also applies where they enter into partnership arrangements.”

He added, “Transparency and governance are part of the ‘golden thread’ for human rights running through the UK government’s guide to implementation of the UN Guiding Principles.

“If we are to harness the significant benefits of emerging technology in this area in a lawful, ethical, and accountable way, we need to build trusted surveillance partnerships. To do that, we must be able to trust our surveillance partners in respect of both the human rights and security considerations.”

Public trust and ethical development and deployment are core principles in the UK’s recent National Strategy for Artificial Intelligence, a technology that is frequently deployed to analyse images from surveillance cameras.

But how images are gathered to train AI systems is another key issue.

Sampson’s follow-up letter came in the wake of the UK’s data protection watchdog, the Information Commissioner’s Office (ICO), fining US facial recognition technology provider Clearview AI for illegally scraping images of people from social platforms to populate its database.

Clearview AI was fined a record £7.5 million last month for including UK residents in a database of 20 billion images without permission. The images of citizens’ faces were scraped from Facebook and other platforms en masse.

The ICO has ordered the company to delete the photos of UK citizens.

Information Commissioner John Edwards said, “The company not only enables identification of those people, but effectively monitors their behaviour and offers it as a commercial service. That is unacceptable. That is why we have acted to protect people in the UK by both fining the company and issuing an enforcement notice.”

The implications of any company simply scraping images off the Web and accepting them as legitimate or authenticated are troubling: deep fakes are a rising problem, while fake accounts, catfishing, and people impersonating others online are also rife.

This means that images of innocent people could become associated with illegal activities or false information about them, through no fault or action of their own. In turn, this could lead to red flags against the names of those people, if profiles are set up fraudulently or maliciously.

On taking up his position in January this year, Edwards said, “Privacy is a right, not a privilege. In a world where our personal data can drive everything from the healthcare we receive to the job opportunities we see, we all deserve to have our data treated with respect.”

In the US, several technology companies, including Microsoft, have called for real-time facial recognition systems to be regulated. Their use is banned in some US cities, including tech hotspot San Francisco.

Despite this and fears over the technology’s low success rate in identifying many groups, including people from ethnic minorities, the Gardaí adopted it in Ireland last month.

In the UK, the Metropolitan Police have also experimented with its use.

• UPDATE: On 10 June, Sampson wrote to Sir Iain Duncan Smith, MP, after both had spoken at an event on surveillance systems.

In the letter he noted, “My correspondence with the relevant companies – and also with government departments – on this matter is all in the public domain but has yet to produce any discernible action.

“While the arguments deployed in this debate embrace a wide spectrum of issues, from my perspective as Biometrics and Surveillance Camera Commissioner the matter can be simplified to one of trust.

“The people we trust – the police, fire and rescue, local authorities, and the government itself – must be able to trust their technology partners, both in terms of security and of our shared ethical and professional values. And the publicly available evidence tells me that some of these companies […] simply cannot be trusted.”