Facial recognition scanners are being used in London and have been used in shopping centres, museums and conference centres around the UK, according to an investigation.

Argent, the company behind a 67-acre site at Kings Cross, is already using the technology, with developers at financial hub Canary Wharf set to follow suit.

Besides two major railway stations – Kings Cross and St Pancras – the Argent plot is home to Google’s London headquarters and Central Saint Martins art college, as well as shops and restaurants.

Civil liberties group Big Brother Watch labelled use of the technology an "epidemic" and said its use on privately owned sites was "deeply disturbing".

'In the interest of public safety'

But the developer said it was using facial recognition "in the interest of public safety".

A spokesperson for Argent said: "These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public."

On Thursday (August 15), the Information Commissioner's Office announced it would launch its own investigation into the use of facial recognition cameras.

The UK's data and privacy watchdog said it was "deeply concerned about the growing use of facial recognition technology in public spaces" and is seeking "detailed information" about how it is used.

Mayor of London Sadiq Khan said he had written to the chief executive of the King's Cross development to raise his concerns.

In his letter to chief executive Robert Evans, the Mayor called for more information on the company’s surveillance activities.

Enfield Independent:

Facial recognition cameras are being used in the Kings Cross area. Photo: Pixabay

He said: “London’s incredible public spaces are a real asset to our city.

“They should be places that all Londoners, regardless of their age, ability, gender identity, religion, race, sexual orientation or social class, can enjoy and use confidently and independently, avoiding separation or segregation.”

The controversial new technology matches live footage of people with a database of existing images to identify them.

Software maps faces as a series of data points – measuring details like width of nose and depth of eye sockets – and compares that information with a bank of existing images to find a match.

Facial recognition has been trialled by UK police forces – including the Metropolitan police – as a way to identify suspects on crime watch lists but this has been subject to legal challenges.

'A privacy emergency'

Big Brother Watch chief executive Silkie Carlo said increasing use of the technology was placing personal privacy at risk.

"There is an epidemic of facial recognition in the UK," she said.

"The collusion between police and private companies in building these surveillance nets around popular spaces is deeply disturbing. Facial recognition is the perfect tool of oppression and the widespread use we've found indicates we're facing a privacy emergency.

"We now know that many millions of innocent people will have had their faces scanned with this surveillance without knowing about it, whether by police or by private companies."

Technology used elsewhere

The group said the Meadowhall shopping centre in Sheffield had carried out trials of the technology last year, while the World Museum in Liverpool and Millennium Point conference centre in Birmingham were also named in its investigation as locations where scans had taken place.

Owner of Meadowhall in Sheffield, British Land, has sites in parts of London including Paddington, Broadgate, Canada Water and Ealing Broadway.

A British Land spokeswoman said: "We do not operate facial recognition at any of our assets.

"However, over a year ago we conducted a short trial at Meadowhall, in conjunction with the police, and all data was deleted immediately after the trial."

Last month, the House of Commons Science and Technology Committee said authorities should cease trials of facial recognition technology until a legal framework is established.

MPs said the lack of legislation calls into question the legal basis of the trials.

In a report on the Government's approach to biometrics and forensics, the MPs referred to automatic facial recognition testing by the Metropolitan Police and South Wales Police, noting an evaluation of both trials by the Biometrics and Forensics Ethics Group raised questions about accuracy and bias.