Police admit they DID share images with King’s Cross Estate for its facial recognition cameras to use – despite previously denying involvement with ‘Orwellian’ surveillance project
- Metropolitan and British Transport Police did share images with developers
- King’s Cross Estate in London had two facial recognition cameras in place
- Developers said stopped using two cameras at King’s Boulevard in March 2018
- On Wednesday police chief Cressida Dick warned of the dangers of technology
The Metropolitan Police and British Transport Police did share images with the King’s Cross Estate in London for its facial recognition technology.
Previously both had denied involvement in the surveillance project which developers said was to help ‘prevent and detect crime in the neighbourhood and ultimately to help ensure public safety’.
The area involved is home to King’s Cross and St Pancras International train stations, as well as restaurants, shops and cafes.
On Wednesday Metropolitan Police chief Cressida Dick said the use of this and similar technology could risk turning Britain into a ‘ghastly Orwellian’ police state.
The Metropolitan Police and British Transport Police have admitted that they did share images with the King’s Cross Estate for its facial recognition technology, despite first denying they did. Pictured is a CCTV camera in Pancras Square near Kings Cross Station
Ms Dick was speaking at the Lowy Institute think-tank in Sydney, Australia, and added that live facial recognition was a ‘hot potato’.
The British Transport Police originally said that it had ‘not contributed, nor has benefited’ from the facial recognition technology, but is now ‘correcting’ its position.
It claims that local teams based at King’s Cross worked with partners between 2016 and 2018 ‘to share images of a small number of convicted offenders, who routinely offended or committed anti-social behaviour in the area’.
‘This was legitimate action in order to prevent crime and keep people safe,’ a spokesman said.
‘Understandably, the public are interested in police use of such technologies, which is why we are correcting our position.’
Mayor of London Sadiq Khan said the original information provided by the Metropolitan Police was also ‘incorrect’ and that ‘they have in fact shared images related to facial recognition with King’s Cross Central Limited Partnership’.
BTP said they shared ‘images of a small number of convicted offenders, who routinely offended or committed anti-social behaviour in the area’ (Pictured is CCTV in Pancras Square)
‘I am informed that this ceased in 2018,’ Mr Khan said.
‘As a matter of urgency, I have asked for a report from the MPS on this concerning development and on their wider data-sharing arrangements, including what information has been shared and with whom.
‘I apologise to the Assembly Member that the previous information provided was inaccurate.
‘A fuller update will be provided to London Assembly Members as soon as I am able.’
Developers of the area said that it had two facial recognition cameras in operation at King’s Boulevard, which it stopped using in March 2018.
Usage of the technology has been under the spotlight after UK data and privacy watchdog the Information Commissioner’s Office (ICO) said it had launched an investigation last month.
The developer behind the King’s Cross Estate (pictured) in the capital admitted it had installed the technology, which can track tens of thousands of people every day. They said two cameras were in operation at King’s Boulevard but they stopped using in them March 2018
The use of the technology has been in the spotlight after the Information Commissioner’s Office said it had launched an investigation. Pictured is Avigilon CCTV camera in King’s Cross
Information Commissioner Elizabeth Denham said the watchdog is ‘deeply concerned about the growing use of facial recognition technology in public spaces’ and is seeking ‘detailed information’ about how it is used.
Last month it was revealed that Canary Wharf was in talks to install facial recognition across its 97-acre estate, which is home to major banks like Barclays and HSBC.
Big Brother Watch said the use of facial recognition on such a scale in the ‘worst case scenario for privacy’.
Liberty called it ‘a disturbing expansion of mass surveillance’ that threatens ‘freedom of expression as we go about our everyday lives.’
Activist Ed Bridges, from Cardiff, lost the world’s first legal challenge over police use of facial recognition technology this week.
The 36-year-old told the High Court that his face was scanned while Christmas shopping in 2017, and at a peaceful anti-arms protest in 2018.
An artists impression of the King’s Cross development which used facial recognition cameras
His lawyers argued the use of automatic facial recognition by South Wales Police caused him ‘distress’ and violated his privacy and data protection rights by processing an image taken of him in public.
But Mr Bridges’s case was dismissed on Wednesday by two leading judges, who said the use of the technology was not unlawful, though he vowed to appeal against the ruling.
After the ruling group Big Brother Watch said: ‘Several cities in the US have banned live facial recognition surveillance and it is long overdue that our parliament does the same.
‘The British public do not want to be walking ID cards subjected to a constant police line up.
‘We’re a nation that cherishes civil liberties, not a Chinese-style police state. Live facial recognition doesn’t fit in a democracy and we will fight until it is banned.’
HOW DOES FACIAL RECOGNITION TECHNOLOGY WORK?
Facial recognition software works by matching real time images to a previous photograph of a person.
Each face has approximately 80 unique nodal points across the eyes, nose, cheeks and mouth which distinguish one person from another.
A digital video camera measures the distance between various points on the human face, such as the width of the nose, depth of the eye sockets, distance between the eyes and shape of the jawline.
A different smart surveillance system (pictured) can scan 2 billion faces within seconds has been revealed in China. The system connects to millions of CCTV cameras and uses artificial intelligence to pick out targets. The military is working on applying a similar version of this with AI to track people across the country
This produces a unique numerical code that can then be linked with a matching code gleaned from a previous photograph.
A facial recognition system used by officials in China connects to millions of CCTV cameras and uses artificial intelligence to pick out targets.
Experts believe that facial recognition technology will soon overtake fingerprint technology as the most effective way to identify people.
Source: Read Full Article