Vous êtes ici : Accueil / Key story / 04 July 2019 - Police should end use of facial recognition due to inaccuracy

04 July 2019 - Police should end use of facial recognition due to inaccuracy

Publié par Nishtha Sharma le 05/07/2019

Police face calls to end use of facial recognition software

Robert Booth, Social affairs correspondent (The Guardian, 03/07/2019)

Police are facing calls to halt the use of facial recognition software to search for suspected criminals in public after independent analysis found matches were only correct in a fifth of cases and the system was likely to break human rights laws.

Academics from the University of Essex were granted access to six live trials by the Metropolitan police in Soho, Romford and at the Westfield shopping centre in Stratford, east London.

They found the system regularly misidentified people who were then wrongly stopped. They also warned of “surveillance creep”, with the technology being used to find people who were not wanted by the courts. 

Read on...

 

Police urged to axe facial recognition after research finds four of five ‘suspects’ are innocent

Adam Forrest (The Independent, 04/07/2019)

Scotland Yard has been urged to stop using facial recognition technologyafter independent research found that four out of five people identified as possible suspects were innocent.

Researchers at the University of Essex found that the Metropolitan Police’s live facial recognition (LFR) system was inaccurate in the huge majority of cases: the technology made only eight out of 42 matches correctly across six trials evaluated.

The report, commissioned by the Met, raised “significant concerns” that use of the controversial technology breaks human rights laws.

Read on...

 

Police face recognition software ‘wrong 80% of time’

Fariha Karim, Mark Bridge (The Times, 04/07/2019)

Facial recognition technology used by Scotland Yard is wrong in the vast majority of cases and probably illegal, according to the first independent analysis of the system.

Scotland Yard has been trialling Live Facial Recognition technology, in which cameras scan the faces of members of the public to compare them with faces on a list of wanted individuals.

However, researchers from the University of Essex who were given access to six of ten trials in Soho, Romford and at the Westfield shopping centre in east London, found that the technology picked out faces that were not on a wanted list in 80 per cent of cases.

Read on...

 

Sky Views: Facial recognition trials show government at its worst

Rowland Manthorpe, Technology Correspondent (Sky News, 04/07/2019)

This is not a philosophical thought experiment, an ultra-boring version of the tree falling in the forest with no one around to hear it.

No: this is a real question with a real answer, one which reveals a great deal about the way power works in modern Britain.

To understand why, let me take you back to yesterday, when the Home Office issued its response to the publication of the first ever independent evaluation on the Metropolitan Police's use of facial recognition technology.

The Met has been experimenting with facial recognition ever since August 2016, when it used the technology at Notting Hill Carnival.

Read on...

 

How do we stop facial recognition from becoming the next Facebook: ubiquitous and useful yet dangerous, impervious and misunderstood?

Kieren McCarthy (The Register, 03/07/2019)

Facial recognition is having a rough time of it lately. Just six months ago, people were excited about Apple allowing you to open your phone just by looking at it. A year ago, Facebook users joyfully tagged their friends in photos. But then the tech got better, and so did the concerns.

In May, San Francisco became the first major city in the world to effectively ban facial recognition. A week later, Congress heard how it also needed to ban the tech until new rules could be drawn up to cover its safe use.

That same week, Amazon shot down a stakeholder proposal to ban the sale of its facial recognition technology from being sold to law enforcement.

Read on...