London Metropolitan Police to “consider” Live Facial Recognition ethics report as rights groups decry technology as illegal

facial-recognition-london-police.jpg
A photo used in by the London Metropolitan Police to publicize its trial of Live Facial Recognition technology in the British capital.

HANDOUT

London — London’s Metropolitan Police force has said it will “carefully consider the contents” of an ethics panel’s report before deciding on any continued use of Live Facial Recognition (LFR) technology in public places. The controversial technology was trialled across London for several years, and the London Policing Ethics Panel — a group commissioned by the London Mayor’s office, independent of the police — issued its report on those trials on Wednesday.

The Ethics Panel’s conclusions, based on consultations with members of the public, were that most people don’t have a big problem with their faces being scanned in public places and cross-checked against lists of suspected criminals — but there are caveats.

According to the report, “57% of respondents thought that in general terms, police use of LFR was acceptable. We will see, however, that the purposes for which it is used makes a significant difference to people’s support for its deployment.”

San Francisco 1st U.S. city to ban facial recognition technology

So, too, does the race and age of the person being asked. The report found that while more than 60% of white and mixed-race individuals said it was acceptable for police to use LFR in public places, only about 40% of black and Asian respondents agreed.

San Francisco bans facial recognition technology

The Met stressed that in all trial uses of LFR in London, the technology was “used overtly. Information leaflets were handed to members of the public, posters were placed in the area and officers engaged with members of the public to explain the process and technology.”

Trending News

  • Iran’s top diplomat warns of “all-out war” after Pompeo comments
  • After 3 decades, police believe they found infamous serial killer
  • Teens pledge not to have kids until action taken on climate
  • What is going on with the war in Afghanistan?
  • Can Israel’s Netanyahu cut a deal to stay in power?
  • “In the final LFR trials, the watch-list only contained images of individuals wanted by the Met and the courts for violent related offences,” the police said. It was not clear what crimes people flagged in the software in earlier trials were accused of, but rights activists have expressed concern that facial recognition has been used in Britain to find people accused of petty offenses including shoplifting.

    According to the police, the recent trials “led to a number of arrests based on positive identifications.”

    “No legal basis”

    “There is no legal basis for them to be using facial recognition in the first place,” argues Silkie Carlo, director of the privacy advocacy group Big Brother Watch, which filed a legal challenge to block the police’s use of the technology last year.

    According to the organization, scanning the face of every passing member of the public is, “a very clear violation” of the U.K. Human Rights Act, which became law in 1998.

    Article 8 of that law states: “Everyone has the right to respect for his private and family life, his home and his correspondence. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”

    Is facial recognition technology in airports accurate?

    “When you’re talking about a really significant expansion of police powers — and one that affects peoples’ rights… in those circumstances it absolutely would be the convention to have at least a legal consideration, and in our view a legal basis for them to be able to do that,” Carlo told CBS News.  

    Big Brother Watch said in an earlier statement that the Met’s use of LFR wrongly identified individuals “over 90% of the time, with many of those being wrongly stopped and subject to further police intrusion.”

    It was not clear when or whether the Met would deploy LFR technology again, given its vow to consider suggestions put forth by the Policing Ethics Panel, and the pending legal challenges over its use.

    .recirc_item:nth-child(5) {
    display: none;
    }

    #recirc_item_30dcbd3b-22d0-4268-a374-f192baa8067b {
    display: none;
    }

    #recirc_item_30dcbd3b-22d0-4268-a374-f192baa8067b ~ .recirc_item:nth-child(5) {
    display: list-item;
    }

    You might also like More from author