Report Confirms Deep Flaws Of Automated Facial Recognition Software In The UK, Warns Its Use In The US Is Spreading
from the mind-the-step-change dept
Techdirt has written many stories about facial recognition systems. But there’s a step-change taking place in this area at the moment. The authorities are moving from comparing single images with database holdings, to completely automated scanning of crowds to obtain and analyze huge numbers of facial images in real time. Recently, Tim Cushing described the ridiculously high level of false positives South Wales Police had encountered during its use of automated facial recognition software. Before that, a post noted a similarly unacceptable failure rate of automated systems used by the Metropolitan Police in London last year.
Now Big Brother Watch has produced a report bringing together everything we know about the use by UK police of automated facial recognition software (pdf), and its deep flaws. The report supplements that information with analyses of the legal and human rights framework for such systems, and points out that facial recognition algorithms often disproportionately misidentify minority ethnic groups and women.
The UK situation is fairly well known. There’s been less coverage of automated facial recognition systems in the US, and the Big Brother Report offers some comments from experts about what is happening there. For example, Clare Garvie from the Georgetown Law Center on Privacy and Technology, writes:
Face recognition surveillance — identifying people in real-time from live video feeds — risks being an imminent reality for many Americans. Are we comfortable with a society where face recognition allows police to identify anyone with a driver?s license, without suspicion or consent? Are we comfortable with a society where the government can find anyone, at any time, by continuously scanning the faces of people on the sidewalk? Face recognition fundamentally changes the nature of privacy in public spaces. As government agencies themselves have cautioned, face recognition surveillance ‘has the potential to make people feel extremely uncomfortable, cause people to alter their behaviour, and lead to self-censorship and inhibition,’ chilling the exercise of the rights protected under the First Amendment and calling into question the scope of protections offered by the Fourth Amendment.
Alongside its report, Big Brother Watch has launched the “Face Off” campaign calling for the UK public authorities to stop using automated facial recognition software with surveillance cameras, and to remove the thousands of images of unconvicted individuals from the UK’s Police National Database. Given the UK authorities’ world-famous love of CCTV and surveillance, it’s unlikely they will take much notice.