New Facial Recognition Technology Identifies 96% of Travelers Wearing Masks

Single-use mask
Facial recognition also works when people are wearing masks

A recent report from the US Department of Homeland Security Science and Technology Directorate states that new facial recognition technology can perform passenger identification at airports with 96% accuracy even while passengers wear masks. For passengers not wearing masks, the technology hits nearly 100% accuracy. The information was presented at the 2020 Biometric Rally, held at a DHS-affiliated testing lab.

There’s no question that facial recognition technology has improved since the start of the pandemic. Pre-COVID-19, most facial recognition technologies had a failure rate between 20% to 50%. The development of the new, more successful system stems from the need to prevent passengers from removing their masks, potentially putting airline and security workers at risk.

With facial recognition technology that provides accurate results 96% of the time, airline passengers will not have to remove their masks in public.

Facial recognition
Facial recognition algorithms already as high as 96% accuracy

Is Facial Recognition Technology the Best Option?

While using biometric algorithms for fingerprints is another viable option, facial recognition is ultimately the better choice in the age of COVID-19. Thousands of people placing their fingers in the same spot could greatly increase disease transmission. Iris recognition technology is also a viable biometric identifier, but it is difficult to implement in large amounts. That makes facial recognition technology the easiest, safest option during COVID-19.

With facial recognition technology that provides accurate results 96% of the time, airline passengers will not have to remove their masks in public.

Even once the vaccines for COVID-19 roll out on a large scale, fear of a new pandemic will likely keep facial recognition technology the most popular option.

Biases of Facial Recognition Technology

Facial recognition tech is not without flaws, however. In the summer of 2020, activists from Black Lives Matter criticized the racial bias of the technology. In December, a black man sued New Jersey police for falsely identifying him as a crime suspect using facial recognition software.

A 2019 report from the NIST showed higher rates of false positives not only for black people, but for Asian and Indigenous faces. Due to these biases, several states have limited or banned the use of facial recognition technology by police.

The recent study presented at the Biometric Rally used 582 participants from 60 countries to ensure that the results would not be racially or ethnically biased.