When MacKenzie Fegan was boarding her morning flight to Mexico City last Wednesday, she noticed something odd at her gate at JFK airport in New York City.
Instead of a JetBlue employee scanning her boarding pass or taking a look at her passport, she – and other passengers at the gate – was directed to look into a camera before being allowed down the jet bridge.
“As people were lining up I noticed they were channeling people into different lanes… and people were being directed to look at a camera, which took their photo and then the gate would open,” Fegan told Threatpost. “I thought that was unusual. There also wasn’t any sign posted that I saw, it wasn’t explained to me what was happening.”
Unbeknownst to Fegan, who asked JetBlue about the incident in a now-viral Twitter exchange, the camera is part of a facial-recognition program first rolled out at JFK in November. The program, dubbed “Biometric Exit,” was first introduced by the U.S. Customs and Border Protection (CBP) in 2015; it scans passenger faces and matches them with photos that the government has on file. The program is currently operational in 17 locations.
Facial recognition is also already actively used by police forces and even at the White House. And it’s not just the U.S; biometrics are spreading worldwide. The EU last week approved a massive biometrics database that combines data from law enforcement, border patrol and more for both EU and non-EU citizens.
While facial recognition has its advantages – including more efficient, faster identification – the explosion of real-world biometrics applications have Fegan and others scratching their heads when it comes to certain deep-rooted privacy and security concerns, and asking important questions.
Those questions include: Where is this personal data being collected and how is it stored? Are users given a proper head’s up that this is happening? Can users opt out – and what happens to biometrics data that has already been collected and stored?
‘You Can’t Change Your Face’
Consent is a special type of privacy concern when it comes to facial recognition, especially because in many cases, people’s biometrics information has already been collected. So, consent really comes down to how that information may be used.
For Fegan, that was her biggest issue with facial-recognition technology at the airport: “[When it comes to biometrics], the train has left the station – the tech is already there and has been implemented,” she told Threatpost. “To me, it’s more an issue of informed consent.”
While Fegan said she didn’t see any notification about facial-recognition technology (though she acknowledged that while she didn’t see one, there may have been one there), a CBP spokesperson told Threatpost that when it comes to the programs being used in airports, U.S. citizens should be able to opt out.
“U.S. citizens who do not wish to participate in this biometric collection should notify a CBP officer, or an airline or airport representative, in order to seek an alternative means of verifying their identity and documents,” said the spokesperson. “CBP discards all photos of U.S. citizens once their identities have been verified.”
But Adam Schwartz, senior staff attorney with the Electronic Frontier Foundation’s civil liberties team, told Threatpost that there’s an important distinction to be made between the ability to opt-out and active consent.
“Biometrics raise special privacy concerns,” said Schwartz. “They’re easy to capture, and once biometrics are taken, there’s nothing you can do about it. You can’t change your face. A very weak form of protection is if the government or a business [that uses biometrics for] surveillance, they notify people. We think this is not consent – real consent is where they don’t aim a camera at you.”
Consent came to the forefront in December when the Department of Homeland Security unveiled a facial-recognition pilot program for monitoring public areas surrounding the White House. The department said that the public cannot opt-out of the pilot, except by avoiding the areas that will be filmed as part of the program. That raised concerns in some corners.
“While this pilot program seems to be a relatively narrowly defined test that does not in itself pose a significant threat to privacy, it crosses an important line by opening the door to the mass, suspicion-less scrutiny of Americans on public sidewalks,” said Jay Stanley, senior policy analyst at the American Civil Liberties Union, in a post at the time.
Biometric Databases Are A ‘Wild West’
Another commonly-discussed issue is how that highly-personal biometrics data is handled and maintained. In fact, when it comes to the collection, storage and sharing of biometrics, it’s a Wild West, as Schwartz puts it. When it comes to the collecting, sharing and storage of data, each process is different, whether it’s being used by private companies (like grocery stores) or governments. And there’s little oversight.
Also, biometrics information used for identification can be collected in the simplest, most mundane-seeming ways, from scraping public Facebook profile pictures to the mugshots taken by police departments or license photos that are captured by the Department of Motor Vehicles.
Biometric data is so prevalent, in fact, that the European Parliament last week voted to approve the “Common Identity Repository,” which will connect the systems used by border control, migration and law-enforcement agencies.
In the case of the Biometric Exit program used by the CBP, traveler photos taken at the gate are compared with existing images that have been stored “in a secure environment” – including photographs taken during the entry inspection, photographs from U.S. passports and visas, and images “from previous DHS encounters,” the CBP official told Threatpost.
In terms of storage, after a photo has been taken, the CBP retains the photos of travelers for up to 12 hours after their identities have been verified, “for continuity of operations purposes,” the spokesperson said. Meanwhile all other traveler photos are retained up to 14 days “in secure CBP systems to support system audits, to evaluate the … facial-recognition technology and to ensure accuracy of the facial-recognition process.”
“CBP takes its privacy obligations very seriously,” the CBP spokesperson told Threatpost. “CBP is fully committed to compliance with privacy laws and regulations, and to protecting travelers’ information and privacy… CBP does not allow its approved partners to retain the photos they collect under this process for their own business purposes.”
But concerns remain about the security of stored data – particularly after incidents like the 2015 Office of Personnel Management data breach, which resulted in the theft of fingerprint data of 5.6 million.
“More and more of our biometric information is being shared among various private and government actors and ending up in databases,” Schwartz told Threatpost. “Those involve tremendous risk because for one, thieves can steal the data; and two, employees can misuse the data.”
Curbing Facial Recognition
As more real-life applications for facial recognition begin to take hold, more cries for privacy-regulation laws and other efforts have occurred.
In May 2018, officials from the ACLU, EFF, Freedom of the Press Foundation, Human Rights Watch and others wrote a joint letter to CEO Jeff Bezos, expressing concerns that Amazon’s Rekognition platform for biometrics could be used as a dangerous government surveillance tool.
There are also existing laws looking to quell biometrics, such as an Illinois law that regulates collection of biometric information (including for facial recognition). Meanwhile, a new bill introduced in the Senate in March, the “Commercial Facial Recognition Privacy Act,” would bar businesses that are using facial recognition from harvesting and sharing user data without consent.
While biometrics certainly has advantages – such as making identification more efficient – Schwartz said there’s a fine line between these pros and more insidious use cases, such as taking photos of protesters in a crowd to build a case file of their political activities.
“We are concerned we’re heading into this Orwellian society where, in order to get around to the store or the workplace, there are cameras everywhere, capturing your face; and the government could know where everyone going,” said Schwartz. “The human face could essentially be transformed into a tracking device. The only thing holding us back from that kind of future is regulation.”
Don’t miss our free Threatpost webinar, “Data Security in the Cloud,” on April 24 at 2 p.m. ET.
A panel of experts will join Threatpost senior editor Tara Seals to discuss how to lock down data when the traditional network perimeter is no longer in place. They will discuss how the adoption of cloud services presents new security challenges, including ideas and best practices for locking down this new architecture; whether managed or in-house security is the way to go; and ancillary dimensions, like SD-WAN and IaaS.