Everybody’s favourite big tech giant, Amazon, is facing yet another class-action lawsuit, this time for allegedly deploying biometric recognition technologies to monitor Amazon Go customers in its New York City outlets without their knowledge. According to the lawsuit, Amazon violated a 2021 NYC law which mandates that all business establishments that track their customers’ biometric information, including retail stores, must at least inform their customers that they are doing so. Amazon apparently didn’t.
“The lawsuit was filed in the U.S. District Court for the Southern District of New York on behalf of Brooklyn resident Alfredo Rodriguez Perez and a proposed class of tens of thousands of Amazon Go customers,” says the privacy advocacy group Stop Surveillance Technology Oversight Project. “The complaint claims that from January 2022 to March 13, 2023 Amazon failed to post any sign stating that Amazon Go stores collect biometric data, including for over a month after Mr. Perez told Amazon it violated New York City law by failing to do so.”
Amazon opened its first Go stores in New York in 2019 and now has ten stores in the city, all in Manhattan. The stores operate on the premise that customers can walk in, take whatever products they want off the shelves and leave without checking out. The company monitors visitors’ actions and charges their accounts when they leave the store. It is the epitome of tech-enabled convenience, but it seems that not all New Yorkers are willing to pay the price by trading in their most personal data.
Amazon only recently erected signs informing New York customers of its use of biometric recognition technology, more than a year after the disclosure law went into effect, claims the lawsuit. The company has also allegedly begun posting signs claiming that Amazon only harvests biometric data from customers who opt into the company’s palm scanner program. However, the plaintiffs in the lawsuit claim that the company was collecting biometric data on all customers, such as their body shape and size, including those who refuse to use the palm scanner.
“New Yorkers shouldn’t have to worry that we’ll have our biometric data secretly tracked anytime we want to buy a bag of chips,” said Surveillance Technology Oversight Project Executive Director Albert Fox Cahn. “Taking our data without notice isn’t convenient, it’s creepy. We have a right to know when our biometric data is used, and it’s appalling that one of the world’s largest companies could so flagrantly disregard the law. It’s stunning to think just how many New Yorkers’ data has already been compromised, and without them ever knowing it.”
New York is one of a small but growing handful of US cities that have passed biometric laws over the past couple of years. So far, three states — Texas, Washington and Illinois — have passed standalone biometrics laws, though many more are expected to follow do so this year. In Illinois alone, more than 1,000 class action lawsuits have been filed under the state’s Biometric Information Privacy Act (BIPA). The public is increasingly attuned to biometric privacy risks and as a result the litigation costs are growing for companies, notes the Cybersecurity Law Report:
BIPA applies to companies that collect, capture, purchase, obtain, disclose, or disseminate “biometrics identifiers,” defined as “a retina or iris scan, fingerprint, voiceprint, or a scan of hand or face geometry,” or “biometric information,” defined as any “information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.”
Companies subject to the law must:
• have a publicly available written biometrics policy;
• obtain an individual’s written consent prior to collection; and
• otherwise comply with the statutory restrictions on biometric use, sale and
The risks of non-compliance are steep: BIPA permits actual damages or liquidated damages
of $1,000 for each negligent violation and $5,000 for each reckless or intentional violation, plus attorneys’ fees and costs and injunctive relief.
The pace of BIPA litigation and settlements has been relentless. Last year, Facebook settled a
BIPA class action over its photo-tagging feature for $650 million, and TikTok settled for $92
million over face detection in videos. Microsoft, Google, IBM and others have not escaped
Meanwhile, in Europe…
On the other side of the Atlantic, the push back against the growing use of biometric surveillance systems, by companies and governments alike, is also growing. In the UK, the unmanned store experience offered by Amazon has been such a flop that the company has had to begin opening stores with actual fresh-and-blood human beings serving customers. In 2021, Amazon reportedly had its sights set on opening 260 cashierless supermarkets across the UK by 2025. So far, it has opened just 20 Amazon Fresh locations in the country, all except one of them in London.
Also in the UK, the Information Commissioner (ICO) last month reprimanded the North Ayrshire Council for using facial recognition technology in secondary schools “in a manner that is likely to have infringed data protection law.” Why it took a year-and-a-half for the ICO to reach this conclusion, despite a sustained public backlash against the move, is anyone’s guess.
In October 2021, nine schools in the Scottish region of North Ayrshire started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US), until a public outcry put paid to the pilot scheme. Yet as I reported months later, rather than shelving the idea, the Tory government actually doubled down:
According to a new report in the Daily Mail, almost 70 schools have signed up for a system that scans children’s faces to take contactless payments for canteen lunches while others are reportedly planning to use the controversial technology to monitor children in exam rooms. This time round, however, the government didn’t even bother informing the UK Biometrics and Surveillance Camera Commissioner Fraser Sampson of the plans being drafted by the Department for Education (DfE).
In Belgium, a petition filed with the country’s Parliament and supported by organizsations such as the Belgium Human Rights League warns that “Facial recognition threatens our freedoms.” The petition calls for a ban on facial recognition in public places as well as its use by authorities in identifying people.
The use of this technology on our streets would make us permanently identifiable and monitored. This amounts to giving the authorities the power to identify the entirely of its population in the public space, which constitutes an invasion of privacy and the right to the anonymity of citizens.
The petition warns that facial recognition will also harm marginalized groups by facilitating yet more systemic discrimination and bias. At the same time, data breeches and leaks risk exposing citizens’ most private data.
U Society “Not Ready for Facial Recognition”
The EU may have set the global standard for data protection, but it too is pushing the ethical boundaries when it comes to collecting and storing citizens’ biometric data. It is in the process of building one of the largest facial recognition systems on planet Earth as part of plans to modernize policing across the 27-member bloc. This data could end up being shared with the US. As I reported last July, the US is planning to trade its citizens’ biometric data — some of it collected without consent — for the biometric data harvested by its “partner” governments in Europe and beyond.
Privacy advocates have called for an outright ban on biometric surveillance technologies due to the threat they pose to civil liberties. They include Wojtek Wiewiorowski, who leads the EU’s in-house data protection agency, EPDS, which is supposed to ensure the EU is complying with its own strict privacy rules. In November 2021, Wiewiorowski warned that European society is not ready for facial recognition technology: the use of the technology, he said, would “turn society, turn our citizens, turn the places we live, into places where we are permanently recognizable … I’m not sure if we are really as a society ready for that.”
In a more recent interview, with EU observer, the EU’s data protection supervisor voiced concerns that the EU is trampling on its own values and people’s rights to privacy and data protection as it expands its data dragnet in areas such as migration and law enforcement.
In France, Emmanuel Macron’s broadly detested government is pushing for the introduction of AI-empowered surveillance systems for the 2024 Paris Olympics. In its passage through the Senate’s plenary session in January, an amendment to include facial recognition was rejected by the Senate’s law committee. That is the good news. However, Amnesty International warned this week that if the proposed bill is approved, it will legalize the use of a pervasive AI-powered mass video surveillance system for the first time in the history of France — and the European Union:
Read the rest here: