BOOK THIS SPACE FOR AD
ARTICLE ADMultiple civil rights groups banded together this week to end the use of facial recognition tools by large retailers.
According to advocacy group Fight For the Future, companies like Apple, Macy's, Albertsons, Lowes and Ace Hardware use facial recognition software in their stores to identify shoplifters. The group created a scorecard of retailers that they update based on whether the company is currently using facial recognition, will in the future or never will.
Stores like Walmart, Kroger, Home Depot, Target, Costco, CVS, Dollar Tree and Verizon have all committed to never using facial recognition in their stores in statements to Fight For the Future.
Walgreens, McDonald's, 7-Eleven, Best Buy, Publix, Aldi, Dollar General, Kohl's, Starbucks, Shoprite and Ross are just a few of the companies that Fight For the Future believes may use facial recognition software in the future.
But it isn't just major retailers deploying facial recognition software. Backlash to private use of facial recognition culminated on Wednesday when Livonia skating rink in Michigan was accused of banning a Black teenager after its facial recognition software mistakenly implicated her in a brawl.
Lamya Robinson told Fox2 that after her mom dropped her off at the skating rink last Saturday, security guards refused to let her inside, claiming her face had been scanned and the system indicated she was banned after starting a fight in March.
"I was so confused because I've never been there," Lamya told the local news outlet. "I was like, that is not me. who is that?"
Lamya's mother Juliea Robinson called it "basically racial profiling."
"You're just saying every young Black, brown girl with glasses fits the profile and that's not right," Robinson added.
The skating rink refused to back down in a statement to the local news outlet, claiming their software had a "97 percent match."
"This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of. If there was a mistake, we apologize for that," the statement said.
Caitlin Seeley George, campaign director at Fight for the Future, told ZDNet that Lamya's situation was "exactly why we think facial recognition should be banned in public places."
"This girl should not have been singled out, excluded from hanging out with her friends, and kicked out of a public place. It's also not hard to imagine what could have happened if police were called to the scene and how they might have acted on this false information," Seeley George said.
"We've seen time and again how this technology is being used in ways that discriminate against Black and brown people, and it needs to stop. Local lawmakers in Portland enacted an ordinance that bans use of facial recognition in places of public accommodation like restaurants, retail stores, and yes, skating rinks. We're calling for Congress to enact such a ban at the federal level as well."
The situation occurred after Robert Williams, another Black Michigan resident arrested based on a mistake by facial recognition software, testified in Congress this week.
Williams came forward in June 2020 as one of the first people to confirm having been arrested based on faulty facial recognition software in use by police. He filed a lawsuit against the Detroit Police Department with the ACLU after he was arrested on the front yard of his home as his children watched, all based on a facial recognition match that implicated him in a robbery.
After 16 hours in holding, he was shown the photo that led to the match and held it up to his face, causing one officer to say "the computer must have gotten it wrong." Police put a security camera photo into their database and Williams' driver's license was listed as a match.
"Detroiters know what it feels like to be watched, to be followed around by surveillance cameras using facial recognition," said Tawana Petty, national organizing director at Data for Black Lives.
"In Detroit, we suffer under Project Green Light, a mass surveillance program that utilizes more than 2000 flashing green surveillance cameras at over 700 businesses, including medical facilities, public housing and eating establishments," Petty added, noting that the cameras using facial recognition are monitored at real-time crime centers, police precincts and on officers' mobile devices 24/7.
She said in a statement that it is difficult to explain the psychological toll it takes on a community to know that every move is being monitored "by a racially-biased algorithm with the power to yank your freedom away from you."
"We must ban facial recognition from stores and get this invasive technology out of every aspect of our lives," Petty said.
EFF senior staff attorney Adam Schwartz told ZDNet that facial recognition use is growing among retailers and that the racial implications of stores having databases of "potential" shoplifters was particularly fraught considering the privacy implications.
But he disagreed with Fight For The Future's stance, explaining that instead of banning its use among private organizations, there should be opt-in consent requirements that would stop stores from randomly scanning every face that walks in.
He noted the need for innovation and some positive instances of facial recognition being used across society, including the iPhone feature that allows you to open your phone with your face.
Ahmer Inam, chief AI officer at Pactera EDGE, said much of the backlash toward retail use of facial recognition is because companies have not been transparent about how they're using it.
"Using a mindful AI approach, a powerful tool like facial recognition can yield tremendous benefits for the consumer -- as well as the retailer. But values such as privacy, transparency, and ethical-use have to be top-of-mind during the build. It's something we've seen work effectively for our facial recognition and other AI projects," Inam said.
"The biggest challenge facial recognition 'faces' right now is model bias that results in false positives. For retailer's, it isn't just about building a facial recognition-based system -- but to what purpose and intention."
Inam listed multiple examples of facial recognition being used to improve the retail experience like that of CaliBurger, which rolled out kiosks that use facial recognition to connect orders to customers.
But Seeley George said companies are adopting facial recognition in the name of "convenience" and "personalization," while ignoring how they abuse peoples' rights and put them in danger.
"The stores that are using or are considering using facial recognition should pay attention to this call from dozens of leading civil rights and racial justice organizations who represent millions of people," Seeley George said.
"Retailers should commit to not using facial recognition in their stores so we can champion their decision, or be prepared for an onslaught of opposition."