Rite Aid has been hit with a five-year ban on the use of AI facial recognition technologies after the Federal Trade Commission (FTC) said the retail pharmacy chain “failed to implement reasonable procedures and prevent harm to consumers” across hundreds of stores from 2012 to 2020.
The company allegedly used the technology to identify customers that may have been shoplifting or otherwise misbehaving, according to the FTC’s late Tuesday announcement and a complaint filed by the regulator in federal court.
When using this to build a database of “persons of interest” that was paired with other identifying information, however, the system “generated thousands of false-positive matches” that led store employees to “erroneously” follow and accuse customers — disproportionately women and people of color — of misbehavior.
“Rite Aid's reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk," Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said in the announcement. “Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices.”
In a statement, Rite Aid said it is “pleased” with the arrangement, but “fundamentally” disagrees with the allegations in the complaint.
“The allegations relate to a facial recognition technology pilot program the Company deployed in a limited number of stores,” the company said. “Rite Aid stopped using the technology in this small group of stores more than three years ago before the FTC’s investigation regarding the Company’s use of the technology began.”
The proposed court order prohibits Rite Aid from deploying or using any facial recognition or analysis system for five years, either directly or through an intermediary. It also requires the company to delete all of the photos, videos, data, models and algorithms created as part of the effort, and to instruct third-party partners to do the same.
Further, Rite Aid will be required to implement more comprehensive safeguards to protect employees when using similar technologies in the future. To settle charges that it violated a 2010 data security order previously handed down by the FTC, the pharmacy chain will also be required to implement “a robust information security program, which must be overseen by the company’s top executives,” the commission said.
The agency alleged that the company failed to confirm the accuracy of the facial recognition technology before deployment; regularly test the accuracy of the technology following deployment; adequately train employees to use the technology; and had used “tens of thousands” of low-quality images pulled from security cameras, phone cameras and news stories.
Additionally, the FTC said that Rite Aid did not inform its consumers that it was using the technology and discouraged employees from doing so. The proposed order requires the company to do so.
Because Rite Aid is currently going through bankruptcy proceedings, the proposed order will require approval from the bankruptcy court as well as the federal district court in which it was filed.
“Looking ahead, we are focused on the important actions underway to strengthen our financial position as we continue providing leading healthcare products and services to the nearly one million customers that we serve daily,” the company said.
Earlier this year the FTC issued a broad policy statement outlining its stricter position on the use of consumers’ biometric information. The practice, sometimes paired with machine learning, “raises significant consumer privacy and data security concerns and the potential for bias and discrimination,” the FTC said in May.