US pharmacy Rite Aid banned from operating facial recognition systems
2023-12-22 05:2:15 Author: www.malwarebytes.com(查看原文) 阅读量:5 收藏

Pharmacy chain Rite Aid has been denied the right to run facial recognition systems in its stores for five years, by a Federal Trade Commission (FTC) ruling. The regulator found so many flaws in the retailer’s surveillance program that it concluded Rite Aid had failed to implement reasonable procedures and prevent harm to consumers in its use of facial recognition technology in hundreds of stores.

In May 2023, the FTC issued a warning that the increasing use of consumers’ biometric information and related technologies, including those powered by machine learning, raises significant consumer privacy and data security concerns, and the potential for bias and discrimination.

In a policy statement,the commission said:

“The agency is committed to combatting unfair or deceptive acts and practices related to the collection and use of consumers’ biometric information and the marketing and use of biometric information technologies.”

According to the FTC, Rite Aid deployed artificial intelligence-based facial recognition technology from 2012 to 2020,  in order to identify customers who may have engaged in shoplifting or other problematic behavior.

The FTC found that Rite Aid deployed a massive, error-riddled surveillance program, provided by vendors that could not properly safeguard the personal data the chain hoarded. The company also failed to inform consumers that it was using the technology in its stores.

According to the complaint, Rite Aid contracted with two companies to help create a database of images of individuals considered persons of interest for engaging in or attempting to engage in criminal activity at one of its retail locations. The images were derived from CCTV camera’s in the stores and smartphone pictures taken by employees. These were stored in a database along with their names and other information, such as any criminal background data.

Despite the fact that the system relied on low-quality images to identify these so-called persons of interest, the chain instructed staff to ask these customers to leave its stores. The employees, acting on false positive alerts caused by the flawed system, followed consumers around the stores, searched them, ordered them to leave, called the police to confront or remove consumers, and publicly accused them of shoplifting or other wrongdoing.

According to the complaint, Rite Aid’s system falsely flagged numerous customers, including an 11 year-old girl whom employees searched based on a false-positive result. The FTC says that Rite Aid did nothing to prevent their customers from being falsely accused. In addition, the FTC says Rite Aid’s actions disproportionately impacted people of color.

But even if the system had been completely accurate, there were enough problems in the way the system was deployed:

  • Without consent or warning, the system scanned everyone who came into certain stores and matched them against an internal list. Store patrons located in plurality-Black, plurality-Asian, and plurality-Latino areas were more likely to be subjected to and surveilled by Rite Aid’s facial recognition technology.
  • Rite Aid failed to test, assess, measure, document, or inquire about the accuracy of its facial recognition technology before deploying it.
  • The use of low-quality images in connection with its facial recognition technology increased the likelihood of false-positive match alerts.
  • It failed to monitor or test the accuracy of the technology after deployment.
  • Employees were not trained to understand the possibility of false positives nor did they take action so employees would report false positives.

This is not the first time Rite Aid and the FTC have clashed. In 2010, Rite Aid agreed to FTC charges that it failed to protect the sensitive financial and medical information of its customers and employees, in violation of federal law.

Rite Aid violated the 2010 data security order, and in addition to the ban and required safeguards for automated biometric security or surveillance systems the FTC requires the company to:

  • Delete, and direct third parties to delete, any images or photos they collected.
  • Notify consumers when their biometric information is used.
  • Investigate and respond in writing to consumer complaints about actions taken against consumers related to an automated biometric security or surveillance system.
  • Provide clear and conspicuous notice to consumers about the use of facial recognition or other biometric surveillance technology in its stores.
  • Delete any biometric information it collects within five years.
  • Implement a data security program to protect and secure personal information.
  • Obtain independent third-party assessments of its information security program.
  • Provide the Commission with an annual certification from its CEO documenting Rite Aid’s adherence to the order’s provisions.

While the FTC ruling highlights Rite Aid’s wrongdoings, it also acknowledges the fact that there are many problems with facial recognition. Because of the privacy implications some tech giants have backed away from the technology, or halted development.

People should at least be informed about when and why facial recognition technology is used, so they can decide for themselves whether they want to participate.


We don’t just report on threats—we remove them

Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.


文章来源: https://www.malwarebytes.com/blog/news/2023/12/us-pharmacy-rite-aid-banned-from-operating-facial-recognition-systems
如有侵权请联系:admin#unsafe.sh