The US Federal Trade Commission (FTC) has been ramping up scrutiny of Big Tech companies ever since Chair Lina Khan – an avowed critic of Big Tech – arrived in June 2021. To date, the FTC has struggled to find ways to combat the rising influence of the technology industry using its limited set of enforcement tools. However, a recent new settlement with WW International, formerly known as Weight Watchers (quoted on the Nasdaq under the ticker WW), signals that the FTC has found a new tool to target what it views as deceptive digital data practices – and it is willing to use it.
On March 4th, the FTC announced it had reached a settlement with WW International and its subsidiary Kurbo – a weight-loss app geared toward children and families that collects personal information like weight, physical activity and daily eating habits. The FTC alleged the company violated US children’s privacy laws in connection with WW’s advertising and marketing practices. As a penalty, the FTC required Kurbo and WW International to pay a $1.5 million fine and, importantly, to destroy any algorithms derived from the data in question from users aged thirteen and under, without verifiable evidence of parental consent. The WW settlement marks the third time in less than three years that the FTC has required a company to destroy algorithms used to collect data using methods the FTC viewed to be deceptive – the first was in 2019 against Cambridge Analytica, which famously gathered Facebook user data without consumers’ permission; and the second in 2021 against Everalbum, a photo-sharing app that used customer data to develop facial recognition algorithms.
The case against WW International is quite different from the prior two and marks an important shift in the FTC’s strategy. In its settlements with Cambridge Analytica and Everalbum, the FTC charged the companies for violating the FTC Act. In the case against Kurbo, the FTC alleged that WW International violated both the FTC Act and the Children’s Online Privacy Protection Act (COPPA). In this case, the FTC went beyond addressing the alleged COPPA violation and sought to penalise WW International’s secondary use of the data. This could mean that any organization collecting data in a manner ruled illegal under COPPA could be at risk for penalty on algorithms built from the data in question. The WW International case also suggests that algorithm destruction could be used by the FTC as a penalty in any other areas where the agency has enforcement authority – including those areas given FTC purview in a future federal privacy law.
This new penalty, known as algorithm disgorgement, could create headaches for companies that rely on algorithms to make money and distinguish their business models. Previously, commissioners at the FTC have allowed companies that violate data protection laws to keep the algorithms and models built from the data subject to the settlement. But privacy advocates have held the FTC’s feet to the fire, arguing the regulator should require tech companies to consider their data collection methods more carefully.
When the FTC settled with Cambridge Analytica in 2019, it was not clear if the agency would use algorithmic disgorgement as a regular tool to penalise companies. However, the direction of travel now appears to be moving clearly toward this practice. In 2020, the FTC issued guidance outlining best practices around the use of AI technology in creating algorithm models. In June 2021, Khan was confirmed as Chairwoman of the FTC and she has acted quickly to make progress on the administration’s antitrust priorities, with a particular focus on promoting economic and social justice through consumer protection and competition policy. At the end of last year, the FTC issued a notice that it was considering undertaking a new rulemaking process to “ensure that algorithmic decision-making does not result in unlawful discrimination.” These actions could mean that algorithmic disgorgement may become a more common enforcement mechanism within the agency.
While the FTC has indicated it is serious when it comes to targeting Big Tech, ensuring that companies follow through with deleting their data and algorithms can be tricky. The FTC’s order against WW International provides little detail on how the company should take an algorithm out of production in practice, and which sets of data need to go. Many companies do not even have processes set up to track all the downstream applications that may have used improperly gained data. As a result, enforcing compliance with such challenges could be difficult. Even acknowledging these challenges, as an old saying goes, two is a coincidence, but three is a trend. The recent WW International settlement signals a new focus by the FTC in targeting Big Tech’s lucrative practices through algorithm destruction.