The world’s largest video-sharing site agreed to pay the fine, which is a record for a children’s privacy case, of $136 million to the U.S. Federal Trade Commission and $34 million New York State for failing to obtain parental consent in collecting data on kids under the age of 13, the FTC said.
Starting in four months, Google also will limit data collection and turn off commenting on videos aimed at kids, YouTube announced at the same time, moves that will hamstring its ability to sell advertisements against a massive portion of its media library.
The settlement under the 1998 Children’s Online Privacy Protection Act, or COPPA, represents the most significant U.S. enforcement action against a big technology company in at least five years over its practices involving minors. Washington is stepping up privacy and antitrust scrutiny of the big internet platforms that have largely operated with few regulatory constraints.
“The $170 million total monetary judgment is almost 30 times higher than the largest civil penalty previously imposed under COPPA,” FTC Chairman Joe Simons said in a joint statement with fellow Republican Commissioner Christine Wilson. “This significant judgment will get the attention of platforms, content providers, and the public.”
The commission’s two Democrats broke from its three Republicans, however, saying the settlement did not go far enough to fix the problems. Consumer groups and lawmakers from both sides of the aisle on Wednesday slammed the fine as an insufficient deterrent, given the size of the company.
“It’s extremely disappointing that the FTC isn’t requiring more substantive changes or doing more to hold Google accountable for harming children through years of illegal data collection,” said Josh Golin, executive director of Campaign for a Commercial-Free Childhood, which helped lead the complaints that led to the settlement.
In a statement, Golin did praise a likely decrease in targeted ads aimed at kids.
Google’s shares rose 1.1% in New York.
YouTube said it will rely on both machine learning and video creators themselves to identify what content is aimed at children. The algorithms will look at cues such as kids’ characters and toys, although the identification of youth content can be tricky. Content creators are being given four months to adjust before changes take effect, the company said.
The company will also spend more to promote its kids app and establish a $100 million fund, disbursed over three years, “dedicated to the creation of thoughtful, original children’s content,” Chief Executive Officer Susan Wojcicki wrote in a blog posting.
“Today’s changes will allow us to better protect kids and families on YouTube,” Wojcicki wrote in the blog, which acknowledged the rising chances that children are watching the site alone. “In the coming months, we’ll share details on how we’re rethinking our overall approach to kids and families, including a dedicated kids experience on YouTube,” she said.
YouTube has already begun plans to strip videos aimed at kids of “targeted” ads, which rely on information such as web-browsing cookies, Bloomberg has reported.
The company violated COPPA with data collection to serve these ads, the FTC alleged. Some consumer advocates including Golin and the Center for Digital Democracy say the move away from targeted ads would do little to stop tracking of kids when they watch content aimed at general audiences, and that relying on video creators to make the changes could hurt compliance.