Google said it wouldn’t target ads to kids anymore, but multiple reports suggest it’s still happening.
Google might be facing significant fines for violating children’s privacy through YouTube ads — again.
Two recent reports suggest that the company is collecting data from and targeting ads to children, a violation of both the Children’s Online Privacy and Protection Act (COPPA) and Google’s consent decree with the Federal Trade Commission. They also come as Google, which owns YouTube, prepares to defend itself in a major antitrust lawsuit about its search engine, is under scrutiny from Democrats and Republicans alike, and Congress considers child online safety bills. Simply put, this is not the best time for Google to face more accusations of wrongdoing, especially when the alleged victims are children.
The situation also shows how child-specific online privacy and safety laws are difficult for even the biggest companies in the world to comply with. Although Congress has a long track record of regulating the internet in the name of kids’ safety, these laws require tech companies to take extra measures to determine what content has to be treated differently to preserve kids’ privacy or the laws’ definition of safety, which may ultimately result in an internet where people of all ages have to confirm their identity and age to use it. Online privacy and safety laws that apply to everyone, on the other hand, would avoid many of these problems entirely. But few lawmakers (though there are exceptions) are crusading to pass those.
The latest problem for Google is the “made for kids” designation it created as part of its settlement with the FTC in 2019, which was meant to prevent children from being targeted with ads on the platform or having their data collected for ad targeting purposes. Creators must designate their entire channel or individual videos on it as “made for kids,” in which case Google won’t collect any data on viewers for personalized advertising purposes, nor will it target ads to them. If they choose not to use the “made for kids” designation, the data collection and ad targeting happen as normal. But the creators have to choose something, one way or the other. (This is on regular YouTube, not YouTube Kids, which is a separate platform that doesn’t allow personalized advertising at all and has had its own issues.)
But a report released last week from Adalytics, a digital ad quality platform, indicates that Google isn’t keeping that promise. The firm found trackers that Google uses specifically for advertising purposes and what appear to be targeted ads on “made for kids” videos. Clicking on those ads often took viewers to outside websites that definitely did collect data on them, even if Google didn’t. The report is careful to say that the advertising cookies might not be used for personalized advertising — only Google knows that — and so may still be compliant with the law. And Adalytics says the report is not definitively saying that Google violated COPPA: “The study is meant to be viewed as a highly preliminary observational analysis of publicly available information and empirical data.”
Fairplay, a children’s online safety group, followed this up with some research of its own on Wednesday, which it said confirmed some of Adalytics’ observations. The nonprofit placed an ad campaign on “made for kids” channels and asked to target the ads to various audience segments. It received a report back from Google that seemed to say that it had shown Fairplay’s ads to viewers who fit in those segments, indicating targeted ads were shown on “made for kids” videos. That same day, Adalytics released a follow-up with additional evidence.
Google is denying any wrongdoing. But if these reports are right, Google may be looking at tens of billions of dollars in penalties. Unlike the $170 million fine, this one could make a real dent.
Privacy groups, child online safety advocates, and lawmakers are demanding that the FTC investigate the allegations. Last week, Sens. Marsha Blackburn (R-TN) and Ed Markey (D-MA), for whom children’s privacy and online safety is a major issue, sent a letter to the FTC over the Adalytics report.
“YouTube and Google cannot continue treating young people’s data as an unprotected commodity from which to profit with abandon,” the letter said.
Fairplay, along with the Center for Digital Democracy, Common Sense Media, and the Electronic Privacy Information Center, on Wednesday sent the FTC a similar letter. Fairplay included its own research that seemed to confirm Adalytics’ report.
“The ads on ‘made for kids’ videos are either behaviorally targeted, or Google is misleading advertisers about the efficacy of its ads in the Audience Segment Report,” the letter said.
In a blog post, Google claimed that Adalytics’ first report was “deeply flawed and uninformed,” and insisted to the New York Times that it doesn’t run personalized ads on “made for kids” videos. Any cookies or trackers that Adalytics found were there for allowed purposes and not collecting data for advertising, the company said. But it didn’t explain away some of the specific claims in the 200-page report.
“We write these really long reports because we want people to engage, replicate the research, and partake in honest dialogue,” Krzysztof Franaszek, founder of Adalytics, told Vox. “We would like to get to a point where these companies provide honest feedback about what we observe in their platforms supported by evidence, instead of just claiming the research is misleading or flawed.”
Google reiterated that denial in response to the Fairplay report, saying in a statement to Vox that it was a “fundamental misunderstanding of how advertising works on made for kids content.” The company added that YouTube disables ad matching based on user data on all “made for kids” videos, so the ads that do appear on those videos are only contextual. That is, they’re based on the content of the video itself rather than the inferred interests of the person watching it.
“We do not allow ads personalization on made for kids content and we do not allow advertisers to target children with ads across any of our products,” Google said. “We’ve reached out to Fairplay to clarify what they saw and share how our protections work.”
Fairplay’s executive director, Josh Golin, didn’t seem satisfied with Google’s response.
“It’s disappointing that Google continues to play shoot-the-messenger rather [than] engage with the substance of our findings,” Golin told Vox in an email. “If Google has the time and inclination to explain to Fairplay our ‘fundamental misunderstanding of how advertising works on made for kids content’ in private, why not publicly explain how we were able to target ads to user’s interests on ‘made for kids’ channels?”
The FTC wouldn’t comment on whether it opened an investigation into the matter, and it could take months or years for any investigation to conclude. So it will be a while — if ever — before we know who’s telling the truth here. But Google has a lot to lose if it is found to violate COPPA again. That $170 million fine was laughably low considering the hundreds of billions of dollars Google pulls in every year. But the FTC doesn’t look kindly on repeat offenders. Just look at the $5 billion fine it gave Meta for its second privacy offense. Plus, COPPA calls for a fine of up to $50,120 per violation. Lots of kids watch YouTube, which means Google could be looking at a fine in the tens of billions.
If you’re wondering why Google seems to have so much trouble complying with one of the only consumer online privacy laws this country has, a major factor is just how many kids use Google’s services, especially compared to the rest of Big Tech. You need a credit card to buy something from Apple TV or Amazon Prime, which has also had its own kids’ privacy issues. Meta’s platforms aren’t supposed to be used by people under 13, except for Quest, which is available for 10-year-olds. Meanwhile, Meta’s long-running “Instagram for Kids” plans are currently on hold. Kids do play Microsoft’s Xbox, which recently cost it $20 million for COPPA violations, but most of its products are designed for and used by adults.
Google, on the other hand, has YouTube, which is free to watch. YouTube is hugely popular with kids, including the very youngest of them. Cocomelon, with 164 million subscribers and a core audience that isn’t old enough to go to school yet, is one of the most popular channels on YouTube, period. In a survey, about a third of US and UK kids said they want to be YouTubers when they grow up — more than any other profession.
And between its Chromebooks and apps, Google also has a big chunk of the education market, thanks to a concerted effort by the company to capture that segment. Google swears that it doesn’t track kids through its educational apps and tools, but it was sued by New Mexico for collecting student data in 2020. Google denied any wrongdoing, and the case was settled the next year with Google agreeing to pay $3.8 million toward a privacy and safety initiative. But even if Google doesn’t collect any data about students, the education apps, including versions of Gmail and Docs, still make kids users of Google’s services, and that makes them more likely to continue to use those products once they become old enough for Google to legally track and monetize them.
If the reports’ findings are accurate — and, again, Google says they aren’t — then either Google won’t comply with COPPA or it can’t. There is another possibility, which is that the company’s reports are telling advertisers that it’s targeting ads that it actually isn’t. That’s another can of worms, but not a children’s privacy violation.
Google’s ad tracking tech might just be too big, opaque, and persistent to preserve kids’ privacy as COPPA dictates. That would be an especially bad look for the company right now, as Big Tech is being scrutinized more than ever and lawmakers at the federal and state levels are trying to regulate the internet that children see and use. In March, Utah passed two laws that require social media platforms to verify user age and place restrictions on what people under 18 can do and see. Arkansas has its own social media age verification law, California has the Age Appropriate Design Code, and several states now have age verification laws for porn sites.
In the federal government, President Biden has said in more than one State of the Union address that Congress has to address the harm that social media has done to children. The Kids Online Safety Act and COPPA 2.0, both of which are bipartisan, are working their way through Congress.
In order to comply with these laws, websites and services have to know which users they apply to. That means things like age verification, which privacy advocates really don’t like and most internet users probably won’t be thrilled about, either. This is avoided with a privacy law that protects children and also applies to everyone. After all, there’s no need to verify ages if age isn’t a factor.
But we haven’t had much luck with online privacy laws that apply to adults because Republicans and Democrats can’t agree on what those laws should look like. Protecting children, on the other hand, is a winning cause that both parties are happy to get behind — even if it means the rest of us are left behind, and kids are stuck with a law that even the biggest companies in the world don’t seem to be able to follow.
0 Comments