Sign on Doorway for the Federal Trade Commission in Washington D.C.
gettyA group of children's advocacy groups is calling on the Federal Trade Commission to investigate Google's use of personalized ads on YouTube, saying they're being illegally served to children.
The Children’s Online Privacy Protection Act requires sites appealing to children to gain consent from parents before collecting the personal information of under-13s. But in a Request for Investigation, Fairplay and the Center for Digital Democracy suggest that, despite this, Google is serving personalized ads on "made for kids" YouTube videos and is tracking viewers of these videos.
The accusation follows initial research from Adalytics, which appears to indicate that Google is surreptitiously using cookies and identifiers on these videos for tracking purposes, is serving personalized ads and transmitting data about viewers to data brokers and ad tech companies.
And separate research by both Fairplay and ad buyers appears to back this up. The team ran test ad campaigns on YouTube in which they selected a series of users on the basis of various attributes and instructed Google to only run the ads on "made for kids" channels, a request which should, in theory, have resulted in zero placements.
Instead, the result was thousands of ad impressions; and, they say, the reporting Google provided to Fairplay and the ad buyers to demonstrate the efficacy of the ad buys would not be possible if the ads were, as Google claims, contextual.
"If Google’s representations to its advertisers are accurate, it is violating COPPA," says Josh Golin, executive director of Fairplay.
"The FTC must launch an immediate and comprehensive investigation and use its subpoena authority to better understand Google’s black box child-directed ad targeting."
It's not the first time the company has come under fire over similar issues, with parent company Alphabet forking out $170 million in 2019 to settle allegations that it had been illegally collected personal data on children without their parents’ consent. Last week, the company released a statement on the Adalytics report, describing it as "deeply flawed and uninformed."
"Since January 2020, YouTube has treated personal information from anyone watching ‘made for kids’ content on the platform as coming from a child, regardless of the age of the viewer. This means we prohibit ads personalization," says Dan Taylor, vice president for global ads.
"Additionally, we do not allow the use of third-party trackers in advertisements served on made for kids content on YouTube."
The new call follows a similar letter last week from senators Edward J. Markey and Marsha Blackburn, suggesting that "potentially millions" of children could have been impacted.
It's worth noting that the advocacy groups do not directly accuse Google of breaking the terms of COPPA, instead describing the situation as "unclear" and asking the FTC to investigate.
But, says Golin, "If Google and YouTube are violating COPPA and flouting their settlement agreement with the Commission, the FTC should seek the maximum fine for every single violation of COPPA and injunctive relief befitting a repeat offender."
Google has been approached for comment on the advocacy groups' letter.
A new report accuses Google of serving targeted ads to children and harvesting their data, potentially violating federal privacy laws.
The allegations cast doubt on Google’s previous promises to protect children better online.
The report, published by advertising analytics firm Adalytics, claims that YouTube continues to serve personalized ads and employ ad trackers on videos and channels labeled “made for kids.”
This would break Google’s 2019 settlement with the Federal Trade Commission over alleged Children’s Online Privacy Protection Act (COPPA) violations.
COPPA, enacted in 1998, restricts websites and apps from collecting personal data on users under 13 without explicit parental consent.
In its 2019 settlement, Google paid $170 million and pledged to stop collecting data on children’s content and turn off personalized ads.
According to Adalytics, Google enables advertisers to target children based on their interests and demographics.
The report includes purported examples of major brands like Disney, Verizon, and Hyundai running behavioral ads on nursery rhyme channels and cartoons.
When clicked, these ads allegedly lead to advertiser websites that immediately set tracking cookies and share children’s data with third parties like Facebook and TikTok without obtaining the required consent.
The report states:
“Many advertisers were observed setting ad targeting cookies, persistent identifiers, and engaging in meta-data sharing with data brokers.”
Some ads were even for Google’s products, including YouTube TV and Chrome, which don’t allow users under 18.
The report claims if a toddler clicked these ads, Google would install cookies for ad personalization.
In response, Google denied flouting any regulations, saying it prohibits personalized ads and third-party trackers on children’s videos.
In a blog post, Google says:
“We do not link cookies to the viewing of made for kids content for advertising purposes, and a viewer’s activity on made for kids content can’t be used for ads personalization.”
Google adds:
“The cookies identified in this report are encrypted and not usable by another tech company, advertiser, publisher or a data broker.”
However, the Adalytics report argues that Google makes it overly difficult for advertisers to opt out of having their ads run on kids’ videos.
“No – it is NOT easy to avoid ‘made for kids’ channels,” media buyers surveyed in the report responded.
The report’s authors claim that tools like Google’s Performance Max algorithm may optimize ad delivery for kids’ clicks.
While Google denied any wrongdoing, the report calls for greater transparency and accountability around ads on children’s content. The findings merit closer regulatory scrutiny.
“We welcome responsible research around our products,” said Google’s statement responding to the report.
Privacy advocates argue Google’s model facilitates hidden data collection from children. While Google insists its tools prevent this, critics say it needs to do more to close loopholes.
One advertising executive interviewed in the report states:
“Google has failed advertisers, again. There is no reasonable excuse for ads running on content intended primarily for kids.”
Whether the report proves that YouTube breached its FTC agreement remains to be seen.
Further independent audits of YouTube’s algorithms and ad systems may be needed to ensure proper safeguards for children are in place.
If confirmed, harvesting children’s data for profit — years after Google paid $170 million for similar violations — would likely provoke stern action.
Google maintains that “protecting kids and teens is a top priority” across its platforms.
Featured Image: Aleksandra Suzi/Shutterstock
Google Ads will offer enhanced customer service for small businesses as part of a new paid pilot.
A select number of agencies and advertisers have been invited to test the new program, which offers one-on-one support tailored to specific customer needs.
Why we care. Small business owners using the program can get specialized advice faster by using this premium service, which has historically only been accessible to Google Ads’ biggest clients.
What’s next? Google will collect feedback from participants. In time, Google hopes to roll the service out to more small- and medium-sized businesses. A Google spokesperson said:
Why now? Google launched the pilot in response to a “common complaint” from small businesses that they more specialized advice from the search engine’s experts, along with ideas for how they can Improve their ads campaigns and optimize their budgets – a level of service not provided by Google’s automated self-service system.
Get the daily newsletter search marketers rely on.
What has Google said? A spokesperson from Google told Search Engine Land:
New on Search Engine Land
Google, the parent company of YouTube, responded to a report that suggested YouTube advertisers are sourcing data from children viewing videos on the platform.
On Aug. 18, a day after the report surfaced, Google posted a blog reinstating its “strict privacy standards around made for kids content,” which is content marked on YouTube created for children.
The BigTech giant said it has focused on creating kid-specific products like YouTube Kids and supervised accounts.
“We’ve invested a great deal of time and resources to protect kids on our platforms, especially when it comes to the ads they see…”
It said it launched a restriction worldwide for personalized ads and age-sensitive ad categories for its users under 18. Additionally, the post clarified that it does not allow third-party trackers on ads that appear on kids’ content.
Nonetheless, on Aug. 17, data analysis and transparency platform Adalytics published a 206-page report alleging that advertisers on YouTube could be “inadvertently harvesting data from millions of children.”
Some of the claims made in the report include cookies indicating a “breakdown” of privacy and YouTube creating an “undisclosed persistent, immutable unique identifier” that gets transmitted to servers even on made-for-kids videos with no clarity on why it’s collecting it.
Related: Universal Music and Google in talks over deal to combat AI deep fakes: Report
An article from The New York Times also reported on the research from Adalytics, specifically highlighting an instance where an adult-targeted ad from a Canadian bank was shown to a viewer on a video label for kids.
Adalytics reported that since that viewer clicked on the ad, tracking software from Google, Meta and Microsoft, along with companies, was tagged on the user’s browser.
Concerns around Google’s privacy and data collection standards have been raised in recent months, as the company has been releasing more products with artificial intelligence (AI) incorporated.
On July 11, Google was hit with a lawsuit over its new AI data-scraping privacy policy updates, with the prosecutors saying it is representing millions of users who have had their privacy and property rights violated due to the changes.
Less than a month later, a report was published that analyzed AI-powered extensions for Google’s internet browser Chrome, which said two-thirds could endanger user security.
Most recently, on Aug. 15, Google introduced a series of enhancements for its search engine incorporating advanced generative AI features.
Collect this article as an NFT to preserve this moment in history and show your support for independent journalism in the crypto space.
Magazine: Should we ban ransomware payments? It’s an attractive but dangerous idea
Google will stop collecting anonymous reviews on August 25, 2023. Google posted this notice in a help document, which reads, "Important: We've stopped collecting anonymous reviews and will stop showing anonymous reviews on August 25, 2023." Also, you no longer need reviews for LSAs to show, it is optional.
Here is a screenshot of the new anonymous reviews change:
Anthony Higman spotted this and posted about this on X/Twitter saying, "Getting reviews in the LSA platform is now optional. It was a really good idea, but I think they couldn't get anyone to actually do it in the LSA platform." "Also anonymous reviews are done across the board on Agust 25th," he added.
Here are those tweets:
As an FYI, Google Maps in general had anonymous reviews for over a decade.
Forum discussion at X/Twitter.
Google spends so much time in the crosshairs, it basically lives there by this point.
On Wednesday – less than two months after going to town on the Google Video Partners program – Adalytics published what it says is direct evidence of personalized ads being served against kids’ content on YouTube in a follow-up to its bombshell report last week alleging the same.
Also on Wednesday, a coalition of nonprofit organizations sent a strongly worded letter to the Federal Trade Commission in response to these findings, calling for an investigation into whether YouTube and Google are, yet again, in (alleged) violation of the Children’s Online Privacy Protection Act (COPPA).
The coalition is being led by the Center for Digital Democracy and Fairplay (previously known as the Campaign for a Commercial-Free Childhood), both of which were behind a previous successful complaint against YouTube over alleged COPPA violations that led to a $170 million settlement between Google, the FTC and New York state in September 2019.
But this group is not the first to contact the FTC about this issue. Sens. Ed Markey (D-Mass.) and Marsha Blackburn (R-Tenn.) sent a joint letter last week after the first report dropped demanding an FTC investigation into YouTube and Google for suspected violations of children’s privacy.
Obligations
But to rewind, what’s in that 2019 settlement?
August is beach season, and if you need some beach reading, you can access the full 2019 consent order here.
The TL;DR is that Google and YouTube are required to get verifiable parental consent before collecting personal information from children.
On the same day the settlement was filed, YouTube’s then CEO, Susan Wojcicki, published a blog post to say that, going forward, YouTube will assume that anyone watching children’s content on the platform is a child and treat them accordingly – regardless of their age – and that YouTube will also “stop serving personalized ads on this content entirely.”
The Adalytics report alleges that Google does neither effectively. And now it says it’s got the receipts to prove it.
The experiment
In Google’s vehement rebuttal of the original Adalytics report last week, it states that YouTube has prohibited ads personalization on “made for kids” content since January 2020.
Yet, according to Adalytics, it’s not only possible but relatively easy to set up YouTube campaigns in Google Ads that allow ads for adult-focused products like insurance, cars and SaaS technology to run against content designed for children.
Adalytics ran an experiment with multiple media buyers demonstrating that with a few clicks anyone with a credit card and a self-serve Google Ads account can configure campaigns so 100% of the impressions are targeted at an adult segment such as business travelers, for example, or avid investors.
The experiment involved setting parental status to “unknown” or “not a parent,” excluding placements on the Google Video Partner network and uploading an inclusion list of “made for kids” YouTube channels to use as the campaign target.
In theory, not a single one of these personalized ads should have shown up on a “made for kids” video.
But the reporting pulled from the Google Ads dashboard and shared with Adalytics appears to show otherwise.
Toddlers don’t have credit cards
There’s an even easier experiment to run, however.
All one has to do in order to find adult-focused ads running against kid-focused content is to click on a kid-focused YouTube video.
To be fair, there are non-personalized ads that run on kid channels. I was served an ad for Barbie (the doll, not the movie) on Vlad & Niki and an ad for a kids’ math game on Sheriff Labrador – and that’s completely fine. Contextual advertising is fair game, even according to the FTC.
But over the course of roughly 10 minutes on Tuesday evening, I was also served an ad for Optimum on the Ryan’s World channel, for Shopify on JJ’s Animal Adventure (made by the same folks as Cocomelon) and another ad for Optimum on Kids Diana Show.
This would appear to be a violation of Google’s own stated policies for what type of ads can run against “made for kids” content on YouTube.
And AdExchanger spoke with four media buyers that have also personally observed behaviorally targeted ads served on “made for kids” channels.
In a statement, a Google spokesperson said that the conclusions in the Adalytics report “point to a fundamental misunderstanding of how advertising works on made for kids content.”
“We do not allow ads personalization on made for kids content and we do not allow advertisers to target children with ads across any of our products,” the spokesperson said. “We also do not offer advertisers the option to directly target made for kids content as a whole.”
The findings were not shared with Google in advance, and the spokesperson said that Google reached out to Adalytics directly to “clarify what they saw and share how our protections work.”
Google also noted that advertisers have the ability to fully opt out of their ads running against “made for kids” content on YouTube.
Harmful, wasteful or both?
But putting aside concerns related to collecting data from children under 13 or the moral question of whether children should be shielded from certain commercial messages, it arguably doesn’t make much sense from a marketing perspective to target ads for high-speed fiber internet or ecommerce platform technology at kids that are probably in the first grade or still in diapers.
“I feel for advertisers here, because this isn’t just a legal risk – it’s a waste of advertising dollars,” said Laura Edelson, an incoming assistant professor of computer science at Northeastern University and former CTO for the antitrust division of the Department of Justice.
One possible explanation for why adult-targeted ads continue to run against “made for kids” YouTube content is because parents are watching with their kids.
But as Adalytics points out in its follow-up report, co-viewing doesn’t negate the fact that Google has said it treats data from anyone watching children’s content on YouTube as coming from a child, regardless of how old the viewer is.
A more cynical, and perhaps logical, explanation is that kid-focused content generates a lot of impressions (as anyone who’s been subjected to Baby Shark is well aware of), and YouTube needs to fulfill demand. (P.S. An ad for Verizon ran before the Baby Shark video when I clicked on it.)
“Human attention is finite, right? And there are only so many people out there,” said a senior media executive, who asked to remain anonymous so as to speak freely. “You’ve gotta get those ad slots filled somehow.”
So, what’s an advertiser like Verizon, Optimum or Shopify to think of all this?
Either brands can’t trust the efficacy of the targeting settings on Google’s platforms, Google is filling behavioral segments with random inventory – including kids’ content – to meet demand or Google is violating its COPPA commitments, said Ruben Schreurs, chief strategy officer at Ebiquity.
None of the options are awesome.
“And in the first case, it doesn’t just lead to wasted ad investments, but also inadvertent exposure to serious compliance and brand reputation risks,” Schreurs said. “Google will have to do better than trying to disregard the Adalytics allegations without proper counterevidence.”
Story updated at 9:22 a.m. to include comments from Google.
This year, BMO, a Canadian bank, was looking for Canadian adults to apply for a credit card. So the bank’s advertising agency ran a YouTube campaign using an ad-targeting system from Google that employs artificial intelligence to pinpoint ideal customers.
But Google, which owns YouTube, also showed the ad to a viewer in the United States on a Barbie-themed children’s video on the “Kids Diana Show,” a YouTube channel for preschoolers whose videos have been watched more than 94 billion times.
When that viewer clicked on the ad, it led to BMO’s website, which tagged the user’s browser with tracking software from Google, Meta, Microsoft and other companies, according to new research from Adalytics, which analyzes ad campaigns for brands.
As a result, leading tech companies could have tracked children across the internet, raising concerns about whether they were undercutting a federal privacy law, the report said. The Children’s Online Privacy Protection Act, or COPPA, requires children’s online services to obtain parental consent before collecting personal data from users under age 13 for purposes like ad targeting.
The report’s findings raise new concerns about YouTube’s advertising on children’s content. In 2019, YouTube and Google agreed to pay a record $170 million fine to settle accusations from the Federal Trade Commission and the State of New York that the company had illegally collected personal information from children watching kids’ channels. Regulators said the company had profited from using children’s data to target them with ads.
YouTube then said it would limit the collection of viewers’ data and stop serving personalized ads on children’s videos.
On Thursday, two United States senators sent a letter to the F.T.C., urging it to investigate whether Google and YouTube had violated COPPA, citing Adalytics and reporting by The New York Times. Senator Edward J. Markey, Democrat of Massachusetts, and Senator Marsha Blackburn, Republican of Tennessee, said they were concerned that the company may have tracked children and served them targeted ads without parental consent, facilitating “the vast collection and distribution” of children’s data.
“This behavior by YouTube and Google is estimated to have impacted hundreds of thousands, to potentially millions, of children across the United States,” the senators wrote.
Adalytics identified more than 300 brands’ ads for adult products, like cars, on nearly 100 YouTube videos designated as “made for kids” that were shown to a user who was not signed in, and that linked to advertisers’ websites. It also found several YouTube ads with violent content, including explosions, sniper rifles and car accidents, on children’s channels.
An analysis by The Times this month found that when a viewer who was not signed into YouTube clicked the ads on some of the children’s channels on the site, they were taken to brand websites that placed trackers — bits of code used for purposes like security, ad tracking or user profiling — from Amazon, Meta’s Facebook, Google, Microsoft and others — on users’ browsers.
As with children’s television, it is legal, and commonplace, to run ads, including for adult consumer products like cars or credit cards, on children’s videos. There is no evidence that Google and YouTube violated their 2019 agreement with the F.T.C.
The Times shared some of Adalytics’ research with Google ahead of its publication. Michael Aciman, a Google spokesman, called the report’s findings “deeply flawed and misleading.” Google has also challenged a previous Adalytics report on the company’s ad practices, first reported on by The Wall Street Journal.
Google told The Times it was useful to run ads for adults on children’s videos because parents who were watching could become customers. It also noted that running violent ads on children’s videos violated company policy and that YouTube had “changed the classification” of the violent ads cited by Adalytics to prevent them from running on kids’ content “moving forward.”
Google said that it did not run personalized ads on children’s videos and that its ad practices fully complied with COPPA. When ads appear on children’s videos, the company said, they are based on webpage content, not targeted to user profiles. Google said that it did not notify advertisers or tracking services whether a viewer coming from YouTube had watched a children’s video — only that the user had watched YouTube and clicked on the ad.
The company added that it did not have the ability to control data collection on a brand’s website after a YouTube viewer clicked on an ad. Such data-gathering, Google said, could happen when clicking on an ad on any website.
Even so, ad industry veterans said they had found it difficult to prevent their clients’ YouTube ads from appearing on children’s videos, according to recent Times interviews with 10 senior employees at ad agencies and related companies. And they argued that YouTube’s ad placement had put prominent consumer brands at risk of compromising children’s privacy.
“I’m incredibly concerned about it,” said Arielle Garcia, the chief privacy officer of UM Worldwide, the ad agency that ran the BMO campaign.
Ms. Garcia said she was speaking generally and could not comment specifically on the BMO campaign. “It should not be this difficult to make sure that children’s data isn’t inappropriately collected and used,” she said.
Google said it gave brands a one-click option to exclude their ads from appearing on YouTube videos made for children.
The BMO campaign had targeted the ads using Performance Max, a specialized Google A.I. tool that does not tell companies the specific videos on which their ads ran. Google said that the ads had not initially excluded children’s videos, and that the company recently helped the campaign update its settings.
In August, an ad for a different BMO credit card popped up on a video on the Moolt Kids Toons Happy Bear channel, which has more than 600 million views on its cartoon videos. Google said the second ad campaign did not appear to have excluded children’s videos.
Jeff Roman, a spokesman for BMO, said “BMO does not seek to nor does it knowingly target minors with its online advertising and takes steps to prevent its ads from being served to minors.”
Several industry veterans reported problems with more conventional Google ad services. They described how they had received reports of their ads running on children’s videos, made long lists to exclude those videos, only to later see their ads run on other kids’ videos.
“It’s a constant game of Whac-a-Mole,” said Lou Paskalis, the former head of global media for Bank of America, who now runs a marketing consulting firm.
Adalytics also said that Google had set persistent cookies — the types of files that could track the ads a user clicks on and the websites they visit — on YouTube children’s videos.
The Times observed persistent Google cookies on children’s videos, including an advertising cookie called IDE. When a viewer clicked on an ad, the same cookie also appeared on the ad page they landed on.
Google said it used such cookies on children’s videos only for business purposes permitted under COPPA, such as fraud detection or measuring how many times a viewer sees an ad. Google said the cookie contents “were encrypted and not readable by third parties.”
“Under COPPA, the presence of cookies is permissible for internal operations including fraud detection,” said Paul Lekas, head of global public policy at the SIIA, a software industry group whose members include Google and BMO, “so long as cookies and other persistent identifiers are not used to contact an individual, amass a profile or engage in behavioral advertising.”
The Times found an ad for Kohl’s clothing that ran on “Wheels on the Bus,” a nursery rhyme video that has been viewed 2.4 billion times. A viewer who clicked on the ad was taken to a Kohl’s web page containing more than 300 tracking requests from about 80 third-party services. These included a cross-site tracking code from Meta that could enable it to follow viewers of children’s videos across the web.
Kohl’s did not respond to several requests for comment.
A Microsoft spokesman said: “Our commitment to privacy shapes the way we build all our products and services. We are getting more information so that we can conduct any further investigation needed.” Amazon said it prohibited advertisers from collecting children’s data with its tools. Meta declined to comment.
Children’s privacy experts said they were concerned that the setup of Google’s interlocking ecosystem — including the most popular internet browser, video platform and largest digital ad business — had facilitated the online tracking of children by tech giants, advertisers and data brokers.
“They have created a conveyor belt that is scooping up the data of children,” said Jeff Chester, the executive director of the Center for Digital Democracy, a nonprofit focused on digital privacy.