This year, BMO, a Canadian bank, was looking for Canadian adults to apply for a credit card. So the bank’s advertising agency ran a YouTube campaign using an ad-targeting system from Google that uses artificial intelligence to pinpoint ideal customers.
But Google, which owns YouTube, also showed the ad to a viewer in the United States. barbie-themed baby video onChildren’s Diana Show,” a YouTube channel for preschoolers whose videos have been viewed over 94 billion times.
When that viewer clicked on the ad, it led to BMO’s website, which tagged the user’s browser with tracking software from Google, Meta, Microsoft and other companies. new research From Adalytics, which analyzes advertising campaigns for brands.
The report said that as a result, major tech companies could track children on the Internet, raising concerns about whether they might be violating federal privacy laws. Children’s Online Privacy Protection Act, or copaChildren’s online services are required to obtain parental consent before collecting personal data from users under the age of 13 for purposes such as ad targeting.
The report’s findings raise new concerns about YouTube’s advertising of children’s content. In 2019, YouTube and Google agreed Pay record $170 million fine to settle allegations by the Federal Trade Commission and the State of New York that the company had illegally collected personal information from children watching children’s channels. regulators said the company had Benefits of using children’s data Targeting them with ads.
YouTube then said it would limit the collection of audience data stop serving Personalized ads on children’s videos.
Adlytics identified more than 300 brands of advertisements for adult products such as cars. Nearly 100 YouTube videos designated as “made for kids” were shown to users who were not signed in, and which linked to advertisers’ websites. It also found many YouTube ads on children’s channels with violent content including explosions, sniper rifles and car crashes.
An analysis by The New York Times this month found that when a viewer who was not signed in to YouTube clicked on ads on some children’s channels on the site, they were taken to brand websites that placed trackers. – Bits of code used for purposes such as. Security, advertising tracking or user profiling – from Amazon, Meta’s Facebook, Google, Microsoft and others – on users’ browsers.
As with children’s television, it is legal and common to run advertisements on children’s videos, including adult consumer products such as cars or credit cards. There’s no evidence that Google and YouTube violated their 2019 settlement with the FTC
The Times shared some of Adlytics’ research with Google in advance of its publication. Google spokesman Michael Aciman called the report’s findings “grossly flawed and misleading”. Google has also challenged a previous Adalytics report on the company’s advertising practices, which first reported on wall street journal,
Google told The Times that running ads for adults on children’s videos was useful because viewing parents could become customers. It also noted that running violent ads on children’s videos violates company policy and that YouTube “changed the classification” of violent ads cited by Adlytics to prevent them from “overtaking” children’s content. .
Google said it does not run personalized ads on children’s videos and that its advertising practices are fully COPPA compliant. The company said, when ads appear on children’s videos, they are based on webpage content, not targeted to user profiles. Google said it did not inform advertisers or tracking services whether a viewer from YouTube had viewed a children’s video — only that a user viewed YouTube and clicked on an ad.
The company said it did not have the ability to control data collection on a brand’s website after a YouTube viewer clicked on an ad. Google said, this type of data-gathering can happen when someone clicks on an ad on any website.
Still, ad industry veterans said they’ve found it difficult to prevent their clients’ YouTube ads from appearing on children’s videos, according to a recent interview with 10 senior employees of advertising agencies and related companies by the Times. And they argued that YouTube’s ad placements put major consumer brands at risk of compromising children’s privacy.
“I’m incredibly concerned about this,” said Ariel Garcia, chief privacy officer for UM Worldwide, the advertising agency that ran the BMO campaign.
Ms Garcia said she was speaking generally and could not comment specifically on the BMO campaign. “It shouldn’t be that difficult to ensure that children’s data is not collected and used inappropriately,” he added.
Google said it has given brands a one-click option to prevent their ads from appearing on YouTube videos made for children.
The BMO campaign targeted ads using Performance Max, a specialized Google AI tool that does not reveal to companies the specific videos their ads run on. Google said the ads did not initially exclude children’s videos and that the company recently helped the campaign update its settings.
In August, a video appeared advertising a different BMO credit card Mult Kids Toons Happy Bear Channel, which has over 600 million views on its cartoon videos. Google said that children’s videos were not included in the second ad campaign.
“BMO does not seek and does not knowingly target minors with its online advertising and takes steps to prevent its ads from being shown to minors,” said Jeff Roman, a spokesman for BMO.
Several industry veterans reported problems with more traditional Google advertising services. They explained how they had received reports of their ads running on children’s videos, made a long list to exclude those videos, and later found their ads were running on other children’s videos.
“It’s a constant game of whack-a-mole,” said Lou Pascalis, former head of global media for Bank of America who now runs a marketing consulting firm.
Adalytics also said that Google had set persistent cookies – types of files that can track the ads users click on and the websites they visit – on YouTube kids’ videos.
Times found persistent Google cookies on videos of children, including one Advertising cookie is called IDE, When a viewer clicks on an ad, the same cookie also appears on the ad page they reached.
Google said it used such cookies on children’s videos only for commercial purposes permitted under COPPA, such as fraud detection or measuring how often a viewer sees an ad. Google stated that the cookie content was “encrypted and not readable by third parties.”
“Under COPPA, the presence of cookies is permitted for internal operations, including fraud detection,” said Paul Lekas, head of global public policy at software industry group SIIA, whose members Include Google and BMO, “unless cookies and other persistent identifiers are used to contact an individual, collect a profile, or engage in behavioral advertising.”
The Times found a Kohl’s clothing ad that was running “Wheels on the bus”,” a nursery rhyme video that has been viewed 2.4 billion times. A viewer who clicked on the ad was taken to a Kohl’s web page, which contained over 300 tracking requests from approximately 80 third-party services. These include a cross-site tracking code from Meta that may enable it to follow viewers of children’s videos across the web.
Kohl’s did not respond to multiple requests for comment.
A Microsoft spokesperson said: “Our commitment to privacy shapes the way we build all of our products and services. We are obtaining more information so that we can conduct any further necessary investigations.” Amazon said it has blocked advertisers from collecting children’s data with its tools. Meta declined to comment.
Children’s privacy experts said they were concerned about Google’s establishment of an interlocking ecosystem – which includes the most popular Internet browsers, video platforms and Largest Digital Advertising Business Online tracking of children was facilitated by tech giants, advertisers and data brokers.
“They’ve created a conveyor belt that’s collecting children’s data,” said Jeff Chester, executive director of the Center for Digital Democracy, a nonprofit focused on digital privacy.