How Do We Stop Social Media Ads
That Target Teens and Minors?
Imagine you have a 15-year-old son or daughter who uses Facebook on a daily basis. And let’s say that your child, like many children in this age group, has a natural curiosity about alcohol. With its sophisticated algorithms, Facebook is able to discern your child’s curiosity. Meanwhile, an adult beverage company wants to buy social media ads targeting your child, and Facebook allows it. Would you be fine with this? You probably wouldn’t, but that’s exactly what happened recently when two watchdog groups, the Tech Transparency Project (TTP) and Reset Australia, each created paid social media ads on separate occasions for products that are illegal for minors such as alcohol and tobacco, targeting teenage users on Facebook aged 13 to 17. According to both organizations, Facebook approved all the ads.
Most people, especially parents, probably don’t need to be convinced that alcohol, tobacco, and gambling ads that target teens and minors are problematic. While one could argue this kind of targeted advertising isn’t so different from, say, ads for alcohol in magazines and on TV that minors can easily see, social media ads are powered by sophisticated, data-driven algorithms that enable them to “hack” our psychology, exploiting the natural desire to be popular and liked that’s so pronounced among adolescents and motivates their use of social media to begin with. The average TV ad is not able to directly address adolescent viewers, for instance, and speak to their deepest longings and desires. Targeted social media ads can. It’s finely tuned “narrowcasting” compared to the generalized broadcasting older generations have traditionally been used to.
This inevitably leads to the question about what should be done about it. Should social media companies be regulated? Letting them regulate themselves may have been an argument at one time, but many now believe that such an approach has failed. Technically, most major social media platforms do have internal regulations, but incidents such as the recent one involving Facebook, and New Mexico suing Google last year for allegedly mining the data of schoolchildren lend support to the counterargument that tech companies cannot be left to regulate themselves.
The next question, and a more difficult one, is how to regulate. Part of this depends on which controversial aspects of social media we are talking about. Targeted advertising towards minors is clearly a different problem from politically motivated misinformation, for example, and will require different solutions. Currently, the only form of federal regulation that protects minors on social media is the Children’s Online Privacy Protection Act (COPPA), but it only protects children under the age of 13. It does not protect minors aged 13 to 17, who were the targets of test ads for alcohol and tobacco that were created by TTP and Reset Australia which had gotten approved by Facebook. COPPA provides advertisers and social media companies with too many loopholes, especially when individual states introduce stronger legislation such as the California Consumer Privacy Act (CCPA). Given that purchasing alcohol is illegal for anyone under 21, and many states have raised the minimum age for buying tobacco products as well, it seems only fitting that the targeted advertising of products illegal for minors should also be at least as restrictive. So, an update to COPPA may be overdue.
Finally, this raises the question of enforcement. While the FTC has in fact enforced COPPA, such as when it fined Google $170 million for violating the law, critics such as Senator Edward Markey, the original author of the COPPA bill, argue that fines like this are effectively little more than slaps on the wrist for powerful tech giants. However, under current laws, for the FTC to have imposed a stricter penalty on Google, it would have needed to take the company to court, a process that almost surely would have gotten drawn out and one in which Google would have had the advantage due to its vast resources. What is therefore needed may not just be regulation that increases the protected age for minors but laws that specifically empower the FTC to better enforce violations.
Just very recently, Senators Markey and Bill Cassidy introduced a new bill, the Children and Teens’ Online Privacy Protection Act, which would expand COPPA protections to minors aged 13 to 15 and would hold companies to stricter standards. But even if the bill passes, will it be enough? For example, would companies merely shift their targeted ads to 16 and 17-year-olds, who are still legally minors? And would the bill effectively solve the matter of better enforcement? Only time will tell for certain. In the meantime, with the health and safety of our children at stake, we should not stop asking the difficult questions and attempting to answer them.
Jui Ramaprasad is a Professor in the Decision, Operations and Information Technologies department at the University of Maryland’s Robert H. Smith School of Business.