Washington is pushing Silicon Valley on children’s privacy, and Silicon Valley is pushing back.
Apple, Facebook, Google, Microsoft and Twitter have all objected to portions of a federal effort to strengthen online privacy protections for children. In addition, media giants like Viacom and Disney, cable operators, marketing associations, technology groups and a trade group representing toy makers are arguing that the Federal Trade Commission’s proposed rule changes seem so onerous that, rather than enhance online protections for children, they threaten to deter companies from offering children’s websites and services altogether.
“If adopted, the effect of these new rules would be to slow the deployment of applications that provide tremendous benefits to children, and to slow the economic growth and job creation generated by the app economy,” Catherine A. Novelli, vice president of worldwide government affairs at Apple, wrote in comments to the agency.
But the underlying concern, for both the industry and regulators, is not so much about online products for children themselves. It is about the data collection and data mining mechanisms that facilitate digital marketing on apps and websites for children – and a debate over whether these practices could put children at greater risk.
In 1998, Congress passed the Children’s Online Privacy Protection Act in an effort to give parents control over the collection and dissemination of private information about their children online. The regulation, known as COPPA, requires website operators to obtain a parent’s consent before collecting personal details, like home addresses or email addresses, from children younger than 13.
Now, federal regulators are preparing to update that rule, arguing that it has not kept pace with advances like online behavioral advertising, a practice that uses data mining to tailor ads to people’s online behavior. The FTC wants to expand the types of data whose collection requires prior parental permission to include persistent ID systems, like unique device codes or customer code numbers stored in cookies, if those codes are used to track children online for advertising purposes.
The idea is to preclude companies from compiling dossiers on the online activities – and by extension the health, socioeconomic status, race or romantic concerns – of individual children across the Web over time.
“What children post online or search as part of their homework should not haunt them as they apply to colleges or for jobs,” Rep. Edward J. Markey, D-Mass., co-chairman of the Bipartisan Congressional Privacy Caucus, said in a recent phone interview. “YouTube should not be turned into YouTracked.”
The agency’s proposals have provoked an intense reaction from some major online operators, television networks, social networks, app platforms and advertising trade groups. Some argue that the FTC has overstepped its mandate in proposing to greatly expand the rule’s scope.
Others say that using ID systems like customer code numbers to track children “anonymously” online is benign – and that collecting information about children’s online activities is necessary to deliver the ads that finance free content and services for children.
“What is the harm we are trying to prevent here?” said Alan L. Friel, chairman of the media and technology practice at the law firm Edwards Wildman Palmer. “We risk losing a lot of the really good educational and entertaining content if we make things too difficult for people to operate the sites or generate revenue from the sites.”
Are adults next?
The economic issue at stake is much bigger than just the narrow children’s audience. If the FTC were to include customer code numbers among the information that requires a parent’s consent, industry analysts say, it might someday require companies to get similar consent for a practice that represents the backbone of digital marketing and advertising – using such code numbers to track the online activities of adults.
“Once you’ve said it’s personal information for children that requires consent, you’ve set the framework for a requirement of consent to be applied to another population,” Friel said. “If it is personal information for someone that’s 12, it doesn’t cease being personal information when they are 13.” An agency proposal to hold sites and apps liable for the data collection practices of their third-party analytics or advertising partners has also met with fierce opposition.
The Association for Competitive Technology, a trade group representing more than 5,000 app developers, has estimated that complying with such requirements could cost educational app developers $250 million in legal fees. It could also dissuade some mom-and-pop developers, who rely on free third-party software for features like animation and social networking, from designing products for children, said Morgan Reed, the trade group’s executive director.
“Children under 13 aren’t enough of a market, aren’t worthwhile to spend the money on compliance and tolerate the risk of getting it wrong,” Reed said.
Tracking vs. safety
Facebook, which does not allow children who say they are younger than 13 to register as members, and Twitter, which says its service is not intended for those younger than 13, have criticized another FTC proposal: to hold third parties liable if they know or have reason to know that they are collecting personal data on children’s sites.
The social networks say they cannot keep track of the many sites that download their software plug-ins, and therefore cannot know whether they are inadvertently collecting data on children’s sites. Google and Apple made a similar argument, telling regulators that app platforms like Android and the iTunes store should not be held liable for the data collection practices of the children’s apps they sell.
Many children’s advocates, however, have urged the agency to impose proposed revisions in full, arguing that a stricter rule is needed to give parents greater control over the many entities that track or show ads to their children online.
“Until there are some rules, marketers will continue to use what they have to penetrate children’s lives,” said Kathryn Montgomery, a professor of communications studies at American University who helped lead the effort to get Congress to pass the original children’s online privacy law. “Without constraints, it could easily get out of hand.”