Kids love mobile apps. Kids also like to look and click on pretty pictures aka “advertising.” Kids, like many -- if not most -- of their parents don’t read privacy disclaimers on apps or websites, and, to be honest, would not understand them if they did. In short, kids need protection.
The need to protect the personal information of our youngest, most impressionable and most vulnerable individuals, our children, is not something that just popped up as a problem. In fact, going all the way back to 1998, the U.S. Congress acknowledged that the online world was going to present perils to young people and passed the “Children’s Online Privacy Protection Act of 1998,” known as COPPA for short.
Image courtesy of Shutterstock
For those not familiar, COPPA, which is under the regulatory jurisdiction of the U.S. Federal Trade Commission (FTC), requires that operators of websites or online services that are either directed to children under 13 or have actual knowledge that they are collecting personal information from children under 13 give notice to parents and get their verifiable consent before collecting, using, or disclosing such personal information, and keep secure the information they collect from children. COPPA also prohibits such entities from conditioning children’s participation in activities on the collection of more personal information than is reasonably necessary for them to participate. The COPPA Rule, which is how the FTC exercises its jurisdiction under the Act, contains a “safe harbor” provision that allows industry groups or others to seek FTC approval of self-regulatory guidelines.
So where or what’s the beef?
The FCC, as part of its industry oversight, from time to time does studies relating to the effectiveness of COPPA and whether the Rule needs to be updated. In the past few months, if you are not aware, the FTC released what is a very disturbing study that looked specifically at the mobile apps business, and has decided that there is definitely a cause for both concern and action.
Let’s start with the report. Issued in December of 2012 and entitled, Mobile Apps for Kids: Disclosures Still Not Making the Grade, there is no way to sugar-coat the findings. This is a follow-up to a February 2012 report done by the FTC staff called, Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, which at that time found that little or no information was available to parents about the privacy practices and interactive features of the mobile apps survey that were obtained from the Apple Store and Google Play. Let’s just say, the picture despite admonitions to the industry remains bleak.
Lowlights of the staff report include:
- Nearly 60 percent of the mobile apps the FTC reviewed from the Google Play and Apple App stores transmitted the device ID.
- They also often shared that ID with an advertising network, analytics company or another third party.
- Of the 235 mobile apps reviewed, 14 also transmitted the location of the device and the phone number, the FTC found.
- More than half of the apps also contained interactive features such as in-app purchases and advertising that were not disclosed to parents.
What the staff also found is that because so many of the apps send information to a small number of third parties, those third parties have the potential to develop detailed profiles of children based on tacking their behavior in the different apps.
Here is the sad part. Only 20 percent of the apps disclosed any information about their privacy practices. "Parents still are not given basic information about the privacy practices and interactive features of mobile apps aimed at kids," the study said. It also turned up the pressure on mobile apps developers to give parents "easy to understand" choices about data collection and sharing on kid apps.
In a written statement at the time FTC Chairman Jon Leibowitz said:
"While we think most companies have the best intentions when it comes protecting kids’ privacy, we haven’t seen any progress when it comes to making sure parents have the information they need to make informed choices about apps for their kids. In fact, our study shows that kids' apps siphon an alarming amount of information from mobile devices without disclosing this fact to parents…All of the companies in the mobile app space, especially the gatekeepers of the app stores, need to do a better job. We'll do another survey in the future and we will expect to see improvement."
The FTC in September proposed updating COPPA, part of its broader push to strengthen online privacy protections. Despite industry resistance the FTC adopted final amendments to the COPPA Rule on what was a 3-1-1 vote. Commissioner J. Thomas Rosch abstained and Commission Maureen Ohlhausen voted no and issued a dissenting statement saying she thought the new rules were a case of regulatory overreach and would stifle innovation. At the time the FTC also said it would be launching investigations to determine if certain mobile apps developers have violated COPPA or have engaged in unfair or deceptive trade practices.
Without going into detail, areas covered in the amendments, which take effect July 1, 2013, include:
- Modified definitions to better delineate who an operator is, what constitutes a website or online services so that things like plug-ins are included, an update as to what personal information is and what constitutes the collection of information.
- Revises the parental notice provisions to help ensure that operators’ privacy policies, and the direct notices they must give parents before collecting children’s personal information, are concise and timely.
- Adds several new mechanisms that operators can use to obtain verifiable parent consent.
- Requires operators to take reasonable steps to make sure that children’s personal information is released only to service providers and third parties that are capable of maintaining the confidentiality, security, and integrity of such information, and who assure that they will do so.
- Requires operators to retain children’s personal information for only as long as is reasonably necessary, and to protect against unauthorized access or use while the information is being disposed of.
- Strengthen its oversight of the approved self-regulatory “safe harbor programs” by requiring them to audit their members and report annually to the Commission the aggregated results of those audits.
All of this sounded complicated and interesting. Needed guidance on the subject, I turned to one of the county’s leading attorneys who covers such issues who happens to live and work in the town next to TMC’s headquarters. Chris Librandi at the Westport, Connecticut-based firm of Levett Rockwood is a specialist on social media matters.
I was hoping for elucidation and some good news. What I got was a great but sobering education. Librandi walked me through the facts about the COPPA Rule actually being quite clear. “Simply put, the law says that solicitations aimed at children 13 and under must disclose what is being collected, and parental consent must be obtained,” he stated. He continued by telling me that, “What the FTC report shows it basically there is blatant disregard by the industry. Self-regulation is not working.”
Librandi also point out an enforcement issue that is a real challenge. “Most app developers are small operations. This means that if they do have disclaimers, they typically are cut and paste jobs from some other site and careful investigation has shown that most of the terms and conditions they have copied have no bearing on the actual use,” he explained. “Clearly the industry needs not just a high level standard disclosure that the industry can use for self-regulation purposes, but also needs to find a way to better police the cut and paste group.”
I asked the obvious question as to how much can be done legally. Librandi said that as with the area of disclosure of personal information of adults, this is a “consumer driven problem.” Unfortunately or fortunately, people like to have free Internet services and apps that are inexpensive or free which means those who run those service and create those apps need to find a way to “monetize” them. This is another way of saying they need to generate ads dollars and the only way to do that is to give advertisers a means to do optimized lead generation. And, for the most part will either willingly or begrudgingly give it up for Apple, Google, Twitter and the like.
The real issue here is that kids really are different, and controls are needed. Whether this means that social media companies should be turned into a type of online in loco parentis is an interesting question. Having said that, however, one would hope that given the excesses that have been revealed, Apple and Google would like to be perceived as being more family-friendly.
Librandi said, the law, particularly state law, is woefully behind is this area. He noted that, “California for example is the only state that has anything like COPPA.”
As I said this was a sobering conversation. We concluded with the acknowledgement that regulation could only go so far and that the onus was going to be on consumers to point out the most egregious abuses so the wrong-doers could be pursued such as happened in a recent case where the FTC forced the removal of a Sponge Bob app that was collecting information by going after the TV network that broadcasts Sponge Bob Nickelodeon, as well as its corporate parent Viacom.
The bottom line is that the FTC has limited resources and this really is an issue for parents pressuring companies. It is also an issue of “if you see something, say something.” It will be interesting to see if behavior changes once the final amendments become the law. It will also be interesting to see if the industry can really come up with some best practices in the face of what the FTC is promising will be an aggressive public awareness campaign.
Edited by Rich Steeves