Why would Facebook want to enroll children in a service built with little regard for adult safety/privacy/security? For the money.

June 7, 2012

A floundering Facebook is under increased pressure to shore up their revenue and flat per user minutes. So it was no surprise to see the Wall Street Journal report that Facebook is developing technology to allow children younger than 13 years old to use the social-networking service under parental supervision in spite of their abysmal track record in protecting older consumers.

Yet Facebook is already in deep trouble over their consistent encroachment on consumer privacy, the service is a hot bed for malware and scams, their advertising is not suitable for younger users.

The issues around Facebook’s interest in onboarding the under 13’s fall into three categories:

  • Facebook’s predatory privacy practices
  • Facebook’s financial woes
  • There is a need for a responsible social network where children, adults and commercial content can mix, but Facebook isn’t it

Facebook’s Predatory Privacy Practices

From its inception, Facebook has shown a deliberate disregard for consumer privacy, trust, or safety. This attitude was evidenced by founder Mark Zuckerberg’s early IM comments, and has continued ever since through the company’s privacy policy choices, and blatant deception and exploitation of users information.  Consider the following points:

  • Consumer’s feelings of betrayal run so high that 70% of Facebook users say they do not trust Facebook with their personal information.
  • The FTC found Facebook’s assault on consumer privacyso egregious that last fall (2011) they charged Facebook with deceiving consumers by failing to keep their privacy promises. The FTC complaint lists a number of instances in which Facebook allegedly made promises that it did not keep:
    • In December 2009, Facebook changed its website so certain information that users may have designated as private – such as their Friends List – was made public. They didn’t warn users that this change was coming, or get their approval in advance.
    • Facebook represented that third-party apps that users’ installed would have access only to user information that they needed to operate. In fact, the apps could access nearly all of users’ personal data – data the apps didn’t need.
    • Facebook told users they could restrict sharing of data to limited audiences – for example with “Friends Only.” In fact, selecting “Friends Only” did not prevent their information from being shared with third-party applications their friends used.
    • Facebook had a “Verified Apps” program & claimed it certified the security of participating apps. It didn’t.
    • Facebook promised users that it would not share their personal information with advertisers. It did.
    • Facebook claimed that when users deactivated or deleted their accounts, their photos and videos would be inaccessible. But Facebook allowed access to the content, even after users had deactivated or deleted their accounts.
    • Facebook claimed that it complied with the U.S.- EU Safe Harbor Framework that governs data transfer between the U.S. and the European Union. It didn’t.

The settlement bars Facebook from making any further deceptive privacy claims, requires that the company get consumers’ approval before it changes the way it shares their data, and requires that it obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years.

Facebook also failed from the beginning to build in strong security, monitoring or abuse tracking technologies, issues they’ve attempted to patch with varying degrees of seriousness or success as evidenced by the constant circulation of malware on the service, the breach of consumer’s information, consumers inability to reach a human for help when abuse occurs, and so on.

Facebook’s financial woes

It’s always about the money. With over 900 million users Facebook is still a force to be reckoned with, but their trajectory is looking a lot more like the climax before the cliff that MySpace faced.

Facebook’s financial problems have been building for some time but the company’s IPO brought financial scrutiny to the forefront and highlights their need to infuse new blood into the service – even if that means exposing children to a service that already poses clear risks to adults. Here’s a quick recap of the financial failings of Facebook:

  • Facebook stock is in a free fall, closing at $26.90 on June 4th when this article was written. That’s down more than $11 dollars, or 29% in the first 17 days of trading – and the stock continued to fall in after hour trading.  
  • The IPO valuation fiasco is far from over; Zuckerberg is now being sued for selling more than $1 billion shares just before stock prices plummeted. The suit says Facebook a “knew there was not enough advertising revenue to support a $38 stock valuation but hid that revenue information in order to push up the share price”.
  • In April, Facebook’s payments revenue went flat according to Business Insider. After growing a consistent 20% quarter over quarter, the first quarter of this year “revenue from payments and other fees [from games and partners] actually fell slightly, according to its latest filing with the Securities and Exchange Commission.”

A new Reuters/Ipsos poll shows that 4 out of 5 Facebook users have never bought a product or service as a result of ads or comments on the site, highlighting Facebook’s inability to successfully market to users.

  • The amount of time users spend on Facebook has also gone flat according to ComScore, a fact also highlighted by the Reuters/Ipsos poll which found that 34% of Facebook users surveyed were spending less time on the website than six months ago, whereas only 20% were spending more.

 

  • Advertisers are bailing. A nasty reality check came from General Motors just days before the company’s IPO as GM pulled their ad campaigns saying “their paid ads had little effect on customers” according to the Wall Street Journal.  

And an article on HuffingtonPost.com  reports that “more than 50 percent of Facebook users say they never click on Facebook’s sponsored ads, according to a recent Associated Press-CNBC poll. In addition, only 12% of respondents say they feel comfortable making purchases over Facebook, begging the question of how the social network can be effectively monetized.”

It’s easy to see why investors are angry, lawsuits have been filed and the government is investigating the debacle.

When 54% of consumers distrust the safety of making purchases through Facebook, and 70% of consumers say they do not trust Facebook with their personal information, and reports that consumer distrust in Facebook deepened as a result of issues around the IPO the company are surfacing, Facebook is looking more tarnished than ever.

As Nicholas Thompson wrote in The New Yorker, “Facebook, more than most other companies, needs to worry deeply about its public perception. It needs to be seen as trustworthy and, above all, cool. Mismanaging an I.P.O. isn’t cool, neither is misleading shareholders. Government investigations of you aren’t cool either.” And ´ The reputation of Facebook’s management team has also been deeply tarnished, particularly by the accusations that it wasn’t entirely open to investors about declining growth in its advertising business.”

There is a need for a responsible social network where children, adults and commercial content can mix, but Facebook isn’t it

Facebook has identified a real gap. There is a legitimate need for a social networking platform where kids can interact with adult family members and other trusted older individuals as well as commercial entities.

This need is evidenced by the 5.6 million underage youth still using Facebook today with or without their parent’s permission for lack of a more appropriate solution.

This 5.6 million underage user number is noteworthy for two reasons:

A)      It shows a significant number of children are using the site.

B)      More importantly it represents a 25.3% reduction of underage users over the past year when Consumer Reports found 7.5 million underage users on the site.  One could reasonably assume that the dramatic drop in underage use of Facebook is precisely because Consumer Reports published their data and that alarmed parents stepped in to block their use.

That 25.3% reduction strikes at the very heart of two of Facebook and their advocates’ key tenants; 1) since so many underage kids are already using Facebook, it would be safer for them if Facebook opened up the service for underage users and gave parents some access and controls to manage their use, and 2) parents want their children to be able to use Facebook.

To be clear, of the 5.6 million children still on the service, many have parents who helped them get onto Facebook. According to Dr. Danah Boyd, one of the researchers studying the issue of parents helping children get on the site, the reason parents told her they allowed their children on Facebook was because parents “want their kids to have access to public life and, today, what public life means is participating even in commercial social media sites.”

Boyd added that the parents helping their kids with access “are not saying get on the sites and then walk away. These are parents who have their computers in the living room, are having conversations with their kids, they often helping them create their accounts to talk to grandma.”

Note that Boyd’s findings don’t say parents want their children on Facebook. The findings say parents want their child to have access to public life. Given the dearth of alternative options, they allow their kids on Facebook with considerable supervision.  Why with considerable supervision? Because the site has inherent safety issues for users of all ages, and the safety issues would be greater for kids.

To date, Facebook has chosen not to cater to children under 13 because to do so requires complying with the Children’s Online Privacy Protection Act (COPPA) which Facebook advocate Larry Magid suggests “can be difficult and expensive” – yet hundreds of companies who take children’s privacy seriously comply with the policies today.

It is more than a little suspicious that Facebook made public their consideration to open their service to children just after they doubled their lobbying budget and just before the upcoming review of COPPA requirements – where the company will have the opportunity to press for weaker COPPA regulations.

Would it be safer for kids using the site to have additional protections such as Facebook suggests implementing?  Yes. But that’s the wrong question.  It is also safer to give kids cigarettes that have filters than to let kids sneak cigarettes without filters.

The real question is how do we fill the existing gap that compels kids to get onto a service that was never designed to protect them?

We need a service that has all the safety, privacy and security protections of the best children’s sites that also allows access to the broader public, news, events, and even appropriate advertising.  That service could easily interface with aspects of Facebook, yet leverage reputations, content filtering and monitoring, and human moderators to provide an environment that Facebook does not have, nor has shown any interest in creating.

This service is imminently buildable with technologies available today, but Facebook’s track record shows they are not the company to entrust with building the online service capable and willing to protect our children’s online safety, privacy and security.

Facebook’s proposal is all about their financial needs, not the needs of children.

Linda


FTC: Trample Children’s Privacy, and You’ll Pay the Price

August 31, 2011

The FTC has levied a $50k fine against the developer of such children’s apps as Zombie Duck Hunt, Truth or Dare and Cootie Catcher, Emily’s Girl World, Emily’s Dress Up and Emily’s Runway High Fashion, for collecting information from children without first gaining parental consent.

W3 Innovations was charged with collecting and storing children’s email addresses and allowing children to post personal information on public message boards, a violation of the Children’s Online Privacy Protection Act (COPPA) which requires parental consent for data collection about or from a child under the age of 13.

The apps found in violation of COPPA laws were marketed towards children in the Apple App store where more than 50,000 downloads were made before the FTC discovered the company was encouraging children to independently enter personal data through the games according to an article on digitaltrends.com. Additionally, the Emily apps encouraged children to email comments to “Emily” on the Emily blog.

Jon Leibowitz, the chairman of the commission said in a statement, “The F.T.C.’s COPPA Rule requires parental notice and consent before collecting children’s personal information online, whether through a Web site or a mobile app. Companies must give parents the opportunity to make smart choices when it comes to their children’s sharing of information on smart phones.”

This marks the first mobile COPPA case the FTC has reviewed, and it is unlikely to be the last. While I wholly support the protection of children’s privacy and regulations around obtaining parental consent, the antiquated methods currently employed to gain this consent need some rethinking.

In an age of instant access, companies and parents must be able to exchange an information request and verifiable consent within moments or the experience for the child wanting to play a game is really poor.

Industry, it’s time to step up to better methods for authentication and approval, those who don’t will find their apps aren’t used.

Linda