Who’s responsible for filtering the Web?

There has been a longstanding legal battle over what companies should be required to do in order to monitor and block harmful content from minors. The debate has recently flared up again, and it is worth understanding the issues at stake.

The controversy

On one side of the content filter debate is the Justice Department. It is seeking to reinvigorate the Child Online Protection Act (COPA) that was first created in 1998 to protect minors from commercially distributed pornographic content on the Internet. COPA requires commercial Web sites to secure proof of identity and age before displaying content that could be harmful to minors.

On the other side of the debate are the ACLU and a broad array of Internet content providers. They argue that COPA is flawed and that content filters allow parents adequate opportunity to protect their children. They also assert that about half of the sites that promote sexually explicit content are international in origin where the law would have no bearing anyway.

Introduced into the current court hearing was a new study by Professor Philip B. Stark, a statistics professor at the University of California, Berkeley. Stark’s research on the effects of content filtering software found that one filter—AOL’s Mature Teen—blocked up to 91 percent of sexually explicit Web sites. The study showed that less restrictive filters blocked “at least 40 percent” of explicit content. (The report did not mention how many desirable sites were blocked in the process.)

Citing Stark’s research, ACLU attorney Chris Hansen claimed that because “filters are more than 90 percent effective,” “it’s up to the parents how to use it, whereas COPA requires a one-solution-fits-all [approach).”

If only one percent of Web sites are pornographic and filters are more than 90 percent effective, what’s the issue? If blocking “harmful” content is as easy as installing a filter tool, why is it that 82 percent of users feel the ease of stumbling across sexually explicit material is a problem? (Consumer Reports WebWatch, 2005) There are serious flaws in the arguments on both sides of this debate.

Flaws in COPA

The COPA proposal flaws include the following:

  • At eight years old, COPA is based on a view of the Internet that is antiquated both in understanding newer revenue models (ad funded and the like) and in failing to address some of the newer functionality for sharing and distributing content (which further reduce the effectiveness of filters that the effectiveness of COPA depends on. For example, It has a very simplistic view of how ‘bad content’ can be discovered. It doesn’t account for material generated by users or for RSS, P2P sharing, and other innovations that have developed since 1998.
  • The regulation would apply only to U.S. companies which means that all Web sites hosted internationally (more than half of all porn sites) would not be bound by the laws.
  • COPA only addresses commercially distributed content, but there is a great deal of “free” content that falls into the category of “harmful to minors.”
  • Forcing consumers to register to view adult material raises serious privacy and freedom of speech concerns that COPA fails to address.

Flaws in arguments of those opposed to COPA

The arguments from the ACLU and others contain flaws as well, including:

  • Chris Hansen’s claim that “filters are more than 90 percent effective” is blatantly overstated and contradicts the research of Professor Stark—that one filter blocked 91 percent of sexually explicit content) Hansen did not mention what the rate of over-blocking is at that filter setting. (Note: over-blocking means a filter falsely blocks a legitimate site, like a *** cancer site because it contains the word ***). If a content filter over-blocks legitimate content too frequently the filter is so frustrating to use that consumers give up and turn it off.
  • The Stark study indicated that less restrictive filter settings “blocked at least 40 percent of sexually explicit sites,” a number that is more realistic in terms of filter accuracy without incurring significant overblocking. That means, however, that less restrictive filter settings fail to block about 60 percent of content deemed harmful to minors. This may be a show stopper for many parents when the average age of first exposure to unwanted sexually explicit material is eleven (research by Top Ten Reviews), and 25 percent of youth have unsolicited exposure to sexually explicit content (research from Online Victimization: A Report on the Nation’s Youth).
  • An Associated Press article (14 Nov 2006), “One percent of Web sites deemed pornographic,” gives the impression that pornographic content is relatively rare, but that statement is open to challenge:
    • First, the one percent data point refers to Web sites—, not the frequency that porn is presented to minors.
    • Secondly, this statistic is the result of one study. Other (from Top Ten Reviews, for example) suggests that pornographic Web sites represent 12 percent of the total.
  • The ACLU’s assertion that parents can take charge, will know where to find and how to download and install them, and proactively watch out for their children’s online safety all the while achieving over 90 percent accuracy in blocking sexually explicit images—ignores that the children potentially at greatest risk are those whose parents aren’t taking the appropriate steps to protect them in any facet of their lives. COPA intention was to default to safer settings for the protection of minors, the ACLU and like minded companies want to assume an unfiltered approach and require proactive steps to be taken for the protection of minors.

Follow the money

Companies are in the business of making money and minimizing costs. Building strong filters that allow consumers to set their own content experience or the content experience of their children is complex and expensive.

  • To provide highly effective content filters, a company would need to screen all content—text, images, video, and audio.
  • Also, it isn’t a build-the-filter-once-and-you’re-set proposition. Businesses and individuals who want to circumvent the filters are constantly working on ways to do so (just as they do with spam, phishing, spyware, and virus safeguards.)
  • Keep in mind that each of these filters (and updates) has to be planned, built, tested, and translated into many languages. They must account for cultural sensitivities, respect differing state and national laws, and empower consumers with enough flexibility to set their own standards.
  • Filters have to work on a dizzying array of networks, operating systems, Internet browsers, and devices including PCs, Internet-enabled cell phones, gaming devices like Xbox, and so on. Each type of device and operating system has unique development and testing requirements.
  • It isn’t enough to simply filter content that can be browsed. To really provide consumers the content filtering choice and protections they should have, content filters need to be applied to content in blogs (like MySpace, Friendster, and Facebook), video hosting services (like YouTube, Google Video, and Windows Live Soapbox), as well as content served up by the services themselves.
  • There is also the reality that no matter how good a filter is, it won’t catch everything. Companies struggle with the concern that trying to filter and failing will open them up to greater legal exposure than if they do nothing.

These are huge challenges—not insurmountable, but certainly not appealing for businesses who face the costs and aren’t hearing a huge outcry from consumers demanding change.

What you can do

In spite of the complexity, empowering consumers to set content filters to match their values and protecting minors are goals worth shooting for. But it’s naïve to imagine the Internet industry taking on this challenge and expense without clear regulatory requirements, strong consumer demand, and some safeguards that protect them from penalties for any gaps. I can’t think of a single industry that has managed to successfully regulate itself and put consumer interest and safety first.

For regulators and law enforcement: Focus your energy on these three areas:

  • In its current form, COPA won’t be successful for the reasons cited above. Rather than fighting to enforce COPA as it is written today, revise and modernize it so that it provides the intended benefits without compromising free speech and privacy.
  • Provide companies protection from legal exposure if harmful material slips through when they have demonstrated diligence in providing strong filters.
  • Work across state and national borders to standardize regulatory requirements to minimize the breadth of legal variables companies will face when building filters.

For consumers: Let companies and elected officials know that you demand that your safety and values be protected and respected; if you don’t let companies know your expectations it will surely take longer to achieve them. (To fuel your demands, read my blog, Your Internet Safety Bill of Rights.)

For Internet companies:

  • Increase your investments in researching and building robust filters that provide consumers the safety and flexibility they need.
  • Make safety a top priority in building consumer trust and loyalty.
  • Reach out across the industry to establish standards and best practices.

Linda

Advertisements

Comments are closed.

%d bloggers like this: