Keeping Internet Sites Clean Wreaks Havoc on Screeners

The New York Times recently published an excellent article titled Policing the Web’s Lurid Precincts, detailing the issues that content reviewers face as they try to ensure that inappropriate materials do not land on their site’s pages.

Having been a content screener in the very early years of consumer content uploads, I am all too familiar with the haunting, nauseating content – the text, images and videos – filled with hate, rage, murder, and abuse of men, women, children, babies, and animals that these workers have to view by the thousands daily. These images never leave your mind, but become a sort of rolodex-of-horror that at times have to actively be suppressed.

I am a strong advocate for therapy for anyone that has to see or experience this horror, and firmly supported the Online Safety and Technology Working Group’s recommendation to Congress to provide financial incentives for companies to “address the psychological impact on employees of exposure to these disturbing images.” The group’s recommendations have been submitted to the National Telecommunications and Information Administration, which advises the White House on digital policy.

The only thing worse than failing to provide therapy for screeners, is providing no screeners and thereby victimizing users

Some sites, like Facebook, not only fail to provide screeners with therapy, they fail to provide screeners at all – choosing to exploit their users instead of covering the costs of professional screeners. This is a common strategy, that requires users to flag questionable content, then hand off material that needs further human review to outsourcing companies that can do so at low cost. (Note, Facebook has run some limited trials on content review).

This practice leaves consumers – including the youngest consumers – exposed to horrific images, videos, content and concepts. Normal users are utterly unprepared to deal with images of sadistic child abuse, animal torture, videos of murder, and many will never be able to erase explicit abusive content from their minds. Frankly, expecting users to view and report such images constitutes abuse in it’s own right.

Internet companies prefer not to discuss their content moderation, partly because they don’t want to call attention to the since they would rather not draw attention to the ugliness of the content that lands on their sites. But as users you have the right to know – and should demand to know – what the screening process is on any site you, or your child uses.

You should launch a clear protest against any site that uses you, or your child, as their primary screening resource, and stick to sites that respect your mental health by filtering out these images for you.

Linda

Advertisements

Comments are closed.

%d bloggers like this: