Friday, July 13, 2007

KEEPING SOCIAL MEDIA SITES CLEAN

Recently social media sharing and publishing platforms have become ubiquitous. People use these sites to create and contribute content; or to critique and comment on content; or just to collect and consume content. Regardless of the use, people of all ages flock to sites that are in this realm (like YouTube, Digg, Flickr, etc.)

So, how do you keep these user-generated-content-sites clean for all ages and free of objectionable content? Do you remove all mature content? Do you display a warning before showing mature content? Or, do you filter content display depending on the age of the user? And, what processes do you provide for the community to flag mature content? These are some of the policy challenges that social media sites address regardless of media silos (of text, audio or video content).

Browser Preferences
As is well known mature material on the Internet is not just pornographic content. It also includes adult / obscene language, and violent or hateful material. Such content can be posted on social media sites as text, audio or video content.

Users do have ways and means of filtering such content on the Web using settings on their browsers. For instance, on Internet Explorer you can use The Recreational Software Advisory Council rating service (based on the work of Dr. Donald F. Roberts of Stanford University) to filter mature content. Some search engines also provide Safe Search preferences, which can be set to filter adult material.

These techniques basically place the onus on the user for setting viewing preferences. Suspect sites and / or offending content get entirely blocked.

Selective blocking
Social media sites feature content users generate. They do not provide editorial filters for approving content and hence have no control over the material that gets published.

So, while they would like to attract users belonging to all demographics, some controls are necessary to insure users do not accidentally view objectionable content on the site. Besides, they do have social responsibilities and moral obligations.


Best Practices
Social media sites need to have well-defined policies for flagging mature content. Getting users to declare mature content while publishing it is the first step. The media site needs to prominently flag all such mature content. A ‘mature icon’ may be displayed to warn users about the nature of the content and to prevent accidental viewing. Next, sites may have to incorporate date-of-birth based controls to prevent underage users viewing mature content.

A means for users (viewers of content) to report inappropriate content should also be provided. The media site should allocate resources and define processes for reviewing complaints and then taking appropriate steps including, but not limited to, expeditiously take down of the inappropriate content. Termination of the violating user’s account is also a step that can be pursued as an option.

Finally, social media site owners need to be vigilant about their site being used for publishing child pornographic material. Federal and state laws make it a crime to produce, possess, distribute, or sell pornographic materials that exploit or portray a minor. It has to be remembered that social media sites as service providers must report child pornography incidents to law enforcement authorities.

No comments: