Web 2.0. The quasi-public sphere. Social media.
Over the past few years, social media platforms have become a force to be reckoned with. From the use of YouTube to show the world Iranian protests during the 2009 elections to the use of Twitter and blogs to spread word about candidates in the United States, the rise of social media has shown us that everyone can have a voice.
Imagine if, in 1989, witnesses to the Tienanmen Square Massacre had carried mobile phones with cameras, later uploading still images and videos to sites like Flickr and YouTube. Though the outcomes may have been the same, the narrative would have been very different. Such platforms allow citizen journalists to offer different perspectives of a news story to the world, perspectives that were never before possible.
But with the sharing of such content comes the issue of content moderation. While showing videos that expose human rights violations, for example, certainly serve a purpose, the companies that host such content have legal responsibilities and are often obliged to moderate certain content. In other cases, content moderation is a response to the needs and concerns of the community.
Victoria Grand, Senior Manager for Communications at YouTube, will be speaking during the “open program” of the Summit on content moderation processes. Grand will provide a “behind-the-scenes” look at YouTube's content removal and deactivation policies. The panel will also include Rebecca MacKinnon, co-founder of Global Voices and fellow at Princeton University, Jillian York of the Berkman Center for Internet & Society, and Oiwan Lam will discuss content removal and deactivation across a number of platforms, as well as the importance of context and transparency in dealing with activist content on these platforms.