By Sophie Gilbert ’19, News Editor

Ever since Facebook Live was introduced for public use in the beginning of 2016, the issue of violent attacks being streamed over the popular social media platform has prevailed. From the streamed torture of a special needs student in January, to countless other videos of assaults, Facebook is now facing a serious problem: how to regulate a feature designed so that anybody who wants to can share live videos to the world.
Now, catalyzed by the livestreamed murder of 74-year-old Robert Godwin in Cleveland in April, Facebook is taking bigger steps towards tightening the regulations of its livestreams. Mark Zuckerberg announced on Facebook: “Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.” This was not the only change Facebook planned to make; Zuckerberg added, “We’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us…easier for them to contact law enforcement if someone needs help.” Zuckerberg referenced a recent case, in which a teen streamed her suicide attempt through Facebook, and viewers of the livestream were able to alert authorities and ended up saving her life. His goal was that if another incident like this was to occur, there would always be somebody monitoring the livestream who would be able to get the person the help they needed. These changes will hopefully lead to a decrease in the violent content, and ensure that it is found faster and taken down immediately.
Earlier this month, community activists in Chicago met with Facebook officials in order to discuss the problem and move towards a solution. One leader of the meeting, Rev. Jesse Jackson, stated that Facebook’s plan to add 3,000 more staff members to regulate the content was a good first step, but more needed to be done. He suggested that Live be shut down for 30 days, but Facebook does not plan to shut down the service, just to tighten regulations. There is no one right solution for this issue, and even within the NA community, students have different ideas about the best way to regulate the content. Neha Maddali ’19 says, “There isn’t much they can do, especially if the person has no previous indication of criminal activity, but Facebook should block and remove the content as soon as possible.” Melisa Yaman ‘19 agrees but adds, “Other people who see the video should report it right away.”
This is not to say that Facebook is completely overrun with brutal, violent images and videos–Facebook Live has led to moments that have brought communities together to watch joyful moments, such as the recent long-awaited video of April the Giraffe giving birth. However, the usage of Facebook Live as a vessel for gaining an easy and large audience to commit violent acts is a prevalent issue. This is a problem because Facebook is such a popular and widely used platform that anybody could see these violent videos. For example, school Facebook pages like NA Announcements and NA Red Army have at least 200 members each, meaning that hundreds of people in the Newark Academy community could come across these livestreamed attacks. However, with new regulations being implemented and Facebook becoming more aware of community needs, the onslaught of violent videos is set to decrease soon.

Leave a Reply
You must be logged in to post a comment.