Many social media platforms, including Facebook and Youtube, have been criticized for violent content being posted on their sites, especially after the New Zealand mass shooting that left 50 people dead.
The New Zealand mosque shootings were streamed over Facebook Live by the shooter himself. Facebook did not remove the video until almost a half hour after the video started. The video was later re-uploaded on Facebook and other social media sites, such as YouTube and Reddit. All these platforms then worked to remove the content.
Lee McKnight, associate professor at The Syracuse University School of Information Studies, said Facebook should have noticed the live stream sooner.
“Clearly in New Zealand there was nobody at Facebook paying attention for 17 minutes while this shooting was going on,” McKnight said.
Mark Bialzcak, the Communications Specialist for the Liverpool Public Library, said social media is great for communication, but it does have its negative aspects.
“Social media has its pluses and obviously, as we found out in that terrible tragedy, its minuses, said Bialzcak. “It’s something that society evolves and deals with.”
McKnight said there are two things social media platforms need to work on in order to stop violent content from being uploaded.
“As Facebook has begun to do, they’ve hired, I think, over 10,000 human monitors of their site to start paying attention to what users are uploading and doing on their site,” said McKnight. “On the other side, they need to focus on artificial intelligence and automated tools.”
McKnight said social media platforms have been trying to eradicate violent content from their sites, but they have a long way to go to solve the problem.
“It’s not going to be over in a year or five years, it’ll go on for quite some time,” McKnight said.