How Social Media Should Block Violent Content Violent Content on Social Media Causes Controversy

Many social media platforms, including Facebook and Youtube, have been criticized for violent content being posted on their sites, especially after the New Zealand mass shooting that left 50 people dead.

The New Zealand mosque shootings were streamed over Facebook Live by the shooter himself. Facebook did not remove the video until almost a half hour after the video started. The video was later re-uploaded on Facebook and other social media sites, such as YouTube and Reddit. All these platforms then worked to remove the content.

Lee McKnight, associate professor at The Syracuse University School of Information Studies, said Facebook should have noticed the live stream sooner.

“Clearly in New Zealand there was nobody at Facebook paying attention for 17 minutes while this shooting was going on,” McKnight said.

Mark Bialzcak, the Communications Specialist for the Liverpool Public Library, said social media is great for communication, but it does have its negative aspects.

“Social media has its pluses and obviously, as we found out in that terrible tragedy, its minuses, said Bialzcak. “It’s something that society evolves and deals with.”

McKnight said there are two things social media platforms need to work on in order to stop violent content from being uploaded.

“As Facebook has begun to do, they’ve hired, I think, over 10,000 human monitors of their site to start paying attention to what users are uploading and doing on their site,” said McKnight. “On the other side, they need to focus on artificial intelligence and automated tools.”

McKnight said social media platforms have been trying to eradicate violent content from their sites, but they have a long way to go to solve the problem.

“It’s not going to be over in a year or five years, it’ll go on for quite some time,” McKnight said.

(TRT=1:34) (ANC)

MANY SOCIAL MEDIA PLATFORMS HAVE BEEN RECENTLY CRITICIZED FOR VIOLENT CONTENT BEING ABLE TO BE POSTED, ESPECIALLY AFTER THE MASS SHOOTING THAT HAPPENED IN NEW ZEALAND.
N-C-C NEWS REPORTER SARA RIZZO TELLS US WHAT SOME SOCIAL MEDIA EXPERTS THINK SHOULD BE DONE ABOUT THE ISSUE.

[TAKE SOT]

“Clearly in New Zealand there was nobody at home, at Facebook paying attention for 17 minutes while this shooting was going on”

(Track 1)
THE RECENT NEW ZEALAND MOSQUE SHOOTINGS THAT LEFT 50 PEOPLE DEAD WERE STREAMED OVER FACEBOOK LIVE…
FACEBOOK DIDN’T REMOVE THE VIDEO UNTIL ALMOST A HALF HOUR AFTER THE VIDEO STARTED…
THE VIDEO WAS THEN UPLOADED TO OTHER SITES SUCH AS YOUTUBE AND REDDIT, WHO THEN WORKED TO REMOVE THE CONTENT…
COMMUNICATIONS SPECIALIST MARK BIALZCAK SAYS SOCIETY HAS TO COPE WITH THE UPS AND DOWNS OF SOCIAL MEDIA…

[TAKE SOT]

“Social media has its pluses and obviously as we found out in that terrible tragedy, it’s minuses. It’s something that society evolves and deals with.”

(STANDUP)
JUST BY LOOKING AT THE HOMEPAGE OF YOUTUBE, THERE IS NO VIOLENT CONTENT VISIBLE. EVE IF YOU GO INTO THE LIVE STREAMS, THERE’SLIVE STREAM GAMING, ANIMALS, SPORTS, TECHNOLOGY. BUT JUST BECAUSE YOU CAN’T SEE THE VIOLENT CONTENT DOESN’T MEAN IT’S NOT THERE.
(Track 2)
PROFESSOR LEE MCKNIGHT SAYS THERE ARE TWO THINGS FACEBOOK NEEDS TO WORK ON IN ORDER TO STOP THIS CONTENT FROM BEING UPLOADED.

[TAKE SOT]

“They’ve hired I think over 10,000 human monitors of their site to start paying attention to what users are uploading and doing on their site. They need to focus on artificial intelligence and automated tools.”

(TRACK 3)
MCKNIGHT SAYS SOCIAL MEDIA PLATFORMS ARE TRYING TO ERADICATE THIS VIOLENT CONTENT, BUT IT’S A LONG WAY AWAY FROM SOLVING THE PROBLEM.

SARA RIZZO, N-C-C NEWS.

Related Articles