Facebook reviewing reporting practices after killing posted online - Technology & Science

The posting of a video showing a 74-year-old man's shooting death on Facebook is raising questions about whether social media sites and their users have a role to play in preventing real-world violence from finding an audience online.   

On Sunday, a Cleveland, Ohio, man claiming to be angry at a former girlfriend, shot and killed Robert Godwin Sr.

The shooter, who appeared to choose his victim randomly, later posted footage of the killing on Facebook. On Tuesday, after a multi-day manhunt, police found the suspect, Steve Stephens, 37, in Pennsylvania. Stephens killed himself as officers approached his vehicle, police said.

The video of Godwin's slaying remained on Facebook for about three hours, prompting questions about why Facebook didn't act more quickly to pull it down.   

"This is essentially the kind of worst-case scenario of something that can occur," said Fuyuki Kurasawa, York University's research chair in global digital citizenship.   

​Facebook issued a public statement Monday, saying it received a report about the video containing the shooting nearly two hours after it was posted.

"We received reports about the third video, containing the man's live confession, only after it had ended," wrote Justin Osofsky, Facebook's vice-president of global operations. 

"We disabled the suspect's account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better."

Osofsky said "as a result of this terrible series of events," Facebook was reviewing its reporting practices to ensure the quickest possible response to remove content like the videos allegedly made by Stephens.

"We prioritize reports with serious safety implications for our community, and are working on making that review process go even faster."

But some social media experts, including Kurasawa, say Facebook relies too heavily on users to police violent content. 

It amounts to "the downloading of responsibility onto users," he said.

That's problematic, Kurasawa said, because although this particular case was a "clear cut" example of a video that shouldn't be there, there are "grey zones," such as live shots of violence occurring during political uprisings, that some users might want to see removed because of their own personal viewpoints while others would argue it should stay.  

Don Heider, founder of the Center for Digital Ethics and Policy at Loyola University Chicago, said social media companies need to address ethical issues like this one proactively, rather than relying on user reporting.

"They are saying, 'You're responsible. You. Every citizen out there,'" Heider said. "[They're saying], 'we're putting it on the public to be responsible for policing our product.' That can't be correct."

Facebook needs to not only develop better algorithms to quickly detect violent content, he said, but also make sure there are enough staffers keeping a human eye on what's being posted.  

That's no easy task with livestreaming, says Aimée Morrison, an associate professor specializing in New Media studies at the University of Waterloo. 

image

Facebook said in a statement that it is reviewing its reporting practices to ensure the quickest possible response to remove violent content. (Associated Press)

With close to two billion registered Facebook users, "there is literally not enough people to moderate that content," Morrison said.  

Facebook Live is relatively new technology for the public, she said, and "on balance, people are more glad to have it than otherwise."

"But there are ways that this tool can be exploited." 

The Facebook murder opens up the question of how much — if any — responsibility people should feel as virtual bystanders watching a video when criminal or violent activity happens.

"Should users not simply just be able to report these types of things or should users be held responsible for say, calling 911 or flagging the video right away if a crime is taking place and they are witnessing it?" Morrison asked.

She emphasized that there's a big difference between being offended by content and seeing an actual crime happening.

"There should be a button that you can press that says, 'There is a crime taking place in this video and it needs to be taken down immediately,'" Morrison said.

Read more http://www.cbc.ca/news/technology/facebook-live-murder-broadcast-responsibilities-1.4073519?cmp=rss

  

 

&

CANADA NEWS

SPORTS