Facebook announces 3000 new employees to review violent content - but is it enough?

Facebook has announced it will hire 3,000 extra staff over the next year to speed up the removal of videos showing murder, suicide and other violent acts, in its most dramatic move yet to contain the company's public image.

FILE: Social media sites are struggling to grapple the surge in live-streamed violence committed on platforms like Facebook.

Social media sites are grappling with the surge in live-streamed violence, shared on platforms like Facebook. Source: AAP

The announcement follows a spike in the number of people using live-streaming services to post crimes in real time, including the recent murder of an 11-month-old girl in Thailand by her father.

In Cleveland last month, Steve Stephens posted a video of the fatal shooting of Robert Godwin Sr, 74. Stevens was later shot dead by police.

With some violent videos taking as long as 24 hours to be removed, Facebook has been widely criticised for being too slow to act. The move is seen as the company's acknowledgement that it has to take action to address the issue.

The hiring spree

At this year's F8 Facebook Developer Conference, held in California in April, Facebook co-founder Mark Zuckerberg conceded "we [Facebook] have a lot more to do here."

Facebook took action on Wednesday and announced plans to bring 3,000 new employees on-board to assist in the social networking site's efforts to monitor and promptly remove violent content.

In a Facebook , Zuckerberg said: "Over the last few weeks, we've seen people hurting themselves and others on Facebook - either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community."
The statement released by Facebook co-founder, Mark Zuckerberg, about the company's announcement it will hire 3,000 new employees to review content.
The statement released by Facebook co-founder, Mark Zuckerberg, about the company's announcement it will hire 3,000 new employees to review content. Source: Mark Zuckerberg, Facebook.
Facebook's 3,000 new hires will join the global community operations team over the next year - a team that currently has 4,500 members. The team is responsible for reviewing the millions of reports the site receives each week, including non-video requests related to hate speech.

Facebook's Vice President of Global Relations, a few days earlier addressing criticism that the company had been slow to remove graphic content, and the need for users to report content "as easily and quickly as possible."
Facebook has released a statement entitled Community Standards and Reporting. Written by Justin Osofsky, the company's Vice President of Global Relations, it was released on Monday night via blog post. In the statement, Osofsky wrote that Facebook is work
Facebook released a statement on Community Standards and Reporting, written by the company's Vice President of Global Relations, Justin Osofsky. Source: Facebook Newsroom
Facebook shares were down slightly on the day of the announcement, but Chief Investment Officer at Florida's A&G Capital, Hilary Kramer, says the move is about improving the company's relationship with users.

"The first impression when one hears that Facebook is hiring 3,000 employees to monitor posts and to try to find the fake news that's out there amongst the Facebook posters is that it's really about public relations and trying to improve their relationship with their users. This has been a major problem for Facebook", she said.

Underlying issues to address

The founder of California-based digital consultancy firm Creatv Media, Peter Csathy, says an awareness campaign that teaches users about how to respond to confronting content must first be implemented.

"There's probably a majority of Facebook users who still don't really know how to flag objectionable content", he told SBS News from California.

"Facebook users aren't necessarily the most sophisticated when it comes to social media or technology. So I think an education campaign can help mitigate some of the risks and some of the terrible impact that comes to as a reult of this."

In response to criticism that it took as much as 24 hours for Facebook to remove the footage of the and the , the company said it was primarily because they were not alerted to the content until a considerable amount of time had lapsed.

Mr Csathy says this is "almost unfathomable", and everyday users, as well as social media sites, have a big role to play in curbing live-streamed violence.

"It's a typical story when it comes to new technology," he said. "It's unintended consequences, and nobody is to blame for any of this. It's not Facebook's fault that this happened.

"No matter how big they are, they couldn't have anticipated these kinds of terrible things that have happened. I absolutely believe that."

'Social media sites are mainstream media'

Professor Toby Miller, from Murdoch University, says major social media sites need to acknowledge the significant role they hold within society.

"Ever since Facebook started, ever since Twitter started, they have been peddling a myth, and the myth is this: We are not like a TV network. We are not like a phone network. We are truly different. We are the creatures of the public," he told SBS News.

"This is simply nonsense, and it's irresponsible to continue operating with that governing myth." 

However, unlike other media organisations, social media sites are driven by user-generated content, meaning they spend essentially nothing in production costs.

Professor Miller says these sites need to use their "billions of dollars in profits" to take more action.

While Facebook altered its review processes in response to criticism over their coverage of the US presidential elections, and has gone through two revision processes in 24 months, Professor Miller says these changes don't address the new concerns.
With millions of videos uploaded daily, the use of Internet bots and artificial intelligence has been suggested to automatically flag inappropriate content.

However, with news organisations rapidly turning to social media platforms to publish their content, some experts say only human assessment can accurately differentiate between violence that is newsworthy, like a protest, and graphic content that needs to be removed. 

"Social media sites need to acknowledge that, given they pay nothing for their content, unlike other communication and media companies, what they should be piling their money into is people who are experts at trawling through all the material that comes through, getting algorithms in place that work effectively to identify material that is offensive or dangerous", Professor Miller said.

“Given that they are lying back with all this revenue, why don’t they spend it responsibly?"

'Governments need to intervene'

"These entities need to be regulated in the same way as television news and a current affairs or talk show so there can be democratic accountability for the material that they present, from fake news through to violence,” Professor Miller said.

While the United States has a "very strong guarantee of free speech in the First Amendments of the Constituion" and numerous High Court rulings show Australia has an implicit constitutional right to free speech, Professor Miller says this should not stop authorities from regulating social media platforms.

"Nowhere in the world is free speech absolute," he said. "There are always limits put on it, in terms of public safety, for example.

"And so it seems to me that there should be no difficulty for the Australian Government in deciding that it will regulate social media in the same sense in which it regulates any other media service. It has that power. There is no restriction on it."

However, he says the Chinese government is likely to be one of the few governments who are willing to criticise social media sites.

"Governments are loath to criticise social media sites because they give them free advertising," Professor Miller said.

"Facebook is popular with many of its citizens and puts governments at risk of appearing to be illiberal and anti-democratic, so I think that's why you see governments hold back."

We have a 'morbid curiosity'

While technology publication Techly's video content editior, Riordan Lee, agrees social media sites need to better monitor their content, he says for such a global platform, it won't always be possible.

"With 1.86 billion users, there’s going to be times where it just slips through the cracks," he told SBS News. "I imagine it will take a bit of time for Facebook's monitoring and censorship to catch up."

Like Mr Csathy and Professor Miller, Mr Lee says the onus is also on the public to help eradicate violent content from social media sites.

Authorities have expressed concern with reports that, while thousands of users reportedly tuned into the graphic live streams, only a small portion reported the content. 

"There's definitely a sort of morbid curiosity I think a lot of people have," Mr Lee said. "You see that with ISIS beheadings and things like that.

"These things tend to get a lot of views, which is quite horrifying. And there's definitely an element of human curiosity that is drawn to this, and that's something that's perhaps unfortunate, but probably completely natural and understandable."

Where to from here?

Mr Csathy says it will be impossible to eliminate violent content from social media sites completely. 

“I don’t think that anything is fool-proof, so no matter how sophisticated the bots or the artificial intelligence gets, there’s also increased sophistication in terms of how to get around it," he said.

"I personally find it hard to believe that can be done with a 100 per cent accuracy. This is going to need not only AI [artificial intelligence] but human intelligence and human interaction to address the situation immediately."

Thai police have reportedly begun a review process of their own, to determine why it took so long for footage of the 11-month old's murder to be taken down. 

While Facebook's expansion of its review teams is a significant move in mitigating the unprecedented issue, Mr Csathy hopes  in the future will now be mindful of the dangers of this new technology.

"Live-streaming is a wonderful development. It's democratising, all of us can get our voices out around the world - that's very exciting," he said.

"And these are the corner cases - the tragic corner cases that nobody anticipated. Now we know they exist and all we can do is try to mitigate them the best we can. But this technology is here to stay, you can't shut it down."
CYBERWAR: The Race For Artificial Intelligence



 


Share
9 min read
Published 4 May 2017 12:35pm
By Hashela Kumarawansa


Share this with family and friends