Facebook to Hire 3,000 New Workers to Fix Violent Video Problem
Facebook to Hire 3,000 New Workers to Fix Violent Video Problem
German lawmakers have threatened fines if the company cannot remove at least 70 percent of offending posts within 24 hours.

Facebook Inc will hire 3,000 more people over the next year to speed up the removal of videos showing murder, suicide and other violent acts, in its most dramatic move yet to combat the biggest threat to its valuable public image.

Read more: WhatsApp Down: Messaging App Back After Outage of Several Hours in India, Canada and US

The hiring spree, announced by Chief Executive Mark Zuckerberg on Wednesday, comes after users were shocked by two video posts in April showing killings in Thailand and the United States.

The move is an acknowledgement by Facebook that it needs more than its recent focus on automated software to identify and remove such material.

Watch the video of Honor 8 Pro first look

Artificial intelligence techniques would take “a period of years ... to really reach the quality level that we want," Zuckerberg told investors after the company's earnings late on Wednesday.

“Given the importance of this, how quickly live video is growing, we wanted to make sure that we double down on this and make sure that we provide as safe of an experience for the community as we can," he said.

The problem has become more pressing since the introduction last year of Facebook Live, a service that allows any of Facebook's 1.9 billion monthly users to broadcast video, which has been marred by some violent scenes.

Some violence on Facebook is inevitable given its size, researchers say, but the company has been attacked for its slow response.

UK lawmakers this week accused social media companies including Facebook of doing a "shameful" job removing child abuse and other potentially illegal material.

In Germany, the company has been under pressure to be quicker and more accurate in removing illegal hate speech and to clamp down on so-called fake news.

German lawmakers have threatened fines if the company cannot remove at least 70 percent of offending posts within 24 hours.

So far, Facebook has avoided political fallout from U.S. lawmakers or any significant loss of the advertisers it depends on for revenue. Some in the ad industry have defended Facebook, citing the difficulty of policing material from its many users. Police agencies have said Facebook works well with them.

Facebook shares fell slightly on Wednesday, and edged lower still after the bell, following its quarterly earnings.

ARTIFICIAL INTELLIGENCE

Zuckerberg, the company's co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service. Facebook has 17,000 employees overall, not including contractors.

Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video. A video of a man shooting and killing another in Cleveland last month also shocked viewers.

Zuckerberg said the company would do better: "We're working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."

The 3,000 workers will be new positions and will monitor all Facebook content, not just live videos, the company said. The company did not say where the jobs would be located, although Zuckerberg said the team operates around the world.

The world's largest social network has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material. In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week, and like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

"Despite industry claims to the contrary, I don't know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We're just not there yet technologically," said Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring.

The workers who monitor material generally work on contract in places such as India and the Philippines, and they face difficult working conditions because of the hours they spend making quick decisions while sifting through traumatic material, Roberts said in an interview.

In December, two people who monitored graphic material for Microsoft Corp's (MSFT.O) services such as Skype sued the company, saying it had failed to warn them about the risks to their mental health. They are seeking compensation for medical costs, wages lost from disability and other damages.

Microsoft has disputed their claims. The company said in a statement that it takes seriously the responsibility to remove and report imagery of child sexual exploitation and abuse, as well as the health and resilience of employees.

Mental health assistance plans sometimes fall by the wayside for such workers, and there was a risk that would happen for Facebook if it tries to find 3,000 new workers quickly, Roberts said. "To do it at this scale and this magnitude, I question that," she said.

PSYCHOLOGICAL SUPPORT

Facebook says that every person reviewing its content is offered psychological support and wellness resources, and that the company has a support program in place.

When Facebook launched its live service in April 2016, Zuckerberg spoke about it as a place for "raw and visceral" communication.

"Because it's live, there is no way it can be curated," Zuckerberg told BuzzFeed News in an interview then. "And because of that it frees people up to be themselves. It's live; it can’t possibly be perfectly planned out ahead of time."

Since then, at least 50 criminal or violent incidents have been broadcast over Facebook Live, including assault, murder and suicide, The Wall Street Journal reported in March.

In January, four African-Americans in Chicago were accused of attacking an 18-year-old disabled man on Facebook Live while making anti-white racial taunts. They have pleaded not guilty.

A man in Cleveland, Ohio, last month was accused of shooting another man on a sidewalk and then uploading a video of the murder to Facebook, where it remained for about two hours. The man later fatally shot himself.

Zuckerberg said the company would keep working with community groups and law enforcement, and that there have been instances when intervention has helped.

"Just last week, we got a report that someone on Live was considering suicide," he wrote in his post. "We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate."

What's your reaction?

Comments

https://filka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!