Facebook to Add Content Monitors -- WSJ
May 04 2017 - 3:02AM
Dow Jones News
By Deepa Seetharaman and Joshua Jamerson
Facebook Inc. said it would hire 3,000 more staffers to review
content in an attempt to curb violent or sensitive videos on its
site without scaling back its live-streaming tools.
The planned hires, announced by Chief Executive Mark Zuckerberg
in a post Wednesday, are in response to the Facebook posting of
such violent videos as one last month showing a Cleveland man
fatally shooting another man. A week later, a man in Thailand
killed his 11-month-old daughter in a live video.
Mr. Zuckerberg's proposed fix, which would increase Facebook's
roster of 4,500 reviewers by two-thirds over the next year,
addresses the amount of time it takes Facebook to remove graphic
content, as opposed to preventing its site from being used to
display such content. The Cleveland video was up for roughly two
hours; the Thailand video stayed up for 24 hours.
"If we're going to build a safe community, we need to respond
quickly," Mr. Zuckerberg wrote, adding that videos posted on
Facebook of people hurting themselves and others in the past few
weeks has been "heartbreaking."
The new steps nudge Facebook toward a more active role in
policing its site, including being able to stop incidents before
they happen. In his post, Mr. Zuckerberg noted that Facebook last
week received a report about a man contemplating suicide on live
video. Facebook immediately contacted law-enforcement officials,
who were able to prevent the incident.
Facebook will still rely on its vast user base to report and
flag content, but it is likely to be able to respond more quickly
to flagged content with its bulked up team. Mr. Zuckerberg noted
that Facebook will continue forging relationships with local
officials and will alert them if reviewers sense a user is about to
cause harm to him or herself or others.
Mr. Zuckerberg also said Facebook would make it easier for users
to report problems to the company so reviewers can more quickly
determine if a post violates its standards. The company is also
investing in artificial intelligence, in hopes that AI can one day
detect violence as it is unfolding, but that technology is a long
way off.
Meanwhile, it could be a challenge for Facebook to ramp up its
content moderation team to 7,500. Most reviewers are contractors,
not full-time employees, and burnout is high, experts and former
workers say. A small subset of workers manage live videos.
"It's an incredible commitment of resources. It took them almost
10 years to devote 4,500 jobs to doing this," said Kate Klonick,
resident fellow at Yale Law School's Information Society Project
and author of a recent paper on content moderation at technology
companies. "The only thing I question is being able to maintain
quality on content moderation review as they take on, train and
update a system for this huge number of new workers."
A Facebook spokeswoman declined to say if the new hires would be
contractors or employees, or where they would be hired. They will
handle a variety of objectionable content, including hate speech
and child exploitation, across text, images and video.
Facebook rushed out its live-streaming tool, Facebook Live, to
users last year, with Mr. Zuckerberg touting the format as raw and
visceral. But the company wasn't prepared for the extent to which
crimes, including suicides and killings, shown on live video would
capture the public's attention.
"We're working to make these videos easier to report so we can
take the right action sooner, whether that's responding quickly
when someone needs help or taking a post down," Mr. Zuckerberg
said.
Write to Deepa Seetharaman at Deepa.Seetharaman@wsj.com and
Joshua Jamerson at joshua.jamerson@wsj.com
(END) Dow Jones Newswires
May 04, 2017 02:47 ET (06:47 GMT)
Copyright (c) 2017 Dow Jones & Company, Inc.
Meta Platforms (NASDAQ:META)
Historical Stock Chart
From Jun 2024 to Jul 2024
Meta Platforms (NASDAQ:META)
Historical Stock Chart
From Jul 2023 to Jul 2024