Facebook Bug Exposed Names of Content Moderators to Terrorists -- Update
June 16 2017 - 9:15PM
Dow Jones News
By Georgia Wells
Facebook Inc. inadvertently exposed the names of some moderators
to suspected terrorists and other groups whose content the workers
were tasked with reviewing, a flaw the company Friday said has been
fixed.
About 1,000 of Facebook's moderators were affected by the flaw,
which disclosed their names in an activity log, a spokesman said.
Clicking on a name, though, would take the viewer to the public
version of the moderator's Facebook profile page. In the vast
majority of cases, he said, moderator names weren't viewed by
administrators of these groups.
Facebook investigators believe suspected terrorists may have
viewed the profiles of fewer than six workers, the spokesman said.
According to the investigators, none of the cases involved
suspected members of the terror group ISIS, the spokesman said.
The problem began in the fall and was fixed in November, the
spokesman said. News of the flaw and subsequent fix was reported
earlier by the Guardian.
In response, Facebook made a number of changes to prevent
workers' information from becoming available externally again. The
company is also testing new accounts that won't require workers to
log in with personal Facebook accounts.
"As soon as we learned about this issue, we fixed it and began a
thorough investigation to learn as much as possible about what
happened, " the Facebook spokesman said.
The fumble comes as Facebook is under scrutiny to do more to
police inappropriate content. The company has been leaning more on
artificial intelligence in recent months to block potential
terrorist posts and accounts on its platform without requiring
reviews by human moderators.
One tool combs Facebook for known terrorist imagery, such as
beheading videos, to stop them from being reposted. Another set of
algorithms attempts to identify and block propagandists from
creating new accounts after they have already been kicked off the
social network.
Facebook's previous attempts to replace humans with algorithms
haven't always succeeded. In August, the company put an algorithm
in charge of its "trending" feature, but within days the lists
featured false stories and celebrity gossip in place of serious
news.
Currently, moderators do much of the work at Facebook deleting
content deemed to be in violation of Facebook's terms of service,
such as hate speech and child exploitation. In May, Facebook Chief
Executive Mark Zuckerberg said the company would hire 3,000 more
staffers to review content in an attempt to curb violent or
sensitive videos.
Typically, these actions don't appear on Facebook's timeline or
logs. But because of a bug introduced last fall, when a moderator
revoked the privileges of a group administrator, a note of this
action was created in the activity log for the group.
Write to Georgia Wells at Georgia.Wells@wsj.com
(END) Dow Jones Newswires
June 16, 2017 21:00 ET (01:00 GMT)
Copyright (c) 2017 Dow Jones & Company, Inc.
Meta Platforms (NASDAQ:META)
Historical Stock Chart
From Mar 2024 to Apr 2024
Meta Platforms (NASDAQ:META)
Historical Stock Chart
From Apr 2023 to Apr 2024