Time and again, the documents show, Facebook’s researchers
have identified the platform’s ill effects. Time and again, despite
congressional hearings, its own pledges and numerous media exposés, the
company didn’t fix them. The documents offer perhaps the clearest
picture thus far of how broadly Facebook’s problems are known inside the
company, up to the chief executive himself.
By Jeff Horwitz
Mark Zuckerberg
has said Facebook allows its users to speak on equal footing with
the elites of politics, culture and journalism, and that its standards
apply to everyone. In private, the company has built a system that has
exempted high-profile users from some or all of its rules. The program,
known as “cross check” or “XCheck,” was intended as a quality-control
measure for high-profile accounts. Today, it shields millions of VIPs
from the company’s normal enforcement, the documents show. Many abuse
the privilege, posting material including harassment and incitement to
violence that would typically lead to sanctions. Facebook says criticism
of the program is fair, that it was designed for a good purpose and
that the company is working to fix it.
By Georgia Wells, Jeff Horwitz and Deepa Seetharaman
Researchers
inside Instagram, which is owned by Facebook, have been studying for
years how its photo-sharing app affects millions of young users.
Repeatedly, the company found that Instagram is harmful for a sizable
percentage of them, most notably teenage girls, more so than other
social-media platforms. In public, Facebook has consistently played down
the app’s negative effects, including in comments to Congress, and
hasn’t made its research public or available to academics or lawmakers
who have asked for it. In response, Facebook says the negative effects
aren’t widespread, that the mental-health research is valuable and that
some of the harmful aspects aren’t easy to address.
By Keach Hagey and Jeff Horwitz
Facebook
made a heralded change to its algorithm in 2018 designed to improve its
platform—and arrest signs of declining user engagement. Mr. Zuckerberg
declared his aim was to strengthen bonds between users and improve their
well-being by fostering interactions between friends and family. Within
the company, the documents show, staffers warned the change was having
the opposite effect. It was making Facebook, and those who used it,
angrier. Mr. Zuckerberg resisted some fixes proposed by his team, the
documents show, because he worried they would lead people to interact
with Facebook less. Facebook, in response, says any algorithm can
promote objectionable or harmful content and that the company is doing
its best to mitigate the problem.
By Justin Scheck, Newley Purnell and Jeff Horwitz
Scores
of Facebook documents reviewed by The Wall Street Journal show
employees raising alarms about how its platforms are used in developing
countries, where its user base is huge and expanding. Employees flagged
that human traffickers in the Middle East used the site to lure women
into abusive employment situations. They warned that armed groups in
Ethiopia used the site to incite violence against ethnic minorities.
They sent alerts to their bosses about organ selling, pornography and
government action against political dissent, according to the documents.
They also show the company’s response, which in many instances is
inadequate or nothing at all. A Facebook spokesman said the company has
deployed global teams, local partnerships and third-party fact checkers
to keep users safe.
By Sam Schechner, Jeff Horwitz and Emily Glazer
Facebook
threw its weight behind promoting Covid-19 vaccines—“a top company
priority,” one memo said—in a demonstration of Mr. Zuckerberg’s faith
that his creation is a force for social good in the world. It ended up
demonstrating the gulf between his aspirations and the reality of the
world’s largest social platform. Activists flooded the network with what
Facebook calls “barrier to vaccination” content, the internal memos
show. They used Facebook’s own tools to sow doubt about the severity of
the pandemic’s threat and the safety of authorities’ main weapon to
combat it. The Covid-19 problems make it uncomfortably clear: Even when
he set a goal, the chief executive couldn’t steer the platform as he
wanted. A Facebook spokesman said in a statement that the data shows
vaccine hesitancy for people in the U.S. on Facebook has declined by
about 50% since January, and that the documents show the company’s
“routine process for dealing with difficult challenges.”
By Georgia Wells and Jeff Horwitz
Facebook
has come under increasing fire in recent days for its effect on young
users. Inside the company, teams of employees have for years been laying
plans to attract preteens that go beyond what is publicly known,
spurred by fear that it could lose a wave of users critical to its
future. “Why do we care about tweens?” said one document from 2020.
“They are a valuable but untapped audience.”
Adam Mosseri,
head of Instagram, said Facebook is not recruiting people too
young to use its apps—the current age limit is 13—but is instead trying
to understand how teens and preteens use technology and to appeal to the
next generation
By Wall Street Journal Staff
A
Senate Commerce Committee hearing about Facebook, teens and mental
health was prompted by a mid-September article in The Wall Street
Journal. Based on internal company documents, it detailed Facebook’s
internal research on the negative impact of its Instagram app on teen
girls and others. Six of the documents that formed the basis of the
Instagram article are published here.
By Stephanie Stamm, John West and Deepa Seetharaman
The
Wall Street Journal reviewed 10 years of Facebook annual employee
lists, which showed names, titles and managers for Facebook’s staffers
and contract workers. The data show which teams under which executives
have expanded the fastest, providing an unusually detailed public view
of the company’s shifting power centers and priorities.
By Jeff Horwitz
Frances
Haugen, a former Facebook product manager who gathered documents that
formed the basis for the Journal’s series, said she had grown frustrated
by what she saw as the company’s lack of openness about its platforms’
potential for harm and unwillingness to address its flaws. A Facebook
spokesman
Andy Stone
said the company strives to balance free expression with safety.
“To suggest we encourage bad content and do nothing is just not true,”
he said.
By Deepa Seetharaman, Jeff Horwitz and Justin Scheck
Facebook
executives have long said artificial intelligence would address the
company’s chronic problems keeping what it deems hate speech and
excessive violence off its platforms. That future is farther away than
those executives suggest, according to internal documents. Employees say
Facebook removes only a sliver of the posts that violate its rules, and
that Facebook’s AI can’t consistently identify first-person shooting
videos, racist rants and even, in one notable episode, the difference
between cockfighting and car crashes, according to the documents.
Facebook, in response, says it takes other actions to reduce how many
people view content that violates its policies and that the prevalence
of that material has been shrinking.
By Sam Schechner and Jeff Horwitz
Facebook
is struggling to detect users’ creating multiple accounts on its
flagship platform, according to internal documents, raising questions
about how the social-media giant measures its audience. One Facebook
presentation called the phenomenon of single users with multiple
accounts “very prevalent” among new accounts, after an examination of
roughly 5,000 recent sign-ups indicated that as many as 56% were opened
by existing users. Facebook says those numbers are incorporated into
estimates it discloses of duplicate accounts, and that such accounts
pose a challenge for many large internet companies.
By Jeff Horwitz and Justin Scheck
Internal
Facebook documents show that people inside the company have long
discussed a systematic approach to restrict features that
disproportionately amplify incendiary and divisive posts. Facebook
rejected those efforts because they would impede the platform’s usage
and growth. Instead, Facebook is making ad hoc decisions about groups it
deems harmful, such as a movement by far-right activists after the Jan.
6 Capitol riot to form what they called a Patriot Party. The company’s
approach puts it in a role of refereeing public discourse that strays
from its public commitment to neutrality. Facebook acknowledges tension
in its work on such viral social movements, and says it has invented new
technologies and balanced difficult trade-offs to develop its
solutions.
By Newley Purnell and Jeff Horwitz
Facebook
researchers documented how its platform has contributed to divisive,
inter-religious conflict in India, according to internal records. The
company’s researchers found hate speech spiked by 300% amid bloody
riots, and that Indian users held the company responsible for failing to
prevent or police it. Facebook has traced some of the stream of hate to
influential entities tied to India’s ruling government but hasn’t taken
action amid concerns about “political sensitivities.” Facebook says
hate speech against Muslims is rising world-wide, and that the company
is working to improve enforcement on its platforms.
By Keach Hagey and Jeff Horwitz
Facebook
employees and their bosses have hotly debated whether and how to
restrain right-wing publishers, with more-senior employees often
providing a check on agitation from the rank and file, according to
internal documents viewed by the Journal. The documents, which don’t
capture all of the employee messaging, didn’t mention equivalent debates
over left-wing publications. Other documents also reveal that
Facebook’s management team has been so intently focused on avoiding
charges of bias that it regularly places political considerations at the
center of its decision making. Facebook says it enforces its rules
equally and doesn’t consider politics in its decisions.
By Georgia Wells, Deepa Seetharaman and Jeff Horwitz
Facebook
researchers have found that 1 in 8 of the app’s users report engaging
in compulsive use of social media that affects their sleep, work,
parenting or relationships, and the problems were perceived by users to
be worse on Facebook than any other major social-media platform,
according to documents reviewed by The Wall Street Journal. The
documents highlight the company’s research into the possible negative
impacts on the day-to-day lives of a broad swath of users. Facebook said
it has built tools and controls to help people manage when and how they
use its services.
By Keach Hagey and Jeff Horwitz
About
40% of traffic to pages in 2018 went to those with content that was
plagiarized or recycled, according to the company’s internal reports.
The researchers said the tactic is an effective way to build a large
audience and has been used by foreign and domestic groups that post
divisive content and peddle false information. “This is the basic game
plan used by many bad actors,” one researcher wrote. Facebook says it
has taken steps to address the issues, including removing fake accounts
and reducing distribution of unoriginal news reporting.
By Keach Hagey, Georgia Wells, Emily Glazer, Deepa Seetharaman and Jeff Horwitz
Part
of Facebook’s response to the disclosures made by the whistleblower was
to push politics to the forefront. The company’s goal, according to
Republicans and Democrats familiar with its outreach, was to muddy the
waters, divide lawmakers along partisan lines and forestall a
cross-party alliance that was emerging to enact tougher rules on
social-media companies in general and Facebook in particular. “When our
work is being mischaracterized, we’re not going to apologize,” said
Facebook spokesman Andy Stone. “We’re going to defend our record.”
Source:wsj
See the Facebook Papers, from the SEC Office of the Whistleblower within the link:
https://facebookpapers.com/sec-documents/
See the Facebook Papers from Gizmodo:
https://gizmodo.com/facebook-papers-how-to-read-1848702919