Go deface and purge CityxGuide.com, Backpage.com, and 1backpage.com. Because they waste of electricity
2021-03-05 03:57:22 UTC
Sacha Baron Cohen was busy last year. In “The Trial of The Chicago 7,” he
portrayed 1960s antiwar activist Abbie Hoffman. In “Borat Subsequent
Moviefilm,” he revived his role as a Kazakh journalist touring America.
These films may seem historical or comedic, respectively, but Baron Cohen
says their themes — abuse of political power, misogyny and disinformation —
are reflective of our current reality. He blames two things: Donald Trump
and social media.
In this episode of “Sway,” Kara Swisher and Baron Cohen discuss whether
Silicon Valley C.E.O.s should be liable for the content on their platforms,
what the rift in the Democratic Party means for future elections and — of
course — what else happened with Rudy Giuliani.
Image6c099b86112cf4400c283b5059a5eb49.jpg
Credit...Illustration by The New York Times; photograph by George
Pimentel/WireImage, via Getty Images
Related Episodes
Opinion | Kara Swisher
Bryan Cranston Won’t Play Donald Trump
Jan. 19, 2021
Opinion | Kara Swisher
In Hollywood, Women Are Seen as ‘a Risk’
Nov. 30, 2020
Thoughts? Email us at ***@nytimes.com. Transcripts of each episode are
available midday.
Special thanks to Kathy Tu, Michelle Harris, Shannon Busta and Liriel Higa.
“Sway” is produced by Nayeema Raza, Heba Elorbany, Matt Kwong, Daphne Chen
and Vishakha Darbha and edited by Nayeema Raza and Paula Szuchman;
fact-checking by Kate Sinclair; music and sound design by Isaac Jones;
mixing by Erick Gomez.
Kara Swisher is the host of “Sway,” an Opinion podcast, and a contributing
writer. She has reported on technology and technology companies since the
early days of the internet. @karaswisher
"Mark Changed The Rules": How Facebook Went Easy On Alex Jones And Other
Right-Wing Figures
Facebook’s rules to combat misinformation and hate speech are subject to the
whims and political considerations of its CEO and his policy team leader.
Picture of Ryan MacRyan MacBuzzFeed News ReporterPicture of Craig
SilvermanCraig SilvermanBuzzFeed News Reporter
Last updated on February 22, 2021, at 1:14 p.m. ET
Posted on February 21, 2021, at 9:59 a.m. ET
In April 2019, Facebook was preparing to ban one of the internet’s most
notorious spreaders of misinformation and hate, Infowars founder Alex Jones.
Then CEO Mark Zuckerberg personally intervened.
Jones had gained infamy for claiming that the 2012 Sandy Hook elementary
school massacre was a “giant hoax,” and that the teenage survivors of the
2018 Parkland shooting were “crisis actors.” But Facebook had found that he
was also relentlessly spreading hate against various groups, including
Muslims and trans people. That behavior qualified him for expulsion from the
social network under the company's policies for "dangerous individuals and
organizations," which required Facebook to also remove any content that
expressed “praise or support” for them.
But Zuckerberg didn’t consider the Infowars founder to be a hate figure,
according to a person familiar with the decision, so he overruled his own
internal experts and opened a gaping loophole: Facebook would permanently
ban Jones and his company — but would not touch posts of praise and support
for them from other Facebook users. This meant that Jones’ legions of
followers could continue to share his lies across the world’s largest social
network.
ADVERTISEMENT
"Mark personally didn’t like the punishment, so he changed the rules,” a
former policy employee told BuzzFeed News, noting that the original rule had
already been in use and represented the product of untold hours of work
between multiple teams and experts.
“Mark personally didn’t like the punishment, so he changed the rules.”
“That was the first time I experienced having to create a new category of
policy to fit what Zuckerberg wanted. It's somewhat demoralizing when we
have established a policy and it’s gone through rigorous cycles. Like, what
the fuck is that for?” said a second former policy employee who, like the
first, asked not to be named so they could speak about internal matters.
“Mark called for a more nuanced policy and enforcement strategy,” Facebook
spokesperson Andy Stone said of the Alex Jones decision, which also affected
the bans of other extremist figures.
Zuckerberg’s “more nuanced policy” set off a cascading effect, the two
former employees said, which delayed the company’s efforts to remove
right-wing militant organizations such as the Oath Keepers, which were
involved the Jan. 6 insurrection at the US Capitol. It is also a case study
in Facebook’s willingness to change its rules to placate America’s right
wing and avoid political backlash.
Internal documents obtained by BuzzFeed News and interviews with 14 current
and former employees show how the company’s policy team — guided by Joel
Kaplan, the vice president of global public policy, and Zuckerberg’s whims —
has exerted outsize influence while obstructing content moderation
decisions, stymieing product rollouts, and intervening on behalf of popular
conservative figures who have violated Facebook’s rules.
In December, a former core data scientist wrote a memo titled, “Political
Influences on Content Policy.” Seen by BuzzFeed News, the memo stated that
Kaplan’s policy team “regularly protects powerful constituencies” and listed
several examples, including: removing penalties for misinformation from
right-wing pages, blunting attempts to improve content quality in News Feed,
and briefly blocking a proposal to stop recommending political groups ahead
of the US election.
Since the November vote, at least six Facebook employees have resigned with
farewell posts that have called out leadership’s failures to heed its own
experts on misinformation and hate speech. Four departing employees
explicitly cited the policy organization as an impediment to their work and
called for a reorganization so that the public policy team, which oversees
lobbying and government relations, and the content policy team, which sets
and enforces the platform’s rules, would not both report to Kaplan.
Facebook declined to make Kaplan or other executives available for an
interview. Stone, the company spokesperson, dismissed concerns about the
vice president’s influence.
“Recycling the same warmed over conspiracy theories about the influence of
one person at Facebook doesn’t make them true,” he said. “The reality is big
decisions at Facebook are made with input from people across different teams
who have different perspectives and expertise in different areas. To suggest
otherwise is absurd.”
An integrity researcher who worked on Facebook’s efforts to protect the
democratic process and rein in radicalization said the company caused direct
harm to users by rejecting product changes due to concerns of political
backlash.
"At some point Zuckerberg has to be held responsible for his role in
allowing his platform to be weaponized."
“Out of fears over potential public and policy stakeholder responses, we are
knowingly exposing users to risks of integrity,” they wrote in an internal
note seen by BuzzFeed News. They quit in August.
Those most affected by Jones’ rhetoric have taken notice, too. Lenny Pozner,
whose 6-year-old son Noah was the youngest victim of the Sandy Hook
shooting, called the revelation that Zuckerberg weakened penalties facing
the Infowars founder “disheartening, but not surprising.” He said the
company had made a promise to do better in dealing with hate and hoaxes
following a 2018 letter from HONR Network, his organization for survivors of
mass casualty events. Yet Facebook continues to fail to remove harmful
content.
“At some point,” Pozner told BuzzFeed News, “Zuckerberg has to be held
responsible for his role in allowing his platform to be weaponized and for
ensuring that the ludicrous and the dangerous are given equal importance as
the factual.”
“Different Views on Different Things”
Samuel Corum / Getty Images
Mark Zuckerberg and Joel Kaplan chat after leaving a meeting with Sen. John
Cornyn (R-TX) in his office on Capitol Hill on September 19, 2019 in
Washington, DC.
ADVERTISEMENT
Kaplan’s close relationship with Zuckerberg has led the CEO to weigh
politics more heavily when making high-profile content policy enforcement
decisions, current and former employees said. Kaplan’s efforts to court the
Trump White House over the past four years — from his widely publicized
support for Supreme Court nominee Brett Kavanaugh to his interventions on
behalf of right-wing influencers in Facebook policy decisions — have also
made him a target for civil rights groups and Democratic lawmakers.
In June 2020, three Democratic senators asked in a letter what role Kaplan
played “in Facebook’s decision to shut down and de-prioritize internal
efforts to contain extremist and hyperpolarizing activity.” Sen. Elizabeth
Warren called him out for overseeing a lobbying effort that spends millions
of dollars to influence politicians. With a new presidential administration
in place and a spate of ongoing antitrust lawsuits, Zuckerberg must now
grapple with the fact that his top political adviser may no longer be a
Washington, DC asset but a potential liability.
“I think that everybody in DC hates Facebook. They have burned every
bridge,” said Sarah Miller, executive director of the American Economic
Liberties Project and a former member of Joe Biden’s presidential transition
team. Democrats are incensed with the platform’s tolerance of hate speech
and misinformation, while “pulling Trump off the platform” has brought new
life to Republican gripes with the company, she said.
“Facebook has fires to put out all across the political spectrum,” Miller
added.
“I think that everybody in DC hates Facebook. They have burned every
bridge.”
When Kaplan joined Facebook to lead its DC operation in 2011, he had the
connections and pedigree the company needed to court the American right. A
former clerk for conservative Supreme Court Justice Antonin Scalia, he
served as a White House deputy chief of staff under President George W. Bush
after participating in the Brooks Brothers riot during the 2000 Florida
presidential election dispute. During a Senate confirmation hearing in 2003
for a post with the Office of Management and Budget, Kaplan was questioned
about his role in the event, which sought to stop the tallying of votes
during the Florida recount.
Though he initially maintained a low public profile at Facebook, Kaplan —
COO Sheryl Sandberg’s Harvard classmate and former boyfriend — was valued by
Zuckerberg for his understanding of GOP policymakers and conservative
Americans, who the CEO believed were underrepresented by a liberal-leaning
leadership team and employee base.
A chart that shows top leadership at Facebook
BuzzFeed News; Getty Images
By 2014, he’d been promoted to vice president of global public policy. In
that role, Kaplan oversaw the company’s government relations around the
world as well as its content policy team. That arrangement raised eyebrows,
as other companies, including Google and Twitter, typically keep public
policy and lobbying efforts separate from teams that create and enforce
content rules.
The candidacy and election of Donald Trump made Kaplan even more valuable to
the company. He served as Zuckerberg’s policy consigliere, helping Facebook
navigate the sea of lies and hate the former president conjured on the
platform as well as the outraged public response to it. In December 2015,
following a Facebook post from Trump calling for a “total and complete
shutdown” of Muslims entering the US — the first of many that forced the
company to grapple with the then candidate’s racist and sometimes violent
rhetoric — Kaplan and other executives advised Facebook’s CEO to do nothing.
“Don’t poke the bear,” Kaplan said, according to the New York Times, arguing
that taking action against Trump’s account would invite a right-wing
backlash and accusations that the site was limiting free speech. It’s an
argument he’d repeat in various forms over the ensuing five years, with
Zuckerberg often in agreement.
During that time, Kaplan rarely communicated openly on Facebook’s internal
message boards or spoke at companywide meetings, according to current and
former employees. When he did, however, his appearances were clouded in
controversy.
Do you work at Facebook or another technology company? We'd love to hear
from you. Reach out to ***@buzzfeed.com, ***@buzzfeed.com,
or via one of our tip line channels.
After a Facebook team led by then–chief security officer Alex Stamos found
evidence of Russian interference on the platform during and after the 2016
US presidential election, Kaplan was part of a leadership group that argued
against disclosing the full extent of the Kremlin’s influence operation.
When the company did end up publicly releasing further information about it
in October 2017, it was Kaplan, not Stamos, who answered employee questions
during an internal town hall.
“They could have sent me,” said Stamos, who subsequently left the company
over disagreements related to Russian interference. “The person who was
presenting [evidence of the Russian campaign] to VPs was me.”
It was Kaplan’s appearance at Kavanaugh’s September 2018 Senate confirmation
hearings, however, that pushed him into the national spotlight. Sitting
behind the nominee, he was visible in TV coverage of the event. Employees
were furious; they believed Kaplan's attendance made it look like Facebook
supported the nominee, while dismissing the allegations of sexual assault
against him.
Jim Bourg / Reuters
Joel Kaplan sits with family members, friends, and supporters of Supreme
Court nominee Brett Kavanaugh during his testimony before a Senate Judiciary
Committee confirmation hearing in Washington, DC, on Sept. 27, 2018.
ADVERTISEMENT
Kaplan subsequently addressed the incident at a companywide meeting via
videoconference, where angry workers, who felt his on-camera appearance was
intentional, hammered him with questions. The confirmation also caused deep
wounds inside Kaplan's own organization. During a Facebook public policy
team meeting that fall to address the hearing and the vice president's
appearance, one longtime manager tearfully argued to a male colleague “It
doesn’t matter how well you know someone; it doesn’t mean they didn’t do
what somebody said they did,” after writing a blog post detailing her
experience of being sexually assaulted.
None of this changed Kaplan’s standing with Zuckerberg. The CEO went to DC
in September 2019 and was shepherded around by Kaplan on a trip that
included a meeting with Trump. Kaplan remained friendly with the Trump White
House, which at one point considered him to run the Office of Management and
Budget.
“Many people feel that Joel Kaplan has too much power over our decisions.”
In May, when Zuckerberg decided to not touch Trump’s “when the looting
starts, the shooting starts” incitement during the George Floyd protests,
workers became incensed. At a subsequent companywide meeting, one of the
most upvoted questions from employees directly called Kaplan out. “Many
people feel that Joel Kaplan has too much power over our decisions,” the
question read, asking that the vice president explain his role and values.
Zuckerberg seemed irked by the question and disputed the notion that any one
person could influence the “rigorous” process by which the company made
decisions. Diversity, the CEO argued, means taking into account all
political views.
“That basically asked whether Joel can be in this role, or can be doing this
role, on the basis of the fact that he is a Republican … and I have to say
that I find that line of questioning to be very troubling,” Zuckerberg said,
ignoring the question. “If we want to actually do a good job of serving
people, [we have to take] into account that there are different views on
different things.”
Facebook employees said Zuckerberg remains stalwart in his support for
Kaplan, but internal pressure is building to reduce the public policy team’s
influence. Colleagues “feel pressure to ensure their recommendations align
with the interests of policymakers,” Samidh Chakrabarti, head of Facebook’s
civic integrity team, wrote in an internal note in June, bemoaning the
difficulty of balancing such interests while delivering on the team’s
mandate: stopping abuse and election interference on the platform. The civic
integrity team was disbanded shortly after the election, as reported by the
Information.
“They attribute this to the organizational incentives of having the content
policy and public policy teams share a common root,” Chakrabarti said. “As
long as this is the case, we will be prematurely prioritizing regulatory
interests over community protection.”
Stamos, who is now head of the Stanford Internet Observatory, said the
policy team’s structure will always present a problem in its current form.
“You don’t want platform policy people reporting to someone who’s in charge
of keeping people in government happy,” he said. “Joel comes from the Bush
White House, and government relations does not have a neutral position on
speech requests.”
“Fear of Antagonizing Powerful Political Actors”
Facebook's campus
Josh Edelson / Getty Images
ADVERTISEMENT
In August, a Facebook product manager who oversees the News Feed updated his
colleagues on the company’s preparations for the 2020 US election.
Internal research had shown that people on Facebook were being polarized on
the site in political discussion groups, which were also breeding grounds
for misinformation and hate. To combat this, Facebook employees who were
tasked with protecting election integrity proposed the platform stop
recommending such groups in a module called “Groups You Should Join.”
But the public policy team was afraid of possible political blowback.
“Although the Product recommendation would have improved implementation of
the civic filter, it would have created thrash in the political ecosystem
during [ the 2020 US election,]” the product manager wrote on Facebook's
internal message board. “We have decided to not make any changes until the
election is over.”
The social network eventually paused political group recommendations — just
weeks before the November election — and removed them permanently only after
the Capitol insurrection on Jan. 6. Current and former employees said
Facebook's decision to ignore its integrity team's guidance and initially
leave group recommendations untouched exemplifies how political calculations
often quashed company initiatives that could have blunted misinformation and
radicalization.
In that same update about group recommendations, the product manager also
explained how leaders decided against making changes to a feature called In
Feed Recommendations (IFR) due to potential political worries. Designed to
insert posts into people’s feeds from accounts they don’t follow, IFR was
intended to foster new connections or interests. For example, if a person
followed the Facebook page for a football team like the Kansas City Chiefs,
IFR might add a post from the NFL to their feed, even if that person didn’t
follow the NFL.
ADVERTISEMENT
One thing IFR was not supposed to do was recommend political content. But
earlier that spring, Facebook users began complaining that they were seeing
posts from conservative personalities including Ben Shapiro in their News
Feeds even though they had never engaged with that type of content.
When the issue was flagged internally, Facebook’s content policy team warned
that removing such suggestions for political content could reduce those
pages’ engagement and traffic, and possibly inspire complaints from
publishers. A News Feed product manager and a policy team member reiterated
this argument in an August post to Facebook’s internal message board.
“A noticeable drop in distribution for these producers (via traffic insights
for recommendations) is likely to result in high-profile escalations that
could include accusations of shadow-banning and/or FB bias against certain
political entities during the US 2020 election cycle,” they explained.
Shadow-banning, or the limiting of a page’s circulation without informing
its owners, is a common accusation leveled by right-wing personalities
against social media platforms.
"In the US it appears that interventions have been almost exclusively on
behalf of conservative publishers."
Throughout 2020, the “fear of antagonizing powerful political actors,” as
the former core data scientist put it in their memo, became a key public
policy team rationalization for forgoing action on potentially violative
content or rolling out product changes ahead of the US presidential
election. They also said they had seen “a dozen proposals to measure the
objective quality of content on News Feed diluted or killed because … they
have a disproportionate impact across the US political spectrum, typically
harming conservative content more.”
The data scientist, who spent more than five years at the company before
leaving late last year, noted that while strides had been made since 2016,
the state of political content on News Feed was “still generally agreed to
be bad.” According to Facebook data, they added, 1 of every 100 views on
content about US politics was for some type of hoax, while the majority of
views for political materials were on partisan posts. Yet the company
continued to give known spreaders of false and misleading information a pass
if they were deemed “‘sensitive’ or likely to retaliate,” the data scientist
said.
“In the US it appears that interventions have been almost exclusively on
behalf of conservative publishers,” they wrote, attributing this to
political pressure or a reluctance to upset sensitive publishers and
high-profile users.
As BuzzFeed News reported last summer, members of Facebook’s policy team —
including Kaplan — intervened on behalf of right-wing figures and
publications such as Charlie Kirk, Breitbart, and Prager University, in some
cases pushing for the removal of misinformation strikes against their pages
or accounts. Strikes, which are applied at the recommendation of Facebook’s
third-party fact-checkers, can result in a range of penalties, from a
decrease in how far their posts are distributed to the removal of the page
or account.
Kaplan’s other interventions are well documented. In 2018, the Wall Street
Journal revealed that he helped kill a project to connect Americans who have
political differences. The paper said Kaplan had objected “when briefed on
internal Facebook research that found right-leaning users tended to be more
polarized, or less exposed to different points of view, than those on the
left.” Last year, the New York Times reported that policy executives
declined to expand a feature called “correct the record” — which notified
users when they interacted with content that was later labeled false by
Facebook’s fact-checking partners — out of fear that it would
“disproportionately show notifications to people who shared false news from
right-wing websites.”
“It makes it hard to be visibly proud of where I work.”
Policy executives also reportedly helped override an initiative proposed by
the company’s now-disbanded civic integrity unit to throttle the reach of
misleading political posts, according to the Information.
Such interventions were hardly a surprise for those who have worked on
efforts at the company to reduce harm and misinformation. In a December
departure note previously reported by BuzzFeed News, an integrity researcher
detailed how right-wing pages, including those for Breitbart and Fox News,
had become hubs of discussion filled with death threats and hate speech — in
clear violation of Facebook policy. They questioned why the company
continued to work with such publications in official capacities.
“When the company has a very apparent interest in propping up actors who are
fanning the flames of the very fire we are trying to put out, it makes it
hard to be visibly proud of where I work,” the researcher wrote.
A Line From Alex Jones to the US Capitol
Kent Nishimura / Los Angeles Times via Getty Imag
Alex Jones leaves after speaking at a Stop the Steal rally in front of the
Supreme Court on Jan. 5, 2021, in Washington, DC.
ADVERTISEMENT
The strategic response team that had gathered evidence for the Alex Jones
and Infowars ban in spring 2019 drew upon years of examples of his hate
speech against Muslims, transgender people, and other groups. Under the
company's policies for dangerous individuals and organizations, Jones and
Infowars would be permanently banned and Facebook would have to remove
content that expressed support for the conspiracy theorist and his site.
In April 2019, a proposal for the recommended ban — complete with examples
and comments from the public policy, legal, and communications teams — was
sent by email to Monika Bickert, Facebook's head of global policy
management, and her boss, Kaplan. The proposal was then passed on to top
company leadership, including Zuckerberg, sources said.
The Facebook CEO balked at removing posts that praised Jones and his ideas.
“Zuckerberg basically took the decision that he did not want to use this
policy against Jones because he did not personally think he was a hate
figure,” said a former policy employee.
The teams were directed to create an entirely new designation for Jones to
fit the CEO’s request, and when the company announced the ban on May 2, it
did not say it had changed its rules at Zuckerberg’s behest. The decisions,
however, would have far-reaching implications, setting off a chain of events
that ultimately contributed to the violent aftermath of the 2020 election.
ADVERTISEMENT
Two former policy employees said the process made content policy teams
hesitant to recommend new actions, resulting in a “freeze” on new
designations for dangerous individuals and organizations for roughly a year.
In the interim, many extremist groups used the platform to organize and grow
their membership throughout 2020. The former policy employees said the delay
in labeling such groups effectively enabled them to use Facebook to recruit
and organize through most of 2020.
“Once the Alex Jones thing had blown over, they froze designations, and that
lasted for close to a year, and they were very rarely willing to push
through anything. That impacted the lead-up to the election last year. Teams
should have been reviewing the Oath Keepers and Three Percenters, and
essentially these people weren’t allowed to,” said the policy employee,
referring to right-wing militant organizations that Facebook started to
remove in August 2020.
The Washington Post reported on Saturday that the Justice Department and FBI
are investigating links between Jones and the Capitol rioters.
The company could have acted much earlier, one Facebook researcher wrote on
the internal message board when they quit in August. The note came with a
warning: “Integrity teams are facing increasing barriers to building
safeguards.” They wrote of how proposed platform improvements that were
backed by strong research and data had been “prematurely stifled or severely
constrained … often based on fears of public and policy stakeholder
responses.”
“We were willing to act only *after* things had spiraled into a dire state.”
“We’ve known for over a year now that our recommendation systems can very
quickly lead users down the path to conspiracy theories and groups,” they
wrote, criticizing the company for being hesitant to take action against the
QAnon mass delusion. “In the meantime, the fringe group/set of beliefs has
grown to national prominence with QAnon congressional candidates and QAnon
hashtags and groups trending in the mainstream. We were willing to act only
*after* things had spiraled into a dire state.”
Though the 2020 election is long over, current and former employees say
politics continue to seep into Facebook product and feature decisions. Four
sources said they were concerned about Kaplan’s influence over which content
is recommended in News Feed. Given his role courting politicians, they said,
there is a fundamental conflict of interest in both appeasing government
officials or candidates and deciding what people see on the platform.
For weeks prior to the election, misinformation was spreading across
Facebook, undermining trust in the integrity of how votes would be counted.
To improve the quality of content in the News Feed, executives decided the
site would emphasize News Ecosystem Quality (NEQ), an internal score given
to publishers based on assessments of their journalism, in its ranking
algorithm, according to the New York Times.
This and other “break glass” measures improved the quality of content on
people’s News Feeds so much that John Hegeman, the vice president
responsible for the feature, pushed to continue them indefinitely, according
to three people familiar with the situation who spoke to BuzzFeed News. Yet
Hegeman’s suggestion was opposed by Kaplan and members of the policy team.
The temporary measures eventually expired.
Hegeman did not respond to a request for comment.
In the days following the insurrection, Facebook reemphasized NEQ in its
News Feed ranking algorithm again. Facebook spokesperson Andy Stone said
that change was temporary and has already been “rolled back.”
“Our Leadership Isn’t Doing Enough”
In the aftermath of the 2020 election, some departing Facebook have openly
criticized leadership as they’ve exited. “I’ve grown more disillusioned
about our company and the role we play in society," a nearly eight-year
veteran said, adding that they were saddened and infuriated by leadership’s
failure to recognize or minimize the “real negatives” the company introduces
to the world.
“I think the people working in these areas are working as hard as they can
and I commend them for their efforts,” they wrote. “However, I do think our
leadership isn’t doing enough.”
Beyond a profound concern over the influence of Kaplan's policy team, a
number of Facebook employees attributed the company's content policy
problems to Zuckerberg and his view that the platform must always be a
balance of right and left.
“Ideology is not, and should not be, a protected class,” a content policy
employee who left weeks after the election wrote. “White supremacy is an
ideology; so is anarchism. Neither view is immutable, nor should either be
beyond scrutiny. The idea that our content ranking decisions should be
balanced on a scale from right to left is impracticable … and frankly can be
dangerous, as one side of that scale actively challenges core democratic
institutions and fails to recognize the results of a free and fair
election.”
In October 2020, Facebook responded to ongoing criticism of its policy
decisions by introducing an Oversight Board, an independent panel to hear
appeals on content takedowns. But the former policy employee with insight
into the Alex Jones ban said that significant changes to rules and
enforcement will always come down to Zuckerberg.
“Joel [Kaplan] has influence for sure, but at the end of the day Mark owns
this stuff,” they said. “Mark has consolidated so much of this political
decision-making power in himself.” ●
UPDATE
February 22, 2021, at 12:14 p.m.
This story has been updated to clarify that an employee's call for empathy
for victims of sexual assault during a Facebook policy meeting in the fall
of 2018 were directed at a colleague not Joel Kaplan.
How to delete your Facebook account
It may be time to leave the world’s biggest social network
By Micah Singleton and Barbara Krasnoff Jan 15, 2021, 9:27am EST
If you buy something from a Verge link, Vox Media may earn a commission. See
our ethics statement.
Illustration by Alex Castro / The Verge
\\If you’ve finally given up on the world’s most popular social media
network, it’s not too complicated to remove yourself from the service. But
before you delete all of those pictures, posts, and likes, you should
download your personal information from Facebook.
Your Facebook archives contain just about all of the pertinent information
related to your account, including your photos, active sessions, chat
history, IP addresses, facial recognition data, and which ads you clicked,
just to name a few. That’s a ton of personal information that you should
probably maintain access to.
To download your archive using the web:
Click on the “down” arrow in the upper right corner.
Go to “Settings & Privacy” > “Settings.”
In the left-hand column, click on “Your Facebook Information.”
In the center, find and click on “Download Your Information.”
You can select which info you want to download (or you can just download all
of it). At the top of the page, there are drop-down lists that let you
create a date range (if you want to), download your data in either HTML or
JSON, and choose between high, medium, or low media quality.
When you’re ready, click on “Create File.” You’ll get notified via email
when your file is ready.
You can choose which Facebook data you want to download. You can choose
which Facebook data you want to download.
After you’ve finished downloading your archive, you can now delete your
account.
Beware: once you delete your account, it cannot be recovered.
When you are ready to delete your account, go back to the page headed “Your
Facebook Information” and click on “Deactivation and Deletion.” Here, you
can choose between temporarily deactivating your account or permanently
deleting it.
If you want to delete it, either click on “Delete Account” on that page, or
click this link, which will take you to the same account deletion page.
You’ll get another chance here to download your archive or choose
deactivation rather than deletion. Once you click “Delete My Account,” your
account will be marked for termination and inaccessible to others using
Facebook.
Before you delete your account, you get another chance to deactivate or
download your info. Before you delete your account, you get another chance
to deactivate or download your info.
The company notes that it delays termination for a few days after it’s
requested. If you log back in during that period, your deletion request will
be canceled. So don’t sign on, or you’ll be forced to start the process over
again. Certain things, like comments you’ve made on a friend’s post, may
still appear even after you delete your account. Facebook also says that
copies of certain items like log records will remain in its database, but it
notes that those are disassociated with personal identifiers.
If you’re really serious about quitting Facebook, remember that the company
owns several other popular services as well, like Instagram and WhatsApp, so
you should delete your accounts there, too.
Update January 15th, 2021, 9:15AM ET: This article was originally published
on September 28th, 2018 and has been updated to allow for changes in the
Facebook interface.
https://justpaste.it/5m8ly
portrayed 1960s antiwar activist Abbie Hoffman. In “Borat Subsequent
Moviefilm,” he revived his role as a Kazakh journalist touring America.
These films may seem historical or comedic, respectively, but Baron Cohen
says their themes — abuse of political power, misogyny and disinformation —
are reflective of our current reality. He blames two things: Donald Trump
and social media.
In this episode of “Sway,” Kara Swisher and Baron Cohen discuss whether
Silicon Valley C.E.O.s should be liable for the content on their platforms,
what the rift in the Democratic Party means for future elections and — of
course — what else happened with Rudy Giuliani.
Image6c099b86112cf4400c283b5059a5eb49.jpg
Credit...Illustration by The New York Times; photograph by George
Pimentel/WireImage, via Getty Images
Related Episodes
Opinion | Kara Swisher
Bryan Cranston Won’t Play Donald Trump
Jan. 19, 2021
Opinion | Kara Swisher
In Hollywood, Women Are Seen as ‘a Risk’
Nov. 30, 2020
Thoughts? Email us at ***@nytimes.com. Transcripts of each episode are
available midday.
Special thanks to Kathy Tu, Michelle Harris, Shannon Busta and Liriel Higa.
“Sway” is produced by Nayeema Raza, Heba Elorbany, Matt Kwong, Daphne Chen
and Vishakha Darbha and edited by Nayeema Raza and Paula Szuchman;
fact-checking by Kate Sinclair; music and sound design by Isaac Jones;
mixing by Erick Gomez.
Kara Swisher is the host of “Sway,” an Opinion podcast, and a contributing
writer. She has reported on technology and technology companies since the
early days of the internet. @karaswisher
"Mark Changed The Rules": How Facebook Went Easy On Alex Jones And Other
Right-Wing Figures
Facebook’s rules to combat misinformation and hate speech are subject to the
whims and political considerations of its CEO and his policy team leader.
Picture of Ryan MacRyan MacBuzzFeed News ReporterPicture of Craig
SilvermanCraig SilvermanBuzzFeed News Reporter
Last updated on February 22, 2021, at 1:14 p.m. ET
Posted on February 21, 2021, at 9:59 a.m. ET
In April 2019, Facebook was preparing to ban one of the internet’s most
notorious spreaders of misinformation and hate, Infowars founder Alex Jones.
Then CEO Mark Zuckerberg personally intervened.
Jones had gained infamy for claiming that the 2012 Sandy Hook elementary
school massacre was a “giant hoax,” and that the teenage survivors of the
2018 Parkland shooting were “crisis actors.” But Facebook had found that he
was also relentlessly spreading hate against various groups, including
Muslims and trans people. That behavior qualified him for expulsion from the
social network under the company's policies for "dangerous individuals and
organizations," which required Facebook to also remove any content that
expressed “praise or support” for them.
But Zuckerberg didn’t consider the Infowars founder to be a hate figure,
according to a person familiar with the decision, so he overruled his own
internal experts and opened a gaping loophole: Facebook would permanently
ban Jones and his company — but would not touch posts of praise and support
for them from other Facebook users. This meant that Jones’ legions of
followers could continue to share his lies across the world’s largest social
network.
ADVERTISEMENT
"Mark personally didn’t like the punishment, so he changed the rules,” a
former policy employee told BuzzFeed News, noting that the original rule had
already been in use and represented the product of untold hours of work
between multiple teams and experts.
“Mark personally didn’t like the punishment, so he changed the rules.”
“That was the first time I experienced having to create a new category of
policy to fit what Zuckerberg wanted. It's somewhat demoralizing when we
have established a policy and it’s gone through rigorous cycles. Like, what
the fuck is that for?” said a second former policy employee who, like the
first, asked not to be named so they could speak about internal matters.
“Mark called for a more nuanced policy and enforcement strategy,” Facebook
spokesperson Andy Stone said of the Alex Jones decision, which also affected
the bans of other extremist figures.
Zuckerberg’s “more nuanced policy” set off a cascading effect, the two
former employees said, which delayed the company’s efforts to remove
right-wing militant organizations such as the Oath Keepers, which were
involved the Jan. 6 insurrection at the US Capitol. It is also a case study
in Facebook’s willingness to change its rules to placate America’s right
wing and avoid political backlash.
Internal documents obtained by BuzzFeed News and interviews with 14 current
and former employees show how the company’s policy team — guided by Joel
Kaplan, the vice president of global public policy, and Zuckerberg’s whims —
has exerted outsize influence while obstructing content moderation
decisions, stymieing product rollouts, and intervening on behalf of popular
conservative figures who have violated Facebook’s rules.
In December, a former core data scientist wrote a memo titled, “Political
Influences on Content Policy.” Seen by BuzzFeed News, the memo stated that
Kaplan’s policy team “regularly protects powerful constituencies” and listed
several examples, including: removing penalties for misinformation from
right-wing pages, blunting attempts to improve content quality in News Feed,
and briefly blocking a proposal to stop recommending political groups ahead
of the US election.
Since the November vote, at least six Facebook employees have resigned with
farewell posts that have called out leadership’s failures to heed its own
experts on misinformation and hate speech. Four departing employees
explicitly cited the policy organization as an impediment to their work and
called for a reorganization so that the public policy team, which oversees
lobbying and government relations, and the content policy team, which sets
and enforces the platform’s rules, would not both report to Kaplan.
Facebook declined to make Kaplan or other executives available for an
interview. Stone, the company spokesperson, dismissed concerns about the
vice president’s influence.
“Recycling the same warmed over conspiracy theories about the influence of
one person at Facebook doesn’t make them true,” he said. “The reality is big
decisions at Facebook are made with input from people across different teams
who have different perspectives and expertise in different areas. To suggest
otherwise is absurd.”
An integrity researcher who worked on Facebook’s efforts to protect the
democratic process and rein in radicalization said the company caused direct
harm to users by rejecting product changes due to concerns of political
backlash.
"At some point Zuckerberg has to be held responsible for his role in
allowing his platform to be weaponized."
“Out of fears over potential public and policy stakeholder responses, we are
knowingly exposing users to risks of integrity,” they wrote in an internal
note seen by BuzzFeed News. They quit in August.
Those most affected by Jones’ rhetoric have taken notice, too. Lenny Pozner,
whose 6-year-old son Noah was the youngest victim of the Sandy Hook
shooting, called the revelation that Zuckerberg weakened penalties facing
the Infowars founder “disheartening, but not surprising.” He said the
company had made a promise to do better in dealing with hate and hoaxes
following a 2018 letter from HONR Network, his organization for survivors of
mass casualty events. Yet Facebook continues to fail to remove harmful
content.
“At some point,” Pozner told BuzzFeed News, “Zuckerberg has to be held
responsible for his role in allowing his platform to be weaponized and for
ensuring that the ludicrous and the dangerous are given equal importance as
the factual.”
“Different Views on Different Things”
Samuel Corum / Getty Images
Mark Zuckerberg and Joel Kaplan chat after leaving a meeting with Sen. John
Cornyn (R-TX) in his office on Capitol Hill on September 19, 2019 in
Washington, DC.
ADVERTISEMENT
Kaplan’s close relationship with Zuckerberg has led the CEO to weigh
politics more heavily when making high-profile content policy enforcement
decisions, current and former employees said. Kaplan’s efforts to court the
Trump White House over the past four years — from his widely publicized
support for Supreme Court nominee Brett Kavanaugh to his interventions on
behalf of right-wing influencers in Facebook policy decisions — have also
made him a target for civil rights groups and Democratic lawmakers.
In June 2020, three Democratic senators asked in a letter what role Kaplan
played “in Facebook’s decision to shut down and de-prioritize internal
efforts to contain extremist and hyperpolarizing activity.” Sen. Elizabeth
Warren called him out for overseeing a lobbying effort that spends millions
of dollars to influence politicians. With a new presidential administration
in place and a spate of ongoing antitrust lawsuits, Zuckerberg must now
grapple with the fact that his top political adviser may no longer be a
Washington, DC asset but a potential liability.
“I think that everybody in DC hates Facebook. They have burned every
bridge,” said Sarah Miller, executive director of the American Economic
Liberties Project and a former member of Joe Biden’s presidential transition
team. Democrats are incensed with the platform’s tolerance of hate speech
and misinformation, while “pulling Trump off the platform” has brought new
life to Republican gripes with the company, she said.
“Facebook has fires to put out all across the political spectrum,” Miller
added.
“I think that everybody in DC hates Facebook. They have burned every
bridge.”
When Kaplan joined Facebook to lead its DC operation in 2011, he had the
connections and pedigree the company needed to court the American right. A
former clerk for conservative Supreme Court Justice Antonin Scalia, he
served as a White House deputy chief of staff under President George W. Bush
after participating in the Brooks Brothers riot during the 2000 Florida
presidential election dispute. During a Senate confirmation hearing in 2003
for a post with the Office of Management and Budget, Kaplan was questioned
about his role in the event, which sought to stop the tallying of votes
during the Florida recount.
Though he initially maintained a low public profile at Facebook, Kaplan —
COO Sheryl Sandberg’s Harvard classmate and former boyfriend — was valued by
Zuckerberg for his understanding of GOP policymakers and conservative
Americans, who the CEO believed were underrepresented by a liberal-leaning
leadership team and employee base.
A chart that shows top leadership at Facebook
BuzzFeed News; Getty Images
By 2014, he’d been promoted to vice president of global public policy. In
that role, Kaplan oversaw the company’s government relations around the
world as well as its content policy team. That arrangement raised eyebrows,
as other companies, including Google and Twitter, typically keep public
policy and lobbying efforts separate from teams that create and enforce
content rules.
The candidacy and election of Donald Trump made Kaplan even more valuable to
the company. He served as Zuckerberg’s policy consigliere, helping Facebook
navigate the sea of lies and hate the former president conjured on the
platform as well as the outraged public response to it. In December 2015,
following a Facebook post from Trump calling for a “total and complete
shutdown” of Muslims entering the US — the first of many that forced the
company to grapple with the then candidate’s racist and sometimes violent
rhetoric — Kaplan and other executives advised Facebook’s CEO to do nothing.
“Don’t poke the bear,” Kaplan said, according to the New York Times, arguing
that taking action against Trump’s account would invite a right-wing
backlash and accusations that the site was limiting free speech. It’s an
argument he’d repeat in various forms over the ensuing five years, with
Zuckerberg often in agreement.
During that time, Kaplan rarely communicated openly on Facebook’s internal
message boards or spoke at companywide meetings, according to current and
former employees. When he did, however, his appearances were clouded in
controversy.
Do you work at Facebook or another technology company? We'd love to hear
from you. Reach out to ***@buzzfeed.com, ***@buzzfeed.com,
or via one of our tip line channels.
After a Facebook team led by then–chief security officer Alex Stamos found
evidence of Russian interference on the platform during and after the 2016
US presidential election, Kaplan was part of a leadership group that argued
against disclosing the full extent of the Kremlin’s influence operation.
When the company did end up publicly releasing further information about it
in October 2017, it was Kaplan, not Stamos, who answered employee questions
during an internal town hall.
“They could have sent me,” said Stamos, who subsequently left the company
over disagreements related to Russian interference. “The person who was
presenting [evidence of the Russian campaign] to VPs was me.”
It was Kaplan’s appearance at Kavanaugh’s September 2018 Senate confirmation
hearings, however, that pushed him into the national spotlight. Sitting
behind the nominee, he was visible in TV coverage of the event. Employees
were furious; they believed Kaplan's attendance made it look like Facebook
supported the nominee, while dismissing the allegations of sexual assault
against him.
Jim Bourg / Reuters
Joel Kaplan sits with family members, friends, and supporters of Supreme
Court nominee Brett Kavanaugh during his testimony before a Senate Judiciary
Committee confirmation hearing in Washington, DC, on Sept. 27, 2018.
ADVERTISEMENT
Kaplan subsequently addressed the incident at a companywide meeting via
videoconference, where angry workers, who felt his on-camera appearance was
intentional, hammered him with questions. The confirmation also caused deep
wounds inside Kaplan's own organization. During a Facebook public policy
team meeting that fall to address the hearing and the vice president's
appearance, one longtime manager tearfully argued to a male colleague “It
doesn’t matter how well you know someone; it doesn’t mean they didn’t do
what somebody said they did,” after writing a blog post detailing her
experience of being sexually assaulted.
None of this changed Kaplan’s standing with Zuckerberg. The CEO went to DC
in September 2019 and was shepherded around by Kaplan on a trip that
included a meeting with Trump. Kaplan remained friendly with the Trump White
House, which at one point considered him to run the Office of Management and
Budget.
“Many people feel that Joel Kaplan has too much power over our decisions.”
In May, when Zuckerberg decided to not touch Trump’s “when the looting
starts, the shooting starts” incitement during the George Floyd protests,
workers became incensed. At a subsequent companywide meeting, one of the
most upvoted questions from employees directly called Kaplan out. “Many
people feel that Joel Kaplan has too much power over our decisions,” the
question read, asking that the vice president explain his role and values.
Zuckerberg seemed irked by the question and disputed the notion that any one
person could influence the “rigorous” process by which the company made
decisions. Diversity, the CEO argued, means taking into account all
political views.
“That basically asked whether Joel can be in this role, or can be doing this
role, on the basis of the fact that he is a Republican … and I have to say
that I find that line of questioning to be very troubling,” Zuckerberg said,
ignoring the question. “If we want to actually do a good job of serving
people, [we have to take] into account that there are different views on
different things.”
Facebook employees said Zuckerberg remains stalwart in his support for
Kaplan, but internal pressure is building to reduce the public policy team’s
influence. Colleagues “feel pressure to ensure their recommendations align
with the interests of policymakers,” Samidh Chakrabarti, head of Facebook’s
civic integrity team, wrote in an internal note in June, bemoaning the
difficulty of balancing such interests while delivering on the team’s
mandate: stopping abuse and election interference on the platform. The civic
integrity team was disbanded shortly after the election, as reported by the
Information.
“They attribute this to the organizational incentives of having the content
policy and public policy teams share a common root,” Chakrabarti said. “As
long as this is the case, we will be prematurely prioritizing regulatory
interests over community protection.”
Stamos, who is now head of the Stanford Internet Observatory, said the
policy team’s structure will always present a problem in its current form.
“You don’t want platform policy people reporting to someone who’s in charge
of keeping people in government happy,” he said. “Joel comes from the Bush
White House, and government relations does not have a neutral position on
speech requests.”
“Fear of Antagonizing Powerful Political Actors”
Facebook's campus
Josh Edelson / Getty Images
ADVERTISEMENT
In August, a Facebook product manager who oversees the News Feed updated his
colleagues on the company’s preparations for the 2020 US election.
Internal research had shown that people on Facebook were being polarized on
the site in political discussion groups, which were also breeding grounds
for misinformation and hate. To combat this, Facebook employees who were
tasked with protecting election integrity proposed the platform stop
recommending such groups in a module called “Groups You Should Join.”
But the public policy team was afraid of possible political blowback.
“Although the Product recommendation would have improved implementation of
the civic filter, it would have created thrash in the political ecosystem
during [ the 2020 US election,]” the product manager wrote on Facebook's
internal message board. “We have decided to not make any changes until the
election is over.”
The social network eventually paused political group recommendations — just
weeks before the November election — and removed them permanently only after
the Capitol insurrection on Jan. 6. Current and former employees said
Facebook's decision to ignore its integrity team's guidance and initially
leave group recommendations untouched exemplifies how political calculations
often quashed company initiatives that could have blunted misinformation and
radicalization.
In that same update about group recommendations, the product manager also
explained how leaders decided against making changes to a feature called In
Feed Recommendations (IFR) due to potential political worries. Designed to
insert posts into people’s feeds from accounts they don’t follow, IFR was
intended to foster new connections or interests. For example, if a person
followed the Facebook page for a football team like the Kansas City Chiefs,
IFR might add a post from the NFL to their feed, even if that person didn’t
follow the NFL.
ADVERTISEMENT
One thing IFR was not supposed to do was recommend political content. But
earlier that spring, Facebook users began complaining that they were seeing
posts from conservative personalities including Ben Shapiro in their News
Feeds even though they had never engaged with that type of content.
When the issue was flagged internally, Facebook’s content policy team warned
that removing such suggestions for political content could reduce those
pages’ engagement and traffic, and possibly inspire complaints from
publishers. A News Feed product manager and a policy team member reiterated
this argument in an August post to Facebook’s internal message board.
“A noticeable drop in distribution for these producers (via traffic insights
for recommendations) is likely to result in high-profile escalations that
could include accusations of shadow-banning and/or FB bias against certain
political entities during the US 2020 election cycle,” they explained.
Shadow-banning, or the limiting of a page’s circulation without informing
its owners, is a common accusation leveled by right-wing personalities
against social media platforms.
"In the US it appears that interventions have been almost exclusively on
behalf of conservative publishers."
Throughout 2020, the “fear of antagonizing powerful political actors,” as
the former core data scientist put it in their memo, became a key public
policy team rationalization for forgoing action on potentially violative
content or rolling out product changes ahead of the US presidential
election. They also said they had seen “a dozen proposals to measure the
objective quality of content on News Feed diluted or killed because … they
have a disproportionate impact across the US political spectrum, typically
harming conservative content more.”
The data scientist, who spent more than five years at the company before
leaving late last year, noted that while strides had been made since 2016,
the state of political content on News Feed was “still generally agreed to
be bad.” According to Facebook data, they added, 1 of every 100 views on
content about US politics was for some type of hoax, while the majority of
views for political materials were on partisan posts. Yet the company
continued to give known spreaders of false and misleading information a pass
if they were deemed “‘sensitive’ or likely to retaliate,” the data scientist
said.
“In the US it appears that interventions have been almost exclusively on
behalf of conservative publishers,” they wrote, attributing this to
political pressure or a reluctance to upset sensitive publishers and
high-profile users.
As BuzzFeed News reported last summer, members of Facebook’s policy team —
including Kaplan — intervened on behalf of right-wing figures and
publications such as Charlie Kirk, Breitbart, and Prager University, in some
cases pushing for the removal of misinformation strikes against their pages
or accounts. Strikes, which are applied at the recommendation of Facebook’s
third-party fact-checkers, can result in a range of penalties, from a
decrease in how far their posts are distributed to the removal of the page
or account.
Kaplan’s other interventions are well documented. In 2018, the Wall Street
Journal revealed that he helped kill a project to connect Americans who have
political differences. The paper said Kaplan had objected “when briefed on
internal Facebook research that found right-leaning users tended to be more
polarized, or less exposed to different points of view, than those on the
left.” Last year, the New York Times reported that policy executives
declined to expand a feature called “correct the record” — which notified
users when they interacted with content that was later labeled false by
Facebook’s fact-checking partners — out of fear that it would
“disproportionately show notifications to people who shared false news from
right-wing websites.”
“It makes it hard to be visibly proud of where I work.”
Policy executives also reportedly helped override an initiative proposed by
the company’s now-disbanded civic integrity unit to throttle the reach of
misleading political posts, according to the Information.
Such interventions were hardly a surprise for those who have worked on
efforts at the company to reduce harm and misinformation. In a December
departure note previously reported by BuzzFeed News, an integrity researcher
detailed how right-wing pages, including those for Breitbart and Fox News,
had become hubs of discussion filled with death threats and hate speech — in
clear violation of Facebook policy. They questioned why the company
continued to work with such publications in official capacities.
“When the company has a very apparent interest in propping up actors who are
fanning the flames of the very fire we are trying to put out, it makes it
hard to be visibly proud of where I work,” the researcher wrote.
A Line From Alex Jones to the US Capitol
Kent Nishimura / Los Angeles Times via Getty Imag
Alex Jones leaves after speaking at a Stop the Steal rally in front of the
Supreme Court on Jan. 5, 2021, in Washington, DC.
ADVERTISEMENT
The strategic response team that had gathered evidence for the Alex Jones
and Infowars ban in spring 2019 drew upon years of examples of his hate
speech against Muslims, transgender people, and other groups. Under the
company's policies for dangerous individuals and organizations, Jones and
Infowars would be permanently banned and Facebook would have to remove
content that expressed support for the conspiracy theorist and his site.
In April 2019, a proposal for the recommended ban — complete with examples
and comments from the public policy, legal, and communications teams — was
sent by email to Monika Bickert, Facebook's head of global policy
management, and her boss, Kaplan. The proposal was then passed on to top
company leadership, including Zuckerberg, sources said.
The Facebook CEO balked at removing posts that praised Jones and his ideas.
“Zuckerberg basically took the decision that he did not want to use this
policy against Jones because he did not personally think he was a hate
figure,” said a former policy employee.
The teams were directed to create an entirely new designation for Jones to
fit the CEO’s request, and when the company announced the ban on May 2, it
did not say it had changed its rules at Zuckerberg’s behest. The decisions,
however, would have far-reaching implications, setting off a chain of events
that ultimately contributed to the violent aftermath of the 2020 election.
ADVERTISEMENT
Two former policy employees said the process made content policy teams
hesitant to recommend new actions, resulting in a “freeze” on new
designations for dangerous individuals and organizations for roughly a year.
In the interim, many extremist groups used the platform to organize and grow
their membership throughout 2020. The former policy employees said the delay
in labeling such groups effectively enabled them to use Facebook to recruit
and organize through most of 2020.
“Once the Alex Jones thing had blown over, they froze designations, and that
lasted for close to a year, and they were very rarely willing to push
through anything. That impacted the lead-up to the election last year. Teams
should have been reviewing the Oath Keepers and Three Percenters, and
essentially these people weren’t allowed to,” said the policy employee,
referring to right-wing militant organizations that Facebook started to
remove in August 2020.
The Washington Post reported on Saturday that the Justice Department and FBI
are investigating links between Jones and the Capitol rioters.
The company could have acted much earlier, one Facebook researcher wrote on
the internal message board when they quit in August. The note came with a
warning: “Integrity teams are facing increasing barriers to building
safeguards.” They wrote of how proposed platform improvements that were
backed by strong research and data had been “prematurely stifled or severely
constrained … often based on fears of public and policy stakeholder
responses.”
“We were willing to act only *after* things had spiraled into a dire state.”
“We’ve known for over a year now that our recommendation systems can very
quickly lead users down the path to conspiracy theories and groups,” they
wrote, criticizing the company for being hesitant to take action against the
QAnon mass delusion. “In the meantime, the fringe group/set of beliefs has
grown to national prominence with QAnon congressional candidates and QAnon
hashtags and groups trending in the mainstream. We were willing to act only
*after* things had spiraled into a dire state.”
Though the 2020 election is long over, current and former employees say
politics continue to seep into Facebook product and feature decisions. Four
sources said they were concerned about Kaplan’s influence over which content
is recommended in News Feed. Given his role courting politicians, they said,
there is a fundamental conflict of interest in both appeasing government
officials or candidates and deciding what people see on the platform.
For weeks prior to the election, misinformation was spreading across
Facebook, undermining trust in the integrity of how votes would be counted.
To improve the quality of content in the News Feed, executives decided the
site would emphasize News Ecosystem Quality (NEQ), an internal score given
to publishers based on assessments of their journalism, in its ranking
algorithm, according to the New York Times.
This and other “break glass” measures improved the quality of content on
people’s News Feeds so much that John Hegeman, the vice president
responsible for the feature, pushed to continue them indefinitely, according
to three people familiar with the situation who spoke to BuzzFeed News. Yet
Hegeman’s suggestion was opposed by Kaplan and members of the policy team.
The temporary measures eventually expired.
Hegeman did not respond to a request for comment.
In the days following the insurrection, Facebook reemphasized NEQ in its
News Feed ranking algorithm again. Facebook spokesperson Andy Stone said
that change was temporary and has already been “rolled back.”
“Our Leadership Isn’t Doing Enough”
In the aftermath of the 2020 election, some departing Facebook have openly
criticized leadership as they’ve exited. “I’ve grown more disillusioned
about our company and the role we play in society," a nearly eight-year
veteran said, adding that they were saddened and infuriated by leadership’s
failure to recognize or minimize the “real negatives” the company introduces
to the world.
“I think the people working in these areas are working as hard as they can
and I commend them for their efforts,” they wrote. “However, I do think our
leadership isn’t doing enough.”
Beyond a profound concern over the influence of Kaplan's policy team, a
number of Facebook employees attributed the company's content policy
problems to Zuckerberg and his view that the platform must always be a
balance of right and left.
“Ideology is not, and should not be, a protected class,” a content policy
employee who left weeks after the election wrote. “White supremacy is an
ideology; so is anarchism. Neither view is immutable, nor should either be
beyond scrutiny. The idea that our content ranking decisions should be
balanced on a scale from right to left is impracticable … and frankly can be
dangerous, as one side of that scale actively challenges core democratic
institutions and fails to recognize the results of a free and fair
election.”
In October 2020, Facebook responded to ongoing criticism of its policy
decisions by introducing an Oversight Board, an independent panel to hear
appeals on content takedowns. But the former policy employee with insight
into the Alex Jones ban said that significant changes to rules and
enforcement will always come down to Zuckerberg.
“Joel [Kaplan] has influence for sure, but at the end of the day Mark owns
this stuff,” they said. “Mark has consolidated so much of this political
decision-making power in himself.” ●
UPDATE
February 22, 2021, at 12:14 p.m.
This story has been updated to clarify that an employee's call for empathy
for victims of sexual assault during a Facebook policy meeting in the fall
of 2018 were directed at a colleague not Joel Kaplan.
How to delete your Facebook account
It may be time to leave the world’s biggest social network
By Micah Singleton and Barbara Krasnoff Jan 15, 2021, 9:27am EST
If you buy something from a Verge link, Vox Media may earn a commission. See
our ethics statement.
Illustration by Alex Castro / The Verge
\\If you’ve finally given up on the world’s most popular social media
network, it’s not too complicated to remove yourself from the service. But
before you delete all of those pictures, posts, and likes, you should
download your personal information from Facebook.
Your Facebook archives contain just about all of the pertinent information
related to your account, including your photos, active sessions, chat
history, IP addresses, facial recognition data, and which ads you clicked,
just to name a few. That’s a ton of personal information that you should
probably maintain access to.
To download your archive using the web:
Click on the “down” arrow in the upper right corner.
Go to “Settings & Privacy” > “Settings.”
In the left-hand column, click on “Your Facebook Information.”
In the center, find and click on “Download Your Information.”
You can select which info you want to download (or you can just download all
of it). At the top of the page, there are drop-down lists that let you
create a date range (if you want to), download your data in either HTML or
JSON, and choose between high, medium, or low media quality.
When you’re ready, click on “Create File.” You’ll get notified via email
when your file is ready.
You can choose which Facebook data you want to download. You can choose
which Facebook data you want to download.
After you’ve finished downloading your archive, you can now delete your
account.
Beware: once you delete your account, it cannot be recovered.
When you are ready to delete your account, go back to the page headed “Your
Facebook Information” and click on “Deactivation and Deletion.” Here, you
can choose between temporarily deactivating your account or permanently
deleting it.
If you want to delete it, either click on “Delete Account” on that page, or
click this link, which will take you to the same account deletion page.
You’ll get another chance here to download your archive or choose
deactivation rather than deletion. Once you click “Delete My Account,” your
account will be marked for termination and inaccessible to others using
Facebook.
Before you delete your account, you get another chance to deactivate or
download your info. Before you delete your account, you get another chance
to deactivate or download your info.
The company notes that it delays termination for a few days after it’s
requested. If you log back in during that period, your deletion request will
be canceled. So don’t sign on, or you’ll be forced to start the process over
again. Certain things, like comments you’ve made on a friend’s post, may
still appear even after you delete your account. Facebook also says that
copies of certain items like log records will remain in its database, but it
notes that those are disassociated with personal identifiers.
If you’re really serious about quitting Facebook, remember that the company
owns several other popular services as well, like Instagram and WhatsApp, so
you should delete your accounts there, too.
Update January 15th, 2021, 9:15AM ET: This article was originally published
on September 28th, 2018 and has been updated to allow for changes in the
Facebook interface.
https://justpaste.it/5m8ly