Facebook created “Project P” - for propaganda - in the hectic weeks after the 2016 presidential election and quickly found dozens of pages that had peddled false news reports ahead of Donald Trump’s surprise victory. Nearly all were based overseas, had financial motives and displayed a clear rightward bent.
In a world of perfect neutrality, which Facebook espouses as its goal, the political tilt of the pages shouldn't have mattered. But in a videoconference between Facebook's Washington office and its Silicon Valley headquarters in December 2016, the company's most senior Republican, Joel Kaplan, voiced concerns that would become familiar to those within the company.
"We can't remove all of it because it will disproportionately affect conservatives," said Kaplan, a former George W. Bush White House official and now the head of Facebook's Washington office, according to people familiar with the meeting who spoke on the condition of anonymity to protect professional relationships.
When another Facebook staff member pushed for the entire list to be taken down on the grounds that the accounts fueled the "fake news" that had roiled the election, Kaplan warned of the backlash from conservatives.
"They don't believe it to be fake news," he said, arguing for time to develop guidelines that could be defended to the company's critics, including on the right.
The debate over "Project P," which resulted in a few of the worst pages quickly being removed while most others remained on the platform, exemplified the political dynamics that have reigned within Facebook since Trump emerged as the Republican Party's presumptive nominee to the White House in 2016. A company led mainly by Democrats in the liberal bastion of Northern California repeatedly has tilted rightward to deliver policies, hiring decisions and public gestures sought by Republicans, according to current and former employees and others who have worked closely with the company.
Trump and other party leaders have pressured Facebook by making unproven claims of bias against conservatives amid rising signs of government action on the issue, including investigations by Congress and the Justice Department. Republicans also have leveraged Facebook's fears of alienating conservative Americans to win concessions from a company whose most widely shared news content typically includes stories from Fox News and other right-leaning sources.
These sensitivities - in conjunction with the company's long-standing resistance to acting as "an arbiter of truth" - have affected Facebook's responses to a range of major issues, from how to address fake news and Russian manipulation of American voters on the platform to, more recently, the advertising policies that have set the political ground rules for the 2020 election, say people privy to internal debates.
Such factors have helped shape a platform that gives politicians license to lie and that remains awash in misinformation, vulnerable to a repeat of many of the problems that marred the 2016 presidential election.
Facebook, unlike Google and Twitter, also has refused calls to restrict politicians' access to powerful ad-targeting tools - which Trump used with particular relish four years ago - that allow messages to be tailored to individual voters, based on characteristics Facebook has gleaned over years of tracking user behavior.
"I think Facebook is looking at their political advertising policies in explicitly partisan terms, and they're afraid of angering Republicans," said Alex Stamos, head of the Stanford Internet Observatory, a research group, and a former Facebook chief security officer. "The Republicans in the D.C. office see themselves as a bulwark against the liberals in California."
The company says its decisions are guided not by political calculations but by global policy goals of expanding connections among users and protecting them from government overreach, in line with chief executive Mark Zuckerberg's commitment to allowing speech on the social media platform to remain as unrestricted as possible.
"After 2016, we made massive investments in new teams and technology to make our products safer and to secure elections," said company spokesman Andy Stone. "People on both sides of the aisle continue to criticize us, but we remain committed to seeking outside perspectives and building a platform for all ideas."
Kaplan declined to comment for this article.
But critics - both outside Facebook and within its ranks - see something more akin to corporate realpolitik, a willingness to accede to political demands in an era when Republicans control most levers of power in Washington.
"Facebook does not speak Republican," said a former employee of Facebook's Integrity Team, which was created to ensure safety and trust on the platform, who spoke on the condition of anonymity to speak freely about a former employer. "This is what they know about Republicans: Tell them 'yes' or they will hurt us."
In the 16 years since its birth as a website to connect students at Harvard, Facebook has emerged as perhaps the world's most far-reaching source of news and information, especially since it added the potent subsidiaries Instagram, WhatsApp and Messenger, creating a stable of globe-spanning communication tools with billions of users. Facebook's technology played a role in fomenting democratic revolutions across the Arab world and helping to rally domestic political movements such as Black Lives Matter. But the platform also was used to help fuel a genocide in Myanmar, a U.N. report concluded, and has been used to live-stream violence, including video of a massacre at a New Zealand mosque.
Facebook's power is coveted by American politicians, who know that the vast majority of U.S. voters have accounts. Trump already has spent more than $32 million on the platform for his reelection effort, while Democratic candidates, collectively, have spent more than $107 million, according to Facebook's Ad Library, one of its transparency initiatives. Andrew Bosworth, a top corporate executive considered a confidant of Zuckerberg, said in a post in December that Facebook was "responsible for Donald Trump getting elected" in 2016 through his effective advertising campaign - a comment that underscored the stakes of the company's policy moves.
Facebook's quest to quell conservative criticism has infused a range of decisions in recent years, say people familiar with the company's internal debates. These included whether to allow graphic images of premature babies on feeding tubes - a prohibition that had rankled antiabortion groups - or to include the sharply conservative Breitbart News in a list of news sources despite its history of serving, in the words of its former executive chairman Stephen K. Bannon, as the "platform for the alt-right."
Breitbart spokeswoman Elizabeth Moore, citing the popularity of the news site and what she called a strong track record of accuracy, said, "It would be an insane oversight to disenfranchise our massive audience that uses Facebook and craves our news content."
But its inclusion has sparked criticism among those who say the move was mainly to address Republican complaints about the company.
"I don't think they do this as a conservative company. I think they do this as a scared company," said Jeff Jarvis, a journalism professor at the City University of New York who has worked with Facebook on several media projects.
The price has been high in terms of anger from Democrats, such as Sen. Elizabeth Warren, Mass., who has promised to lead efforts to break up Facebook should she win the presidency. Liberal financier George Soros, writing recently in the New York Times, called for stripping control of Facebook from Zuckerberg and accused the company of having "an informal mutual assistance operation" with Trump.
Yet by at least one metric, Facebook's moves have succeeded - in appeasing a disruptive, unpredictable president. Just last month in Davos, Switzerland, Trump said of Zuckerberg on CNBC, "He's done a hell of a job."
- - -
Soon after Facebook's meeting on Project P, former Trump campaign manager Corey Lewandowski came to Facebook's Washington headquarters offering to advise the company on how to handle the new White House, according to people familiar with the meeting, who spoke on the condition of anonymity to describe sensitive internal matters.
The shifting power in Washington was a serious issue for the company. Its employees had donated just $5,171 to Trump, compared with $1.1 million to fundraising committees affiliated with Democrat Hillary Clinton, with nearly half that amount coming from two of Zuckerberg's closest confidantes, chief operating officer Sheryl Sandberg and then-chief product officer Chris Cox, according to the political analytics firm GovPredict.
But the meeting with Lewandowski sparked outrage within an office still reeling from the election. Particularly upset were several Democrats, including director of U.S. public policy, Catlin O'Neill, a former chief of staff to House Speaker Nancy Pelosi, D-Calif., and the granddaughter of a legendary Pelosi predecessor, Thomas "Tip" O'Neill, D-Mass., said people familiar with the visit and its aftermath.
Facebook decided not to retain Lewandowski, who declined to comment on the details of the visit aside from saying by text, "Please be sure to include the facts that I have never worked for them or been paid by them - they solicited me for a meeting and I attended."
But the encounter left many within the company uneasy about what Trump and his allies might do - or perhaps worse, what he might tweet.
The company gradually implemented policies to combat false, misleading news reports through new transparency initiatives and a system of third-party fact-checkers, a move that upset some Republicans. It also adopted its first policy against "coordinated inauthentic behavior" - essentially using bots, fake accounts or other amplification tactics to manipulate the platform, as Russians and others had in 2016 - and bolstered its security team to police violations.
Complaints eventually grew, however, that conservatives were being unfairly targeted by these moves and by long-standing content policies, such as the prohibition against hate speech. Moves to ban conspiracy theorist Alex Jones and right-wing media stars Milo Yiannopoulos in 2019 for being "dangerous," for example, generated allegations of censorship by "Big Tech" among more mainstream conservatives.
As these and other complaints against Facebook grew among Republicans, Trump often amplified them over rival social media platform Twitter, where his following tops 72 million users.
"Facebook was always anti-Trump," he tweeted on Sept. 27, 2017, amid the scandal over Russian efforts to use social media to help elect him. The following month, he added, "Crooked Hillary Clinton spent hundreds of millions of dollars more on Presidential Election than I did. Facebook was on her side, not mine!"
Trump leveled similar charges against other technology companies, as he did in December 2018: "Facebook, Twitter and Google are so biased toward the Dems it is ridiculous!" But often Facebook bore the brunt of the president's wrath, as it did after a pair of pro-Trump social media personalities, "Diamond and Silk," accused the company of censoring them after they received a warning about posting "unsafe" content. (The company later said it had acted in error.)
"The wonderful Diamond and Silk have been treated so horribly by Facebook. They work so hard and what has been done to them is very sad - and we're looking into" it, Trump tweeted in May 2019. "It's getting worse and worse for Conservatives on social media!"
- - -
The role of helping the company maneuver through this treacherous new political landscape became a core responsibility for Kaplan, Facebook's vice president for global public policy, who had joined the company in 2011, after eight years in the Bush White House and a stint as an energy lobbyist.
The former Marine Corps officer had clerked for Supreme Court Justice Antonin Scalia and, despite supporting former Florida governor Jeb Bush and Sen. Marco Rubio (Fla.) for president, met with Trump in December 2016 after the White House expressed interest in having him head the Office of Management and Budget. Kaplan later played a key role in organizing support for Trump Supreme Court pick Brett M. Kavanaugh, a longtime Kaplan friend.
As Trump came to office, Kaplan was a Republican in a company increasingly self-conscious about its oversupply of Democrats in its top ranks. This included Sandberg, who had worked in the Clinton administration and hired numerous friends and former colleagues into Facebook - creating a class of internal allies known informally as "FOSS," for Friends of Sheryl Sandberg.
Kaplan, who had dated Sandberg when they were students at Harvard, managed to be both a FOSS and one of the only Republicans in the room when major decisions got made. The combination lent him credibility when he warned, as he often did, that a looming decision might inflame perilous relations with conservatives.
The rising clout among Facebook's Republicans went beyond Kaplan. Katie Harbath, a onetime campaign aide to former New York mayor Rudolph W. Giuliani, gained increased prominence. Kaplan also dispensed with the tradition of having members of both major parties share power atop the Washington office by hiring a fellow Republican, former Federal Communications Commission chairman Kevin Martin, as his deputy - strengthening the conservative cast of the office at its highest levels.
Kaplan proved to be adept at assuaging conservative concerns about Facebook. Even before Trump won the presidency, the company faced a crisis in May 2016 when tech publication Gizmodo published a story claiming that contractors managing Facebook's "Trending" module were suppressing conservative stories.
Kaplan tapped a small team of Republicans, including Harbath, to organize a visit for prominent conservatives, such as political commentators Glenn Beck and Tucker Carlson, to Facebook headquarters. The meeting with Zuckerberg and Sandberg calmed the controversy - at least for a time - but conservatives soon would come back with other complaints.
"It's the squeaky wheels who get the grease," said another person familiar with the company's effort to mollify conservatives, who spoke on the condition of anonymity. "They were the squeaky wheels."
As for the "Trending" topics feature, Facebook fired the contractors described in the Gizmodo story and gave the job for determining "Trending" topics to an algorithm. That allowed the feature to become a vehicle for spreading the false news reports that marred Facebook in the months leading up to the election. One recommended story claimed - falsely - that Fox News host Megyn Kelly had been fired for supporting Clinton.
- - -
Security researchers at Facebook found the first signs that Russians were seeking to influence the U.S. election months before the 2016 vote, discovering accounts apparently under control of foreign military hackers.
Those initial discoveries, although shared with the FBI, were not made public. But when U.S. intelligence officials announced in January 2017 that they, too, had detected Russian interference on social media, an internal debate developed within Facebook about what to reveal publicly and when.
The result, after three months of wrangling, was a 13-page white paper in April that did not include the words "Russia" or "Russian." Instead, there was this oblique reference: "Our data does not contradict the attribution provided by the U.S. Director of National Intelligence in the report dated January 6, 2017."
Several issues were at play in these debates, including whether Facebook's researchers had enough clear evidence to name Russia definitively, and company officials pushed to make sure the white paper was rigorous enough to be defended in the face of the expected Republican backlash. But some company employees found the resulting document incomplete, and the caution of company officials fueled complaints that they were acting in part to avoid inflaming tensions with a White House consumed with battling allegations that Russia had helped elect Trump.
"If we say Russia, it will center us in this discussion and anger the administration," a person familiar with the political dynamics in Facebook's Washington office recalled hearing.
Stone, the Facebook spokesman, said, "The goal of the white paper was to share our findings in a straightforward manner, which is why there was broad agreement with the security team's recommendation to refer to the Intelligence Community Assessment and not name any specific nations."
The worry about political fallout grew in subsequent months as Facebook's security researchers discovered that the Internet Research Agency, whose owner was a close ally of Russian President Vladimir Putin, had used 470 fake accounts and pages to manipulate U.S. voters. When Facebook revealed this Russian interference in September 2017, the fears of angering the White House proved prescient.
Trump soon began tweeting about the company, and conservatives in Congress used the resulting hearings to accuse Facebook of bias against conservative voices on the platform. Such complaints grew the following year, when the Cambridge Analytica scandal broke regarding the use of sensitive Facebook data to direct campaign messaging.
Zuckerberg's visit to Capitol Hill in April 2018 to address the Cambridge Analytica scandal featured frequent allegations of bias. Sen. Ted Cruz (R-Tex.) cited Facebook's warning to Diamond and Silk as exemplifying "a pervasive pattern of political bias." In a House hearing the next day, Rep. Billy Long (R-Mo.) asked Zuckerberg, "What is 'unsafe' about two black women supporting Donald J. Trump?"
While Zuckerberg attributed the incident to "an enforcement error," the next month the company announced that it would conduct an audit of allegations of bias against conservatives at Facebook. Leading this inquiry was not an independent social media researcher but a prominent conservative lawyer, former senator Jon Kyl (R-Ariz.).
The resulting interim report, completed in August, catalogued numerous complaints by conservatives but offered no concrete evidence of bias or any systematic, data-based review of the question. Still, it offered two concessions: Facebook would hire more staff "dedicated to working with right-of-center organizations and leaders." And the company would loosen a long-standing advertising policy against graphic medical photos; the result was to allow antiabortion groups to depict premature babies reliant on feeding and other medical tubes in political messaging.
The audit and its concessions pleased many conservatives but rankled some on the other side of the political spectrum, who had begun to sense that, in their dealings with Facebook, they were on a losing streak to an organized, forceful and consistent campaign of pressure by conservatives. Civil rights leaders, for example, had been asking for an audit of racism on the platform for several years. It finally got announced the same day, in May 2018, as the conservative bias audit.
"We've been in conversation with them, in some iteration, for four years, without much success," said Malkia Devich Cyril, a senior fellow for the activist group MediaJustice who was part of a Black Lives Matter delegation that visited Facebook in 2016. "As individuals they might have liberal or progressive leanings, but as a company their interests are being served by conservative economic policy."
- - -
The political stakes for Facebook became increasingly clear last summer. A major corporate initiative, a cryptocurrency called Libra, landed in Washington with a discernible thud.
"Facebook Libra's 'virtual currency' will have little standing or dependability," Trump tweeted in July, making clear his intention to impose federal regulations on such an initiative. "We have only one real currency in the USA, and it is stronger than ever, both dependable and reliable. It is by far the most dominant currency anywhere in the World, and it will always stay that way. It is called the United States Dollar!"
About the same time, the Justice Department began a broad antitrust review of major technology companies, including Facebook.
Zuckerberg - who had lashed out at Warren over her calls to break up Facebook, telling employees in a July meeting that he would "go to the mat and . . . fight" any such effort - took a more conciliatory tone with Trump.
As talk of federal investigations grew in September, Zuckerberg visited the White House. Trump tweeted, "Nice meeting with Mark Zuckerberg of @Facebook in the Oval Office today." Included was a picture of the young tech billionaire shaking hands with the president.
Zuckerberg also hosted a group of conservatives at his home in Palo Alto, Calif., in June. One participant, longtime anti-tax activist Grover Norquist, praised the company for hiring staff specifically to work with conservatives.
"There has been what seems to be a serious effort to reach out to us," said Norquist, president of Americans for Tax Reform.
Two important victories for Trump and conservatives came amid this outreach by Zuckerberg.
The first was when Nick Clegg, Facebook's vice president for global affairs and communications, announced in September that the company's system of third-party fact-checkers would not review claims by politicians. Although Facebook said this was merely the ratification of existing practice, the announcement provoked fury among Democrats weary from thousands of well-chronicled falsehoods, embellishments and misstatements by Trump and worried that he would exploit the loophole in the coming campaign season.
An immediate test further underscored these fears: A Trump campaign ad made claims against former vice president Joe Biden, at the time leading in the polls for the Democratic presidential nomination, that independent fact-checkers called dubious. Biden's campaign demanded that the ad be removed, but Facebook refused, reiterating that it would not act against false statements from politicians.
Those defending the decision, inside and outside the company, pointed to the traditional leeway given to political speech in the United States and to Zuckerberg's own reluctance to curb user expression in all but the most extreme circumstances.
He said in a speech at Georgetown University in October that restricting political speech threatens "the ability to speak freely [that] has been central in the fight for democracy worldwide."
But critics saw yet another effort by Facebook to steer clear of Republican wrath.
"Right now Trump is president, and the company is obviously very attuned to the political winds," said Vanita Gupta, president of the Leadership Conference on Civil and Human Rights, a Washington-based umbrella group. "They all know [at Facebook] that the Justice Department and state attorneys general are sniffing around at regulations and litigation."
The second victory for conservatives came soon after, when Facebook rebuffed calls to limit the ability of politicians to use advertising tools that allow the narrow targeting of individuals based on their home address, gender, education level, income, marital status, job or other characteristics. Brad Parscale, a digital adviser to Trump's 2016 campaign and now campaign manager for the reelection effort, had boasted of the power of these targeting tools and made clear his eagerness to use them again.
Some Democratic political operatives and the Democratic National Committee also expressed concern to Facebook about losing access to such cheap, effective means for reaching voters. But other prominent Democrats, as well as politically independent technology researchers, warned that what they called "microtargeting" could threaten the sanctity of elections by undermining the accountability and transparency of political speech.
These critics warned that voters had no way to know what messages reached their friends or neighbors, giving politicians license to tailor messages based on what people wanted to hear rather than what was best for the public overall. A lie delivered to just 100 carefully targeted people on Facebook, for example, was much less likely to be caught and corrected than one delivered on a billboard or in a television ad.
Ellen L. Weintraub, a Democrat who then was chair of the Federal Election Commission, warned in a Washington Post opinion piece that such targeting had a history as "a potent weapon for spreading disinformation and sowing discord."
For these reasons, Google prohibited politicians from using its most powerful targeting tools. Twitter decided to ban political ads altogether. Sen. Ron Wyden, D-Ore., urged Facebook to follow the lead of these other companies "rather than continuing to chase political advertising dollars."
Facebook seriously considered such a move during a months-long internal debate that weighed several types of restrictions, including possibly banning political ads altogether, company officials said, pointing out that such advertising produces a very small percentage of its multibillion-dollar revenue streams while generating a disproportionate amount of headaches.
But when news leaked that Facebook was considering such changes, Trump made clear his opposition. His campaign tweeted, amid red siren emoji, "IMPORTANT @facebook wants to take important tools away from us for 2020. Tools that help us reach more great Americans & lift voices the media & big tech choose to ignore!"
Facebook ultimately announced in January that it would increase the transparency of ad targeting ahead of the 2020 election but impose no new limits for politicians.
In a blog post, Rob Leathern, Facebook's director of product management, made clear that the company had heard the political clamor on the issue.
"Unlike Google, we have chosen not to limit targeting of these ads," Leathern wrote. "We considered doing so, but through extensive outreach and consultations we heard about the importance of these tools for reaching key audiences from a wide range of NGOs, nonprofits, political groups and campaigns, including both Republican and Democratic committees in the US."
- - -
The Washington Post’s Josh Dawsey contributed to this report.