For the first time, 60 Minutes is publishing whistleblower complaints filed with the Securities and Exchange Commission against Facebook by former employee Frances Haugen.
The filings, submitted by Haugen’s lawyers, state, “Our anonymous client is disclosing original evidence showing that Facebook, Inc. (NASDAQ: FB) has, for years past and ongoing, violated U.S. securities laws by making material misrepresentations and omissions in statements to investors and prospective investors, including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”
Haugen’s attorneys have filed at least eight whistleblower complaints with the SEC based on tens of thousands of internal Facebook documents secretly copied by Haugen before she left the social media company in May. 60 Minutes obtained the SEC letters from a Congressional source.
Haugen revealed her identity onIt was her first recorded interview.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen told Pelley. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.”
Among the allegations in the SEC filings are claims that Facebook and Instagram were aware in 2019 that the platforms were being used to “promote human trafficking and domestic servitude.” The filings also allege Facebook “failed to deploy internally-recommended or lasting counter-measures” to combat misinformation and violent extremism related to the 2020 election and January 6 insurrection.
Following the 60 Minutes report on Sunday, Facebook Vice President of Integrity, Guy Rosen, said on Twitter “We have the most comprehensive and transparent effort to fight hate speech of any major tech company.”
Facebook declined an on-camera interview with 60 Minutes before the report ran. The company issued a statement that you can read.
Haugen’s lawyer John Tyle told 60 Minutes that “as a publicly traded company, Facebook is required to not lie to its investors or even withhold material information.”
Tye said his client is provided legal whistleblower protection from lawsuits by the Dodd-Frank Act, which became a federal law in 2010.
Haugen, a 37-year-old data scientist with a degree in computer engineering and a Master’s Degree from Harvard Business School told Pelley that Facebook “picks metrics that are in its own benefit” when it comes to publishing data about hateful content and misinformation.
“The prevalence of hate speech on Facebook is now 0.05%, and is down by about half over the last three quarters,” Facebook’s Rosen tweeted Sunday night. “We can attribute a vast majority of the drop in prevalence in the past three quarters to our efforts.”
Haugen’s whistleblower complaints, which you can read in full below, make allegations against the $1 trillion social media company and cite some of the internal Facebook documents Haugen copied and provided to federal law enforcement.
“The SEC filings lay out the scope of the internal research that Haugen brought forward,” said 60 Minutes producer Maria Gavrilovic. “It helped 60 Minutes understand the severity of the allegations brought by the whistleblower.”
60 Minutes contacted the SEC regarding Haugen’s allegations and was told it “does not comment on the existence or nonexistence of a possible investigation.”
Facebook’s role in the 2020 election and January 6 insurrection
A whistleblower complaint filed on behalf of former Facebook employee Frances Haugen cites internal documents that reference what she claims was the company’s role in stoking political division [or polarization].
The complaint titled, “Facebook misled investors and the public about its role perpetuating misinformation and violent extremism relating to the 2020 election and January 6th insurrection” highlights internal Facebook experiments that she says found the company’s algorithm “can veer people interested in conservative topics into radical or polarizing ideas and groups/pages.”
The filing quotes an internal Facebook study that found that new test accounts, created by Facebook, which followed “verified/high quality conservative pages”, including the official pages of Fox News, Donald Trump, and Melania Trump, took one day to devolve towards recommending polarizing content. The same Facebook study said, “Page recommendations began to include conspiracy recommendations after only 2 days.”
In a statement to 60 Minutes, Facebook said:
“We banned hundreds of militarized social movements, took down tens of thousands of QAnon pages, groups and accounts from our apps, and removed the original #StopTheSteal Group. This is in addition to our removal, and repeated disruption of various hate groups, including Proud Boys, which we banned in 2018. Ultimately, the responsibility resides with those who broke the law, and the leaders who incited them. Facebook has taken extraordinary steps to address harmful content and we’ll continue to do our part. We also aggressively worked with law enforcement, both before January 6 and in the days and weeks since, with the goal of ensuring that evidence linking the people responsible for January 6th to their crimes is available for prosecutors.”
Facebook’s removal of hate speech
A whistleblower complaint filed on behalf of former Facebook employee Frances Haugen claims the social media company doesn’t take sufficient action regarding hateful content posted to its platform.
The filing cites an internal Facebook study that said, “We only take action against approximately 2% of the hate speech on the platform. Recent estimates suggest that unless there is a major change in strategy, it will be very difficult to improve this beyond 10-20% in the short-medium term.”
Another internal Facebook document cited in the whistleblower filing said, “We’re deleting less than 5% of all of the hate speech posted to Facebook. This is actually an optimistic estimate.”
In a statement issued to 60 Minutes on Friday, Lena Pietsch, Facebook’s director of policy communications, said, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago. We have a strong track record of using our research — as well as external research and close collaboration with experts and organizations — to inform changes to our apps.”
Teen and mental health
One of the whistleblower complaints says that Facebook CEO Mark Zuckerberg misled members of Congress in March when he testified about Facebook and Instagram’s effect on the health of young girls. In responding to a question, Zuckerberg said that he did not believe his platform harms children.
The SEC filing cites internal Facebook research that found:
• 13.5% of teen girls on lnstagram say the platform makes thoughts of “Suicide and Self Injury” worse
• 17% of teen girl lnstagram users say the platform makes “Eating Issues” (e.g. anorexia and bulimia) worse
• “We make body image issues worse for 1 in 3 teen girls.”
In a statement issued to 60 Minutes, a spokesperson for Instagram said, “Contrary to [the] characterization, Instagram’s research shows that on 11 of 12 well-being issues, teenage girls who said they struggled with those difficult issues also said that Instagram made them better rather than worse.”
An October 2019, internal Facebook document quoted in Haugen’s whistleblower complaint cited the company’s knowledge of Facebook, Instagram, and WhatsApp being used for what it called “domestic servitude.”
The filing cited an internal Facebook document that said:
“Our investigative findings demonstrate that … our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks … The traffickers, recruiters and facilitators from these ‘agencies’ used FB profiles, IG profiles, Pages, Messenger and WhatsApp …. “
In an exchange last week with Tennessee Senator Marsha Blackburn (R) on allegations of human trafficking on the social media platform, Facebook’s global head of safety Antigone Davis stated “…in fact, we have policies against sex trafficking on our platform.”
Facebook’s algorithms and the promotion of misinformation and hate speech
Another whistleblower complaint filed on behalf of Haugen alleges that Facebook misled investors and the public when it said it prioritizes “meaningful social interactions” (MSI) through its algorithms. The complaint claims Facebook actually promotes polarizing misinformation and hate speech.
The filing claims that in 2018, Mark Zuckerberg announced a shift from prioritizing time spent on Facebook to focusing on MSI, emphasizing a focus on showing friends and family content in users’ newsfeeds. But internal Facebook studies cited by the complaint show how prioritizing MSI actually furthers misinformation and other divisive, low-quality content. One cited report reads, “the more negative comments a piece of content instigates, the higher likelihood for the link to get more traffic.”
In a statement to 60 Minutes, Pietsch said, “Research also shows that polarization has been growing in the United States for decades, long before platforms like Facebook even existed, and that it is decreasing in other countries where Internet and Facebook use has increased. We have our role to play and will continue to make changes consistent with the goal of making people’s experience more meaningful, but blaming Facebook ignores the deeper causes of these issues – and the research.”
Facebook’s “XCheck” program and the whitelisting of VIPs
A whistleblower complaint filed on behalf of Haugen alleges that Facebook misled investors and the public about equal enforcement of its terms since high-profile users are “whitelisted” under its “XCheck” program.
An internal Facebook report cited by the complaint says that in 2020, “XCheck” (pronounced cross-check) entities were shielded from the majority of integrity actions on the site. “That means,” the report said, “for a select few members of our community, we are not enforcing our policies and standards. Unlike the rest of our community, these people can violate our standards without any consequences… since we currently review less than 10% of Checked content.”
In a previous statement to the Wall Street Journal, Facebook spokesperson Andy Stone stated that this system, “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”
Global division and ethnic violence
Another whistleblower complaint filed on behalf of Haugen claims Facebook misled investors and the public about bringing “the world closer together.” The filing claims internal Facebook documents “show that Facebook’s language capabilities are inadequate, leading to global misinformation and ethnic violence.”
According to the complaint, documents show that Facebook’s foreign language capabilities are inadequate. One study cited by the complaint states, “in the Afghanistan market, the action rate for hate speech is worryingly low.”
The complaint goes on to say Facebook’s written translations do not account for regions where significant numbers of users cannot read, nor do they appropriately manage safety systems for different dialects.
Internal records cited in the complaint show how these linguistic shortcomings may lead to violent and incendiary content: “Anti-Muslim narratives targeted pro-Hindu populations with [violent and incendiary] intent… There were a number of dehumanizing posts comparing Muslims to ‘pigs’ and ‘dogs’ and misinformation claiming the Quran calls for men to rape their female family members. Our lack of Hindi and Bengali classifiers means much of this content is never flagged or actioned.”
Facebook responded in a statement to 60 Minutes, stating, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”
In a 2018 blog post titled “An Independent Assessment of the Human Rights Impact of Facebook in Myanmar,” Alex Warofka, a Facebook product policy manager wrote, “…we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”
A whistleblower complaint filed on behalf of Haugen claims Facebook misled investors and advertisers about shrinking user bases in important demographics, declining content production and the true number of recipients of “reach and frequency” advertising.
The complaint cites internal records confirming that teens and young adults in more developed economies are using the platform less.
The complaint also claims that for years, Facebook has misrepresented its true number of individual users to advertisers, not properly accounting for “single users with multiple accounts.” An internal report cited in the complaint shows that if single users with multiple accounts were properly handled, there would be “audience size reduction” for reach and frequency campaigns as follows “…18% of current R&F revenue using broad targeting… will see a decrease in audience size target than 10%. A majority of R&F campaigns using broad targeting will actually see an audience shrinkage in the 5-8% range.”