The revelations have generated increased political support for new regulation in the United States and Europe, including some calls for Mr. Zuckerberg to step aside as Facebook’s chief executive, putting Facebook on the defensive. The growing rancor could lead to new government investigations and force the company to disclose more details about how its software works.
“Facebook is failing to prevent harm to children, it’s failing to stop the spread of disinformation, it is failing to stop the spread of hate speech,” John Nicolson, a lawmaker from Scotland, said during the hearing. “It does have the power to deal with these issues, it’s just choosing not to.”
Ms. Haugen left Facebook with scores of internal research, slide decks, discussion threads, presentations and memos that she has shared with lawmakers, regulators and journalists. The information provides an unvarnished view of how some within the company tried to raise alarms about its harmful effects, but often struggled to get Facebook leaders to act.
Facebook defended its practices and said it had spent $13 billion and hired 40,000 people to work on safety issues.
“Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites,” said Mitch Henderson, a company spokesman. “People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.”
After leaking internal company documents to The Wall Street Journal that resulted in a series of articles that began in September, she revealed her identify this month for an episode on “60 Minutes” and testified before a Senate committee. She also shared the documents the Securities and Exchange Commission.
Since then, she has shared the Facebook materials with other news organizations, including The New York Times, resulting in additional stories about Facebook’s harmful effects, including its role in spreading election misinformation in the United States and stoking divisions in countries such as India.
Ms. Haugen’s visit to Europe is a reflection of the region’s aggressive approach to tech regulation and a belief that its policymakers are expected to act faster than the United States to pass new laws aiming at Facebook and other tech giants.
“For all the problems Frances Haugen is trying to solve, Europe is the place to be,” said Mathias Vermeulen, the public policy director at AWO, a law firm and policy firm that is among the groups working with Ms. Haugen in the United States and Europe.
In London, Ms. Haugen told policymakers that regulation could offset Facebook’s corporate culture that rewards ideas that get people to spend more time scrolling their social media feeds, but views safety issues as a less important “cost center.”
Facebook’s influence is particularly bad in areas of Africa, Asia and the Middle East where its services are widely popular but the company does not have language or cultural expertise, Ms. Haugen said. Without government intervention, she told lawmakers, events in countries such as Ethiopia and Myanmar, where Facebook has been accused of contributing to ethnic violence, are the “opening chapters of a novel that is going to be horrific to read.”
She suggested policies that would require Facebook to perform annual risk assessments to identify areas where its product were causing harm — such as the spread of coronavirus misinformation, or harms to teenagers’ mental health. She said Facebook could be required to outline specific solutions and share the findings with outside researchers and auditors to be sure they are sufficient.
Without government-mandated transparency, Facebook can present a false picture of its efforts to address hate speech and other extreme content, she said. The company says artificial intelligence software catches more than 90 percent of hate speech, but Ms. Haugen said the number was less than 5 percent.
“They are very good at dancing with data,” she said.
British policymakers are drafting a law to create a new internet regulator that could impose billions of dollars worth of fines if more isn’t done to stop the spread of hate speech, misinformation, racist abuse and harmful content targeting children.
The policy ideas gained additional momentum after the murder this month of David Amess, a member of Parliament, leading to calls for the law to force social media companies to crack down on extremism.
In Brussels, Ms. Haugen is scheduled to meet on Nov. 8 with European Union officials drafting laws that would force Facebook and other large internet platforms to disclose more about how their recommendation algorithms choose to promote certain material over others, and impose tougher antitrust rules to prevent the companies from using their dominant positions to box out smaller rivals. European policymakers are also debating a ban on targeted advertising based on a person’s data profile, which would pose a grave threat to Facebook’s multibillion-dollar advertising business.
Despite growing political support for new regulation, many questions remain about how such policies would work in practice. Any new laws in Britain and the European Union are not expected to be passed until next year at the earliest. In the United States, lawmakers are focusing on the harmful effect of Facebook and other social media platforms have on children.
Regulating Facebook is particularly complex because many of its biggest problems center on content posted by users all over the world, raising difficult questions about the regulation of speech and free expression. In Britain, the new online safety law has been criticized by some civil society groups as being overly restrictive and a threat to free speech online.
Another challenge is how to enforce the new rules, particularly at a time when many government agencies are under pressure to tighten spending.