Get the latest tech news How to check Is Temu legit? How to delete trackers
Facebook

Facebook whistleblower's explosive testimony: Company makes 'disastrous' choices, prioritizes profit

WASHINGTON — A Facebook whistleblower who raised alarms about several of the company's business practices testified Tuesday before Congress after a series of incriminating revelations about the company.

Frances Haugen, a former project manager at Facebook who leaked a massive trove of internal documents to the Wall Street Journal, told a Senate subcommittee that Facebook "put their astronomical profits before people" and asked for congressional action to rein in the tech giant.

"We can have social media we enjoy that connects us without tearing our democracy apart or democracy, putting our children in danger, and sowing ethnic violence around the world," Haugen said.

The documents Haugen released unearthed several explosive revelations about the company's tactics in the pursuit of growth, including bids to market its products directly to children, documents underscoring the severity of the platform's public health misinformation crisis and internal research that found its Instagram platform is destructive to young girls' mental health.

►Facebook's Mark Zuckerberg:CEO breaks his silence as Congress demands answers after whistleblower testimony

►'Profits before people':After Facebook whistleblower Frances Haugen argued her case, will Congress act?

"The choices being made inside of Facebook are disastrous for our children or our public safety for privacy and for our democracy. And that is why we must demand Facebook changes," Haugen told senators on Tuesday.

Facebook hasn't outright denied any of the Journal's reporting, but it claims the characterizations are "misleading" and has strenuously pushed back on them.

Former Facebook product manager Frances Haugen testifies before the Subcommittee on Consumer Protection, Product Safety, and Data Security in Washington on Oct. 5, 2021.

►More:Who is Facebook whistleblower Frances Haugen? Everything you need to know

Lawmakers questioned Haugen over the implications of the documents, which come as opinion on Capitol Hill had already turned sharply against the tech giant across both sides of the aisle.

►More:Would you take a Facebook or Instagram break? Why civil rights groups want you to log out

Facebook responds to whistleblower's testimony

Following Haugen's Capitol Hill appearance, her former employer rebutted her testimony.

“Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question," Lena Pietsch, Facebook's director of policy communications, said in a statement to USA TODAY. "We don’t agree with her characterization of the many issues she testified about."

Facebook also detailed steps it has taken or called for that would protect users in an emailed statement.

"Every day, we make difficult decisions on where to draw lines between free expression and harmful speech, privacy, security, and other issues, and we use both our own research and research from outside experts to improve our products and policies," the statement reads. "But we should not be making these decisions on our own which is why for years we’ve been advocating for updated regulations where democratic governments set industry standards to which we can all adhere."

►More:Is Facebook down? Instagram, WhatsApp, Messenger coming back after one of worst Facebook outages

Rep. Adam Schiff says Haugen needs to testify in Jan. 6 select committee

After Haugen's testimony in the Senate had wrapped, California Democratic Rep. Adam Schiff said on Twitter that she deserved to be called into conversation with the House Select Committee that is investigating the Jan. 6 Capitol riot.

"According to this Facebook whistleblower, shutting down the civic integrity team and turning off election misinformation tools contributed to the Jan 6 insurrection," Schiff tweeted. "The Select Committee will need to hear from her, and get internal info from Facebook to flesh out their role."

Haugen says she lost faith in Facebook’s commitment to protecting users after it disbanded the civic integrity team after the 2020 presidential race. Facebook said it distributed the work to different teams. But Haugen says Facebook stopped paying close attention, leading to the Jan. 6 attack on the Capitol.

– Katie Wadington and Jessica Guynn

Misinformation propeled by serving it to those who are isolated

Sen. Todd Young, R-Indiana, dove into misinformation on Facebook, asking Haugen how it could be addressed.

Misinformation can be spread by a small number of users, she said.

But users who are constantly exposed to misinformation – "the misinformation burden," she called it – and "are exposed to ideas that are not true over and over again, it erodes their ability to connect with the community at large, because they no longer adhere to facts that are consensus reality," Haugen said.

Facebook knows, through its own research, that those most exposed to the most misinformation are often isolated in some fashion – divorced, recently widowed or having relocated to a new city, she said.

Since highly engaged users may propel misinformation, she said, there could be built-in curbs on them or their posts, she suggested.

"The fact that Facebook knows that its most vulnerable users, people who recently widowed, they're isolated, the systems that are meant to keep them safe like demoting this information stop working when people look at 2,000 posts a day," Haugen said. "It breaks my heart, the idea that these rabbit holes can suck people down and then make it hard to connect with others."

– Mike Snider

Parents need to be equipped to help children navigate online spaces

Facebook whistleblower Frances Haugen expressed concern about how many parents around the world are poorly equipped to help their children navigate the dangers of social media given the relative novelty of the technology.

“Very rarely do you have one of these generational shifts where the generation who leads — these parents — have such a different set of experiences that they don’t have the context to support their children in a safe way,” Haugen told lawmakers.

She called on schools and the National Institutes of Health to create guidelines to help parents guide children through the pitfalls of the internet.

“They don’t have the context to support their children in a safe way,” Haugen cautioned lawmakers, urging for greater aid from public institutions.

“It should be easy for them to know what is constructive and not constructive because Facebook’s research alone shows that kids today feel like are struggling alone with these issues because their parents can’t guide them,” Haugen said.

The former Facebook product manager continued that some online critics calling on parents to simply "take a child's phone away" are unfair given the complexities of the issue in today’s society.

“The reality is that these issues are a lot more complicated than that,” she said. Haugen argued governments should provide aid to parents in navigating online platforms, like Facebook’s products, “because if Facebook won’t protect the kids we at least need to help parents to protect the kids.”

“Parents are anguished,” Democratic Sen. Richard Blumenthal said in response to Haugen’s testimony. The Connecticut lawmaker noted that Facebook disregarded recommendations in its internal research that sought to remedy many of the issues facing children and youth.

– Matthew Brown

Haugen has national security concerns about Facebook

Sen. Dan Sullivan asked Haugen about whether Facebook might be an active haven for terrorists and global rivals, including China and Russia.

"Do they provide a platform for those leaders who, in my view, clearly don't hold America's interests in mind?" the Alaska Republican asked.

Haugen worked on Facebook's counterespionage team, which found China surveilling the network and signs that the Iranian government was doing espionage "on other state actors," she said.

"So this is definitely a thing that is happening," Haugen said, "and I believe Facebook's consistent understaffing of the counterespionage information operations and counterterrorism teams is a national security issue, and I'm speaking to other parts of Congress about that."

In response, Sullivan asked, "So you are saying in essence that the, the platform, whether Facebook knows it, or not is being utilized by some of our adversaries in a way that helps push and promote their interests at the expense of Americans?"

Facebook is "very aware that this is happening," she said. "I believe the fact that Congress doesn't get a report of exactly how many people are working on these things internally is unacceptable because you have a right to keep the American people safe."

As the questioning ended, Blumenthal suggested that Sullivan's questions and Haugen's answers "may have opened up a new area for another hearing."

"I have strong national security concerns about how Facebook operates today," Haugen said.

– Mike Snider

Former Facebook product manager Frances Haugen arrives to testify before the Subcommittee on Consumer Protection, Product Safety, and Data Security in Washington on Oct. 5, 2021. Haugen asserts Facebook prematurely turned off safeguards designed to thwart misinformation after Joe Biden defeated Donald Trump in last year’s elections. She believes the action contributed to the deadly Jan. 6 invasion of the U.S. Capitol.

Facebook's underpinnings explored as cause of conflict 

Haugen broke down how Facebook's own algorithms can affect user behavior on its platforms and lead to social problems such as Jan. 6 revolt in Washington and conflict in places such as Ethiopia.

A process called engagement-based ranking amplifies topics users will see on Facebook or Instagram, based on subjects and topics users have looked at previously. The company has done tests to confirm how the networks' "amplification algorithms" can lead users to remain engaged, at the same time transporting them to new topics that could be harmful to users, she said.

"The way that they pick the content in Instagram for young users, for all users, amplifies preferences, and they have done something called a proactive incident response where they take things that they've heard for example like can you be led by the algorithms to anorexia content, and they have literally recreated that experiment themselves and confirmed. Yes, this, this happens to people," she told Sen. Amy Klobuchar, D-Minn. "So Facebook knows that they, that they are leading young users to anorexia content."

Facebook needs legislation to provide oversight, otherwise it will never put aside engagement-based rankings, she said. Otherwise, "Facebook is going to say ...  you're not gonna like Facebook as much, if we're not picking out the content for you," Haugen said.

Without the algorithm, users might not engage as much. As the platforms stand, "we spend more time on their platform (and) they make more money," she said.

"They know that other people will produce more content, if they get the likes and comments and ratios, they prioritize content in your feed, so that you will give little hits of dopamine to your friends, so they will create more content, and they've run experiments on people producer-side experiments where they have confirmed (this)," Haugen said.

She suggested Congress make changes in Section 230 of the Communications Decency Act, which shields online platforms from being responsible for what is posted by third parties on their sites, to make Facebook "responsible for the consequences of their intentional ranking decisions."

That could help prevent the viral spread of misinformation and content that results in violent incidents such as the Jan. 6 Capitol riot and repressive actions in Myanmar and Ethiopia, she said.

"I think the moment which I realized we needed to get help from the outside … that the only way these problems would be solved is by solving them together, not solving them alone, was when civic integrity was dissolved, following the 2020 election," Haugen said. "It really felt like a betrayal of the promises that Facebook had made to people who had sacrificed a great deal to keep the election safe by basically dissolving, our community and integrate and just other parts of the company."

– Mike Snider

Will the Facebook Files move Congress to act on social media?

The latest series of reports by the Wall Street Journal exposing a range of malpractices by the tech giant, lawmakers are rallying to regulate the company, a call that has been issued many times before.

“Here’s my message for Mark Zuckerberg: Your time of invading our private, promoting toxic content and preying on children and teens is over. Congress will be taking action,” Sen. Ed Markey, D-Mass., said in remarks during the Senate hearing.

“You can work with us or not work with us, but we will not allow your company to harm our children and our families and our democracy any longer,” the senator continued. “We will act."

Facebook is among a cadre of Big Tech companies that have come under increasing scrutiny from lawmakers across the globe as their influence and power have become more apparent.

While Washington has been slow to act on issues including antitrust, privacy, data portability and algorithmic bias among technology giants, regulators in the European Union have passed laws cracking down on Big Tech companies including Amazon, Alphabet, Apple and Facebook.

California lawmakers have also enacted policies targeting Big Tech from Sacramento on issues like privacy and taxation. The laws are considered by many industry watchers as a potential blueprint for if and when Washington finally moves on the issue.

On Tuesday, Markey and other lawmakers blamed Washington’s inaction on lobbying from the technology industry.

"We have not done anything to update our privacy laws in this country,” Sen. Amy Klobuchar, D-Minn., lamented during the hearing, “because there are lobbyists around every single corner of this building that have been hired by the tech industry."

Haugen, the whistleblower, argued that any regulation of social media should also include a specified regulatory agency that has people well-versed in the technology because most people who best understand the underlying technology already work in the technology industry.

– Matthew Brown

Facebook likely to continue work on Instagram Kids, whistleblower speculates

Children's advocacy groups and some in Congress called on Facebook earlier this year to stop its work on a planned kids version of its social media app. But when Sen. Brian Schatz, D-Hawaii, asked Haugen about the project, she doubted that work had stopped.

"I would be sincerely surprised if they do not continue working on Instagram kKds, and I would be amazed if a year from now we don't have this conversation again," she responded.

Haugen said Facebook had a need to ensure the "next generation is just as engaged" with Instagram. "And the way they'll do that is by making sure that children establish habits before they have good self regulation."

"By hooking kids?" Schatz asked.

Haugen: "By hooking kids."

She went on to note the research she had provided showed that "problematic use" of social media peaked at age 14.

"It's just like cigarettes. Teenagers don't have good self-regulation," she said. "They say explicitly, 'I feel bad when I use Instagram and yet I can't stop'." 

– Mike Snider

Haugen: Algorithms encourage virality in face of incitement to violence on platforms

Facebook whistleblower Frances Haugen discussed how the company's use of its algorithms to boost engagement leads to dangerous levels of violence and conflict around the world, something the company does not avoid because it would harm profits.

Haugen said Facebook needed to be “less twitchy, less reactive, less viral" as it develops products if it is going to avert the worst effects for individuals and society.  

Sen. Amy Klobuchar, D-Minn., asked Haugen about Facebook’s role in the 2020 election, during which the company briefly throttled engagement-boosting algorithms for American users ahead of the presidential voting.

“It seems that Facebook invests more in users who make them more money, even though the danger may not be evenly distributed based on profitability,” Haugen told Klobuchar.

“Facebook is presenting a false choice,” Haugen repeated, arguing that the company seeks profit through increased engagement at any cost. She said one of the major avenues for this, the engagement-based ranking of posts on Facebook and Instagram, is especially damaging.

“The choices that were happening on the platform was how reactive and twichty was the platform, how viral was the platform, and Facebook changed those safety defaults in the run-up to the election because they knew they were dangerous,” Haugen told lawmakers.

“And because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults. And the fact that they had to break the glass on Jan. 6 and turn them back on, I think that’s deeply problematic.”

– Matthew Brown

Haugen: Facebook needs to admit 'moral bankruptcy' to move forward

In answering questions from Connecticut Democratic Sen. Richard Blumenthal about Facebook CEO Mark Zuckerberg, Facebook whistleblower Frances Haugen said there is no one at the company holding its cofounder accountable.

"The buck stops with Mark," she said.

Facebook's corporate strategy has led to its current trials, she added.

"The metrics make the decision. Unfortunately that in itself is a decision and in the end, (Zuckerberg) is the CEO and the chairman of Facebook, he is responsible for those decisions," she said.

After Blumenthal called Zuckerberg "the algorithm designer in chief," Haugen described how the CEO and chairman's management style has led to a troublesome cycle.

"Facebook has struggled for a long time to recruit and retain the number of employees in needs to tackle the large scope of projects it has chosen to take on," she said. "That causes it to understaff projects, which causes scandals, which then makes it harder to hire."

That is why the company, "needs to come out and say, 'We did something wrong. We made some choices that we regret,' she said. "The only way we can move forward and heal Facebook is we first have to admit the truth."

– Mike Snider

Haugen: 'The choices being made inside Facebook are disastrous'

In her opening remarks, Facebook whistleblower Frances Haugen recounted her experience at Facebook and pleaded with lawmakers to tackle the tech giant's behavior head on.

“The choice being made at Facebook are disastrous,” Haugen said in her opening remarks, adding that many of the decisions made through its business practices and products have “led to actual violence that harms and even kills people.”

Haugen highlighted instances of radicalization on Facebook’s platforms around the world, including mob violence and genocides in many countries like Myanmar and Ethiopia.

She said the company was fully aware of its platforms’ effects on people, especially children. “This is about Facebook choosing to grow at all costs,” Haugen said.

“Almost no one outside of Facebook knows what happens inside of Facebook,” Haugen said, comparing the company’s opacity to other tech giants like Alphabet, which owns Google and YouTube.

Haugen said Facebook “intentionally hides” its inner workings from the American public and governments around the world in an effort to hide the effects of its company.

“Until the invectives change, Facebook will not change. Left alone, Facebook will continue to make choices against the common good. Our common good,” she said.

Haugen called on lawmakers to intervene in the situation and reign in the social media company’s behavior, comparing Facebook's behavior to that of the tobacco and pharmaceutical industries avoiding accountability in the past.

Haugen said that because of the black box nature of Facebook’s algorithms, the government and the public are left to judge the company’s algorithms by their end result, which is less effective than seeing the technology from the inside.

“A safer, free speech respecting social media is possible,” Haugen said, arguing that the many revelations about the company “are only the first chapters in a story so terrifying, no one wants. To read the end of it.”

– Matthew Brown

Sen. Marsha Blackburn calls out Facebook's role in addiction in kids

Sen. Marsha Blackburn, R-Tenn., charged that Facebook continues to put profit ahead of the safety of the children and teen users on its platform – a bipartisan issue that could unite legislators against the tech giant.

She cited research provided by Facebook after Haugen's revelations, which found 66% of teen girls and 40% of teen boys on Instagram experienced negative social comparisons. Another finding: 52% of teen girls who experienced negative social comparison on Instagram said it was caused by images related to beauty.

"Social comparison is worse on Instagram because it is perceived as real life, but based on celebrity standards," Blackburn said.

The resulting social media consumption cycle can lead to "a downward emotional spiral encompassing a range of emotions from jealousy to self-proclaimed body dysmorphia," Blackburn said.

Facebook also accepts that users can become addicted, Blackburn said, using a term it "calls conveniently 'problematic use,'" which is "most severe in teens peaking at age 14."

"Big tech companies have gotten away with abusing consumers for far too long," Blackburn said. "It is clear that Facebook prioritizes profit over the well-being of our children and all users."

– Mike Snider

Former Facebook product manager Frances Haugen arrives to testify before the Subcommittee on Consumer Protection, Product Safety, and Data Security in Washington on Oct. 5, 2021. Haugen asserts Facebook prematurely turned off safeguards designed to thwart misinformation after Joe Biden defeated Donald Trump in last year’s elections. She believes the action contributed to the deadly Jan. 6 invasion of the U.S. Capitol.

Facebook CEO Zuckerberg, COO Sandberg silent amid crisis

As lawmakers and the public again train their attention on Facebook amid its most damaging scandal in years, the company’s top executives are silent.

Facebook Chief Executive Officer and co-founder Mark Zuckerberg is absent from the national spotlight. The company’s chief operating officer, Sheryl Sandberg, is also missing in action.

Sen. Richard Blumenthal, D-Conn., has said he will call on Zuckerberg to testify about the latest reports on the company’s internal research. Facebook has not yet issued a statement on whether he would testify before Congress.

Since the latest reports from the Wall Street Journal, the company’s vice president for global affairs and communications, Nick Clegg, has been the main spokesperson for the company, pushing back on the latest reports.

Instagram's top executive, Adam Mosseri, has also made media appearances since the latest revelations, including announcing that the company would halt work on its Instagram Kids project amid public backlash.

Zuckerberg and Sandberg’s absence from the public eye mirrors past major crises for the company, including the 2018 Cambridge Analytica scandal.

– Matthew Brown

Facebook's Monday outage stranded billions of users

Tuesday's hearing comes less than 24 hours after Facebook and its associated apps came back to life following one of the longest outages in its history.

Monday’s outage of Facebook, Facebook Messenger, Instagram and WhatsApp marooned billions of users who rely on the social media giant and its apps for everything from connecting with friends to running their businesses and logging into websites.

The social network and the Facebook-owned platforms stopped working around 11:30 a.m. EDT Monday, according to the site Downdetector.com. At around 5:40 p.m., some users were able to access the platforms, but not all functions were back.

Facebook said late Monday that “the root cause of this outage was a faulty configuration change” and that there was “no evidence that user data was compromised as a result.” 

– Terry Collins

Follow Matthew Brown online @mbrownsir.

Featured Weekly Ad