Frances Haugen, a former Facebook employee, on Tuesday told US lawmakers that the social media giant has made “disastrous” choices with regards to children, public safety, privacy and democracy and that the company needs “help” as it cannot fix these problems on its own.
Asked specifically what needs to be done, Frances Haugen, who was a product manager with Facebook and handled democracy and misinformation issues, and on counterespionage as part of the civic misinformation team, suggested changes in a section of a 1996 law that protects online platforms from liability of their content to make Facebook reveal its decisions on algorithms and setting up a “dedicated oversight body”.
“I used to work at Facebook. I joined Facebook because I think Facebook has the potential to bring out the best in us. But I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” Frances Haugen said in her testimony at a hearing of the subcommittee on consumer protection, product safety, and data security. “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes as they put their astronomical profits before people.”
Congressional action is needed. They won’t solve this crisis without your help,” she added.
In response to a question about her work on counterespionage, Haugen said, “I have strong national security concerns about how Facebook operates today.”
Frances Haugen said the company was unable to investigate all the cases they detected because of understaffing.
Frances Haugen left the company with thousands of documents on internal research that showed Facebook prioritised profits over safety as it pushed user engagement, by amplifying misinformation and hate-speech. She gave these documents to The Wall Street Journal, US Congress and regulators. In an interview to CBS 60 Minutes, she had said, the social media company prioritised “growth over safety”.
“I understand how complex and nuanced these problems are. However, the choices being made inside of Facebook are disastrous for our children or our public safety, or our privacy and for our democracy. And that is why we must demand Facebook make changes,” Frances Haugen told lawmakers in her opening remarks.
The documents I have provided to Congress prove that Facebook has repeatedly misled the public about its own research reveals about the safety of children, the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages,” she said.
When asked what she would want Congress to do, Frances Haugen had two specific suggestions. One, reform Section 230 of the Communication Decency Act of 1996, which grants immunity from liability to online platforms for their content. But, she said, the reform must make Facebook reveal its “decisions about algorithms … Facebook should not get a free pass on choices it makes to prioritise growth, and virality and reactiveness over public safety”. And, two, there needs to be a “dedicated oversight body, because right now the only people in the world who are trained to analyse these experiments to understand what’s happening inside of Facebook are people who, you know, grew up inside of Facebook or Pinterest or another social media company”.
There has been bipartisan concern in US Congress over big tech’s growing reach and influence, specially Facebook. The same subcommittee had a Facebook official over for a hearing last week.
“Facebook knows its products can be addictive and toxic to children. And it’s not just that they made money, again, it’s that they value their profit more than the pain that they cause the children and their families,” said Senator Richard Blumenthal, the Democratic chairman of the subcommittee, in his opening remarks, while whistleblower Frances Haugen nodded along. “Facebook exploited teens using powerful how algorithms and amplified their insecurities.”
Senator Marsha Blackburn, the top Republican on the subcommittee, said, “Having seen the data that you’ve presented and the other studies that Facebook did not publicly share, I feel pretty confident that it is Facebook who has done the misrepresenting.”She added, “Here is what else we know – Facebook is not interested in making significant changes to improve kids’ safety on their platforms, at least not when that would result in losing eyeballs on posts or decreasing their ad revenues.”