Why Facebook should be worried after the hearing - GulfToday

Why Facebook should be worried after the hearing

whistleblower-Frances-Haugen

Former Facebook employee and whistleblower Frances Haugen testifies during a hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ in Washington. File/Reuters

Ahmed Baba, The Independent

In Facebook’s early days, its motto was “move fast and break things”. Little did they know, the company would go on to break societies around the globe.

This is a moment of reckoning for a social media giant with a user base the size of multiple nations. It’s been compared to similar reckonings in Big Tobacco, Big Oil, and Big Pharma. We’ve long felt Facebook’s damaging impact on our society, democracy, children, and health. Now, in an unprecedented move, we have someone from inside Facebook with tens of thousands of documents laying Facebook’s “moral bankruptcy” bare.

A damning picture is coming into focus: Facebook knows exactly how destructive its products are and they aren’t doing everything in their power to fix them. These new allegations show that time and time again, Facebook executives choose the maximisation of profit over the public good.

On Tuesday, former Facebook employee Frances Haugen delivered bombshell testimony before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security. She initially joined Facebook as the product manager on Facebook’s Civic Integrity team in 2019. Haugen has worked at other tech companies, from Google to Pinterest, but she claims she had never encountered what she saw at Facebook, leading her to become a whistleblower. After weeks of build-up through revelations in the Wall Street Journal and a 60 Minutes interview, Haugen officially went on the record outlining Facebook’s alleged corruption in no uncertain terms.

Haugen began her opening statement with a claim that epitomises the problems at Facebook: “I believe Facebook’s products harm children, stoke division, and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”

Haugen’s testimony and corroborating documents allege that Facebook has repeatedly misled the public, US government, and investors about its own research on how its products spread disinformation, foster anti-vaccine misinformation, foment hate speech, and impact the safety of children.

Let’s start with disinformation and hate speech. Haugen points to an algorithm change in 2018 that was intended to boost engagement, and as a result, made an already destructive platform worse. According to Haugen and the documents she’s provided, Facebook ran a test where a user merely liked some political pages like “Donald Trump” and “Melania Trump”. Within hours, they were being recommended QAnon content. Within days, they were being presented with other extremist content. We’ve all known that Facebook works as a radicalisation machine, increasingly recommending more extreme and dishonest content. But now, it’s being alleged that Facebook knows this too.

While Facebook has publicly stated they have made massive progress fighting disinformation and hate speech, Haugen has provided an internal document that highlighted a study indicating the opposite: “We estimate that we may action as little as 3-5% of hate and 0.6% of V&I (Violence and Incitement) on Facebook despite being the best in the world at it.” Haugen went on to argue that this kind of hate speech proliferation has led to ethnic violence in countries like Myanmar and Ethiopia. And closer to home, we’ve seen other troubling results.

Facebook’s role in the disinformation campaigns executed in the 2016 and 2020 elections are widely acknowledged. What is now being alleged involves some of the behind-the-scenes decisions. Haugen claims that Facebook disbanded the Civic Integrity group she was working in immediately after the election and turned off safety measures that had been implemented ahead of the election. Haugen claims that this helped foster the environment that led to the 6 January attack on the Capitol.

Haugen said that after the election, she thinks Facebook removed those safeguards because they wanted to boost engagement. In response to me tweeting about this assertion, Facebook Policy Communications Director Andy Stone responded: “I’m glad you wrote ‘thinks’ because the truth is we left a number of the measures on through Jan. 6, added additional measures following the violence at the Capitol, and made some of the changes permanent, like not recommending political groups.”

Whether Facebook kept some measures on while turning off others, or not, the results weren’t great. The 6 January insurrection was planned and promoted on a number of social media sites, including Facebook. The lies about the 2020 election spread like wildfire on the platform and years of radicalisation culminated on the Capitol steps. While I think their decision to ban former President Trump was a good move, some would argue that it was too little, too late.

One of the main reasons this Facebook scandal is different than others is that it involves children. Monday’s hearing largely focused on the testimony and documents Haugen provided regarding Facebook’s research on its impact on teenage girls. Haugen has provided a study from Facebook that found “13.5 per cent of teen girls say Instagram makes thoughts of suicide worse, 17 per cent of teen girls say Instagram makes eating disorders worse”.

Earlier this year, Facebook CEO Mark Zuckerberg told a congressional hearing that “overall, the research that we have seen is that using social apps to connect with other people can have positive mental health benefits.” That runs contrary to the data Haugen has provided. That data hasn’t stopped Facebook’s desire to widen their reach among kids, which they call a “valuable and untapped audience”. It wasn’t until these documents were revealed that Facebook put their “Instagram for kids” project on hold.

As we look at the totality of the revelations, this is really the bottom line: it’s not just that Facebook’s business model incentivises capturing attention at all costs, it’s that they take it to extremes. It’s a choice. Some have given Zuckerberg the benefit of the doubt or claim he doesn’t truly grasp the damage his company has caused. But that excuse can no longer credibly be made.

This is why calls for regulation are becoming increasingly popular. Haugen asked for this during her testimony: “The choices being made by Facebook’s leadership are disastrous – for children, for public safety, for democracy – that is why we must demand Facebook make changes.” Haugen’s call for changes to Section 230, increased transparency, and oversight of Facebook’s engagement-based algorithm seemed to get widespread, bipartisan agreement among Senators.

Related articles