In 2016, the worst fears were that a wildfire of Russian propaganda on Facebook persuaded a bunch of Americans to vote for Donald Trump. In 2018, people spun yarns that the political consulting firm Cambridge Analytica brainwashed us with data they vacuumed up from Facebook users. Not quite right.
In the firestorms, there may have been too much credit given to the Kremlin, Cambridge Analytica and Facebook — and too little to human free will. And in Facebook’s crisis du jour, kicked off by a whistle-blower’s claims that the company repeatedly chose its short-term corporate interests over the good of humanity, some nuance has likely been lost. Instagram’s internal research about the app’s influence on teenage girls’ mental health doesn’t appear conclusive, as some researchers told me and NPR reported.
So yes, we’ve all gotten stuff wrong about Facebook. The company, the public and people in power have at times oversimplified, sensationalised, misdiagnosed the problems or botched the solutions. We focused on how the heck Facebook allowed Macedonian teenagers to grab Americans’ attention with fabricated news, and did less to address why so many people believed it.
Each public embarrassment for Facebook, though, is a building block that makes us a little savvier about the influence of these still relatively new internet technologies in our lives. The real power of the scandals is the opportunity to ask: Holy moly, what is Facebook doing to us? And what are we doing to one another? Kate Klonick, a law school professor, told me that when she started as a Ph.D. student at Yale Law School in 2015, she was told that her interest in internet companies’ governance of online speech wasn’t a subject for serious legal research and publication. Online life was not considered real life, she explained. Russian election propaganda, Cambridge Analytica and other Facebook news in the years that followed changed that perception.
“Those stories have done one huge thing: They’ve started to make people take the power of technology companies seriously,” Dr. Klonick said.
That is one thing that’s different about this Facebook episode from all the ones that came before. We are wiser. And we are ready. There is a coterie of former tech insiders and outside professionals who have studied Facebook and other tech superpowers for years, and they are armed with proposed fixes for the harms that these companies perpetrate.
Another difference in 2021 is the presence of Frances Haugen, the former product manager at Facebook who gave hours of testimony before a Senate subcommittee on Tuesday. (And bookending her testimony this week were twin outages of Facebook and the company’s other apps.) Haugen seems to be the right messenger with the right message at the right time.
Blame is a blunt instrument, but at each Facebook crossroad, we learn to wield blame more judiciously. Facebook and other online companies are not responsible for the ills of the world, but they have made some of them worse. We get it now.
The answers aren’t easy, but Haugen is directing our attention straight at Facebook’s molten core: its corporate culture, organisational incentives and designs that bring out the worst in humanity. And she is saying that Facebook cannot fix itself. A wiser public must step in.
Ovide is a tech writer with NYT©2021