Facebook’s disinformation problem is harder than it looks

0
267

 

https://www.cjr.org/the_media_today/facebooks-disinformation-problem-is-harder-than-it-looks.php

By Mathew Ingram For Columbia Journalism Review

That Facebook can distribute dangerous amounts of misinformation around the world in the blink of an eye is not a new problem. But the attention stepped up when President Joe Biden told reporters during a White House scrum that Facebook was “killing people” by spreading disinformation, hoaxes, and conspiracy theories about COVID-19, and in particular about the efficacy of various vaccines. As Jon Allsop reported in the CJR newsletter on Wednesday, Biden backtracked somewhat on his original statement after some pushback from the company and others: Facebook said that the country needed to “move past the finger pointing” when it comes to COVID disinformation, and that it takes action against such content when it sees it. Biden responded that his point was simply that Facebook has enabled a small group of about a dozen accounts to spread disinformation that might be causing people to avoid getting vaccinated, and that this could result in an increase in deaths.

Biden appears to have got his information about this “disinformation dozen” from a group called the Center for Countering Digital Hate, which came out recently with research it said showed that the bulk of the disinformation around COVID-19 and vaccines appears to come from a handful of accounts. The implication of the president’s comment is that all Facebook has to do is get rid of a few bad apples, and the COVID disinformation problem will be solved. As Farhad Manjoo of the New York Times put it, however, Biden “reduced the complex scourge of runaway vaccine hesitancy into a cartoonishly simple matter of product design: If only Facebook would hit its Quit Killing People button, America would be healed again.” While Biden’s comments may make for a great TV news hit, solving a problem like disinformation at the scale of something like Facebook is much harder than he makes it sound, in part because it involves far more than just a dozen bad accounts. And even the definition of what qualifies as disinformation when it comes to COVID has changed over time.

As Jon Allsop described yesterday, part of the problem is that media outlets like Fox News seem to feel no compunction about spreading “fake news” about the virus in return for the attention of their viewers. That’s not a problem Facebook can fix, nor will ridding the social network of all hoaxes about COVID or vaccines make much of a dent in the influence of Fox’s hysteria — which information researcher Yochai Benkler of Harvard’s Berkman Center for Internet and Society has argued was much more influential during the 2016 election than any social-media network. But even that’s just the tip of the disinformation iceberg. One of the most prominent sources of COVID and vaccine disinformation is a sitting US member of Congress: Marjorie Taylor Greene from Georgia. Another, Robert F. Kennedy, is a member of one of the most famous political families in US history, and his anti-vaccination conspiracy theories put him near the top of the Center for Countering Digital Hate’s “disinformation dozen” list. What is Facebook supposed to do about their repeated misstatements?

Twitter blocked Taylor Greene’s account for 12 hours because she was spreading anti-vax hysteria, and Facebook could easily do likewise. But then what? The social platforms could just play Whack-a-Mole with such statements forever, or they could take definitive action and ban Taylor Greene and/or Kennedy for their spreading of disinformation. But as Manjoo pointed out in his Times column, doing so is going to give right-wing critics even more ammunition to cry about censorship by the platforms than they already had thanks to Donald Trump’s ongoing social-media ban. It’s not just people like Taylor Greene and Kennedy, or obvious trolls like Alex Jones of Infowars. It’s not even just professional “bot” accounts that trade in disinformation for profit and influence. Another part of the problem is that things that once seemed like obvious COVID disinformation no longer do.

Take the idea that the virus might have originated in a research lab in Wuhan, China. Not that long ago, this was defined by almost everyone — including Facebook — as disinformation, and sharing theories to that effect would get your account blocked. In recent months, however, experts have started to entertain those theories more than they did in the past, in part because of a track record of poor record-keeping and slip-ups at a number of similar laboratories. The idea that COVID might have escaped into the wild accidentally doesn’t have much more to recommend it than it did six months or a year ago. But discussing this possibility is no longer seen as a direct ticket to Facebook or Twitter oblivion. It would be hard enough to pinpoint all the pieces of disinformation around something like COVID even if there were agreement about all aspects of it, but there isn’t.

Joe Biden and his advisors, and other critics of Facebook, might think that getting rid of disinformation is an easy task, and that the company is simply dragging its feet because it doesn’t want to disrupt its business, and there is probably more than a little truth to that. But it’s also true that finding the right line between disinformation control, public-health awareness, and outright censorship is not an easy task. Blocking accounts en masse for normal speech about an ongoing problem is not going to solve anything.

Tell us what you think about this story.