Look Closer to Home: Russian Propaganda Depends on the American Structure of Social Media

The State Department’s Undersecretary for Public Diplomacy, Richard Stengel, wanted to talk about his efforts to counter Russian propaganda. So he called up David Ignatius, long a key cut-out the spook world uses to air their propaganda. Here’s how the column that resulted starts:

“In a global information war, how does the truth win?”

The very idea that the truth won’t be triumphant would, until recently, have been heresy to Stengel, a former managing editor of Time magazine. But in the nearly three years since he joined the State Department, Stengel has seen the rise of what he calls a “post-truth” world, where the facts are sometimes overwhelmed by propaganda from Russia and the Islamic State.

“We like to think that truth has to battle itself out in the marketplace of ideas. Well, it may be losing in that marketplace today,” Stengel warned in an interview. “Simply having fact-based messaging is not sufficient to win the information war.”

It troubles me that the former managing editor of Time either believes that the “post-truth” world just started in the last three years or that he never noticed it while at Time. I suppose that could explain a lot about the failures of both our “public diplomacy” efforts and traditional media.

Note that Stengel sees the propaganda war as a battle in the “marketplace of ideas.”

It’s not until 10 paragraphs later — after Stengel and Ignatius air the opinion that “social media give[s] everyone the opportunity to construct their own narrative of reality” and a whole bunch of inflamed claims about Russian propaganda — that Ignatius turns to the arbiters of that marketplace: the almost entirely US-based companies that provide the infrastructure of this “marketplace of ideas.” Even there, Ignatius doesn’t explicitly consider what it means that these are American companies.

The best hope may be the global companies that have created the social-media platforms. “They see this information war as an existential threat,” says Stengel. The tech companies have made a start: He says Twitter has removed more than 400,000 accounts, and YouTube daily deletes extremist videos.

The real challenge for global tech giants is to restore the currency of truth. Perhaps “machine learning” can identify falsehoods and expose every argument that uses them. Perhaps someday, a human-machine process will create what Stengel describes as a “global ombudsman for information.”

Watch this progression very closely: Stengel claims social media companies see this war as an existential threat. He then points to efforts — demanded by the US government under threat of legislation, though that goes unmentioned — that social media accounts remove “extremist videos,” with extremist videos generally defined as Islamic terrorist videos. Finally, Stengel puts hope on a machine learning global ombud for information to solve this problem.

Stengel’s description of the problem reflects several misunderstandings.

First, the social media companies don’t see this as an existential threat (though they may see government regulation as such). Even after Mark Zuckerberg got pressured into taking some steps to stem the fake news that had been key in this election — spread by Russia, right wing political parties, Macedonian teenagers, and US-based satirists — he sure didn’t sound like he saw any existential threat.

After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.


This has been a historic election and it has been very painful for many people. Still, I think it’s important to try to understand the perspective of people on the other side. In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.

And that’s before you consider reports that Facebook delayed efforts to deal with this problem for fear of offending conservatives, or the way Zuckerberg’s posts seem to have been disappearing and reappearing like a magician’s bunny.

Stengel then turns to efforts to target two subsets of problematic content on social media: terrorism videos (definedin a way that did little or nothing to combat other kinds of hate speech) and fake news.

The problem with this whack-a-mole approach to social media toxins is that it ignores the underlying wiring, both of social media and of the people using social media. The problem seems to have more to do with how social media magnifies normal characteristics of humans and their tribalism.

[T]wo factors—the way that anger can spread over Facebook’s social networks, and how those networks can make individuals’ political identity more central to who they are—likely explain Facebook users’ inaccurate beliefs more effectively than the so-called filter bubble.

If this is true, then we have a serious challenge ahead of us. Facebook will likely be convinced to change its filtering algorithm to prioritize more accurate information. Google has already undertaken a similar endeavor. And recent reports suggest that Facebook may be taking the problem more seriously than Zuckerberg’s comments suggest.

But this does nothing to address the underlying forces that propagate and reinforce false information: emotions and the people in your social networks. Nor is it obvious that these characteristics of Facebook can or should be “corrected.” A social network devoid of emotion seems like a contradiction, and policing who individuals interact with is not something that our society should embrace.

And if that’s right — which would explain why fake or inflammatory news would be uniquely profitable for people who have no ideological stake in the outcome — then the wiring of social media needs to be changed at a far more basic level to neutralize the toxins (or social media consumers have to become far more savvy in a vacuum of training to get them to do so).

But let’s take a step back to the way Ignatius and Stengel define this. The entities struggling with social media include more than the US, with its efforts to combat Islamic terrorist and Russian propagandist content it hates. It includes authoritarian regimes that want to police content (America’s effort to combat content it hates in whack-a-mole fashion will only serve to legitimize those efforts). It also includes European countries, which hate Russian propaganda, but which also hate social media companies’ approach to filtering and data collection more generally.

European bureaucrats and activists, to just give one example, think social media’s refusal to stop hate speech is irresponsible. They see hate speech as a toxin just as much as Islamic terrorism or Russian propaganda. But the US, which is uniquely situated to pressure the US-based social media companies facilitating the spread of hate speech around the world, doesn’t much give a damn.

European bureaucrats and activists also think social media collect far too much information on its users; that information is one of the things that helps social media better serve users’ tribal instincts.

European bureaucrats also think American tech companies serve as a dangerous gateway monopolizing access to information. The dominance of Google’s ad network has been key to monetizing fake and other inflammatory news (though they started, post-election, to crack down on fake news sites advertising through Google).

The point is, if we’re going to talk about the toxins that poison the world via social media, we ought to consider the ways in which social media — enabled by characteristics of America’s regulatory regime — is structured to deliver toxins.

It may well be that the problem behind America’s failures to compete in the “marketplace of ideas” has everything to do with how America has fostered a certain kind of marketplace of ideas.

The anti-Russian crusade keeps warning that Russian propaganda might undermine our own democracy. But there’s a lot of reason to believe red-blooded American social media — the specific characteristics of the global marketplace of ideas created in Silicon Valley — is what has actually done that.

Update: In the UK, Labour’s Shadow Culture Secretary, Tom Watson, is starting a well-constructed inquiry into fake news. One question he asks is the role of Twitter and Facebook.

Update: Here’s a summary of fake news around the world, some of it quite serious, though without a systematic look at Facebook’s role in it.

Marcy Wheeler is an independent journalist writing about national security and civil liberties. She writes as emptywheel at her eponymous blog, publishes at outlets including Vice, Motherboard, the Nation, the Atlantic, Al Jazeera, and appears frequently on television and radio. She is the author of Anatomy of Deceit, a primer on the CIA leak investigation, and liveblogged the Scooter Libby trial.

Marcy has a PhD from the University of Michigan, where she researched the “feuilleton,” a short conversational newspaper form that has proven important in times of heightened censorship. Before and after her time in academics, Marcy provided documentation consulting for corporations in the auto, tech, and energy industries. She lives with her spouse in Grand Rapids, MI.

7 replies
  1. Trevanion says:

    Very thoughtful.

    Unstated, but something I would assume you would include in the “wiring” or “structure” of the social networks and their ability to deliver toxins is the extreme distance the pendulum has swung (esp. in Silicon Valley) toward a winner-take-and-keep-all business model and the USG’s continuing ho-hum approach to market concentration (something bought and paid for, of course).

    • emptywheel says:

      Yes–thanks for adding it. Google and Facebook (Twitter not as much) have a real gatekeeper effect. That’s why it’s so important when their algos can be abused.

  2. bloopie2 says:

    Good take down of US propaganda, as usual. But I’m going to put in my “not having a good day at work” quibbles.
    “The point is, if we’re going to talk about the toxins that poison the world via social media, we ought to consider the ways in which social media — enabled by characteristics of America’s regulatory regime — is structured to deliver toxins.” I’ve read this post three times and can’t find a listing of the “ways in which social media … is structured to deliver toxins”, ways that don’t also deliver good stuff. And what does this mean: “then the wiring of social media needs to be changed at a far more basic level to neutralize the toxins”—what do you mean, technically, in practice—what is this “wiring” that can be changed? At the least, I can’t find any finger pointing herein along the lines of “Google does ABC and it is bad and it should change to XYZ” or some such.
    Addressing a similar (or the same?) issue, the New Yorker has an item about how Silicon Valley lacks empathy—it promotes technological advance without thinking of or caring about the potential human cost (search news for “Silicon Valley Has an Empathy Vacuum”). The driverless truck manufacturers, the Airbnbs, the Ubers, the Amazons, all being pushed to make money for investors (i.e., those who already have more money than they need to live on), at the cost of eliminating lower and middle class jobs and wealth. Again, it’s nice to see someone point out the problem, but the author provides no viable answers. No suggestion of how to make Amazon stop in place where it is, no more expansion, allowing the remaining retailers to survive. No suggestion of how to make Airbnb stop in place where it is, no more expansion, allow the remaining hotels to survive. Or Uber, and taxi drivers. Or “everyone making driverless trucks”, and the million or more truck drivers.
    I personally don’t have the answers. How would thousands of unemployed truck drivers, with no money, fight back? Going on strike–oops, no. Train them, increase their skills, so they can create new technology to put another group of people out of work? It’s so damn frustrating. Maybe there are no answers to my “quibbles” and that’s why you didn’t essay them.

    • emptywheel says:

      A big part is just competition. These industries have all become too concentrated. I also think Wall Street has come to expect too much profit, which against the background of financialization makes the companies uninterested in investing for value anymore (though that’s probably more true elsewhere).

      As to how the wiring works? Do click thru to the Scientific American piece. I think it’s a combination of us being wired to prefer clickbait, but also of the algo preferring clickbait. Some people have suggested that Facebook should return to having editors, and that may fix things. Or maybe just tone down the clickbait response some, diminishing the value of fake news, terror videos, and hate speech.

      There also needs to be a better means to make norms do their work. FB is pretty strict about not pulling hate speech bc it’s free speech. Maybe, but if it violates the norms of a community there ought to be a way to shun the person, as would happen in real life.

      Facebook is clearly working on some of this (Twitter less so, because it’s less of a gateway and therefore under somewhat less pressure), but they seem hesitant to do the things that would normalize the community.

  3. earlofhuntingdon says:

    Mr. Ignatius wrongfoots himself from the start, apparently taking Mr. Stengel’s framing at face value:

    “In a global information war, how does the truth win?”

    Are we at war or only managing longstanding conflicts of interest?  Is truth the main, or even a principal aim of diplomacy?  Winning perhaps, but “truth”?  Surely, the philosophy department is better suited to its pursuit than the hegemon’s diplomatic arm.

    Nevertheless, Mr. Stengel’s question deftly establishes his street cred as assistant propagandist in chief by wrapping a misdescription inside a disputed claim, leading to a misdirection.  One would have thought that being the former managing editor of Time, whose founder often let his political slant overwhelm his reporters’ facts, would have been credential enough. Ironically, Mr. Stengel’s topic, like the paraphrase from Churchill, is about Russia.  His original, from 1939:

    “I cannot forecast to you the action of Russia.  It is a riddle, wrapped in a mystery, inside an enigma; but perhaps there is a key. That key is Russian national interest.”

    One might similarly describe how America, and its State Departments’ propaganda arm, pursues its own national interest.



  4. earlofhuntingdon says:

    Mr. Stengel hopes that our tech giants will one day lead us in the pursuit of the truth, or at least facts, rather than pursue profits.  One can but hope that a billion here or a billion there will one day be regarded as not enough money to distract them from the pursuit of truth.  But I wouldn’t bet Mr. Stengel’s soon-to-expire job on it; rents in San Francisco and San Jose are more likely to drop by half.

  5. Hieronymus Howard says:

    The fake news is the MSM.   Got so I could not / cannot do NPR “Morning Edition” with my coffee.   It is TASS (a Soviet news agency) revisited.   & now they’re doubling down in the wake of the Triumphian insurgency.

Comments are closed.