Dr. Samuel Woolley has a lot of experience warning the public. The author of the new book, The Reality Game: How the Next Wave of Technology Will Break the Truth met with Factal before an appearance at Town Hall in Seattle on January 9th.
At Town Hall, Woolley took care to point out that he’s an ethnographer, not a computer scientist. He’s talked to the makers and purveyors of disinformation. “There’s a person behind every bot,” he said from the stage (video).
One might expect Woolley to be cynical about the future we face. He’s the co-editor of the book Computational Propaganda, a term Woolley helped coin for automated manipulation on social media. In his role at the journalism school at the University of Texas in Austin, his job is to research the newest ways people lie on the internet.
Yet Woolley has found reasons for optimism. Every chapter of The Reality Game features solutions. At one point during our interview, he pauses during an answer.
“The core belief behind this book is that there’s a lot of power in people,” he says. It doesn’t take an ethnographer’s training to recognize he’s telling the truth.
Factal: So on the first page of the book, you explain that you don’t prefer the term “fake news”.
Dr. Sam Woolley: That’s right.
Factal: You suggest using the terms “disinformation” and “misinformation” instead. What’s the difference? And why does the difference matter?
SW: So first of all, the reason that I don’t use the term “fake news” is because I think that it has been co-opted by the people that make and spread fake news.
What we’re in now is a bit of a conundrum where the term “fake news” has become weaponized by the powers that be, to spread dis- and misinformation. When I say dis- and misinformation, I mean two different things.
Disinformation is the purposeful spread of misleading or false content. So, for instance, it would be a government saying that an event never happened when it actually did.
Misinformation is the accidental spread of false or misleading information. So that would be like if my grandmother accidentally shared an article on Facebook that turned out to be completely false.
Factal: And so the disinformation often feeds into misinformation?
SW: Disinformation is often the precursor to misinformation. So oftentimes, there are campaigns that are waged by governments or militaries or even regular people in attempts to manipulate public opinion, to mold the way that people think about certain issues or to make people apathetic or to make people angry. And oftentimes, that disinformation gets spread as misinformation by people without any kind of bad intentions.
Factal: This past week has been full of news that’s been muddied by disinformation. How quickly do disinformation and computational propaganda latch onto breaking news?
SW: Breaking news and disinformation and computational propaganda are all very closely tied together, and oftentimes we see things happening within a matter of minutes. So, for instance, take the recent Iranian missile attack against Iraqi bases that were housing US troops.
There was breaking news that was released purporting to show images of a missile that was being launched at the base. Verification quickly caught that the picture of the missile is actually a picture from 2017, from a prior attack.
What ends up happening most of the time, if groups like Factal don’t catch it, depending on the intention of the people who originally spread it, it quickly gets picked up by bots.
If the account that is tweeting the image or video or content out is a well known account, a popular account, an influencer – then there’s an expectation that bots, and a lot of other accounts, will latch on to the stuff that is spreading and either spread it purposefully for the sake of misinformation, or disinformation.
It’s kind of like the ourobouros, the snake eating its own tail. All of this stuff is really interconnected, and oftentimes journalists mistakenly share disinformation.
Journalists are the targets of disinformation campaigns, with a specific intention to get them to share breaking news that is actually false.
Botnets aren’t trying to get you or me to change our minds about something. [… They’re trying to] trick the algorithm on a site like Twitter or YouTube into re-curating the content for people like me or you on the trends page.Dr. Samuel Woolley
Factal: So, sometimes, breaking news is a vehicle for disinformation where the news event has not occurred and there’s not even really an inciting incident?
SW: I call this “manufactured consensus”. It’s actually the name of my next book, which is going to be more of an academic book with Yale Press.
Manufacturing Consensus riffs on the work of Herman and Chomsky’s Manufacturing Consent, which basically talked about the idea that the media controls to a large degree how people think and what they think about.
In Manufacturing Consensus, the argument I make with these bots and things is that I say that the armies of bots on social media – combined with trending algorithms that pick up stuff based upon how much [activity] is going on, so if you have 10,000 bots it makes it look much more popular – it creates the illusion of popularity for a candidate or for a cause or for an idea.
And so oftentimes people will use bots – on Twitter specifically, but also on YouTube, also on Facebook, in an attempt to make something trend. And there’s a public misperception there.
Botnets aren’t trying to get you or me to change our minds about something. There’s not like a bot that’s attacking Tyler and saying “Hey Tyler, you should believe that Liz Warren is the best candidate for president and here’s why.”
It’s actually that the bot is using hashtags and a combination of other key terms and things to trick the algorithm on a site like Twitter or YouTube into re-curating the content for people like me or you on the trends page. Then we’re sort of seeing a repackaging of that false content.
It gives it credibility, and the same exact thing happens when a journalist spreads it, right? If a journalist picks something up because they noticed that something’s trending and then they spread it, then we’re more likely to believe it as well.
Factal: Today, I told my friend that I’m attending [your talk at] Town Hall and that it might be a bummer. Will it?
SW: [laughs] There’s going to be things in my talk that will be disconcerting and that should worry people. There’s a reason that the American book has the subtitle “how the next wave of technology will break the truth”. It’s meant to be provocative. It’s intentional, it’s meant to make people scared and it’s meant to make people think about the ways in which technology can become a tool for manipulating people.
The thing is, though, in the United Kingdom, my book just came out today, actually, through a different publisher. And there’s another part of the subtitle, which is, “how the next wave of technology will break the truth, and what we can do about it”.
Every single chapter in the book ends with solutions. So there’s a chapter on AI there’s a chapter on deepfakes, there’s a chapter on VR/AR, and every single chapter ends with solutions.
The solutions are user based, policy based or government based, and they’re also corporate and [privately] based. What can happen in all of these sectors? What can we do as people, what can the government do, what can companies do?
I’m of the opinion that we spend a lot of time relitigating the past and thinking a lot about 2016 and, you know, making the Russians into a boogeyman. That move actually gave them a lot more steam and a lot more power and made politics more divisive in this country rather than less. In actuality, the situation is much more complex. Most of the tools that were used in 2016 were super simple, rudimentary bots that were used to spread this stuff. Organic armies of people that were trying to get memes to trend.
I think you’re actually going to find out when you come to the talk tonight that there’s a great deal of hope in the book, and a great deal of optimism in the book about getting out in front of the problems that we face now.
But what this book’s about is, is preventing the misuse of way more powerful technology that everyone thought was used in 2016, but that actually wasn’t, but that might actually be used later.
Factal: So, not a bummer?
SW: Yeah. I’m going to talk about things that are that are a bummer, but I am going to talk about what we need to do.
Factal: How do you propose tech creators ensure that their products promote the values of democracy, the values of human rights? If we’re all participating in “the reality game”, which rules are most in need of a rewrite?
There’s a quote in the front of the book by a woman named Betty Reid Soskin, who’s a 98-year-old park ranger in San Francisco at the Rosie the Riveter Museum. She’s amazing. She has talks on YouTube, where she theorizes about democracy.
She worked in the shipyards that launched out a lot of battleships during World War Two. She’s phenomenal, and what she said is: “Democracy will never be perfect, we all have to build democracy in our own time.” We all have to work towards that more perfect union. [Democracy] was never meant to be finished, it’s a work in progress.
People that go around saying that democracy is done, that tech is ruined democracy, I don’t buy that. Everyone should look back to the core tenets of democracy, but also to the core tenets of human rights and think very carefully about how we can recreate those, or re-place those tenets into our day to day life now.
Factal: So some of the deleterious effects of misinformation and disinformation are really obvious, like, people thinking that democracy is over with. What are some of the less obvious effects?
There’s a few effects of disinformation and misinformation that fly a little under the radar. One of them is that disinformation and misinformation often lead to disengagement. A lot of the attacks that we saw in 2016, and a lot of the continued attacks we see around the world now that look like computational propaganda are actually built to make people apathetic or disengage with politics.
For instance, lots of the messaging that was spread during the 2016 election was targeting minority communities with text messages or with messages over social media that said “why vote?” or even in other circumstances trying to get people to vote on a different day, things like that.
Authoritarian foreign powers [have a vested interest] to get people to disengage with democracy, to make them think they don’t have any kind of voice.
The core belief behind this book is that there’s a lot of power in people and in numbers of people.
Another thing that’s unexpected that often shows up with dis- and misinformation is polarization. If disinformation is the wave that we’re experiencing right now, polarization is kind of the tide.
If disinformation is the wave that we’re experiencing right now, polarization is kind of the tide. We get really excited about the wave that’s happening, but the tide’s there all the time.Dr. Samuel Woolley
We get really excited about the wave that’s happening, but the tide’s there all the time. Polarization is what’s driving disinformation, but also is what’s caused by disinformation so it’s, again, a chicken and egg thing.
We have a lot of work to do specifically in the United States to overcome these kinds of divides that have been built.
Disinformation is just a really useful [tactic] to continue to drive polarization. The fact of the matter is, with the rise in populism and things like nationalism in countries like the United States, India, Brazil, Philippines, Turkey, it benefits the leaders that are ruling those countries to continue to allow polarization to thrive.
So that’s why many of those people who lead those countries are actually primary users of the term “fake news”. They attack the news media, they try to attack academics. The book actually begins with [President] Rodrigo Duterte in the Philippines attacking my research team.
Factal: Not particularly effectively, though.
(Ed. note: Duterte’s quote to begin Chapter 1 is “Oxford University? That’s a school for stupid people.”)
Factal: What resources do you recommend for people inspired and motivated by The Reality Game?
SW: There’s organizational resources, so, other groups like Factal, private and nonprofit, who are doing great work.
First Draft, who are sort of leading the charge in the next era of fact checking. There’s the Center for Media Engagement [at the University of Texas at Austin, where Woolley is program director for computational propaganda research] who works directly with journalists and journalistic entities to train them on all sorts of things related to digital media and politics.
There’s the Poynter Institute in Florida, which does excellent work with journalists. There’s the Atlantic Council, which has a lab that works specifically on this stuff [the Digital Forensic Research Lab, or DFRLab]. The German Marshall Fund has the Alliance for Securing Democracy.
And then there’s researchers out there that are doing great work, like the SMaPP Lab – social media and political participation – lab at NYU. There’s Kate Starbird here in Seattle [at the Center for an Informed Public].
In terms of what other things people can start doing with in terms of engagement – people should work to get really savvy with what their rights are, and what they can change on platforms. It’s kind of amazing when you go into Facebook settings or Instagram settings or Twitter settings, what you can actually change.
I think that everyone should get a little more woke to the mechanisms that exist already.
People should start demanding platforms that are designed with democracy in mind. It’s time that some of the social media platforms that exist today become legacy media as quickly as they became new media.
And I think we need some new platforms. We need platforms that put democracy and human rights front and center and that say we’ve built this thing to be — not just ethical, because I think ethics has been touted a lot by tech companies in order to obfuscate.
(Ed. note: Later that night, at Town Hall, Woolley will mention the Ethical OS Toolkit onstage – he worked on it while at The Institute for the Future. It describes itself as “a guide to anticipating the future impact of today’s technology.)
“Ethics” is in the eye of the beholder. You can’t really argue the same about “human rights”. Human rights are, as they say, unalienable.
Factal: Do you see any potential new mediums or new platforms that you think are built on that foundation?
I think that there’s still a lot of room for people to innovate on a grand scale. Right now there’s lots of great companies that are just starting up that could become that, companies like Factal, but at the moment they’re still scaling.
Facebook’s incredibly powerful and so is Google, and they both have the ability to buy companies that they see as competitors and close them. And unfortunately, this tends to happen.
Hopefully, we’ll be able to see a new perspective, from the venture world, and from the capital world so that there is actually a lot of money that gets directed to these new companies. The reality is that democracy and human rights are pretty good for business in today’s era.
Factal: What do you hope readers of your work and specifically of The Reality Game will do differently after reading it?
SW: I hope people do two things. I hope that they challenge themselves to always read the whole article, read below the headline and never share an article unless they’ve done that.
And second, I challenge them to have difficult conversations with people they love and that they’re close to, because that’s how minds actually get changed.
Factal: Thanks so much for the time, and we’ll see you at Town Hall tonight!
SW: Thank you.
Tyler Adams is senior marketing manager at Factal, a breaking news verification platform relied on by enterprise companies and NGOs alike. Top photo: A screenshot of Woolley’s interview with PBS Frontline.