The pursuit of truth in a post-fact world

Ali Velshi

Jana Chytilova

A few years ago, I started thinking about the issue of fake news. I delivered my first talk on the topic right here at Queen’s, for a TEDx talk in 2015. Back then, the main idea about fake news was that it was lucrative. It was cheaper, and more sensational, than real news. People would click on fake news links, and someone would make money from it. My concern back then was simply that people were going to put all sorts of bad information out there. What would happen, I wondered, if someone decided that, not only is fake news profitable, but with it, you can start to influence people’s thoughts politically? How naïve I was back in 2015.

Misinformation and disinformation

So I’d like to talk to you about where this has all gone since then, and I want to start with two important terms: disinformation and misinformation. They’re sometimes used interchangeably, but they actually mean different things. And the distinction is important.

There is intent in disinformation. Misinformation? This happens to all of us: when we retweet something credible that we’ve read. And it’s a basic – but really important – concept to understand about the world in which we now live.

Disinformation is somebody deliberately putting out bad information. Misinformation is somebody retweeting that bad information, believing it, posting about it.

Both are really dangerous. And sometimes the same person will do both. Donald Trump has a habit of doing both, actually. Sometimes it’s just his own stuff that he tweets that isn’t true, but he often retweets other people’s untrue stuff.

So let’s talk about how we solve this problem.

And the only way, I think, that you solve this is, first, understanding how to identify what is misinformation, and second, not being part of the disinformation problem.

I remember when social media first was invented. And I thought to myself, “This is amazing. No one will ever be able to lie again.” Because you’ll put a lie out and everybody will check it and you’ll – within moments – be embarrassed.

Clearly, I was on the wrong side of that one, too. If you can get your disinformation into an amplification channel, it then exists in a much bigger way.

As a journalist, fact-checking is certainly one of the first things you learn.

When I was doing this for the first time, when I worked here at the Queen’s Journal, I didn’t yet understand that was my only job. I didn’t understand that was even most of my job. We were supposed to check things. We called it sourcing and fact-checking. I assumed that if I interviewed someone, they weren’t obviously lying to me, but they might have some facts wrong. Or they might be misrepresenting something. So I would check the facts and I would make sure there were other sources who confirmed them. I didn’t realize I was going to be in a world where people actually come out on my tv show (which millions of people see) and lie.

The death of shame

In the old world, back in 2015, when I’d correct somebody on their lie, by the end of the interview I would have expected a phone call from their office to apologize or to clarify the statement. Maybe there’d even be a resignation. But now, they’ve posted the lie on their website before my show is even over. So, in the process of the mastery of disinformation and its amplification into misinformation, we have seen a subsequent phenomenon, which is the death of shame.

It used to be that being caught out in a lie was career-ending, reputation-damaging, bad for your income. So that was the nonsense that used to occur, but generally speaking, it wasn’t all that damaging for the rest of us. “Right” didn’t really matter. Somebody got rich off of it. You were misinformed for a few hours; you got embarrassed because you told it at a party and someone let you know that was a hoax.

But now, it’s about politics.
It’s about power.
It’s about wars.
It’s about coronavirus.
There’s some really bad information out there. If you look up all the stuff you’re supposed to do about coronavirus, it’s kind of boring. Wash your hands. Stop touching your face. But if you go to the wrong sources, they will tell you to drink bleach, which, I suspect, will kill the coronavirus. And you. It will kill you.

So this is the problem: we have a combination of things that have gone on. First, we trust our institutions less than we used to. We’re not necessarily turning to the same places where we should be turning to for health advice. Most people don’t go to the World Health Organization or the Center for Disease Control for their information on a daily basis. So, disinformation makes it to Google, which we do turn to. During the 2016 presidential election in the States, the top 20 fake stories about politics were way more popular than the top 20 real stories. Fake stories have better headlines. And second, the algorithms are better. Fake news stories will be stuck into your feed in a way that is more lucrative to the social media companies. And so fake news spreads faster than real news spreads. If I am creating a fake news piece, I can tailor my story to you, the audience. I don’t have to figure out a way to tell a story about what actually happened; I can understand that you are driven by fear. You are driven by whatever you’re driven by, and I can use that to parlay my entire message to you. And if you read some of the stuff about what happened in the 2016 election, what you will see is some remarkable targeting. We know much more about this now than when I gave my talk here in 2015. But it’s all deliberate.

Fear of “the other”

A disinformation campaign feeds on the idea of there being an “other” – someone to blame. The 2016 U.S. election – and many elections in the Western world in the last 10 years – have all centred around “the other.” In the U.S., the traditional American Conservative mantras of low taxes, abortion, and guns have been supplanted by a fear of immigrants overrunning the American borders.

Once you have created “the other,” people are susceptible to lies about them. And things like disease and crime fit neatly into this narrative. So what you have is a willing audience. In some cases, you have an amazing distribution method for the narrative.

This is my larger point concerning the coronavirus. It is serious. There’s no question it’s serious. It has already infected many times more people than SARS did and it’s got a higher mortality rate. We don’t know enough about it. We haven’t tested enough people to know. But it’s not actually going to destroy the world. It will change some of our behaviours, but it will not wipe humanity out. But acting on the misinformation about coronavirus will, ultimately, make defeating this very real threat harder than it may have been if we relied solely on the facts.

But before coronavirus, it was socialists that were going to wipe America out. Before that, it was immigrants. It’s always something. Something is always going to be a fundamental threat to your way of life and how you exist. And social media preys on that and people prey on that.

Covering the lies

I live in a world where probably 50 per cent of what I do is cover Donald Trump. And people have said to me, “Why don’t you just not do it? Why don’t you just not put him on TV? Why don’t you just not cover the lie?"

But it’s important to talk about the lie and to correct it.

For instance, Donald Trump has said incorrect things about the trade deficit with Canada. And he would say it and then we would correct it on TV. We have researchers and fact-checkers, and I’d go over and over the numbers. Once, twice, three times. And then he’d repeat the misinformation again, the next day, and the day after that. At that point, I realized, “This is not just misinformation; it’s not just him repeating something he heard. There is intent behind spreading this information.”

So, I have a responsibility when it comes to reporting on what the President of the United States says and what the facts are. He creates policy. In the course of one of the 150 and 200 tweets he sends in a day, he will, at some point, create a policy that will affect you. And I don’t want to have decided that that was the day I’m not covering Donald Trump.

There are days when he’s just tweeting, tweeting, tweeting and some of it means nothing. And then suddenly it does mean something. He tweets and it affects markets. That affects your investments.

Recently, the central bankers of all the G7 countries had a meeting and they agreed that they didn’t need to cut interest rates. But Donald Trump had been tweeting about interest rates so much that the Federal Reserve, which is supposed to be independent, cut interest rates in the U.S. by half a percentage point, which is a very unusual thing.

And then Canada cut its interest rates because Donald Trump bullied his guy into cutting U.S. interest rates when most of the smartest people in the world said this was not the moment to do that.

So what happens when you cut interest rates? Anything that bears interest becomes unattractive. So even people who otherwise wouldn’t be in the stock market put money into the stock market. So if you’re upset about the interest rate that you’re getting in the bank today, thank Donald Trump. Now you might be happy because your stock market came back a little bit, but that’s a bit of a short-term solution.

So, it’s not just the bald-faced lies that we need to worry about, it’s this general ecosystem of bad information that makes its way out there and then becomes government policy. So why do I cover this? Because whether you like Trump or not, whether you’re interested in American politics or not, this just affected every Canadian financially. Misinformation leads to policy that affects actual people.

Bearing witness and holding power to account

The number one duty of a journalist is to bear witness. So sometimes our job is just to tell you that it happened, which is exhausting when it comes to Donald Trump and his tweeting. But it’s important that you know, because the number two calling of my profession is to hold power to account.

So sometimes, on my show, I have to go down a rabbit hole, talking about something Trump – or someone else – said, and then correcting the disinformation, when I had planned to tell my viewers about something else that was really interesting and probably useful. Because I’ve got to hold them to account and tell you why it was dishonest. And by that time, my show is over.

I don’t necessarily mean to pick on Trump. He just happens to be an interesting, well-known example of disinformation in action. But I have interviewed so many public officials who will say things to me that I can prove to be untrue. In fact, I have a bank of all that stuff ready to go when I do an interview; I’ve got charts and data. It doesn’t matter. People will just lie to me. They’ll just say things, and then what they say goes into the ether and a lot of people will see it. And those people will not only internalize it, they’ll spread it. And then they’ll take it to the ballot box with them.

Back in the old days, misinformation was presented at the Thanksgiving table. Everybody had an uncle or somebody who pronounced, “This is what the government is up to.” And they know because someone “in the know” told them. Distrust in government has been going on for a long time. But now with the ability to spread it professionally, it’s not just your crazy uncle at Thanksgiving. It could be somebody you really trust, someone in the social media world, and, in fact, someone in the real media world.

People are curating their information. They curate who they get it from. And with social media, this is compounded. Once you – inadvertently or deliberately – click on something very politicized, then an algorithm kicks in: “Ah, you’re worried about these issues?” and it will feed you more. And by the way, this doesn’t mean that you clicked on something that says, “Are immigrants criminals?” They’re much more sophisticated than that. Facebook knows – from what you do, what you post, what you say, where you say you are – Facebook knows more about what you think than you know about what you think. They really have behavioural tendencies mastered.

So you’ve got this combination of algorithms that work to distribute information that is going to be the most salacious to you, and bad actors who are willing to manipulate that information.

In the end, I don’t know how much we can govern this just yet. I don’t know if we know enough; I don’t know if we’ve got the technology yet to do it.

But we can stop the spread of disinformation.

These days, in many schools, kids are being taught how to understand the consumption of information. I don’t recall getting much formal education on how to consume information, but it was less complicated when I was growing up.

But these days, this is important. Maybe this should be the kind of thing that Queen’s could do for the community, some seminars and public conversations about what healthy consumption of news looks like. Because this is a safety issue. It’s about your health. It’s about your prosperity. It’s really important that you have good information.

So, what can we do?

First, do no harm.
And that is the starting point for everybody here. If you consume information from social media, just consume it first. Don’t spread information you don’t know to be true.

And push back sometimes. If you’re part of a large social media community, you can say “Hey, where are you getting that info from? Because I’ve seen otherwise.”

You can cause the discussion to happen. So much misinformation is unintentional. It’s from people we trust. We – all of us – like to share information. Sometimes we like to be the first in our group to know something and to share something.

Think about those things that you know about. Let’s go back to using social media for healthy dialogue about issues, back and forth, as opposed to for misinformation.

Second, triangulate your information.

Find multiple, credible sources for your information. We curate the information we get by choosing the sources we get it from. If you are a liberal, find a conservative outlet that you can trust, even if you may not agree with it on everything. If you’re conservative, find a liberal outlet that you can trust. Again, the point is not for you to agree with it all the time. The point is to understand what’s out there in the ether, so that you can triangulate multiple, credible sources for your information.

News has always been biased. Don’t fool yourself about that. People have biases. But you can adjust for biases. Fundamentally, bias is not what’s going to destroy the fibres of democracy; lies are.

Ali Velshi, Artsci’94, LLD’16, is a journalist and tv anchor at MSNBC in New York. The author of two books, he is currently working on his third. Follow him on Twitter: @AliVelshi.

Related Articles

Prefer the offline issue?

The Queen's Alumni Review is the quarterly magazine for Queen's University alumni. Compelling stories and photos make it a must-read for all who love Queen's.

Download Spring 2020