Polarisation is seen as a threat to democracy – and social media is seen as a cause. But what can be done? Does the blame really lie with tech alone? And what could the virtual public square look like if we dared to hit "reset" and redesigned our apps from scratch? A radical and counter-intuitive conversation between Chris Bail, head of the Polarization Lab at Duke University, and Samira Shackle, editor of New Humanist magazine, on tribalism, extremism, and not logging off. For fans of Azeem Azhar, Jonathan Haidt, Nick Srnicek and Shoshana Zuboff.

Podcast listeners can get a year's subscription to New Humanist magazine for just £13.50. Head to newhumanist.org.uk/subscribe and enter the code WITHREASON

Hosts: Samira Shackle and Niki Seth-Smith
Executive producer: Alice Bloch
Sound engineer: David Crackles
Music: Danosongs

Transcript:

Samira Shackle:

Hi there. This is With Reason from New Humanist magazine, with me Samira Shackle

Niki Seth-Smith:

and me Niki Seth-Smith.

SS:

This, as ever, is where we think about questions of reason and unreason belief and disbelief, criticism and debate, all through conversations with thinkers who speak to our turbulent times and who aren't scared to challenge lazy thinking.

NSS:

And today, we're talking to someone whose work is all about improving the quality of our public conversations of our democracy, really. Here's Chris Bale, who heads up the Polarization Lab at Duke University in the US, where he's professor of sociology and public policy. Samira, I'm going to enjoy listening in on this before catching up with you at the end of the show. So I'll leave it to you to fill us in on Chris.

SS:

So reading Chris's work has made me think, again, not just about how we use social media, but also how we talk about it. His book, Breaking the Social Media Prism challenges the accepted wisdom on things like echo chambers and algorithms. And it suggests that if we really want to solve political tribalism online, the solution isn't just with some isolated thing called technology, but it's also inside ourselves. And he offers some quite surprising conclusions, actually. So, you know, we hear this idea that if we were to just step out of our echo chambers, then you know, we wouldn't be so divided. But his research shows that it might actually make us more polarized being exposed to opposite views, not less. So the book draws on a ton of clever online experiments and interviews. And it offers not just a picture of how things are, but how they could be. So not least, he introduces new apps and bots to help us engage in better conversations with the so-called other side. So Chris writes that his interest in political polarization is personal. So I asked him why?

CB:

You know, I was born in a very comfortable suburb of the United States and had a very comfortable upper middle class upbringing, until I was 11 years old, when my father announced that we were moving to the French Congo, so that he could do work on the HIV pandemic, which was then ravaging Sub Saharan Africa. And so I went from the comfortable life of, you know, playing video games and eating McDonald's, to arriving in a country that was really in the throes of civil war. And I saw shocking things, you know, things that that still haunt me to this day. And as I got to know the people involved, our neighbours and friends, I discovered something that I think many of us discover, which is that people can be terrible to each other, even when they're very similar. And despite our enormous similarities, these small differences seem to have, you know, the potential to create violence and really awful conflict. And so, you know, the other thing that's happening at this point in time, this was now the early 90s, is the internet was taking off. And so the obvious questions are, you know, about how the internet is going to shape our ability to tolerate each other and get along with each other. And this has really been a lifelong obsession for me to figure that out.

SS:

So I guess that experience in the place that's now the Democratic Republic of Congo, was a reminder in a way that polarization maybe isn't an abstract thing, or some sort of worry of the chattering classes, it's something that can really have life or death consequences.

CB:

Exactly. Yeah. And even in the US today, you know, some surveys are suggesting that as many as 40% of Republicans and Democrats are willing to justify violence for political ends, so that, you know – we've not yet seen that on this on anywhere near the scale of, say, a civil war in the Congo and I'm not necessarily suggesting that we're headed in that direction – but certainly, you know, the potential for violence is obvious.

SS:

Well, that's an astonishing statistic. Yeah, I guess you're very well placed to know statistics like that. And to be watching this because you run the Polarization Lab, which brings together scholars from social sciences and statistics and computer science to think about how to bridge those partisan divides in America. Now, I wonder if you could tell me a bit about the methods that the lab uses to study and understand polarization?

CB:

Sure, you know, we are at a kind of a golden age for social science, and social media and the internet more broadly, has really unlocked a treasure trove of information about human behaviour. And you know, for many years now, we've been able to collect large amounts of data about what people think and say to each other and online, and that created, you know, a real excitement that we would be able to crack kind of long standing challenges in the social sciences. But you know, more recently, we've discovered that social media is only part of the story. And of course, you know, we still need the human connection and to understand how technology comes in and out of people's lives. And so studying political polarization and developing new technology to address it, which is the mission of our Polarization Lab. And what I read about in my new book, how to make our platforms less polarizing, but we need to approach it from multiple angles.

SS:

It sounds like a very worthwhile thing to be working on at this moment where we hear so much about polarization. And yeah, I wanted to think about that term a bit, because it's a term I think that's been used a lot in recent years, of course, to do with us electoral politics, also here in the UK around the Brexit referendum and the political divides that ensued after that, I wonder, are we more polarized than ever? Or is it just that we're more aware of it? Or that we talk about it more, perhaps, or that it's more visible because of social media?

CB:

Yeah, I mean, you know, unfortunately, so much of our evidence comes from the US. And, you know, I think we focus on the US too often. And there is evidence that polarization is going in different directions in different countries. But in the US, it's fairly clear that polarization is high. Now, it depends how we define polarization, as you said. On the one hand, we can define polarization, as you know, how much people agree or disagree about social policies. And those levels actually have been pretty steady over the last 20 years in the US, but 20 years ago, the US was a deeply divided place. What's more concerning and seems to be rising much faster is what is often called an effective polarization or, you know, negative attitudes between members of rival political parties, independent of the content of their ideas, and, and about 10 years ago, for the first time, out-party hate, hating the other party, became more common than loving or approving of your own party. And so we're really seeing a downward spiral, in in this effect of polarization in the US today.

SS:

That's fascinating. I guess, obviously, as you're saying that going back 20 years, the US was was a pretty polarized place, then as now. So in the kind of social media era, is it that those existing trends have been accelerated and exacerbated? Or do you think that something different is happening because of the way that we use social media politically?

CB:

It's very difficult to isolate the impact of social media. I think everybody wants to know the answer to that question: is social media the problem, is it the root cause? In the US, it's pretty clear that there were historical antecedents to the internet and social media that were shaping polarization, things like perhaps the fall of the Soviet Union, which maybe had a unifying effect on Americans, things like the rise of cable news and the market segmentation of the media, which may have created incentives for you know, more outrage in the news. There's a number of contributing factors. But you know, of course, today, anybody who spends any time on social media can see that social media at the very least, does not seem to be helping.

SS:

Yeah, yeah, absolutely. And I guess we should clarify, probably, having brought up this this phrase, social media, are we talking about Facebook and Twitter, which I guess are some of the biggest platforms? Are we talking more generally than that?

CB:

Yeah, I mean, you know, certainly I would include Facebook and Twitter. And that definition, you know, some people might include YouTube, I think that's a little more of a grey area. But you know, some tool that allows us to connect and share information and have conversations across, you know, great geographical distances, and asynchronously is basically how I would define social media.

SS:

When we are talking about social media in this broad sense, another term that often gets used is “echo chambers”. And again, you know, we've already talked briefly about the US electoral politics and Brexit here. But those are two subject areas where I think the term echo chambers has been used very, very frequently, although you point out in your book that the term was actually introduced in the ‘60s, which was interesting. So I wondered if you could just tell me a bit about the conventional wisdom on how echo chambers work and how they contribute to polarization. And actually, how your research has challenged all of that. So I know you did some experiments on this at the lab?

CB:

Sure. Yeah. You know, unfortunately, it seems like a lot of the prevailing narratives might not be right. So the idea is something like this: the echo chamber refers to our kind of all too human tendency to surround ourselves with like-minded people. So we did this long before social media, we know. But social media makes it much easier to do that. And the concern, of course, is that this creates a kind of myopia, where we only kind of come to reinforce our pre-existing views. And we become less able to, for example, see that there's two sides to every story or perhaps empathize with each other. And of course, you know, there's additional concern that algorithms within social media companies have amplified this and reinforced this human tendency to surround ourselves with like-minded people, creating what's sometimes called a filter bubble.

So I think with so many of our unexpected events, like the Brexit referendum or the election of, you know, former President Trump, this seems like such a great explanation. You know, many of us were shocked because we simply weren't seeing the other side. And so in 2017 it was very much my own view. But at the time, I realized that no one has really actually scrutinized this idea. And so we decided to do a study to try to see if taking people outside their echo chamber would make them more moderate. And we basically, we recruited about 1220 Republicans and Democrats who use Twitter. And we asked them to complete a lengthy survey about their political views. And then about a week later, we invited half of them to follow a Twitter bot that we created that exposed them to 24 messages a day from a member of the opposing political party. And when we went to re-survey them a month later, to try to identify the effect of stepping outside one's echo chamber, we were pretty surprised. We had, of course, hoped that this would create moderation, and we saw no evidence of moderation. To the contrary, liberals seem to become a little more liberal when they follow the republican bot. And Republicans became substantially more conservative when they were exposed to our Democratic bot.

SS:

Yeah, it's absolutely fascinating that because, as you say, the accepted wisdom, which I think is often almost spoken about, as if it's a truth is that, you know, if we could just break these echo chambers and filter bubbles, then we'd have the key to unlock some greater level of understanding. And yet you say that this, you know, that the common wisdom about this, is, quote “The common wisdom about social media, echo chambers, and political polarization may not only be wrong, but also counterproductive.” And you actually suggest that our whole relationship with social media isn't what we think it is. And we might think we're using it as a mirror. But in fact, it's more of a prism. That is obviously also the title of your book, I wondered if you could explain what you mean by this phrase, the social media prism?

CB:

Let’s start with a statistic: about 73% of all tweets about politics in the United States are generated by 6% of people, and that 6% of people have extreme views. And so what this creates is kind of a prism-like effect, where we see the amplification of the extremes and moderates can seem all but invisible. I think we've all had that experience, you know, logging onto social media and asking ourselves, you know, where are all the moderates? On the other side, I'm only seeing extremism. And what we're seeing, in fact, is what I call the social media prism, which is the tendency for us to misunderstand the breadth of polarization, because of the structure of social media and how it incentivizes certain types of human behavior that lead to an outgrowth of extremism and discourage moderation.

SS:

You talk about the way in which political identities can override our rational instincts on social media.

CB:

Yeah, I think we all want social media to be a competition of ideas, you know, in its idealized state, that's maybe what social media should be, right? We should be discussing the issues of our day, presenting rational arguments and arriving at some type of compromise. But clearly, I mean, anybody who's spent time on social media, you know, this is the exception, not the rule. It's probably more like a competition of our identities than our ideas. And the reason for this, I think, is very simple, which is that, you know, identities are central to everything we humans do. It's what makes us unique creatures. Each day, knowingly or unknowingly, we present different versions of ourselves, we observe how other people react. And then we tend to start to cultivate the identities or the presentations of ourselves that that make us feel good about ourselves, they'd give us status.

So social media, I think, has done two things to the ways that we create our identities that have largely deleterious consequences for everyone. The first is that we have unprecedented flexibility to present different versions of ourselves: I can, at the very least, emphasize certain parts of my identity and downplay others. And then the second thing that social media does, has changed the way that we monitor our social environment for clues about which ones of our identities, which identities we present are kind of working. So we have like buttons, you know, notifications that give us instant gratification, not through some kind of simple flashy eye candy. I think that's the relatively simplistic idea that's out there. But because it's really helping us do something that we have a strong instinct to do. And that's to learn how to gain social respect and status and feel good about ourselves. This is our core instinct. So social media, in other words, has thrown this instinct into hyperdrive. And of course, the tools we use to monitor our social environment, things like likes and follower accounts, and so on, of course, create a very distorted picture of what's really going on.

SS:

And this, this idea of status seeking, when thinking about it in the context of polarization, I guess also connects to our need for a group identity which is also something that you talk about in the book.

CB:

Yeah, well, first, let me let me detail that point a little more, I'd love to tell you some stories of the people that we met in doing this research. And the story I'd love to tell you is about this guy will call Ray, we use a pseudonym to protect his confidentiality. When we first interviewed Ray (and one of the unique things about this book is that we're not only looking at things from 30,000 feet and looking at millions of social media users, but we're trying to take the perspective of the social media user as well, and particularly how things can be so different online and off). And this guy, Ray, is a moderate conservative, and when you meet him, he's very polite, he's deferential. He even goes out of his way to say, decry racism and complain about people arguing online, you know, who who probably, as he says, live with their mom in her basement. And so, you know, when we began to track him online, I suspected we would find this kind of relatively civil kind, man. And instead, what we discovered is that he transforms from Dr. Jekyll and Mr. Hyde every night. In fact, he's one of the most prolific political trolls that I've ever seen. And I've been studying this stuff for more than a decade. He just shares meme after meme denigrating “liberal hypocrisy” in some of the most vile and unspeakable ways possible. And so, you know, the interesting question is, what creates this transformation? Why does this guy Ray, you know, make this transformation. And when we began to merge the data from our interviews with the social media data with survey data that we had collected from him, we discovered that Ray is actually a recently divorced, middle-aged man who lives with his mother, exactly the type of person that he had been, you know, denigrating in the first part of our interview. So he was really, you know, maintaining two different lives. And as a political minority in his everyday life, you know, he works in a place with lots of liberals. I think social media created a kind of refuge for him and a source of status that was deeply motivating to him no matter how deleterious for the rest of us.

SS:

And when you talk about people getting enmeshed in political extremism, and I guess uncivil behaviour as well online, you compare it at one point to the way in which people behave when they're in a cult or joining a cult. So yeah, I wanted to ask about that. Because I guess we, and again, it comes back to this different definitions of extremism. But we might think about that cult membership, sort of comparison when you're thinking about far right groups, or ISIS, or the other kinds of things where people get radicalized online, but maybe not so much when thinking about someone who's an extreme supporter of the Republicans or Democrats. So yeah, I wondered about that.

CB:

The question we need to ask is what motivates us to use social, social media? I mean, does anyone really think they're changing anyone else's mind? Recent research by the Pew Foundation suggests that, you know, almost nobody is changing their mind based on something they see on social media. And I think many of us know that. But we're, you know, so many people are still motivated to argue kind of out into the ether. So what are they doing? Are they really trying to persuade others? Or are they trying to signal to people on their side, the strength of their identity and the strength of their commitment. And that's where I observed these cult-like dynamics, you know, strong penalties when people for example, unfollowed each other and antagonism among the extremists themselves, you know, really not only egging each-other on to take more extreme positions, but then also attacking those who don't fall in line, especially members of their own party.

SS:

You're listening to With Reason from the Rationalist Association and New Humanist magazine with me Samira Shackle, where I'm talking to Chris Bale, who heads up the Polarization Lab at Duke in the US, all about political tribalism online. More from Chris in a minute, but time now for a quick word from our deputy editor Niki Seth Smith.

NSS:

If you're enjoying Samira’s conversation with Chris and want to hear more from With Reason, take a couple of seconds right now to tap “subscribe” in the app that you're using. It helps us to keep making episodes for you and means you'll be the first to hear when Season Three begins. And if you're especially interested in today's subject of polarization, you might want to click back through our archive, you'll find the anthropologist Joe Webster talking about the value of listening to people you disagree with. And you'll also find the poet Michael Rosen reflecting on the peril and promise of social media that's in the episode before this one. Back now to Samira and I'll be seeing you at the end of the show.

SS:

Yeah, so Chris, one of the things that you found in your research is that political moderates are silenced on social media, while more extreme voices get a bigger audience. So yeah, how does that work? Is it a mix of technological or algorithmic factors and also the human so I guess like moderates disengaging from platforms and extremists amplifying each other by arguing with each other?

CB:

Let me tell you another story about one of the people we met in the research to answer this question. We'll call her Sarah. She's in her early 30s. She's a moderate conservative. She's from New York City, which is a very liberal part of the United States. But her father is a police officer. And that makes her, you know, have somewhat conservative views on things like, you know, police violence and, and handguns and things like that. She's also half Puerto Rican, though. So she has, you know, some some liberal family members who are, you know, very strong in their views, and she went to a prestigious, liberal college. So you know, she has lots of friends who are liberal, she's kind of exactly the type of person who, you know, if there is a middle and divisive debates on things like race and policing, Sarah is the type of person we need to hear from.

But the problem is on social media, she seems all but invisible. And here's why. So, you know, when we first interviewed her, we asked her as we did to most people, you know, tell us about the last time you use social media, what happened, you know, and she says, “Well, you know, the thing that sticks out was a few months ago, I was up late one night, I just got my children to bed. And I saw a post about handguns by the National Rifle Association”, which is this group in the US that promotes the right to own guns. And she simply kind of commented on the post saying, “Yeah, it's Americans right to own guns. And my husband is, is a responsible gun owner and deserves the right to protect our family.” That's pretty innocuous in the landscape of American gun debates, right? Then she tells us within minutes, her phone was lighting up with replies. You know, the first one was someone who had apparently, you know, scrolled through their Twitter feed, saw that she had children, and then posted, “I hope your kids find your gun and shoot you.” And this is the type of experience which unfortunately, is all too common. In fact, the Pew Research Center, a large survey organization in the States, recently discovered that the most common reason that people get harassed online, is their political views. So when you think about it, for someone like Sarah, engaging in a political debate, and particularly sharing her moderate views, not only risks inviting harassment from extremists on the other side, but also on her own side, it's kind of a lose-lose thing. And for people like Sarah, who, by the way, has a very happy life offline, and you know, wonderful kids and a nice house. You know, there's really nothing to be gained by voicing moderation on social media. Whereas for someone like Ray, the Dr. Jekyll and Mr. Hyde character I was discussing earlier, there's every incentive to be extreme. And so we need to think more about the incentive structure of social media and why it encourages extremism and mute moderates like Sarah.

SS:

That’s interesting. And so would you say that those sort of personal decisions that people are making matter more than some of the things we hear about, like, the idea that posts with more engagement, which might be the more extreme content, gets boosted and so on?

CB:

Yes, yeah. You know, so many of the prevailing narratives out there, we've already talked about the echo chamber. But there's some other prominent ideas out there, like the idea that foreign governments, notably Russia, have tried to interfere in the US and elsewhere and divide people against each other. And, you know, this is a very intuitive idea. And I think many of us were very concerned about that. But once again, we discovered that no one had really scrutinized this idea. And we had a unique opportunity with some data that we had collected to measure what happens when someone actually interacts with one of these accounts linked to the internet research agency, which was allegedly run by the Russian government. And of course, we expected people to be changing their views and become more antagonistic towards each other. And instead, what we discovered is interacting with these accounts didn't seem to change people's attitudes, or even their behavior very much. And there again, this was a neat kind of tidy explanation. Oh, it's just the echo chambers and the foreign misinformation campaigns. And he put them together, it explains what happened. But really, you know, the somewhat disturbing conclusion that I've come to, but one that also eventually gave me hope, was that it's about our human behavior. It's the way that we are the ones producing the vitriol and we are the ones polarizing ourselves. Even if Facebook, Twitter and other platforms enacted sweeping reforms, which by the way, I think they should, and I think there's a lot that platforms could do better, we would still be left with a really polarized landscape. And it's up to us, the social media users to change that, I think and that's what gives me hope. Because even though we might be driving the problem, it means we also have the power to produce solutions.

SS:

And going back to Sarah. Yeah, I can certainly empathize. I think with that, you know, people like her deciding not to post about politics because it's kind of not worth the hassle almost with unpleasant arguments and so on. I definitely post less about UK politics on Twitter, for instance, than I used to. But you make the case that someone like Sarah, a moderate, should be fighting that instinct and posting more. Is that kind of what you're saying? For the greater good?

CB:

Yeah, you know, I'm a little worried about the “delete your social media account” movement. Now, for the record, I for a long time thought that social media was a net negative and I still have deep concerns about our current trajectory. But what I became even more worried about is whether there are any alternatives. So, you know, I think so many of us would love to say, you know, just meet up with each other in offline settings, you know, maybe get a beer at a pub in London, and maybe that can produce a better understanding of how Brexit happened than anything that could possibly happen on social media. And that's probably right. But at the same time, if we turn our attention towards young people who are on social media in unprecedented numbers, in fact, they grew up with social media. And then we begin to ask the question of political segregation geographically in the US, for example. A recent study from Harvard indicates that, you know, the majority of Republicans and Democrats will almost never interact with each other in offline settings. And then you throw on top of that the pandemic, it really seems that social media, for better or worse, and probably for worse in the short term, is going to continue filling that role, and maybe one of the few places where we can actually have cross party conversations. And so you know, if we all do it, that counts. My concern is, of course, the people deleting their accounts are people like Sarah, the moderate Republican woman I just described. And then that's only going to make things worse, it's only going to drive the misperception that extremism is pervasive, and moderation simply doesn't exist.

SS:

Yeah, it's interesting, because I guess as a as a researcher looking at polarization, that seems very clear why a moderate should continue to post but I guess lots of people do just want to get through their day without being shouted out by lots of strangers. So yeah, I wonder, what do you kind of think is the solution to that? And we can talk more about solutions later. But yeah, I wondered what you sort of think about that those motivations and incentives?

CB:

Well, I think we need to think about them from sort of the bottom up and the top down. So I think we have the most leverage, again, from the bottom up to the extent that, you know, we social media users are creating these cleavages ourselves through these, you know, all too human tendencies to protect our own group and denigrate the out group. But I think the first thing we can do is learn to see the social media prism. And the reason I call this book, Breaking the Social Media Prism is because I think just simple awareness goes a long way. So most people who hear that statistic that I mentioned earlier that 73% of tweets are created by 6% of people who have unusual views, it helps them understand that when they're seeing someone with opposing political views, it's probably not a representative member of the other party. And it kind of breaks the chain of what we sometimes call false polarization, the perceived polarization on actual polarization that can make us all feel so helpless. And so you know, there's a number of ways we can do this, awareness is great, and a lot of books like mine, and with a set of prescriptions about how we should all be better humans, and you know, be more aware of our, you know, the darker angels of our nature, and blah, blah, blah. But that's hard to do. You know, it's hard to just stop defending your side's identity and stop attacking the other side's identity, these things become kind of unconscious.

And so one of the things that we wanted to try to innovate was some new technology that would help people, you know, public tools that people could use to avoid extremists, for example. So if you go to PolarizationLab.com, your audience can try out our trollometer, which is a tool that allows you to input characteristics of a social media user and monitor the language that they use and see the likelihood that they might be a political troll or an extremist. You can also learn to see how the social media prism refracts you. So you know, you can use our apps to, for example, get an ideology rating of your tweets. And you can take a quiz that allows you to compare what you look like online to what your views actually are.

And secondly, you know, in addition to this awareness, we really need help finding people on the other side who have moderate views. So we created something called the bipartisanship leader-board, which is a ranking of prominent Twitter users, politicians, journalists, media organizations, even celebrities. And we track how often they get likes by members of both parties. And the idea is to kind of try to create a kind of status around moderation, where right now there's none. So we're really trying to innovate and you know, technology, at its best, can optimize for democracy. But we need to know what we're shooting for. And we need to know that, you know, currently, the status incentives are just are just horribly and even horrifically misaligned.

SS:

Yeah, that's so interesting. Take the idea that we can with these tweaks (although obviously, as we've said, so much of this has to do with human behavior) but with some tweaks, you can still make that change.

I want to move now to our archive bit. So this is something we do on With Reason where we turn to the New Humanist archive for a piece that speaks to our guest’s work. So today, Chris, I wanted to ask you a couple of questions prompted by a piece we published last year by Nicola Cutcher, which is called Does the Left Have a Problem with Empathy? And it's about how political polarization plays out in the interpersonal realm. So it's these trends, like people writing (this is here in the UK) writing notes on their dating profiles, for instance. So when it comes to polarization and the way it affects our personal lives, do you think that there is any difference between left and right? One of the things that she looks at in the piece is this idea that people on the left are somehow harsher in their moral judgments of right-wingers than the other way around? I wonder if you have any thoughts on that?

CB:

Yeah, it's interesting, you know, what's the saying, the US and the UK are two countries divided by a common language, you know, we're so similar and yet so different. In the US, the debate is kind of running the opposite way, we've come up with this concern about asymmetric polarization. And the concern is often that, that the Republican side is, is lacking moderation and empathy and not the liberal side. Now, that seems most evident at the elite level, where, you know, we see elected officials seem practically incapable of any kind of compromise right now. And that seems to be driven historically, by conservatives in the United States. But when we get down to ordinary people, you know, I think what we sometimes called the empathy gap is very easy to observe. Sometimes, you know, another issue here is these things kind of sway with elections. Right? It could be that in the UK right now, liberals, you know, being out of political power to some degree, you know, with Boris Johnson, are just more angry, and that things might swing if a Labour politician were in power. And conversely, of course, in the US right now, with with Biden in power, it could be the Republicans that are particularly angry, although we saw no shortage of liberal anger during the Trump years, of course.

SS:

Yeah, yeah, of course, certainly no switches of that. Nicola also cites a study by Jonathan Haidt, the influential American thinker on polarization, in which he found that people on the left - I guess, liberals in the US parlance - were worse at understanding people on the right than the other way around. So when asked questions about views of people on the right, liberals were sort of worse at guessing what their opponents views would be, then vice versa. And I wondered if that tallies with your research or not really, did you find that people on both the liberal and Republican sides make the same kind of wrong assumptions about others’ beliefs?

CB:

More the latter. We've mostly found that misperceptions are pervasive on on all sides. And there's some interesting new work that took this far beyond the US and the UK into, you know, actually dozens of countries and just looked at all sorts of rival groups. And what we see is time and again, we misperceived the intentions and, and, you know, strength of commitment and desire for compromise from the other side. So, you know, in the US, there's an interesting study, basically, a set of Americans were asked if you're a Democrat, who do you think the Republicans are? And the Democrats say something like, well, Republicans are very wealthy. They are evangelical Christians, and they live in rural areas. Then if you ask Republicans, well, who are the Democrats, they'll say, they're young, they are ethnic minorities, and they live in cities. And of course, the reality is that the average Democrat and the average Republican is a middle-aged white person from a suburb. And so in an interesting study, some political scientists have corrected that misperception and shown really appreciable gains in in de-polarization. So it seems, to some degree that, you know, correcting this perceptions is an obvious first step here.

SS:

Well, before we wrap up, let's talk about what can be done about all this, how we can reduce political tribalism online, you spoke about some of the tools that you've looked at. But I wondered as an individual as a, as a social media user, what could I do right now today to help the situation?

CB:

I think, you know, consider this as kind of civic duty. You know, we're all pretty content to detach ourselves from politics, when it becomes difficult, and certainly in Britain and the US we’re in an extraordinarily difficult time. And, you know, we have every inclination to kind of retreat away from social media, because we've had such negative experiences. But if we recognize this as a collective responsibility, it I think, we can begin to rethink how important it is to carefully reflect when we use social media. Is the person I'm interacting with an extremist? Who maybe isn't worth my time? Or is this the type of person that might actually go and talk to others about the issue that we're discussing? And it could it have a scalable effect. So we need, you know, more reflective social media users. But we also need top-down solutions. It would be naive to say, you know, everybody, just go use the tools on Polarization Lab and everything is gonna work out. Those are tools to promote awareness and to create a conversation. What we really need is a paradigmatic shift, you know, we've been too content to allow platforms that were really originally created for very banal, or even sophomore purposes, things like helping people read each other's physical attractiveness, or, you know, in the case of Instagram, arranging alcohol centric gatherings, you know, why should we expect these platforms to serve the complex communications needs of modern democracies? But more fundamentally, we've never asked the question, what should be the design principles of social media? If we could redesign social media from scratch? You know, knowing what we've seen so far? How would we redesign it differently? And how would we optimize the platforms to promote social cohesion instead of the incivility which is so obviously spreading right now?

SS:

Yeah. So there's my own perspective on this right now is that I've just published a story about COVID conspiracy theories, and my personal Twitter mentions have been a complete bin fire for the last 24 hours. So, I mean, yeah, maybe I’ve got that that perspective of someone who's sort of wilfully disengaged for the last bit of time. But it's nice anyway, to hear, you know, that being as it is, to hear someone talk in an optimistic way, that's not just diagnosing the problems, which I think are very, very obvious, but, but thinking about how it can improve.

CB:

So here's the thing, you know, if we take the long view, for a moment, we're at the very early history of social media, you know, we know that all these technological shifts take time to really settle out. Even in the short history of social media every two or three years, you know, some new platform has come along. And I think, given broad dissatisfaction with social media, right now, there's a real opportunity to innovate and some entrepreneur out there who you know, is willing to do the work is going to discover a much better design. The problem right now is that the conversation that we're having about design principles is just speculation, and sometimes even self-interested speculation. So we have apocryphal tech leaders, you know, now telling us that they uniquely understand human behaviour and how to fix it. We have politicians really trying to take down the other side and debate about government regulation. But what we don't have is really cold, hard evidence about what works and what doesn't work.

And the other problem there is, of course, the platforms themselves, which have thus far been very unwilling, especially recently, to allow research on these key design principles on their platforms that would allow us to move the needle. And there's many good reasons for this PR concerns, legal issues, ethical issues. And so in the Polarization Lab, what we decided to do was create our own social media platform for scientific research, and pay people to use it. And this gave us a unique opportunity to sort of turn on and off different features of social media, and try to figure out which ones were polarizing and which ones could be used to maybe increase social cohesion.

SS:

Well, that's probably an optimistic place to end. Thank you so much, Chris.

CB:

Thanks so much.

SS:

So that was Chris Baill, Professor of Sociology and Public Policy at Duke University in the US, where he heads up the Polarization Lab. Yeah. So Niki, you've been listening along to that conversation? Did anything in particular jump out at you?

NSS:

Yeah, I was really interested in that point, that people might be more motivated by expressing their sort of hatred or dislike of the other, like the other political group or political party, than they are actually by the sense of affiliation and love and belonging for their own side. I had an inkling that that sort of might be the case. But yeah, it was quite striking to sort of find that there's been a survey saying that it really is, at least in the American context.

SS:

Yeah. I found that very striking as well. There's a study that he mentions in the book where sociologists took groups of boys and basically assigned them completely random identities and they ended up sort of becoming enemies during this brief summer camp type arrangement, they put them in attacking each other's camps and so on, which I think is you know, it's kind of Lord of the Flies-esque but maybe tells us something about the way in which even quite arbitrary group identities can become so powerful. But it would be interesting to know how it translated to the UK context, what he was talking about. You're certainly seeing political debate play out in recent years, you sort of wonder if we do have shades of that here, don't you?

NSS:

Yeah. And if you accept that theory that maybe the motivation is not to change people's minds, but just to kind of signal your own tribalism, I guess that that is what puts off moderate people from posting. And you were saying that you had an experience around this recently to do with your Guardian article, which of course would be, you know, very, very fair and balanced, talking about COVID scepticism.

SS:

Yeah, so it's a piece that was looking at broad anti-lockdown and COVID-sceptic movements and conspiracy theories around COVID. And it's predictably made lots of people who subscribe to those views, very angry. And my instinct is just to sort of disengage. I've had lots of people kind of shouting at me on Twitter and so on. Yeah, I think there's lots of people who do just kind of want to fight and my, I generally prefer not to engage really, because I can't really see what benefit there would be, especially when you can just kind of close Twitter down and not look and not reply, you know, you can sort of walk away from it. Which, I guess I mean, maybe not on this specific issue where you're getting attacked over something, but I guess is maybe the opposite of what Chris wants moderate people to do. You know, he wants you to sort of wade in and change the discourse.

NSS:

It was quite interesting, quite heartening in a way to sort of have that idea that it’s a civic duty as a moderate to keep posting, I've never sort of really heard that message being put to me in that way before. But I mean, what do you think? You're not gonna delete your Twitter account anytime soon?

SS:

Yeah, I mean, It’s definitely something I've thought about. It's quite hard as a journalist to meaningfully delete your social media, I think because you need that, that sort of constant drip of self-promotion.

Well, assuming that you've not deleted your social media accounts, that's it for Series Two of With Reason. Please do share it online. Give us a rating review on whatever app you're using right now to listen, and you can always tweet us @NewHumanist. We hope to be back with you in the summer. And until then, remember, you can find reading lists and transcripts for every episode of With Reason on the New Humanist website. And you can also subscribe to the magazine. That means four beautifully designed editions delivered straight to your door for just £13.50 if you head to newhumanist.org.uk/subscribe and enter the offer code WithReason.

NSS:

This podcast was presented by me Niki Seth-Smith and Samira Shackle. Our executive producer was Alice Bloch and our sound engineer was David Crackles. Stay well and see you back here soon. Bye for now.

Further Reading: