How to Be a Better Human
Andrew Marantz doesn’t want you to give up on the internet
January 30, 2023
[00:00:00] Chris Duffy:
You are listening to How to Be a Better Human. I'm your host, Chris Duffy. I grew up in New York City, and so one of the big differences between my childhood and friends who grew up in other places is how little time I spent in cars.
Starting in around fifth grade, I could walk over to a friend's house or hop on the bus on my own. I remember the first time I took the subway on my own, my dad secretly followed me the whole way, and he rode in the car right behind me. I actually didn't know it at the time. He only told me that he did this recently, but pretty soon after that, he got comfortable with me on the train, and I was comfortable. And if I wanted to go somewhere, I would just take the subway regularly all over the city.
I have so many memories of sitting calmly on the train as it roars along, and I'm peacefully reading my book. So it was kind of surprising when as an adult living in a different state, I owned my first car, and I finally started driving to work regularly. I was really surprised by road rage, and not just other people's road rage, but my own, because I, I cannot think of a time when I ever cursed someone out or flipped someone off while I was walking down the sidewalk or on a subway. I, I just can't imagine that.
But inside of a car, I would often just find myself boiling with rage. I remember this one time, a car cut me off at a crowded intersection, and I had to slam on my brakes to avoid crashing. And I was so angry at this person for doing this dangerous move and for being such a fool.
And I was yelling curse words at them, and they were yelling curse words at me. And then I pulled alongside the car, ready to throw my middle finger out the window at them, and I saw that this person that I was so furious at was an elderly woman. And I just, I just instantly felt so embarrassed, and listen, to be clear, she was still cursing me out. She was still furious. She was flipping me off, but I just felt like, what am I doing? It was probably a 30-second interaction maximum, but it felt so bad, and part of the reason it felt so bad is I was kind of shocked to discover that I had that inside of me. There are situations where for each of us, we find it very difficult to be our best selves. In fact, situations that bring out the absolute worst in us.
Journalist Andrew Marantz, he's been studying for years the way that social media and the internet often put us in those situations where we are metaphorically cursing out the person in the car across from us. So what does that mean for our culture, for our society, for democracy? And is there anything we can do to change that situation for the better? Here's a clip from Andrew's TED Talk.
[00:02:27] Andrew Marantz:
Facts do not drive conversation online. What drives conversation online is emotion. See, the original premise of social media was that it was gonna bring us all together, make the world more open and tolerant and fair, and it did some of that.
But, the social media algorithms have never been built to distinguish between what's true or false, what's good or bad for society, what's pro-social and what's anti-social. That's just not what those algorithms do. A lot of what they do is measure engagement: clicks, comments, shares, retweets, that kind of thing.
And if you want your content to get engagement, it has to spark emotion. So we've ended up in this bizarre dynamic online where some people see bigoted propaganda as being edgy or being dangerous and cool. And people see basic truth and human decency as pearl-clutching or virtue signaling or just boring.
And the social media algorithms, whether intentionally or not, they have incentivized this, because bigoted propaganda is great for engagement. Everyone clicks on it, everyone comments on it, whether they love it or they hate it.
[00:03:27] Chris Duffy:
We will be right back with more from Andrew Marantz after this quick break. And I promise you the conversation that we are gonna have is going to be fully optimized to spark your emotional engagement. So, do not go anywhere.
[BREAK]
[00:03:49] Chris Duffy:
And we are back. Today, we're talking with journalist Andrew Marantz about how social media and the internet are built for emotion.
[00:03:55] Andrew Marantz:
Hi, I'm Andrew Marantz. I'm a staff writer at the New Yorker Magazine, and I also wrote a book, which is called Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation.
[00:04:08] Chris Duffy:
So obviously you study some pretty dark stuff. You have done a lot of research into worlds that many of us would avoid. How did you get into this in the first place? Why is this how you decided to spend your time?
[00:04:21] Andrew Marantz:
Yeah. I wonder that a lot. I, um, what, what I started with was kind of noticing, okay, it seems like the way we communicate and understand the world is increasingly through these algorithmic social platforms.
Seems like those algorithmic social platforms are really gnarly places to be, and this is before even getting into any of the empirical social science of it. It just feels personally, anecdotally, like it's sometimes really gross to hang out on the internet. And then, my kind of narrative reporter instinct was to say, “Okay, rather than, you know, just talking to experts or coming up with a kind of polemic screed about why the internet is bad, why don't I go try to find the people who are making it bad and see if I can hang out with them and specifically watch them do what they do?”
So part of it was talking to them, but part of it was actually sitting next to them as they did it and saying, “Okay, if you're gonna try to use social media to break democracy, can I be there while you do it?” And still, weirdly to me, a lot of them said, “Yeah, sure.”
[00:05:26] Chris Duffy:
So much of the parts of your book that are, like, fascinating and gripping and read, like, “how can this be real?” are those moments where you're like right next to someone who is doing something that I think many people, regardless of where you sit on the political spectrum, would think of as, like, objectionable and, and really deeply problematic.
And yet, they're kind of just happy to be getting attention in any way. And so you feed into that as a journalist of, like, “Great, like I'll give you access because that's more attention. Who cares if it's gonna make me look bad?” Obviously, as a journalist, mostly you're focused on documenting this stuff and getting it out in the open, but you must have also had ideas about how we can make the internet less of a gross, troubling, scary place.
[00:06:04] Andrew Marantz:
Yeah, first I wrestled all the time with the ethical conundrum of having a transactional relationship with people who I fundamentally distrusted, somebody who doesn't think our democracy should exist or doesn't think I should exist as a Jew or doesn't think, you know, trans people should exist. Like that was very, very uncomfortable for me to be playing into, as you say, giving them attention that they craved.
And so I had a constant… There, there’s no one size fits all answer to that ethical conundrum. Right? When do you give those people attention and when do you not? And so, my editors and fact-checkers and I, and you know, people in my life, we would just constantly try to gauge, okay, when does the value of, as you say, exposing or showing patterns or informing readers about how this stuff works, when does that outweigh the cost of essentially entering into an attentional transaction with this person who I fundamentally think is kind of a bad faith player?
[00:07:00] Chris Duffy:
And you've been doing this for so long too, that, you know, you started this reporting at a time when people were like, “Oh, come on. What people say on the internet doesn't matter. The internet isn't real life.”
[00:07:09] Andrew Marantz:
Yeah.
[00:07:09] Chris Duffy:
And then obviously, we've all seen that what people say on the internet often has very real consequences in the real world. And the whole idea that the internet isn't real life has kind of fallen apart a little bit, even if there are elements of truth in that sentence.
[00:07:23] Andrew Marantz:
Yeah, yeah, yeah. I think we are, we're thankfully sort of beyond the “Is Twitter real life or not” debates of, you know, 2016 or 17? I think there is still truth to, let's say, the idea that maybe you can win this or that political campaign by ignoring what people on Twitter want you to do and paying more attention to what constituents on the ground want you to do. Right?
So there are still versions of “the internet is not real life” that you could kind of salvage, but the notion that, yeah, as you say, when I started doing this stuff in 2014, 2015… Yeah, I did get a fair amount of people saying, “Okay, so there's some fringe people who have a website somewhere. Like, who really cares?” And yeah, you don't, you don't get that much of that anymore.
[00:08:08] Chris Duffy:
Are there things that the regular person can do to make the internet less of a cesspool or less dangerous or even on the positive side, like a friendlier, nicer place that they enjoy being more?
[00:08:20] Andrew Marantz:
Sometimes what, what people will ask me for is like, “Okay, what are the, you know, five commandments of internet life that will equip me for any situation,” and like anything, the nostrums, the, the rules of thumb are not gonna get you all that far. So, going back to what we were just talking about, 2015, 2014, the kind of basic logic of “don't feed the trolls” had not yet been mainstreamed. So you got a lot of this, and you still see some of it, amplification along the lines of, “Can you believe this awful person said this thing?”
And that, I think is, is one good place to start is that you do not always have to amplify everything that you strongly dislike or strongly like or have an emotional reaction to. I think it's surprisingly alluring, the temptation to say, “I saw this thing and it freaked me out, and I want everyone to know about it.”
And I just think it's important to step back and remember: that was how Donald Trump ran for president. Not even in 2016, but he tried to launch a run in, in 2012, essentially around, uh, being a birther, you know, not believing that Barack Obama was born in the United States, a thing that nobody had to pay attention to, right?
Like, “conman business guy says ridiculous thing” does not inherently have to be a story. It was only a story because it incited and enraged enough people that they felt, “We have to tell people that this guy is saying this outrageous thing.”
[00:09:50] Chris Duffy:
The idea of, like, “you could always not” as the first rule of the internet. I love, I mean I think it's really funny and obviously like instantly, anyone who's been online realizes like, “Oh yeah, that is an important thing that people could know.”
It also makes me think… I used to work in an elementary school. You are the parent of two young children, and sometimes when I am out in the real world or when I'm interacting with people online, I just realize how all of us, all adult humans are also just small children because the number one thing that a kid does is like, if you give them a big reaction, they're gonna keep doing that thing. Whether it's a positive reaction or a negative reaction, like, “Oh, I got attention, that's a thing that I can do.” So, I wonder if you, as a parent of young kids see the, like connection between how you interact on social media and how you interact with your sons.
[00:10:35] Andrew Marantz:
Yeah. I, I, I see it all the time. You know, I'm the parental figure in my house, and the parental figure in the house that is social media is Mark Zuckerberg or Elon Musk. And so, they are making paternalistic choices, whether they admit it or not, about what the, the users of their platform are doing. I call them in my book, the “new gatekeepers”, and I call them that sort of pointedly because that's the last thing they want to be called, right?
The people who run social media platforms, they want to be seen as liberators. They want to be seen as “we're taking down the gatekeepers. We are disrupting/ We’re innovating.” And so, they don't take responsibility for the power that they have, which as anyone who has seen the Spider-Man movies, knows is, is against the rules.
These people have this immense power and responsibility to be shaping people's behavior, and they are shaping people's behavior, whether through commission or omission, whether through intention or recklessness. They are shaping people's behavior. If you are a bizarre, negligent, erratic parent, the way Elon Musk is, both in Twitter and apparently in life, you can't shape behavior in a coherent way.
So you're giving people all kinds of constant, contradictory, frenetic, informational signals about and, and direct incentives about what they should be doing. With children, it is all about attention and dopamine and these tiny feedback loops. And as we all know, the whole premise of the business model of social media is a giant dopamine slot machine.
Some of the recent parenting stuff I've seen in my own life is attention is the big one, but there's also, there are different kinds of attention, and there's also different ways to get to the root cause behind it, right? So, we've entered the toilet word phase of my five-year-old's life.
[00:12:21] Chris Duffy:
And, uh, social media has never left that phase.
[00:12:24] Andrew Marantz:
Exactly. He's playing the slot machine and seeing, when I say one of these, you know, seven words you're not supposed to say in the playground, it gets a big laugh. He's going back to that again and again. That's kind of all you need to know about the basic mechanism of the thing. And then we are so attuned as human beings to finding the patterns that will make us feel more loved and accepted that that has become one of the biggest business models in the world.
If the only lever I have is disciplinary carrots and sticks, if all I can do is say, “I will punish you if you say the naughty word, or I will reward you if you don’t,” both of those are extremely limited. Right? Because I'm not touching the root cause of why this is happening. I also can't say, “Be a different kind of person than you are.” Right? I can't just say, “Have a fully developed frontal cortex and don't be interested in what the kids on the playground think of you,” right?
[00:13:20] Chris Duffy:
Yeah.
[00:13:20] Andrew Marantz:
That’s also not realistic. But sometimes, and it doesn't happen all the time, I can get to a root cause and say, “Okay, why were you so interested in getting this kind of attention?”
And sometimes, he will say, “The kids were kind of ganging up on me.” And then you're at the level of root cause instead of at the level of “Do I take away your granola bar or do I not?” The very direct parallel with social media stuff is that we often start and end the conversation at, okay, this person is saying toilet words, or this person is saying Nazi words, or this person is, you know, doing this behavior that we find distasteful. Do we ban the account or not? Do we freeze the account or not? Do we set a rule around it or not? And those are all valid questions, but they are not getting anywhere near anything like a root cause.
[00:14:13] Chris Duffy:
So in your work, in your journalism, and in your book, you often have sat with individual people who are bad actors, right? Whether that person is as extreme as a, a neo-Nazi or, you know, extreme on a different end: someone who is deliberately putting disinformation out online, trying to, to spread things that they know are false because they know it'll get attention or achieve a means. So there, there obviously are real bad actors out there, but one of the big things that I think changed the way that I see social media, in general, is this idea you talk about in your book, which is that overall, even if everyone was a good actor, that social media is designed to prioritize certain types of emotions and not others.
[00:14:53] Andrew Marantz:
Yeah. People will say, they'll try to make some argument about human nature or they’ll say, well, you know, “Human beings gravitate toward things that are on the extremes”, or “Human beings want to be emotionally stimulated” or some sort of vague thing. And what that leaves out is that not all emotions are created equal, and not all emotions are equally incentivized.
These social media algorithms that we have come to see as normal and default have made very specific choices about how to boost certain things in the algorithm, and the most basic choice they've made is to boost things that are emotionally engaging around what social scientists refer to as high arousal emotions, things that make your blood boil, things that make your heart rate increase. These are literally measurable in a lab. Fear, rage, excitement. Some of them are positive, some of them negative. Uh, part of the reason my book is called Antisocial is not just because I was with some bad people, but because there are pro-social and anti-social feelings, meaning pro-social things that bring us together, anti-social things that drive us apart.
It just so happens that there are more high-arousal emotions that tend toward anti-social ends as we have now seen, so you don't get extra points on the board because you made someone think or made someone feel something. The only way you get extra points on the board is if somebody takes an action in response to your post.
They retweet it, they like it, they share it. They dislike it, they hate-retweet it, whatever. The things that make you more likely to take an action are high-arousal emotions. So if somebody listens to us talking right now, and they have a very strong emotion of, “I feel edified and engaged and connected, and I feel part of a community”, those are really good pro-social emotions that I hope we can try to foster in people.
But those are not emotions that are strongly associated with “I will necessarily take an action.” I mean, maybe they will tell a friend, maybe they will, you know, mention it to someone, but it doesn't necessarily mean that they're gonna smash that like button. If you are feeling really keyed up and like, “Can you believe that this jerk said this outrageous thing?”, that’s what makes you smash the button. And so it's really that simple. It's just the mechanic of the thing is built around emotions that, on average, are not good for us.
[00:17:25] Chris Duffy:
So what can we do if we're online and we're noticing that we're in a high arousal state? Like, how do you channel your emotions in a way that's productive and doesn't just give in to the negative anti-social parts of this?
[00:17:34] Andrew Marantz:
I'm just gonna fully give the dad answer to all of these. Take a breath. You know, stretch your body and wait till your body feels safe. There are giant structural things that are beyond any one person’s capacity to change single-handedly. So that's the demobilizing part.
The mobilizing part is you are the coin in the slot machine. It's you. Your attention. So you can choose where that attention goes or doesn't go. It's hard to choose because these supercomputers are designed to scramble your brain. But if you step back and disengage, and sometimes that means not doing anything, sometimes that means closing the laptop or throwing your phone, you know, under your bed. But sometimes it just means having a two-second break between the thing you feel compelled to do and the thing that you end up doing. And also, sometimes it means just knowing how something works. Like I sort of, the way I think about narrative journalism is I don't always think that I can shed light on something and automatically change it.
You know, it's not necessarily “I showed that this person was innocent and then you know, they were released from prison.” It's great when that happens, but other times it's just being aware in the same way that being aware of, you know, reading Kitchen Confidential and understanding how they make fish in a big restaurant just might make you think twice before you order the fish. When you have just a basic awareness of how this stuff works, you just move through it differently.
Yeah, it doesn't necessarily mean that you necessarily throw your phone in a river or disengage or, or delete everything. You just kind of feel less like a cog and more, a little bit like your hand is on the levers of the machines, just in your own personal way.
[00:19:22] Chris Duffy:
I actually would strongly encourage everyone to throw their phones into a river. I wanna make sure you have a plan for how to listen to podcasts once that phone is gone. So you know, make sure, make sure you've downloaded them on your computer or on some other device.
Okay, and while you are doing that downloading so that you can safely destroy your phone while still finishing this podcast, we are going to take a quick break for some podcast ads.
[BREAK]
[00:19:46] Andrew Marantz:
So the number one thing that has to happen here is social networks need to fix their platforms. So, if you're listening to my voice and you work at a social media company, or you invest in one, or, I don't know, own one, this tip is for you. If you have been optimizing for maximum emotional engagement and maximum emotional engagement turns out to be actively harming the world, it’s time to optimize for something else.
[00:20:18] Chris Duffy:
Okay, so that was another clip from Andrew's TED Talk, and he was addressing the responsibility that tech companies have to bring about change. But what about the role of the government? What policies would be helpful in dealing with these issues? You're not saying that we should just be having the equivalent of prohibition where we ban all social media and we just say that's gonna work.
[00:20:36] Andrew Marantz:
Yeah.
[00:20:36] Chris Duffy:
But obviously we have limitations and, and we have laws around alcohol and that helps mitigate some of the harm. So, what are some of the policy changes that you think people should push for?
[00:20:44] Andrew Marantz:
People who think that they are flummoxed by what to do here, I just want to uplift you and tell you you’re right to feel that way, because I've been thinking about this for a few years and I am right where you are.
I, look, I have views on what the FTC could do, what the SEC could do. I think that Meta is too big a company. I think Amazon is too big a company. I'm personally in favor of taking antitrust action against companies that are that big. That kind of thing, which I would advocate for, I think still in a way isn't thinking big enough because a way to get at how big this problem is, right, is that with alcohol or cigarettes, pharmaceuticals or food or cars, I feel that the government has a strong, robust role to play in regulating those things, making them safe. There are times when the government has totally failed at that. There are times when it's been moderately more successful. We, we can find policy fixes to a lot of those things.
I think there are two slippery slopes. There's the slippery slope of doing nothing and letting the most powerful social communication tools in history become garbage fires of bigotry. And I think there's also a slippery slope of what if the government gets too involved in legislating and regulating which speech should exist.
And we're getting to a, I think, a slightly dangerous place where people who take one seriously have a hard time taking the other seriously. And so you see a lot of kind of free speech absolutist stuff. Again, Elon Musk, famously his reasoning for taking over Twitter was “I'm a free speech absolutist, and I think that any speech that's legally allowed in the United States should be allowed on Twitter.”
Then about five minutes into owning the company, he realized that what everyone had told him was true, which is that's an incoherent mission statement. As a content moderator, you cannot run a social media company that way, and a lot of the chaos we're seeing as a result of that.
[00:22:47] Chris Duffy:
I've also learned from you, this is a phrase that I, that I've learned from you, is the, the idea that freedom of speech does not mean freedom of reach. And that being allowed to say whatever you want does not mean that whatever you say has to be promoted through a microphone to millions of people who don't know you.
[00:23:03] Andrew Marantz:
Exactly. Exactly. So it's just so simplistic as to be disingenuous, I think. The question is not whether anyone can say whatever they want. The question is how much do the algorithms amplify and promote it in ways that are invisible to the average user? So that whole canard of, “Well, if you love free speech, then you’ll give me, as a plutocrat, a free pass when I do nothing to prevent my thing from becoming a garbage fire.” That, that whole cascade of logic, I think, has been discredited. At least to me. Not everyone feels that way, but we've come a long way from, again, the beginning of when I started covering this, it was much easier to get away with saying stuff like that.
[00:23:45] Chris Duffy:
I hear what you're saying. That, and I think it's a really important point, that there are two competing concerns, right? The, the idea that we don't wanna just let pretend that free speech is just on its own will just work, and that we also don't wanna pretend that like, massive regulation of what you're allowed to say can work. Right? But, obviously, there is a middle ground between those two. And, and as someone who's thought about it a lot, I, I do feel like I'm curious to hear what policies you think need to change.
[00:24:12] Andrew Marantz:
Yeah.
[00:24:12] Chris Duffy:
Or, or even if you don't have this specific proposal, like—
[00:24:14] Andrew Marantz:
Yeah.
[00:24:14] Chris Duffy:
What is it that you think needs to be like, tweaked more that's not getting tweaked in the right way?
[00:24:18] Andrew Marantz:
So I proposed breaking up companies that are too big or not allowing further acquisitions in ways that seem against, you know, the spirit of antitrust section 230, which makes it hard to hold these companies legally liable. That's another area where policymakers can tweak things. The reason I say it's not big enough is that I really think we should be cautious about investing too much power in any government entity to make recommendations about what speech should exist. So, where does that leave us in terms of policy recommendations?
I think it's kind of like the climate thing where there are a lot of incentives that can be set at a governmental level. There are a lot of things that can be done from a corporate level, but ultimately, there's a huge sort of ethical shift that needs to happen, which is we need to stop burning dead carbon in the ground and we need to start finding entire new economies that can power us.
We really need to move beyond high-arousal, emotional algorithmic, social media, full stop, so the things that people should advocate for, responsible government regulation of social media, and all that stuff. I think that's all good to think about, but I actually just think, I would actually encourage people to think even bigger than that.
[00:25:35] Chris Duffy:
Another thing that you have obviously looked on a lot is extremism and how people become extremists online and get radicalized. This show is, you know, people listen all over the world. I'm sure there's lots of people listening who are outside of the US. Online extremism is happening across the globe. It's not just a US issue, even though these companies are often based in the US. How do you see the global picture of online extremism?
[00:25:58] Andrew Marantz:
As long as we have a high-arousal slot machine-based information system, there will be people who will be radicalized by it in various ways. It seems like Kanye West, Elon Musk, Donald Trump, a lot of people have exhibited similar symptoms of what you could call, you know, algorithmic brain poisoning.
I spent a lot of time with people who were brought down a particular rabbit hole of male supremacy, white supremacy, antisemitism, whatever the case may be. And it was very easy to see that as a fringe thing. But when I saw Kanye West going down this path, I could immediately, almost guess almost like a kind of bingo card. Like, “Oh, I wonder if he's gonna start talking about the JQ.” That JQ is short for the Jewish Question. “Oh, there he goes. He started talking about the Jews.” Like, he says out loud everything he's thinking. Uh, given what he had been looking at before, it was not very hard for me to see where the algorithms would push him next.
Personally, that breaks my heart as a, as a Kanye fan, but systemically, it's not clear to me how you can get that to stop happening at scale.
[00:27:09] Chris Duffy:
Let's take it out of the realm of it being someone who is kind of distant and famous and instead, you know, millions of people have this experience where someone that they love, a friend or relative starts to say some things. It may not be as extreme as, like, overt anti-Semitism, but they've found things online that are troubling or, or you disagree with. You feel like they're kind of becoming more extreme with what they're engaging with online. What can we do if we have a person in our lives, and it feels like that's, they're starting on that path, but they're not at the end? Which is obviously a much more complicated problem.
[00:27:41] Andrew Marantz:
I think the first step is try to meet people where they are. Try to listen, and listening obviously doesn't imply agreement or acceptance or excuse-making, but try to listen so that you actually know what they're saying. Because often you hear your, let's say, teenager say something and it freaks you out and your immediate response is a kind of anxiety turtle going into its shell response, and you just sort of say, “I know I don't like what you just said, and so, you know, stop it.” Or “I'm gonna show disapproval”, or “I'm going to forbid you from going to that, you know, YouTube channel,” or whatever. I feel like anyone who's tried that as a parent has had not that much success with it.
Let's just say in this example, your, you know, teenager says to you, “I've been reading some really interesting things about, you know, demographics and birth rates and, you know, the future of European civilization”. I would understand if you recoiled from that and said, “I don't like that. Stop talking about that.”
The question for me is, what about that stuff is the person finding interesting or engaging with? Or what, what need is it meeting for them? Some of it is all the most basic stuff: loneliness, longing for community, wanting to feel seen, wanting to feel like your identity is being reinforced, and some of it is just actual intellectual curiosity gone wrong.
We can't really legislate that away or regulate that away, or even on a person-to-person level forbid people from asking those questions. I would like to see people providing better answers to those questions. What, what about your life is making it hard for you to make meaning and find community? What is causing you to feel alienated?
What does it mean to have a personal and group identity that is constructive and not destructive and positive-sum and not zero-sum? And the internet is a big place, so some of that stuff is gonna be really boilerplate, and you know, “it's a small world after all” kind of stuff, but some of it can get pretty weedsy, and I think for some people that that would fill a vacuum that is currently out there.
[00:29:52] Chris Duffy:
Something that you talk about in your book is how a lot of the really dangerous stuff online starts out, kind of, as a joke. And as a comedian, right, I see this all the time of people being like, “Oh, you’re… Don’t take it so seriously. It's just a joke.” And you talk about how a lot of the extremist groups online start out by saying something and being like, “It's just a joke. It's just a joke. It's just a joke. Laugh about it.”And then, bit by bit, you're like, “Okay, well, you hear those ideas a lot in jokes. What if it wasn't a joke?” And then, “Oh, I'm just being ironic.” And then it gets more serious the deeper you go down the rabbit hole. And you know, I'm saying this as a comedian, I don't want people to be, like, wet blankets who can't take a joke or laugh about anything.
But I wonder if there is a way to have these conversations where you kind of get a preview of what's coming down the road. Because I think that would, in my uninformed opinion, it seems like that would stop people, is if you're like, “This joke leads to you being a neo-Nazi”, people would be like, “Hold on, that's not where I want to go.”
[00:30:47] Andrew Marantz:
Yeah.
[00:30:47] Chris Duffy:
But if you take all those steps, all of a sudden you are surrounded and, but you realize, “Oh, I'm friends with all these people who believe these things. Maybe I believe that too.” And it takes a different kind of person to step out of that.
[00:30:56] Andrew Marantz:
Yeah, yeah. The comedy thing is a great example, right? All these people say it's just a joke. Why are you being a wet blanket? Why don't you get a sense of humor? And then, I mean, not to take it there, but that was also the response to Hitler, right? Why can't everyone see that he's just a clown? Why? Like he's literally being mocked by Charlie Chaplin. Why doesn't everyone get a grip? This guy's never gonna have real power.
I mean, that was the discourse in American media in 1936. So, you know, not everyone is Hitler, not everyone is a Nazi, but leaving yourself open to the possibility that some people are makes these things very confusing.
I spent a lot of time in the book with the Proud Boys, a “street gang of Western Chauvinists” as they call themselves, which is another bad thing to be in my personal view, and they were started by a guy who was, you know, said he was a professional comedian and said, “Why can't everybody see that this is a big joke? You know, we're talking about memes and you know, songs from Aladdin and why is everyone taking this so seriously?” And then flash forward four years and, you know, Donald Trump is on a debate stage telling the Proud Boys to stand back and stand by. So it's very easy to seem like a pearl-clutching alarmist, but leaving that possibility open in your mind, I think is a really important exercise. What would it mean if you were to take seriously the idea that this joke is a step down a potentially slippery slope that could get really scary?
What would that even mean? Again, to be really clear, I don't think that means that you ban the speech, and I don't even think it necessarily means that you don't want people telling those jokes. I think it just, it changes how you reflect on it and how you would encourage people in your life to respond to it.
[00:32:41] Chris Duffy:
Do you have any sort of, like, positive takeaway for the future of social media? I feel like you don't think that it's just doomed. So I'm curious, like, what is your like positive takeaway for the future of social media or for the future of the internet?
[00:32:54] Andrew Marantz:
You can create a social media for yourself that is based around the stuff you want it to be. Seeing your friends’ baby pictures and feeling good about that. Being in touch with people you've lost touch with. All the stuff that they advertise to you because it's the thing that people want? You can create that for yourself. It's just harder than you think it is because there's always other stuff trying to suck you back in.
[00:33:22] Chris Duffy:
Okay. So, Andrew, we're coming to the end of our show, and one final question for you. So the show's called How to Be a Better Human. I'm curious, what is one way that you, yourself are trying to be a better human right now?
[00:33:32] Andrew Marantz:
Uh, I'm trying to meditate more. I feel like a lot of this stuff just boils down to what are you actually doing with your attention on a moment-by-moment basis, and you can train yourself to be better at that by just sitting and actually doing it.
And it's very easy for me to tell myself that I don't have time to do that. because I'm too busy, you know, catching up on White Lotus or whatever. But it's something that I'm always getting better at finding time to do, and it, I'm always glad when I do it.
[00:34:00] Chris Duffy:
Thank you so much, Andrew. I really appreciate it.
[00:34:02] Andrew Marantz:
Yeah, yeah, yeah, for sure.
[00:34:06] Chris Duffy:
That is it for today's episode of How to Be a Better Human. Thank you so much for finishing the episode. Thank you especially if you really did throw your phone into a river. You went the extra mile, and I just wanna let you know that does not go unnoticed. Thank you to today's guest, Andrew Marantz. His book is called Antisocial.
I'm your host, Chris Duffy, and you can find more from me, including my weekly newsletter and information about my live show dates at chrisduffycomedy.com. How to Be a Better Human is brought to you on the TED side by Anna Phelan, who's posting baby pictures of herself; Whitney Pennington-Rodgers, who is supportively liking every single one of your posts; and Jimmy Gutierrez, who's posting only the very finest memes. This episode was fact-checked by Julia Dickerson and Erica Yuen, who are strongly advocating for an emoji react button that says “Sources needed.”
From PRX, our show is brought to you by Morgan Flannery, who is taking a deep breath and stepping away from her feed; Rosalind Tordesillas, who is turning off notifications; and Jocelyn Gonzales, who keeps telling me that a million downloads isn't cool, but a billion downloads sure is.
And of course, thank you so much to you for listening to our show and making this all possible. We will be back next week with more episodes of How to be a Better Human.