MEGAN BASHAM, HOST: It’s Thursday the 4th of June, 2020. Glad to have you along for today’s edition of The World and Everything in It. Good morning, I’m Megan Basham.
NICK EICHER, HOST: And I’m Nick Eicher. Before we get started, I just wanted to remind you that today is the day we will release our second episode of the special series, “Ask Doctor Horton.”
It’s questions and answers with WORLD’s medical correspondent, practicing physician Charles Horton. Still, lots of questions about the coronavirus.
You’ll find that special program later this afternoon right here in this podcast feed. So do check it out.
BASHAM: OK! First up on today’s program, President Trump takes on Twitter, his favorite social media platform. And by favorite, I mean his most frequently used social media platform.
Last week, Twitter placed a couple of warning labels on two of the president’s tweets. The warnings called his comments about mail-in voting in California “potentially misleading.”
EICHER: The president’s response was swift and sweeping.
First, he labeled Twitter a censor and a politically motivated one at that. And then a move that’s been simmering since at least the social media summit last year: an executive order that among other things directs the Federal Communications Commission to look into whether platforms are policing content in good faith.
BASHAM: Joining us now to talk about these changes and how they could affect the rest of us is Jason Thacker. He’s an associate research fellow at the Ethics and Religious Liberty Commission. And he specializes in technology. Good morning, Jason!
JASON THACKER, GUEST: Hey, good morning. Thank you for having me, Megan.
BASHAM: At the root of this fight is Section 230 of the Communications Decency Act. Can you start by explaining what that says?
THACKER: Yeah. I think most Americans don’t even realize that this is an act that was enacted by Congress. But it was passed in 1996 when the internet was basically a fledgling medium. Many of the largest service providers were under constant attack from litigation often concerning content that was posted on their platforms. So, Section 230, the Communications Decency Act, helps to protect these internet providers from third-party content that’s posted on their platform. So, essentially, 230 allowed a more open and free market of ideas and it led to some really great developments in the internet in terms of Facebook and Twitter and even Craigslist because it gave these companies an additional protection from litigation upon good faith efforts on their part to moderate content. And also to protect users from otherwise pretty objectionable content and material.
BASHAM: So the law shields companies from lawsuits if they remove “objectionable” content. And it does seem like there could be legitimate reasons for that protection. Another one that comes to my mind was several instances where mass shooters tried to stream their crimes live on Facebook. Is it fair to say we probably want internet companies to have that discretion?
THACKER: Yeah, I mean, at the end of it, we do want these companies to have the ability to moderate content on their platforms. They are private companies, after all. These are their products. And so most of these companies in the recent years have developed community policies or community standards or kind of the way they want their user to interact on the platform. And without 230 there would be a lot of litigation against these companies against false or misinformation, bad facts, and even defamatory posts that are seeking to really attack certain people for their beliefs. And so we do want that level of protection for these companies not just because it protects them from lawsuits, but because it does really create that more free and open internet. And so that’s where we really need to be looking to Congress to say, “Let’s clarify this.” But that’s going to be something that’s really left up to Congress because this was an act that Congress passed in 1996. So it’s not something that even the president really has the power to change.
BASHAM: Gotcha. 1996, that seems tough to apply a law from then to what we’re dealing with now. So, this debate has split the social media giants. Facebook CEO Mark Zuckerberg publicly disagreed with Twitter’s decision. Zuckerberg said his platform would not engage in fact-checking politicians.
In the meantime, Snapchat just announced they will no longer promote the President’s account, saying, “We will not amplify voices who incite racial violence and injustice by giving them free promotion…”
How do you think these different positions could play into any efforts to modify Section 230?
THACKER: Yeah, two of the main approaches that were taken were kind of the distinction between Facebook and Twitter’s approach. This happened, actually, last year when they announced their policies and regulations on political ads. Facebook kind of took a more hands-off approach, saying they wanted the market to regulate itself. They didn’t want to be in the business of fact checking or saying what’s true and what’s not true because it’s fraught with complexity. Twitter, on the other hand, took a very different approach in saying, no, they would engage in fact checking. They would label false or manipulative videos. They would seek to ban or suspend certain accounts based on information or inciting violence. And really in both of these approaches you can see the intentions and some of the good reasons why they would implement those types of things because at the end of the day, this debate over Section 230 is less about conservative or liberal or good or bad on the internet, it really comes down to the role of free speech in our society and how we as a nation want to engage in free speech on these online platforms that are a lot bigger than they were in 1996.
BASHAM: The president clearly has a love-hate relationship with Twitter. It has been key to his ability to communicate directly with followers, and he clearly doesn’t want to lose that platform. But CEO Jack Dorsey doesn’t seem inclined to back down. Do you think there’s a middle ground solution here?
THACKER: I surely hope so. I mean, I think we should be looking to Congress is to say let’s have these debates. They’re tough, they’re difficult, they’re fraught with complexity, but we’re a deliberative people. And so I’m really hoping that we can. The one thing that I think Christians should keep in mind—especially in light of these big platforms and their influence—is that they are indeed private companies. And so do we want the government, being able to tell a private company what they can and can’t do. There are a lot of religious liberty implications to that, but there’s a lot of just free speech and how our democracy is set up. But there are a lot of kind of common sense measures and good sense measures that can be implemented to protect the vulnerable, to protect the weak, and to make sure that we’re promoting truth instead of falsehoods.
BASHAM: Jason Thacker is with the Ethics and Religious Liberty Commission of the Southern Baptist Convention. He’s also recently released a book about artificial intelligence. It’s titled, The Age of AI. Thanks for joining us today, Jason!
THACKER: Thanks for having me, Megan.