#1396 The Platforms of Our Discontent (Social Media, Social Destruction) (Transcript)

Air Date 2/5/2021

Full Episode Notes

Download PDF

Audio-Synced Transcript

JAY TOMLINSON - HOST, BEST OF THE LEFT: [00:00:00] Welcome to this episode of the award-winning Best of the Left Podcast in which we shall learn about the role that social media plays in the radicalization of discontented communities, and engage in the debate over content moderation and de-platforming of individuals.

 Clips today are from The Weeds, Newsbroke from AJ Plus, On the Media, Off-Kilter, a piece of a speech from Sasha Baron Cohen, Big Tech, Vox Conversations, the Medhi Hasan Show, and Your Undivided Attention.

Why everyone hates Big Tech, with The Verge's Nilay Patel - The Weeds - Air Date 7-19-19

NILAY PATEL: [00:00:34] I think one thing everyone will agree on, just universally, is that these companies are not necessarily well-run. And even if they were perfectly run, the nature of writing and enforcing speech regulation is such that you're still gonna do a bad job.  The United States has been trying to develop a free speech policy in our courts for 220+ years and we're pretty bad at it, but four guys at Facebook aren't going to do a good job up in 20 years. 

So there's that problem, where does the line cross from being a pretty funny joke to being overtly bigoted? It really depends on context. We all understand this. So it absolutely depends on the context. It depends on who you think you are speaking to, whether it's a group of your friends or whether suddenly Twitter's algorithm grabs you and amplifies you to millions of people. How many little Twitter scandals are a throwaway comment that somehow went viral and now someone's crying. Like we understand it happens everyday.

The other problem, and I think this is where I come back to, there's only this tiny handful of companies. These companies are monopolies in their space. So you see Republicans saying that you're violating my free speech rights. The President is saying they're violating our free speech rights. Well, these companies, they're not the government, right?  There's no vi- . . . they're private companies. They should be able to do whatever they want. By statute, they are allowed to do whatever they want.  But there's nowhere to go. So, if you feel like tweeting is important, and the President feels like tweeting is important, and you're constantly being bombarded with moderation decisions for your base, it does feel like these companies are censoring us. And then you might say, well, they're overmoderating; they've overstepped their bounds. They might as well just be liable for everything the way a newspaper would be, even though the statute doesn't say that at all. 

MATTHEW YGLESIAS - HOST, THE WEEDS: [00:02:21] You know, we were joking around about Prodigy vs. AOL, but it's true, right? In the '90's, you had a bunch of different nascent internet platforms, and you would dial into them and it was Prodigy and there was CompuServe and there was America Online and I think there were a couple more. And I think the vision that people had at that time of how this would evolve is that this would continue to be a rich space of competition in which consumers would probably subscribe to one or two of these, pick them, and so then different companies would have their own moderation policies and there would be a variation. Part of the basis of competition would be trying to pick an approach to moderation that people liked and different people would have different tastes, different people would participate in different ways and it would be sort of all good, right? But instead, we live in a world where there's a conversation on Twitter that does not have any close analogs anyplace else. YouTube is where people find videos, right? All people who get short internet videos get it from YouTube. So, if you can't publish to YouTube, you're kind of out of luck, right? Everybody cares a lot about these company's policies. We see it as having big, systematic social impact and not just being kind of like, well, I don't like this, so I'm going to go elsewhere. 

NILAY PATEL: [00:03:46] Yeah. If you're a heavy Twitter user which you probably shouldn't be, you're probably more impacted day-to-day by a random Twitter policy decision than by any decision your local government makes, right. A crazy scenario to be in, but it's where we are. If you are a YouTube creator and the Verge covers YouTube creators very closely, they're always kinda mad at YouTube. YouTube is the gateway to their economic freedom, and YouTube is not great at handling its creator class. 

So you see the enormous amount of power these companies have, and you see the sort of lack of market competition. So, if you're a YouTuber and you're like, I hate YouTube. Where are you going to go? Where's the other platform that's going to provide you a career the way that YouTube provides you a career. So then if YouTube says, Hey, you crossed this line. Right. We made this moderation rule, and six months ago we enforced it this way, but times have changed. We're going to enforce it slightly tighter. We're demonetizing you now, we're deleting your channel. All of a sudden you're like, wait a minute. That was my livelihood. That was my business. And you just took it away from me cause, cause you decided to. 

And there's not, again, these companies are not well run. There's not a lot of transparency in that process. There's not an appeals decision. If the state did that kind of thing, we're looking at like a decade worth of lawsuits. Right. When YouTube does it you're just done. There's nowhere to go.

Let’s Break Up Facebook, Google, Amazon - Newsbroke - Air Date 1-30-21

MATT LIEB - HOST, NEWSBROKE: [00:05:00] It was the perfect distillation of a right-wing complaint that's been levied for years: the idea that somehow tech platforms are silencing them despite all the evidence to the contrary.

CANDICE OWENS: [00:05:11] Facebook is a problem. I think that they are trying to intentionally go against conservatives and silence voices, 

CNN HOST: [00:05:18] The top performing post of the week was this post by Candace Owens, a far right-wing commentator and a favorite on Fox, denigrating George Floyd.  

MATT LIEB - HOST, NEWSBROKE: [00:05:26] Hmm, shocking. You know, the height of gaslighting is being told that the right wing is being silenced on Facebook by the very ghoul who has been red-pilling all of our mothers on Facebook. As of July, 2020, the top posts on Facebook included three Fox News posts, three Ben Shapiro posts, and one from a guy named Dan Bongino. If you're not familiar with Dan's work: 

DAN BONGINO: [00:05:49] You can take your mask mandate and shove it right up your ass. 

MATT LIEB - HOST, NEWSBROKE: [00:05:54] Nah, Dan. That's not wear masks go. You put your mask on your face. You're confusing your face with your ass. Which makes sense because you talk out of it. But there is a piece of the right's outrage that does kind of hold water.

FOX NEWS HOST 1: [00:06:06] It's being carried out by big tech, which I would argue has in many ways become more powerful than the government. 

MATT LIEB - HOST, NEWSBROKE: [00:06:12] Some real broken clock energy here, but it's true. A small handful of tech CEOs like Mark Zuckerberg have way too much power to decide who gets a platform and who gets banned because while we might all relish Trump being absolutely owned by his favorite app, it's important to remember that it only happened after it was clear that he wasn't going to be president again and after the Capitol was stormed by people trying to overturn the election. Almost like they were like, no, let let's wait and see what happens with this whole coup thing. Listen, Trump deserved to be banned a long time ago from spreading fake news and conspiracies, downplaying COVID-19, to inciting violence against Black Lives Matter protesters to the multiple atrocities that he's inflicted upon grammar, Trump was practically begging to be banned, but they didn't. Instead, they made all the money they could from him, consequences to the Republic be damned, and then booted him at the last possible moment, because at the end of the day, tech conglomerates are motivated by power and profit. And that will always take precedence over the public good. 

TECH CONFEREINCE SPEAKER: [00:07:18] When people say platform what they really mean is we're a media company but want to operate in a netherland of unprecedented multiples and no accountability. When people say things like "impossible" or "First Amendment" or "we can't be arbiters of truth," what they're really saying is we're not going to do anything that's unprofitable. 

MATT LIEB - HOST, NEWSBROKE: [00:07:33] Yeah, it makes sense. I mean, why would tech CEOs or basically unelected business dictators care to protect democracy? Nothing about their business structure or perspective is democratic. Google, Amazon, Facebook, Microsoft, and Apple have the highest market values of all public corporations in America, and they didn't get there by playing fairly, but by buying out and crushing any competition, then they turn around and funnel millions of dollars a year into lobbying governments to allow them to keep doing it. And that's to say nothing of the fact that they don't pay any taxes. They've gotten as far as they have because of their Julius Caesar complex. And they embrace it. How else can you explain Mark Zuckerberg's haircut?

Solving the Facebook Problem at Home and Abroad - On The Media - Air Date 5-22-19

BOB GARFIELD - HOST, ON THE MEDIA: [00:08:19] Once a progressive pipe dream, the call to break up Facebook edged mainstream earlier this month thanks to a passionate New York Times op-ed by Chris Hughes. Mark Zuckerberg, his college roommate and former business partner: 

CHRIS HUGHES: [00:08:34] This is me back in my college days, and this is my roommate Mark. Together we founded Facebook in 2004. Now 15 years later, I think Facebook has grown too big and too powerful.  Every week brings new headlines about privacy violations, election interference, or mental health concerns. I haven't been at the company in over a decade, but I feel a sense of responsibility to account for the damage done.

BOB GARFIELD - HOST, ON THE MEDIA: [00:09:00] First Hughes' plea was met with high fives from the press. Senator Bernie Sanders quickly jumped on the bandwagon, joining longtime tech critic and fellow Democratic presidential hopeful Senator Elizabeth Warren. Then came a series of hard questions. How exactly would breaking up Facebook, which also owns WhatsApp and Instagram, address free speech concerns or help stifle the spread of propaganda on the platform. And how would American regulations affect the majority of Facebook users? Many in the Global South, and in particular, Myanmar. 

MICHAEL LWIN: [00:09:37] Facebook's monthly active users in Myanmar is about 22 million. In Myanmar, Facebook is the internet. 

BOB GARFIELD - HOST, ON THE MEDIA: [00:09:44] Michael Lwin is an American born antitrust lawyer living in Yangon, Myanmar. From his vantage point far, far from the US, he says that the cost to break up Facebook could have some wide ranging, unintended consequences. This coming from someone who's seen what harm Facebook has done. It's widely agreed that the massacre of Rohingya people which started in 2017 was fueled by crude propaganda campaigns spread via Facebook. Lwin says that the reason those campaigns worked is bound up in the country's recent history.

MICHAEL LWIN: [00:10:21] What happened was you have this country that opened up. You know, being a military dictatorship, the education system is in shambles: it's rote learning in schools, and suddenly they get exposed to all of this information. And so there are various coordinated propaganda campaigns that were spreading rumors about Muslims committing acts of violence, raping Burmese, Buddhist women that were not true. And people have no way of which to judge what information is true, which information is not true. 

BOB GARFIELD - HOST, ON THE MEDIA: [00:10:58] And the messages that people received incited violence by Muslims against Buddhists and Buddhists against Muslims. 

MICHAEL LWIN: [00:11:09] Sure. So examples would be things like kill all Muslims. All they do is breed, stating that they should burn down their villages, put them back to their country they're not from here. 

BOB GARFIELD - HOST, ON THE MEDIA: [00:11:21] And Facebook did not intervene, or at least intervened only too late when it realized that people were dying as a consequence of what they were reading on the Facebook platform. 

MICHAEL LWIN: [00:11:32] You know, the people in the human rights NGO community had told me that Facebook's public policy division at the top  were made aware of this in 2014, maybe earlier. So they were certainly tipped off. So I think Facebook finally publicly acknowledged that their platform was used to, you know, incite violence on Myanmar I think this year or last year. So they've admitted it, but you know, that's, you know, four or five years too late.

BOB GARFIELD - HOST, ON THE MEDIA: [00:11:57] So it's fair to say that you have a particular reason for wishing that this problem could be resolved the use of Facebook to transmit hate speech and instigate violence. But if you are motivated towards a breakup, you're not alone. Facebook co-founder Chris Hughes just called for it. Elizabeth Warren has been beating that drum for a while. Other Democratic presidential nominees have piled on. But now I'm going to ask you in your antitrust role, you have no qualms about a forced breakup from an antitrust point of view. 

MICHAEL LWIN: [00:12:36] Yeah, I think I was still working as an antitrust lawyer during the Instagram and maybe WhatsApp acquisitions. And I was saying to my other NHS colleagues, like we don't have any basis for this under current antitrust regime, but this is clearly going to be anticompetitive. If you understand the business strategies of the firm, right. Facebook's market share among millennials is going down and Gen Zers, I think that's what they're called.

But that market share is largely being ceded to Instagram, which it already owns. Right. And that's the same sort of thing that's going on the Global South. If Facebook's losing market share, it's only to Instagram or WhatsApp, and that model needs to be reconsidered given the pace of technological change.

BOB GARFIELD - HOST, ON THE MEDIA: [00:13:18] So there's this laundry list of problems that Facebook has enabled: election interference, alleged stymieing of free speech, coordinated propaganda campaigns leading, at least in the case of Myanmar, to ethnic cleansing, horrendous privacy breaches, the Hoover method of data collection, and so on. From your perspective, would breaking up Facebook adequately address those problems?

No. And that doesn't mean that I'm not [sic] against breaking up Facebook. I think breaking up WhatsApp and Instagram from Facebook from an NHS perspective may not be a bad idea. I think breaking up Facebook and Facebook US, Europe, Global, for example, may not be a bad idea. Depends on how it's done. But I don't think that resolves the core issue of misuse of the platform. Chris Hughes wrote a very good and informative New York Times op-ed. There's been other commentators on this. There's been legal academics. I think all of them are extremely US-centric as it makes sense; they're sitting in America. But I think the most egregious abuses in terms of violence, death, Facebook's platform have not been in the US right? It's been in a bunch of other countries.

Zephyr Teachout on "Break 'Em Up" - Off-Kilter - Air Date 8-4-20

REBECCA VALLAS - HOST, OFF-KILTER: [00:14:34] Monopolists have begun to charge whatever prices they want, and in the process also to pay whatever wages they want, and to dictate politics however they see fit. Tell a little bit of the story behind this language that you're using here, which is incredibly, frankly, terror provoking and should be for anyone reading it, but which is a story of multiple decades of allowing this to happen, but also very intentional policy choices happening outside of the public spotlight in a lot of cases, and with people understanding it to be the only way. 

ZEPHYR TEACHOUT: [00:15:12] Yeah. Everything that I'm saying in this book would have been just absolutely mainstream up until about 1980. Everybody understood that monopolies posed this democratic threat that we had to be constantly vigilant about, excessive private power, and although - and I'm talking about in particular, the period from FDRs administration through 1980. And FDR was - though, Teddy Roosevelt gets a lot of the public credit, the FDR administration, particularly the second term, was one of the most important antitrust, anti-monopoly moments in American history. And FDR really took as a central part of his project, taking on concentrated private power, recognizing the toxic role it had played in leading to a fragile and unequal society, and eventually the crash.

Now in 1980, Ronald Reagan literally rewrote the anti-trust guidelines and took what had been guidelines and an approach that said antitrust and anti-monopoly law is about preserving equality, is about preserving actual competition between lots and lots of decentralized different competitors, is about protecting against private power, and collapsed all of that into new anti-trust guidelines. And basically, up until 1980, we had what is often called the structuralist approach, which is to say we need to have a decentralized economy, private power needs to be decentralized, and we are going to stop mergers before a company gets so big that it threatens to govern us.

What Ronald Reagan did in 1980 when he rewrote the antitrust guidelines, appointed a series of anti-civil rights and anti-antitrust leaders in the FTC and DOJ, and appointed a huge number of judges who really didn't believe in the American anti-monopoly tradition, was to basically rip out all that history and replace it with a simple sounding and clean sounding idea about what antitrust is about. Which is, it's just about prices, it's just about consumer prices. Any merger should default go through, don't worry about private power, the only thing you should worry about is if a merger might lead to higher prices. 

Now, that in itself was a tragedy, but the real tragedy came when Democrats got in power and didn't do anything about it. In fact, the Democratic administrations after Reagan have continued to adopt the same ideology. There may be some differences around the margins, but effectively they've all adopted the view that big is fine, there's no threat, all we really care about is consumer prices, and unfortunately, even those on the left really adopted this, and I mean the left within the Democratic party. You see a real focus on people's roles as consumers and seeing that role as an especially powerful role, instead of understanding if we purchase something we also are workers, we are also members of a political community, you can't just separate out that individual role. 

And you see basically the vanishing of anti-monopoly from not just the Republican party, not just the Democratic party, but from being a core part of what the left does. This book is for people from across the political spectrum, but I spend a lot of time talking to progressives and working with progressives, organizing with progressives, and in this book, I'm trying to shake the shoulders of people who I work with to say, we've got to stop just accepting the power dynamics as they exist and spending all our time fighting on policy choices with these behemoths sitting here dictating policy. Instead, we should spend a lot more of our activism on antitrust, a lot more of our activism on-going at the root of power itself and not just the toxic policy choices that come at it.

ADL International Leadership Award Presented to Sacha Baron Cohen - Anti-Defamation League - Air Date 11-21-19

SACHA BBARON COHEN: [00:19:35] When Borat was able to get an entire bar in Arizona to sing “Throw the Jew down the well,” it did reveal people’s indifference to anti-Semitism. When—as Bruno, the gay fashion reporter from Austria—I started kissing a man in a cage fight in Arkansas, nearly starting a riot, it showed the violent potential of homophobia. And when—disguised as an ultra-woke developer—I proposed building a mosque in one rural community, prompting a resident to proudly admit, “I am racist, against Muslims”—it showed the acceptance of Islamophobia. 

That’s why I appreciate the opportunity to be here with you. Today around the world, demagogues appeal to our worst instincts. Conspiracy theories once confined to the fringe are going mainstream. It’s as if the Age of Reason—the era of evidential argument—is ending, and now knowledge is delegitimized and scientific consensus is dismissed. Democracy, which depends on shared truths, is in retreat, and autocracy, which depends on shared lies, is on the march. Hate crimes are surging, as are murderous attacks on religious and ethnic minorities.

What do all these dangerous trends have in common? I’m just a comedian and an actor, not a scholar. But one thing is pretty clear to me. All this hate and violence is being facilitated by a handful of internet companies that amount to the greatest propaganda machine in history.

The greatest propaganda machine in history.

Think about it. Facebook, YouTube and Google, Twitter and others—they reach billions of people. The algorithms these platforms depend on deliberately amplify the type of content that keeps users engaged—stories that appeal to our baser instincts and that trigger outrage and fear. It’s why YouTube recommended videos by the conspiracist Alex Jones billions of times. It’s why fake news outperforms real news, because studies show that lies spread faster than truth. And it’s no surprise that the greatest propaganda machine in history has spread the oldest conspiracy theory in history—the lie that Jews are somehow dangerous. As one headline put it, “Just Think What Goebbels Could Have Done with Facebook.”

On the internet, everything can appear equally legitimate. Breitbart resembles the BBC. The fictitious Protocols of the Elders of Zion look as valid as an ADL report. And the rantings of a lunatic seem as credible as the findings of a Nobel Prize winner. We have lost, it seems, a shared sense of the basic facts upon which democracy depends.

When I, as the wanna-be-gansta Ali G, asked the astronaut Buzz Aldrin “what woz it like to walk on de 

sun?” the 

joke worked, because we, the audience, shared the same facts. If you believe the moon landing was a hoax, the joke was not funny.

When Borat got that bar in Arizona to agree that “Jews control everybody’s money and never give it back,” the joke worked because the audience shared the fact that the depiction of Jews as miserly is a conspiracy theory originating in the Middle Ages.

But when, thanks to social media, conspiracies take hold, it’s easier for hate groups to recruit, easier for foreign intelligence agencies to interfere in our elections, and easier for a country like Myanmar to commit genocide against the Rohingya.

It’s actually quite shocking how easy it is to turn conspiracy thinking into violence. In my last show Who is America?, I found an educated, normal guy who had held down a good job, but who, on social media, repeated many of the conspiracy theories that President Trump, using Twitter, has spread more than 1,700 times to his 67 million followers. The President even tweeted that he was considering designating Antifa—anti-fascists who march against the far right—as a terror organization. 

So, disguised as an Israel anti-terrorism expert, Colonel Erran Morad, I told my interviewee that, at the Women’s March in San Francisco, Antifa were plotting to put hormones into babies’ diapers in order to “make them transgender.” And he believed it.

I instructed him to plant small devices on three innocent people at the march and explained that when he pushed a button, he’d trigger an explosion that would kill them all. They weren’t real explosives, of course, but he thought they were. I wanted to see—would he actually do it?

The answer was yes. He pushed the button and thought he had actually killed three human beings. Voltaire was right, those who can make you believe absurdities can make you commit atrocities and social media lets authoritarians push absurdities to billions of people.

Joan Donovan on how platforms enabled the capital hill riot Part 1 - Big Tech - Air Date 1-21-21

JOAN DONOVAN: [00:25:35] When it comes to Stop the Steal, it was pretty clear over the past few months that Trump was going to, by any means necessary , claim that the election was being stolen and/or rigged. His first line of attack of course, was against mail-in ballots. But Stop the Steal as a campaign had been online prior to that -- Roger Stone had registered the domain and it had, for 2016, anticipating a Trump loss and a big moment to try to mobilize these MAGA folks.

So the groundwork for Stop the Steal actually existed prior to November 3rd. But November 3rd, you start to see a constellation of folks come together around Stop the Steal. And the group starts growing so fast on Facebook that Facebook eventually shuts it down. There were 330,000 people when I checked in on it.

So that kind of spread is digitally enabled by the algorithmic recommendation systems, as well as these networks that where Facebook had tried to remove, which were these QAnon networks that had figured out how to quote unquote, "go camo" as they called it, and ingratiate themselves in other spaces on Facebook, which is  keep joking that Donovan's first law of disinformation is that if you leave this information to fester long enough, it'll infect the whole product. And so the QAnon is a good example of that, which is Stop the Steal then is being pushed across all of these different networks. And then the mitigation strategy, of course by Facebook, was to try to stop them from joining a very large group, which forced this network-distributed model of local Stop the Steal groups to happen.

But through it all not only [are] these companies trying to mitigate it, but then you have politicians stepping in utilizing that response as a tool to make people think that they're being silenced and suppressed in some way. And that this forces these groups to move into other spaces, most particularly Gab and Parler. But it's not the case that they're just using Gab and Parler. I want to dispense with this idea that somehow they move off of Twitter and Facebook and are just siloed. 

TAYLOR OWEN - HOST, BIG TECH: [00:28:01] Which was the idea that one QAnon ban moves an entire community off Facebook.

JOAN DONOVAN: [00:28:05] Exactly. It doesn't happen that way. And we have to not get caught up in the corporate logic of the walled gardens here. We have to realize that people use multiple platforms at the same time. And yeah, as we were thinking about this and trying, and we've been writing it up, it's looking as if it's the biggest disinformation campaign that we've seen through the internet. Like I can't think of anything bigger. I can think of things that are definitely disinformation that have happened in the past.  Saddam Hussein had a nuclear weapons; that was a kind of lie that was told to justify intervention. But this one is very different because it's using the logics and the appearance of protest to serve Donald Trump. Who's the sitting president. He's not a challenger. And so in that way, he's not an average citizen and he's not someone that's being, just having his accounts taken away from him. If this was the president of another country that was doing this kind of media manipulation, the UN might step in. 

TAYLOR OWEN - HOST, BIG TECH: [00:29:21] The platforms responded at the very end of that process by cutting off access. Was at the right time to do it for them? Or where do you stand on that debate about when the platform should have cut off accounts? 

JOAN DONOVAN: [00:29:34] I think we needed a set of rules that we never got related to platforms and our responsibility of politicians. We have different users provide different uses of the same technology. Which is to say that if your social media use is largely political. You are a politician. You use social media strategically for political wins. If you're a marketer though, and your goal is to make money, then the strategic use of social media is about incentivizing people to buy your product. So different users have different use cases and also then have different incentives for using social media and bending it to their will then has a lot to do with how flexible the rules are.

But Facebook did a very particular thing, which is they created a carve-out for politicians that said that they were not going to apply rules to politicians and that they wanted people to see politicians for who they were, quote unquote warts and all. And this I think was a strategically bad decision because what they failed to account for, more so that someone has political power, is that they also have network power.

When you have power in a network you can direct people to do different things. You can organize large scale protests , which has been a virtue for many about the internet is, relatively anonymous individuals can use the internet and technology and infrastructure in order to coordinate, organize and plan large scale social movements and rewarded that use case with lots of praise over the years, everything from Occupy to Black Lives Matter to Standing Rock.

 It's been immensely transformative, but in the wrong hands for the wrong ends it's a different technology entirely. It's a different infrastructure. 

TAYLOR OWEN - HOST, BIG TECH: [00:31:41] It seemed like Twitter in their post about the ultimate account ban signaled that intentionality of users. And they said, look, we're not just looking at what he said. We're looking at how he wanted it to be interpreted and how it was received and interpreted by others. Is that a move in the direction you're talking about? Where, when we look at speech, we need to --

JOAN DONOVAN: [00:32:12] --look at the context and the consequence. This is the rubric of incitement is important here, which is that in the midst of the the siege on the Capitol where you have people who are not just, we focus on death, but how many people were injured? How many reporters were punched and kicked and dragged to the ground? How is our government going to come together across party lines ever again, knowing that Republicans were in favor of not certifying the electors, right? These are existential questions, but the damage that they caused stems from that moment that I described earlier, which is that Trump was airing the grievances and then had told people they had no other option than to make the Capitol hear them.

Peter Kafka and Kevin Roose on big tech's power and responsibility - Vox Conversations - Air Date 1-18-21

PETER KAFKA: [00:33:08] We can't make the internet go away. But for purposes of the thought exercise, if Twitter and Facebook and YouTube went away, are we better off or worse off? Are we better off having centralized structures that are built for growth that are built for frictionless distribution versus just stuff accumulating in weird corners of the internet where maybe we can't see it. 

KEVIN ROOSE: [00:33:34] That's a good question. It's something I've thought about a lot. I mean, the argument is if you break up Twitter or Facebook or YouTube, then five smaller things replace it, and maybe they don't have the same AI detection systems or capabilities. Maybe they're not run by people who particularly care about removing violent incitement from the internet. Maybe it's just a bunch of gabs and 

PETER KAFKA: [00:33:56] a guy in the Philippines running Akon.

KEVIN ROOSE: [00:33:58] Right? Exactly. So, so I think this is a real question and I think you have to evaluate it on two questions. One is this sort of contagion effect: how likely are people on a given platform who don't go looking for extremist content how likely are they to be exposed to it anyway, through algorithms, recommendations, suggested pages and groups. And then there's the question of the hardened extremists, the people who are already part of extremist movements and what their community dynamics are.

So I think the upside of having lots of little social networks would be that the damage in some ways could be contained. You could say, okay, these are the bad networks where the violence and the hate speech and the incitement is going on. And we can deal with those, but they're not likely to jump the fence to these other social networks and contaminate discourse there. 

PETER KAFKA: [00:34:52] That's the bad part of town. Don't go there. Don't go to that club. 

KEVIN ROOSE: [00:34:56] Exactly. Yeah. And I think this sort of federated model is something that we haven't really probed yet. The example that that I tend to use is Reddit. Reddit has  notoriously had problems with moderation and hate speech and all kinds of garbage, but it it was able to contain . . .. Because the structure of Reddit is this network of mini-networks, that bad behavior was largely contained to those subreddits. And when they decided to nuke those, it wasn't like they had to go cleaning up their entire. . .. You know, history subreddits didn't become infected with stuff about QAnon. It was like they were separate parts of Reddit and so you could deal with the problem in one part because it hadn't yet spread everywhere. So I think that's the upside of having lots of little social networks that don't have billions of users is that you're less likely to encounter this kind of poison-in-the-water effect.

Insurrection on Capitol Hill Heightens Focus on Social Media Radicalization - The Mehdi Hasan Show - Air Date 1-13-21

MEHDI HASAN - HOST, THE MEHDI HASAN SHOW: [00:36:00] Last year, The Guardian found many of the conspiracy theories that contributed to last Wednesday's events were thriving on Facebook, even after a supposed purge of QAnon groups, over a hundred pages, each with thousands of followers popped up shortly after. When Facebook removed the Stop the Steal group that had over a quarter of a million members in November after it called for violent protests, new groups with new names and the same plans formed online. We're talking about a platform, remember, that according to an internal presentation reported in the Wall Street Journal, knows its own algorithms exploit the human brain's attraction to divisiveness. In fact, according to a 2016 internal presentation, they found 64% of people who joined an extremist group on Facebook did so because they were recommended them in groups and -- sorry, they were recommended them in groups you should join and discover pages. And Facebook did nothing about it. Now it's reportedly seeing signals that more violence may be to come. And the social media giant is trying to prevent extremist rhetoric becoming popular on its platforms. Talk about slamming the barn door shut after the crazy conspiracy theorists of Bolton. 

So what can be done, if anything? Who better to help us dive into all this than Facebook's own former chief security officer until 2018, Alex Stamos. Alex is the director of the Stanford Internet Observatory, and an adjunct professor at Stanford University Center for International Security and Cooperation.

Alex, thanks so much for coming on the show. Let me ask you this very bluntly. If Facebook didn't exist, would we have seen the kind of attack on the Capitol that we sold last week, those kinds of crazed people? 

ALEX STAMOS: [00:37:44] Well, I think if Facebook itself didn't exist, there would be something else that people would be using.

What's happened since the election is we we've had this fracturing of the right wing ecosystem, where these different platforms are being used for different kinds of radicalization or mobilization. And when you talk about Facebook, Instagram, and some of the bigger, more popular platforms, a lot of that is about driving the idea that the election was stolen and creating the sense of grievance that is then taken advantage of by these groups and perhaps amplified on different platforms. But they do that based upon the basic incorrect misinformation that was spread mostly by very large accounts on places like Facebook and Twitter.

MEHDI HASAN - HOST, THE MEHDI HASAN SHOW: [00:38:28] Yeah. The problem is of course that even with all of that fragmentation and with what's been going on, I just feel like every group that's involved in this, all roads lead back to Facebook, whether it's QAnon, whether it's Stop the Steal, whether it's actual far right, neo-Nazi type groups. And then you hit Sheryl Sandberg say, well, it's not our fault.

It's the blaming of Parler, the scapegoating of the new conservative kid on the block, which is a bit weird given Parler had what, less than 10 million users. Facebook has 3 billion and a long history of ignoring this problem. I mentioned the Wall Street Journal reporting on those internal slides, which made clear the problems to the Facebook boss class and they just ignored it. Alex?

ALEX STAMOS: [00:39:11] So QAnon is interesting because it's a great example of how you can have a group that has a radical core that exists off of the large platforms, but then is able to use the large platforms to recruit new members to the cause. And if there's anything that QAnon and that some of these extremist groups are like, it's a lot like how the Islamic State used Facebook and Twitter and some of the large platforms in the 2016 to 2017 era, which is that the communication, the core of Islamic State organization used a platform like Telegram, but then they would go to a Facebook to try to recruit people who were in the periphery and to pull them into their ideology and then pull them off platform where they could be highly radicalized.

And so it is true that Parler plays a part, Gab, that a bunch of point to point encrypted messengers are important. But the important part could be like Facebook 'cause you have to consider then how are you going to keep people? Because you know, they have the audience. How are you going to keep your audience from being attracted to get sucked off into, push to those places? I think Facebook has not done a good enough job of policing the ability to understand that even if content on the platform is itself not violating that specific piece, that it's part of an overall strategy to try to radicalize people and to suck them to a place where you might have really direct organization or calls to violence.

Joan Donovan on how platforms enabled the capital hill riot Part 2 - Big Tech - Air Date 1-21-21

TAYLOR OWEN - HOST, BIG TECH: [00:40:31] I'm wondering how you think our policy conversation needs to change in a way that it can get at structure and get at some of these complexities of the way the network functions and of the actors in it and of the design of the system, all the thing we know to be true. How can the policy agenda adapt to that? Because it's not there now, I don't think. 

JOAN DONOVAN: [00:41:00] I mean, it's a good question.  I'm inclined not to believe that these companies really want legislation. I'm inclined to believe that there's a move happening where on the one hand, they'll say in a very chaotic moment, Do whatever you gotta do. Like we're over here, legislate. And then in the background of all of that, they're employing teams in DC and other people to move the needle away from business models, away from oversight, and to offload the responsibility onto either other professions, like the whole fact-checking world. So the offloading of responsibility onto other professional sectors is something that they talk about as a true cost of misinformation, which is to say that we could actually figure out what the costs are to journalism that has to pick up the slack.

The other part that I'm inclined to think about when they say yes, we want this, but no, not that way, is to look at the openness of the advertising systems and how they're used adversarially and how we've seen the weaponization of these advertising systems. And when they do rollback people using them, we get different kinds of effects than when their advertising systems are fully open to political operatives. And so the markup has some really great research about this using their Citizen Browser Project, where they were able to say that the moment that Facebook reopened the pathway for political advertising, there was an attack on Warnock in Georgia around a cheap fake campaign, which is basically a very short clip of him saying, God damn America, from 2013 where he was quoting someone else. So that is to say that I think the scale needs to be the target of the policy, as well as the business model around openness that produces so many of these ill effects.

And the fact of the matter is that when these companies say we welcome regulation, they mean of a certain kind and type, and one that specifically doesn't require them to either break up their businesses or create limits for the kinds of profit that they take in or profit sharing downstream. 

TAYLOR OWEN - HOST, BIG TECH: [00:43:32] Facebook's campaign, which is advertised everywhere now in newspapers, online, on all the main political newsletters, calling for regulation is very clearly more effort going into it being a PR campaign than meaningful reform. And you consistently hear "we want smart regulations" -- 

JOAN DONOVAN: [00:43:53] Yeah but that's a move, right? If you offload responsibility and say these politicians wouldn't do it either. Like "We tried; we were just doing our thing"-- 

TAYLOR OWEN - HOST, BIG TECH: [00:44:02] "We've been asking for it. We've been begging, 

JOAN DONOVAN: [00:44:05] I've been begging for you to clean the dishes; you won't do it. And dishes are still dirty though. 

TAYLOR OWEN - HOST, BIG TECH: [00:44:11] And you blame me for breaking the plates! 

JOAN DONOVAN: [00:44:15] Yeah. Which is why I'm just a lonely researcher over here, all over my silo, unable to see beyond my own self interest.

But I do know one thing, which is that all of the evidence is there, that these companies are trying to blame everybody else for the design, like it's actually a design problem. 

The other thing is like you imagine technology to be like fast and flexible and responsive, but it's really not. When you layer on so much bureaucracy, as well as a CEO model that has more of a charismatic figurehead model than it does pro-programmatic approach. You end up in this kind of mess where -- I  make this joke a lot, but it's still funny to me, which is -- Zuckerberg and Dorsey are the highest paid content moderators on the planet, thanks to Trump. 

TAYLOR OWEN - HOST, BIG TECH: [00:45:05] And where was Dorsey? He was in the Polynesian Islands? Wasn't he? Making that decision too -- 

JOAN DONOVAN: [00:45:11] Yeah. Get them out of his sleep chamber and yeah. And get him to comb that beard and then, get them to read some of these Trump tweets and make some decisions on well, does a kid stay up? Does it get a label? Does it get this? Does it get that? 

TAYLOR OWEN - HOST, BIG TECH: [00:45:25] You testified to Congress recently: 

JOAN DONOVAN: [00:45:28] This emerging economy of misinformation is a threat to national security. Silicon Valley corporations are largely profiting from it while key political and social institutions are struggling to win back the public's trust.

TAYLOR OWEN - HOST, BIG TECH: [00:45:41] But you ended by saying, what would it mean to uninvent social media? What did you mean by that? And what are the stakes in that? 

JOAN DONOVAN: [00:45:52] I'm a big fan of Donald McKenzie and the STS scholarship on uninventing the bomb, right? This idea that we brought technology into this world that means we can end this world, right? That's what the bomb means, is that the biggest bomb means everybody dies, the mother of all bombs. 

And so I think a lot about the social shaping of tech and this idea that we don't really often think about how to roll back innovation. We don't think about how to uninvent things.

We don't have an imagination for a future, without something that has been poorly designed and threatens our entire existence. And in the case of war technologies of course, it's a little bit clearer where we go, which is to say that we have auditing systems, we have negotiations, there's peace treaties, there's all kinds of ways, like an immense amount of hand-wringing and bureaucracy that is applied to nuclear weapons technologies, because this technology exists now and was brought into this world and that it is so dire and so terrible that again, in the wrong hands could cause massive pain.

And so when I think about how do you uninvent social media , it's thinking with that lens, thinking with those ideas: how do we approach regulation and prioritize those who are going to be harmed --killed -- by unbridled social media, open and working at scale. You end up in this position because you don't imagine a world without the technology, as it is designed.

TAYLOR OWEN - HOST, BIG TECH: [00:47:50] Yeah. And as it is today, importantly. Because these are constantly evolving, but we always take the current moment as the baseline. 

JOAN DONOVAN: [00:48:00] Exactly. Exactly. In the current moment right now is that we finally have seen the important impacts of openness and scale on the public, which is to say that, looking back, Charlottesville looks like a precursor and ominous warning: "if you don't do something now, something worse will come" moment. Only because the future has to happen, right? It has to play out. And for every person that could have made a decision and didn't, or tried to make a decision and was thwarted, that kind of inaction is so important for us to understand sociologically as the rationale by which we will then get into another situation like this, when these groups figure out how to remount an attack with newer and bigger consequences.

Your Nation's Attention for the Price of a Used Car - Your Undivided Attention - Air Date 10-6-20

ZAHED AHAMULLAH: [00:49:03] In 2014-2015 I think the world was looking at the dramatic propaganda that was coming out of ISIS in Syria. For most people, when they think about extremist recruitment online, they think of those ISIS videos and the production value and all of that. In developing countries, those resources aren't necessarily there, but the recruitment is no less effective. In Kenya, the first thing that we did when we went there to analyze what was going on was to map the extremist landscape, and part of that was done by getting researchers to explore some of the pages that were being put up on Facebook that appear to be recruiting for Al-Shabaab.

Al-Shabaab had a long history going back to the embassy bombings in 2008 when they were influenced by Al-Qaeda and grew in Somalia and were pretty well-known by 2014-2015. Al-Qaeda was still operating, ISIS was relatively new at the time, but all Shabaab was the real threat. Al-Shabaab was explicitly recruiting from the Kenyan coastal regions because of the grievances that people in the coastal regions had toward the Kenyan government, and they found them to be ready and willing recruits, many of the young people. 

If you engage with those pages recruiting for Al-Shabaab, if you like those pages, if you share a comment from those pages, that was a window to get at vulnerable people. And so our researchers, as soon as they would like a page, they would get inundated by friend requests, by being tagged in photos, being messaged and invited to online meetups and offline meetups. And again, all of this wasn't very explicit. It wasn't we're all Shabaab come join us, it was more of a slow grooming process, not unlike that you'd see with gangs or other groups around the world.

TRISTAN HARRIS - HOST, YOUR UNDIVIDED ATTENTION: [00:50:45] This mirrors a lot of what past guests on Your Undivided Attention have talked about. Renee DiResta, Russia disinformation researcher, has talked about also how in conspiracy theories, if you followed on Facebook one conspiracy theory, let's say Pizzagate or something like that, [Facebook] would say, "Oh, people like you also tend to like..." the anti-vaccine group, the chem trails group, the flat earth group, et cetera. 

And so one thing I think people underestimate is the power, in terms of the human brain, of surround sound. If you join a few of those groups, then you start to get a surround sound effect where your news feed is gradually filled up more and more with more of those memes. And I think that the non-linear impact of surround sound versus just hearing it once from one group is something we tend to underestimate.

What do you think about that? 

ZAHED AHAMULLAH: [00:51:27] And that's exactly how we were able to be exposed to so many groups of her short period of time. It was a truly immersive experience. We were really worried that there was actual physical risk if we were to go too much further down that road. So we actually had to come up with another way of monitoring content without putting our researchers and our partners in Kenya at risk.

We interviewed a lot of NGOs that themselves at a higher risk tolerance than we did. And a lot of them were willing to meet people. Some of them had worked with former Al-Shabaab extremists. We actually had to ask them to pull back and say, okay, we're not going to do this engagement anymore because it's clearly getting out of hand. What we are going to do is focus on the messaging based on your experiences, based on what we've learned from the online mapping, and create something that will resonate with the target groups that are being influenced by these extremist groups. We really just wanted to turn our focus on that.

We put a really, really complex picture of the languages that people use, the groups used, mostly were Swahili, also Arabic. There was some English content, but that English content appeared to be translated through Google Translate from Arabic source text. Some of that text came from Al-Qaeda. So it was a bit of a haphazard mix of materials, but all with the same purpose. Basically, how do we spread this net out using this new tool that is having mass reach and get as many young people as we could to join our movement. 

So we have this body of data that we put at the lap of Facebook and said, "you have to know that this is happening right under your nose in East Africa, can we talk to your subject matter experts who are supposed to be looking for the stuff?" and they didn't have any. And again, not just people who speak the language, but people who understand the nuances that groups use to get noticed and the trends and so forth.

TRISTAN HARRIS - HOST, YOUR UNDIVIDED ATTENTION: [00:53:16] What was that like for you, in that moment, when you discover all of these patterns and then you bring it to Facebook? And this is in what year?

ZAHED AHAMULLAH: [00:53:21] 2017, before the elections took place in November. In fact, when we started the project, the first thing we did is to reach out to their teams in East Africa. Google had just set up and Facebook was just putting it together, they didn't even have an office in Nairobi for us to go to. Their subject matter experts were in Palo Alto. We felt it was a huge vulnerability. At the time we were six months from the presidential election and we knew that this had a potential to destabilize society in a way that the election 10 years earlier, in 2008 which killed over a thousand people, didn't. So all of a sudden, this turned from a project where we thought it was just purely academic to something where we thought that lives are really at stake. 

TRISTAN HARRIS - HOST, YOUR UNDIVIDED ATTENTION: [00:54:01] What's an example of something that you would know if you were on the ground with subject matter expertise, versus if you, that you wouldn't know if you're sitting in Menlo Park and having lunch at nice cafeterias and thinking about East Africa?

ZAHED AHAMULLAH: [00:54:14] Well, the window for us was talking to the dozens of civil society organizations in Kenya who had firsthand experience getting specific pictures of young people who disappeared overnight and were later found with Al-Shabaab in Somalia, or had escaped from Al-Shabaab and were captured by the government and disappeared by the government. Piecing together stories about the messaging that they had engaged with online and how that may have led to their disappearance. 

It was putting that pitch together, we're so used to looking for keywords, in English I might add, that we missed that picture. In English we talk in memes and coded language and dog whistles, imagine a dog whistle in Swahili. So that's the kind of nuance that we're missing in developing countries around the world, especially where you have languages that aren't universally spoken like Swahili. 

TRISTAN HARRIS - HOST, YOUR UNDIVIDED ATTENTION: [00:55:01] I think people tend to underestimate the propaganda power of some of these organizations. Production value, it might look low to an outside audience. Could you talk a little bit about what people might underestimate about the power of terrorist propaganda? 

ZAHED AHAMULLAH: [00:55:14] When we look at some of the ISIS propaganda that came out, and the production value of it, that was the big game changer, that this stuff could be done on a production level that was equivalent to what Hollywood could produce. But you don't need to have high production values to be effective in recruiting. A lot of the stuff that we saw was actually crude coming from Al-Shabaab in the region. It didn't mean it was less effective. What's important is that something appears genuine and authentic. And remember, Al-Shabaab is building on grievances that already exist, and to do that, they didn't need to be that sophisticated. Their recruitment was and continues to be very effective. 

TRISTAN HARRIS - HOST, YOUR UNDIVIDED ATTENTION: [00:55:48] And what lets us know that it's so effective, even though it has low production value? Because I think one of the themes that you're tracking here is just that the cost of doing recruitment is going down over time, and the ability to do a highly personalized recruitment is going down. The ability to reach exactly who you want, the youth in the specific geographic areas that have the grievances that you want to target, and then the actual literal dollar cost. I think you've said, when we first met, that it costs less than $10,000 to reach the entire country of Kenya.

And if you think about $10,000 might be hard to find in Kenya, but it's not hard to find it just about any Western country where you want to basically say, hey, I'd like to own the next election or I'd like to cause grievances or tribalist violence and how cheap that is. 

ZAHED AHAMULLAH: [00:56:31] It was free, really. It wasn't even low cost, all they had to do was to create accounts, create pages and look out for those who were engaging with that content. That was all they needed to reach potential recruits. And surely this is happening all over the world where you have the sort of mix of grievance, conflict, and extremist groups that know how to use these platforms.

Final comments on Social Media Externalities

JAY TOMLINSON - HOST, BEST OF THE LEFT: [00:56:51] We've just heard clips today, starting with The Weeds discussing the difficulty of managing speech. Newsbroke from AJ+ broke down the profit motives of the platforms, which explains why they waited until the very last minute to ban Trump. On the Media discussed, breaking up Facebook and what it would and wouldn't solve. Off-Kilter spoke with Zephyr Teachout about the Break 'Em Up campaign. We heard a piece of Sacha Baron Cohen's 2019 speech to the Anti-Defamation League. Big Tech spoke with Joan Donovan about the emergence of the Stop the Steal movement. Vox Conversations discussed the structural benefits of more dispersed mini-networks like exist on Reddit. The Medhi Hassan Show discussed social media radicalization in the context of the insurrection. And the continuing their discussion in part two, Joan Donovan, still on Big Tech, discussed some of the policy changes we need to consider. 

That's what everyone heard, but members also got a bonus clip from it, Your Undivided Attention in which they discussed the role of platforms in the destabilization of entire countries and of the nearly frictionless recruitment strategies of terrorist organizations. For non-members that bonus clip is linked in our show notes and is part of the transcript for today's episode, so you can still find it if you want to make the effort, but to hear that and all of our bonus content delivered seamlessly into your podcast feed, sign up to support the show at bestoftheleft.com/support or request a financial hardship membership, because we don't make a lack of funds, a barrier to hearing more information. Every request is granted, no questions asked. 

And now we are skipping voice mails for today, if you would like to record a message of your own to be played in a future episode, please do you can call us at (202) 999-3991, or write me a message to [email protected] where I could turn your message into a VoicedMail to be played on the show.

Just a few things today. First an update because I had had to talk on the show about a listener offering regular voice mailer V from Upstate New York a one-year membership as a gift for him, and I just wanted everyone to know that he got it. He got his gift membership and is getting immense joy from the bonus episodes, which is a perfect excuse for me to tell you that after some experimentation and testing, our plan now is to consistently give members conversational bonus episodes every other week or so featuring not just myself, and not just myself and Amanda as it's been for a while, but now our two research producers Deon and Erin are going to be joining us. In our latest episode, which was just released this week, we discussed the other epidemic facing society right now, aside from the pandemic, and of course I'm speaking of the epidemic of optimism. Specifically techno-optimism, as in let's move fast and break stuff and just be optimistic that everything will work out fine. So timely for today's episode, because that is of course what the researchers have been working on recently. So you get a little behind the scenes understanding of our process as we, research upcoming shows.

So we've done two episodes like that so far with the whole team, and I think there've been quite good. So you're gonna want to check that out. Besides, it makes me feel like a real podcaster in a different way than this show does. Getting multiple people on the line, and fighting with recording logistics, and editing the final result of a conversation to make it sound as good as possible. It's like going back to basics for a podcaster so that's been fun, for me at least. And we've been getting good responses from members as well. 

Moving on now, there's one point I just really wanted to highlight from today's episode. In that last clip, Joan Donovan was talking about the additional cost, the real cost, of having misinformation be permeated throughout society because of these platforms. It made me think of the term social media externalities. Externalities is the term that we use when talking about fossil fuel companies and how they spew carbon dioxide and other pollutants into the air or water or both, and if we do not have regulation that forces those companies to pay for that pollution, then it's just an externality that they get to put the cost of their pollution onto someone else, and someone else has to pay that cost. It could be the cost of extreme weather destroying property that wouldn't of been destroyed without the increased energy of climate change. It could be the health cost of people suffering from those pollutions. 

So that's the real costs that are actually being born by real people, but not by the people or the corporations creating those costs. And so social media externalities are sort of doing the same thing, which is exactly what John Donovan was alluding to, and I just wanted to emphasize that. That she is pointing to the fact that misinformation and disinformation in society has real costs. You could even put it in dollars and cents in terms of media companies, which are already struggling in this new media world, having to put resources toward fact checking and debunking. Whereas they could be telling us about things that are just true and that we should be paying attention to but instead, a huge amount of effort is going into fact checking and debunking. So that's a real cost. So really interesting point that she brought up.

And with that I'll wrap up. Just quick reminder about our referral program. The people who have been engaging in the referral program have been going wild over our super secret artwork that we have. That's the big prize. You just have to refer five friends and the reward you get is our secret artwork. Amanda and I put together the artwork ourselves. It's custom to us, you can't get it anywhere else. It's not for sale. We won't give it to you if you just ask nicely. The only way to get it is to refer five friends. And the people who've been experiencing this secret artwork cannot say enough good things about it. So if you want to engage in that, obviously there's no reason not to, you're going to want to go to bestoftheleft.com/refer.

 Now, that is going to be it for today. Thanks to everyone for listening. Thanks to Deon Clark and Erin Clayton for their research work on the show as well as their appearances now on our bonus episodes. Thanks to the Monosyllabic Transcriptionist Trio, Ben, Dan, and Ken for their volunteer work, helping put our transcripts together. Thanks to Amanda Hoffman for all of her work on our social media outlets, activism segments, graphic design, web mastering, and on and on. And thanks of course, to those who support the show by becoming a member or purchasing gift memberships at bestoftheleft.com/support; that is absolutely how the program survives. 

For details on the show itself, including links to all of the sources and music used in this and every episode, all that information can always be found in the show notes on the blog and likely right on the device you're using to listen. So coming to you from far outside the conventional wisdom of Washington, DC, my name is Jay!, and this has been the Best of the Left Podcast coming to twice weekly thanks entirely to the members and donors of the show from bestoftheleft.com.

 


Showing 1 reaction

  • Jay Tomlinson
    published this page in Transcripts 2021-02-05 19:24:50 -0500
Sign up for activism updates