Air Date 9/27/2024
[00:00:00]
JAY TOMLINSON - HOST, BEST OF THE LEFT: Welcome to this episode of the award-winning Best of the Left podcast.
Let's just say that it's not a coincidence that right wing authoritarians are on the rise at the same time as people around the world are having a harder time than ever figuring out what's true. That said, society is beginning to fight back.
Sources providing our Top Takes in about 45 minutes today includes The New Abnormal, The Commonwealth Club World Affairs, Wisecrack, Zoe B., Tech Tank and On The Media. Then in the additional Deeper Dives half of the show, there'll be more in three sections: Section A. Social misinformation; Section B. Live by the algorithm; and Section C. Solutions.
Trump: Project 2025 Will Lay ‘Groundwork’ for Second Term - The New Abnormal - Air Date 9-22-24
JASON PARGIN: I'm someone who is terminally online. I have been since I [00:01:00] first got an AOL connection in 1996, I think, and have had many experiences of refreshing a feed, not of journalists, but of people who are rapidly trying to gather information ahead of the journalists, because the journalists are waiting to actually get sources and confirm things.
So I just did this yesterday, after there was another assassination attempt on the president, like I was refreshing Twitter and watching the bad information flow in and watching good information get discarded. It's something I've been doing now for more than half my life. So ultimately it becomes a story about that, about the movement of information through this bizarre system we've built.
ANDY LEVY - HOST, THE NEW ABNORMAL: Yeah. And look, there's a lot of stuff in the book is about the internet and about social media. One of the characters in the book, a former FBI agent named Joan Key, says that, or is thinking in the book that a TikTok is possibly the most addictive piece of software ever created.
You, in real [00:02:00] life, are someone who uses that platform a lot, and really well, I should add. Do you share Joan Key's belief in this case?
JASON PARGIN: In a book like this, I tried to include characters that represent every view, and I never wanted to seem like one person is voicing, like they're there to yell at the reader about "You need to get off your phone," because I am a creature of this ecosystem.
I would not have a writing career if the internet had not come along. My first novel was originally posted as a blog. It was one of the first big books that, back in the era when letting bloggers publish books was seen as weird. So I cannot lecture anyone else about being glued to a screen.
But there is no question: the algorithms are not built for truth. They are not built to make you smarter. They are built to keep you doing one thing and one thing only, which is glued to a screen. So, automatically from the jump, no matter who is running it, no matter how honest the company is or how much you like them, [00:03:00] their incentive Is not in your benefit. Their incentive is to keep you glued to a screen where, in a perfect world, if there was a news story and you wanted to find out what had happened and then you find out what happened and so you can say, okay, now my curiosity has been satisfied. I have learned what has happened. I'm going to turn it off and go outside and play with my children in the sun. The algorithms fail if you ever turn it off. So it's more like a casino thing. Obviously the casino's job is not to help you win money. The casino's job is to get you to stay there, glued to a stool, pulling the lever on the slot machine.
So the thing we are using to get us information -- and with TikTok, Gen Z uses TikTok as their Google.
ANDY LEVY - HOST, THE NEW ABNORMAL: Right.
JASON PARGIN: When they want to find out something, they search on TikTok. This is a habit. TikTok is not built for that. TikTok is built to hypnotize you. And I have more than half a million followers on there. I've been on there for two years now and have [00:04:00] grown a huge following. And I tell myself that I'm trying to be one of the good ones, like I'm trying to spread things that are actually true or things that are interesting and not trying to play up on people's fears or anything like that.
But if you try to do what you feel like is good, honest content, you will find yourself swimming against a current. And it's not that TikTok is specifically more evil than any other company. I don't trust Mark Zuckerberg more than I trust the owners of TikTok. They all have an incentive that runs counter to my interests.
ANDY LEVY - HOST, THE NEW ABNORMAL: Except for Elon Musk, who is the world's greatest altruist, I think
JASON PARGIN: [Laughs} He just wants us to be better. That's all, his only motivation.
ANDY LEVY - HOST, THE NEW ABNORMAL: Yeah. So in the book, Abbott, who is the Lyft driver and Ether, the woman who hires him to drive her in the black box, they head out across the country. And because they don't have any cell phones or computers or anything, they're blissfully unaware that they've become the subject of a ton of online scrutiny. Talk about the role of Reddit, both [00:05:00] in the book and in real life. You describe it as where the internet's unfathomable gush of data was gathered, sorted, and shaped into a satisfying narrative.
JASON PARGIN: That's -- it's so interesting to me because I've been around long enough to see the internet evolve from a series of chat rooms to a series of message boards to a series of blogs, and then finally around 2009, 2010, you start to see the social media era. And then when smartphones came along and started to become pervasive around 2012, everything totally changed, like everything had to be geared totally around something that could be easily browsed on a tiny screen.
My whole deal is that I have watched the nature of the flow of information change with each of those. For example, where there used to be a whole galaxy of message boards of niche interests, now it's pretty much just Reddit. The internet has one big message board and it's all divided up by the Subreddits. It's like if you're a fan of, I don't know, [00:06:00] the Philadelphia Eagles, once upon a time, there'd be a bunch of niche fan websites and message boards where you could talk to other fans. Well, now by far the biggest gathering of Eagles fans is going to be the Eagle Subreddit.
ANDY LEVY - HOST, THE NEW ABNORMAL: Right.
JASON PARGIN: So he has basically a kind of monopoly on that kind of discussion, which is permanently recorded, long form discussion where it's all threaded out and people can reply to each other. But because that now dominates, then Reddit Culture, which is a very distinct type of culture where it's heavily male, it's heavily libertarian, it's heavily atheist, that kind of acts as an umbrella over everything in a way that I don't think anybody fully appreciates, especially if they were born after the internet was pervasive. Because the one advantage I feel like I have is that I was in my twenties before the first time I logged on, because I'm extremely old. I don't think -- it's hard to understand how platforms shape [00:07:00] information and shape misinformation until you've seen the difference between how people used to talk to each other versus how they talk to each other now. And with Reddit, which plays a role in the story, this is where people like Reddit wants you to do this. They want you to follow a live thread on any kind of a breaking story.
And so, for example. Infamously during the Boston Marathon bombings, Reddit sloots all got together and immediately found the guy who did it, who it turned out not only didn't do it, but he wasn't even alive at the time.
ANDY LEVY - HOST, THE NEW ABNORMAL: Right.
JASON PARGIN: And that's the kind of thing where the rush to be first, where we're not going to wait for a journalist to tell us what happened; we are going to ourselves dig through social media. We're going to put clues together. We're going to take this still photo from a security camera and decide, Hey, I've decided I know exactly who this guy is.
The way like voting on comments works, the way certain things rise to the top, the way it governs what becomes visible and what doesn't, [00:08:00] that all winds up pulling the strings on the discussion in a way that is not necessarily visible to you, if this is the only way you've ever known it.
NBC's Jacob Ward: How Technology Shapes Our Thinking and Decisions Part 1 - Commonwealth Club World Affairs (CCWA) - Air Date 1-31-22
JACOB WARD: So around January 6th, for instance, I spent that day monitoring all of the online streaming folks, typically on YouTube, who were streaming from the Capitol. And they're pulling in the live feeds on people's phones, which of course have gotten so many of those people arrested. And at the top of the YouTube screen, if you have a certain number of subscribers, you're allowed to institute what's called a super chat, which allows you to charge money for pinning somebody's comment to the top of the window for a few minutes, and you can set whatever price you want. YouTube of course takes a percentage of that and it's 20 bucks, 50 bucks, whatever, a hundred bucks. And I'm just watching [00:09:00] people bing, bing, bing, they're making a few thousand bucks a minute.
Now...
DJ PATIL - HOST, COMMONWEALTH CLUB: Is that a few thousand bucks a minute live streaming over at the seat of democracy!
JACOB WARD: An insurrection at the Capitol. You cannot make it up. Exactly. You would not believe it if you saw it. So it is, yeah, it's idiocracy, but not funny, right?
And so that profit incentive is a huge driver of this. When I I was covering a lot of the Stop the Steel protest rallies at NBC, I would go see these people there who are streaming live. And it's so interesting because they look to a space alien, they might look like their job is the same as mine. They've got lights, their hair's done, they're doing their makeup, right? And then they go live. But these people are leading the chant. And, when you look at their Instagram feed, or you look at their super chat on YouTube, you can see they're making money in that moment. Now I am also being paid to cover this, but I don't get paid more per comment. You know what I'm saying?
DJ PATIL - HOST, COMMONWEALTH CLUB: You have an industry behind you ideally called [00:10:00] journalistic ethics.
JACOB WARD: Yeah, that's right. And I get fired if I make it up. If I lie, I get fired. So there's some gargoyles around it.
But anyway, lfor me, earning the grift was really powerful. And then there's a very brilliant woman named Nandi Jamini who runs something called Check My Ads. And, she was the co-founder of Sleeping Giants, you're probably familiar with. And she has been doing all of this research about the ways in which online advertising funds all of these very scary publishers of all kinds of scary stuff. And, she turned me on to the research that really set her on her path, which showed that there are all kinds of blacklist services that will spike certain published articles against -- basically make it such that advertisers who don't want to be publishing or advertising next to sensitive topics, won't be published next to certain news articles. And she discovered that in fact, with the research actually discovered, that it is people covering [00:11:00] really important stuff that are being blacklisted off of these lists, such that some Pulitzer Prize-winning brilliant people at the New York Times, for instance, no online advertising was appearing next to their work. So it's actually costing the New York Times money to run that kind of really important journalism.
So for me, when I think about misinformation again, I'm thinking, Okay, there is a system here, both of pattern, dumb pattern recognition that nobody is equipped to question, and incentive structures that is fueling this stuff.
I also blame, as much as the next person, our tendency to just try and be tribal and crass and get attention. The attention economy is a really important part of this, all that stuff. But there's some specific machinery in there that I think we should be starting to think about how we're going to take a hammer to it.
DJ PATIL - HOST, COMMONWEALTH CLUB: One of the ones I will highlight because this is how much I enjoyed the book here is, and I think it's in chapter two, you talk [00:12:00] about this experiment of what happened when kids are just basically told they're on the green team versus the orange team and what affiliation does as a powerful psychological motivator, and it gets me to one of these big questions that's in here is -- and the way I almost want to describe it is once you -- could you talk about this is addiction of people have too much time on their hands. because technology is freeing them up. They can take drugs, get into the stupor and detached from the world. Same thing happens with gambling. You see a version of that. You see a version of this with people who don't have. alternatives to spend their time on work or other things, getting into these forums where they get radicalized, not just here in the United States, we see it around the world. And it's almost this version of, we're using technology to free ourselves up from time, but then, when time comes together, your free time plus despair, the note I wrote is "free [00:13:00] time plus despair equals opportunity to take advantage of people, and technology accelerates it."
I'd love for your reaction to that.
JACOB WARD: Yeah. I'm very interested about what you say. I'm not sure that I blame free time as much as I blame despair in the equation that you have there. And I also think that social isolation is a huge part of that as well.
So one of the common threads there in the book, there's a 14-year-old kid who lost his mom, and was deeply isolated in Florida, who wound up going down this rabbit hole, of quote unquote, "race realism," and all of this stuff, and wound up adhering to all kinds of white supremacist ideology.
And he turned out to be Muslim. That's right. He was a, his parents were Bosnian Muslims who escaped genocide. And he nonetheless wound up down this rabbit hole and became somebody who believes in white supremacy. That kid couldn't have been sadder or lonelier than he was. He was deeply looking for connection and was not able to find it.
Another [00:14:00] character who fell prey to online casino simulators, also, a deeply lonely and sad person. And here's the thing, what I'm starting to understand is that there are marketing mechanisms out there that find people who exhibit those conditions. I don't know about you, but a lot of my pandemic -- as soon as I turned in the book, I went hard at TikTok for a while and would doom scroll my way through hours of it, until -- and here's what happens when you get to a certain point in TikTok -- until a video comes up that says you've been scrolling really fast; you should slow down. There's a little warning that says you've been going too fast, slow down. And then eventually it'll say, you've been looking at videos for quite a while. You want to take a break?
Meanwhile, every ad I get is for ADHD medication. And I'm sure anyone out there listening to this who's been on TikTok recently has gotten these ads as well. Huge amounts of ADHD medication.
Now, maybe everybody's getting that. Maybe that's just a blanket kind of advertising campaign. I don't think so. [00:15:00] I think that inside that company, there is a pattern recognition system that says this guy is exhibiting the classic signs of X, Y, and Z; serve him an ADHD ad. It is not just the affinities that we have and the hobbies we exhibit and what we post about. It is the way we behave that is showing our inner state. And I think that we are being analyzed in that way. That is the loop. That's what's starting to grab us. And as they get better at noticing that I'm ADHD, I'm not actually a diagnosed ADHD, people. And there's a whole problem with advertising ADHD to people who have not been clinically diagnosed.
But putting all that aside, the qualities they have spotted in me, and are feeding me information as a result, basically the way TikTok is for me, it's like doing drugs. I do it for a couple of months, and then I have to erase it off my phone, because I cannot control myself with that app.
And so, yeah, there's an inner state being analyzed here, that I think is a huge part of this. And, maybe it is extra time on our hands, but I don't know about you, like half [00:16:00] of Americans can't put together an extra $400 in an emergency right now.
I don't think time is our problem.
Streaming is Changing Politics...Is That A Good Thing? - Wisecrack - Air Date 9-20-24
MICHAEL BURNS - HOST, WISECRACK: It's clear that authenticity is critical to the success of political streaming, but of course, it's not just about being cool and relatable, because some of these guys aren't cool or relatable. They're kind of weird and unrelatable.
It's that the very structure of the medium creates moments of truthfulness in both the streamer and their audience that are almost impossible to achieve in other forms of digital media.
Now we can better understand this via the work of French philosopher Henri Bergson and his concept of pure duration.
Now, for Bergson, duration is a way of thinking about time or a type of time which operates in distinction from clock time. It's not mathematical or mechanical. Like I have a little watch right now, and the second thing it's going 50, 51, 52, 53, right? [00:17:00] Sequential, linear, mathematical, things of that nature.
But duration is more to do with our experience of time. Imagine what it feels like to wait for 45 minutes in the waiting room of your dentist. Feels like an eternity. Now imagine what it feels like to spend 45 minutes catching up with your best friend that you haven't seen in a while. Probably flies by before you know it, they're on their way, and you wish that you could hang out for longer. Same amount of, linear time, but we experience it way differently.
And we see this at play in the unique temporal experience of watching a live stream. Now, unlike watching a cable news show, which is broken up into distinct segments and commercial breaks equaling precisely one hour, streams can often go on for hours without any segment breaks or time constraints.
To be clear, some streamers do have sponsors -- take a little break to talk about that -- but it's not quite the same as cutting away to, I don't know, some commercial about mesothelioma.
AD CLIP: How will this affect my loved ones?
MICHAEL BURNS - HOST, WISECRACK: But all of this creates a sense of flow that [00:18:00] affects both the streamer and the audience. And in doing so it can collapse the way that we think about and experience time.
Now think about it this way: okay, you're pretty unlikely to get home from work and just say, now I'm going to sit in front of a screen for five hours, watching a guy talk about politics, play some games, hey, maybe do some research. Instead, you probably just tune into your favorite streamer, and before you know it, you're caught up in the discussion, you're jumping in the chat, you're participating in an active way, and, then five hours have passed, and holy crap, you have to be up for work soon.
Now, for Bergson, this disillusion of clock time creates the space for "pure duration," where we're purely in the moment, fully experiencing our own internal lives.
Now, Bergson unfortunately died way before you could watch Hbomberguy stream Donkey Kong 64 for 57 hours straight, but, he may be considerate in terms of music.
CLIP: Chunky Kong is my favorite.
MICHAEL BURNS - HOST, WISECRACK: Now, music does follow a specific temporal structure, right? It [00:19:00] conforms to a certain clock time. Maybe a song is in 4/4 at 110 beats per minute. But when you're caught up in a song that you love, your experience is one of pure duration. You're not thinking about measures and time signatures. You're just fully in it. This is why Deadheads can vibe out to a song for 45 minutes and then want more as soon as it ends. Cause when that music never stops, you feel free from everything else.
This feeling of temporal freedom is the exception to the rule of our normal lives. Which, let's be honest, are mostly governed by clock time, and more specifically, work time or work clock time.
Now, as Bergson writes, "We spend the majority of our lives outside ourselves, hardly perceiving anything of ourselves but our own ghost." He goes on to write that "we live for the external world rather than for ourselves. We speak rather than we think. We are acted rather than act ourselves." This means that our identities and our internal states of consciousness are shaped by rigid clock time. It's maybe I'm this [00:20:00] person when I'm at home; when I'm this person when I'm at work.
But considered from a perspective of duration, I'm all of these things at once. And my identity and consciousness are constantly unfolding.
And streamers are able to tap into this, creating an experience of duration in which different elements of their consciousness or perspective emerge over time. Think about the stream marathons of someone like Ludwig, where dramatic variations in his behavior can create a palpable sense of authenticity.
Sometimes even a streamer's mistakes or lapses in judgment or random asides make them feel most real to us.
CLIP: Hawk Tua on that phobia.
MICHAEL BURNS - HOST, WISECRACK: Take this time that Assam Piker, seemingly forgetting that he was streaming, got a little bit distracted.
But the sort of realness that emerges via duration doesn't just lead to personal details or accidental bro behavior. It also creates a particular experience for us, the audience, one in which I am actively listening and engaging and experiencing myself thinking in real time.
When this is happening, it can [00:21:00] feel like my identity is unfolding in dialogue with both the streamer I'm watching and the community that I'm engaging with. This can all create the conditions for the political streamer to truly guide the audience when dealing with tricky or contentious issues and policies. And when they do this, they might help the audience see past rhetorical traps and empty platitudes, ideally creating the feeling that you're learning alongside your favorite streamer.
And the authenticity of the medium, mixed with the freedom of duration, can then lower our guards. And that makes us maybe more likely to reconsider our opinions, or remain open to ideas we don't immediately agree with.
That might sound like I'm being too gracious, but I've seen this a little bit on our tiny little YouTube streams where people in the chat kind of open their minds, debate with each other, change their minds on stuff. It's a very beautiful thing to see. Other times they just say This channel fell off. Why is he here?
Now, we can see some of this in Piker's analysis of a debate between Ben Shapiro and Malcolm Nance, in which, the audience gets all the [00:22:00] benefits of having a political commentator who can translate, add context, and fact check in real time. This enhances the audience's understanding and equips them to critically engage for themselves.
And it's not just the political streamers are able to offer us more honest and authentic analysis than traditional news media. They also have a pretty dialectical relationship with their audiences, participating in an open back and forth that can shift the conversation or lead to digressions and insights that a streamer maybe wouldn't arrive at on their own.
The duration of these streams and the communities they engender really do open up new types of interaction between host and audience. One where, unlike most experiences of our media consumption, you feel like being in the audience actually matters. We see this in an instance like Piker responding to his audience when they had concerns about the rise of anti woke rhetoric.
The unique temporal nature of political streaming might then be one of the reasons that it creates such impassioned communities, which is all well and good. But if you're like me, you [00:23:00] might still be a little skeptical of the actual political efficacy of any of this. Because sure, political streamers draw audiences as large as traditional news outlets, but are they as politically influential? Now, this question feels especially relevant when we know that networks like Fox News and CNN or podcasts like Pod Save America have exhibited the power to influence voters.
Here's the thing that makes streaming so interesting on a theoretical level, that the temporality that comes through an experience of duration might actually be the problem. By keeping folks locked into a consumptive trance, it's arguably doing the opposite of facilitating offline political activism.
Which raises the question, are we too busy watching political streams to actually participate in politics?
YouTube and the Death of Media Literacy - Zoe Bee - Air Date 9-2-24
ZOE BEE - HOST, ZOE BEE: For a lot of people with poor media literacy, numbers are a really easy metric for understanding media. Why would you worry about actually analyzing a film to understand its strengths [00:24:00] and weaknesses when you could just look at its Rotten Tomatoes score instead?
And on the industry side, making art objective and quantifiable makes it easier to figure out the most efficient method of dispensing that art, as we've seen with the rise of binge watching thanks to streaming services. And we're not just seeing this with movies, either. There's been a recent rise in book summary apps, and I find that absolutely fascinating.
Now, obviously, things like CliffsNotes have been around forever, but now we have ShortForm, Blinkist, and Magibook, which aren't helpful guides to books. They're literally just summaries of books. And something that all of these services emphasize is how easy they're making things. They take confusing ideas and explain them in plain and simple ways. Never get confused by a complicated book again. I don't know if you've seen these ads for Magibook, but they are wild. There's a couple of videos that I've seen from folks breaking down [00:25:00] exactly what they do and how terrible it is, so I won't get too much into it here, but suffice it to say, Making something simpler doesn't necessarily make it better.
But if you see books not as art that's made with intention, but instead as a quantifiable and infinitely reducible data point, of which you need to sell as many units as possible, then of course it makes sense to reduce it down until it's as easy as possible to consume. But this focus on ease of use is also a big factor in another media literacy issue: the spread of misinformation online.
Part of this is because, as Mike Caulfield puts it, the primary use of misinformation is not to change the beliefs of other people at all. Instead, the vast majority of misinformation is offered as a service for people to maintain their beliefs in face of overwhelming evidence to the contrary. In other words, it's confirmation bias.
Social [00:26:00] media actually thrives on confirmation bias. Because while changing your beliefs is hard, maintaining them, having them catered to, being told what you want to hear, is easy. And like I've said over and over again, brains like easy. So, social media companies aren't incentivized to combat these algorithms, and instead just focus on giving users what they want.
And that's to say nothing of how much YouTube and Facebook actually directly profit off of grifters. But just because these systems aren't incentivized to change, doesn't mean that we're stuck like this. Now, unfortunately, a lot of solutions for these big picture problems are unsatisfying. Like I mentioned earlier, most articles and organizations focus on education as a panacea to media literacy issues, and I think that's just a lazy answer.
One of the books I read for this video is a non-fiction graphic novel, A Firehose of Falsehood: the Story of Disinformation. And I think that it can be a [00:27:00] really valuable resource, like it has some really interesting stuff on the history of propaganda, but this whole book talks about the firehose, the big institutional forces that overload us with information.
And its solution to that issue? Wear a raincoat. So, first of all, that metaphor really falls apart, because the danger of a fire hose isn't the water. It's the pressure. Sure, wearing a raincoat will probably keep your clothes from getting wet, but it's still gonna hurt. And second of all, like I said earlier, I'm skeptical of any solution that comes down to just teach more media literacy in school.
When you have such powerful forces bombarding you with information and misinformation and disinformation, it doesn't really seem fair to put the responsibility entirely on the individual. Rather than trying to protect yourself from [00:28:00] the firehose, doesn't it make more sense to just turn the hose off?
Now, to be fair to this book, the authors do offer other suggestions for how to protect societies from the proverbial firehose of falsehood. They bring up regulations on businesses, especially social media, as well as safeguarding free press and repairing the public sphere. And these are solutions that have been echoed by other media literacy scholars like Dana Boyd and her article, "Did Media Literacy Backfire?"
Now, I don't love every single thing she says in this article, but there are some gems that I think are really helpful for understanding the scope of our society-wide media literacy struggle. She argues that media literacy programs and solutions that focus on expert fact checking and labeling are likely to fail. Not because they are bad ideas, but because they fail to take into consideration the cultural context of information consumption that we've created over the last 30 years. Addressing so called fake news [00:29:00] is going to require a lot more than labeling. It's going to require a cultural change about how we make sense of information, whom we trust, and how we understand our own role in grappling with information.
In other words, we've built this huge, tangled, attention economy web and there's so much baggage associated with it, like all the algorithm stuff I brought up at the beginning of this section, that surface level issues like labeling misinformation aren't ever going to be enough. What we need to do is untangle the web.
The question of how we untangle the web is complicated. Some people, like Ben and Elliot, argue that we need big picture, radical, societal change.
ELLIOT: For one to just start radically changing systems of education, systems of media, just sort of radically undermine capital. You know, the more that you undermine capital and put power into the hands [00:30:00] of workers and in the hands of non-proletarian, working class people, in the hands of people who are underprivileged, in the hands of people who are disabled—the more that you do that, the less you'll feel there's a problem with media literacy.
ZOE BEE - HOST, ZOE BEE: But until the revolution happens, Dana Boyd suggests in her article that we instead get creative and build the social infrastructure necessary for people to meaningfully and substantively engage across existing structural lines.
This won't be easy or quick, but if we want to address issues like propaganda, hate speech, fake news, and biased content, we need to focus on the underlying issues at play. No simple band aid will work. Part of the problem with this, though, is that there's been an erosion of trust in a lot of the institutions that make up this social infrastructure.
For instance, people are becoming more skeptical of higher education, but especially of the humanities. [00:31:00] And the humanities are the home of media literacy studies.
BEN: Actual humanities education of, like, thinking and reading and writing is on the decline on the average in our society because we don't, like, assign a value to it, really.
LILY ALEXANDRE: I would situate this in the context of a growing resentment for, like, humanities and liberal arts in general. You know, I get the feeling that people don't understand, or can't quantify the benefit that those disciplines give. And, like, universities all over the world are slashing humanities funding to make room for more, like, STEM funding.
ZOE BEE - HOST, ZOE BEE: I'm not saying that these solutions are doomed to fail. I actually have a lot of hope for these kinds of things. I'm fatally optimistic about this kind of stuff. But, like Boyd said, this won't be easy or quick.
A Citizen’s Guide to Disinformation - TechTank - Air Date 9-3-24
NICOL TURNER LEE - HOST, TECHTANK: We used to have decades where we were attacking disinformation based on these [00:32:00] falsehoods that, you know, essentially persuaded the collective conscience. It sounds to what Daryl's talking about in the area of climate and what we previously talked about in elections, they're also becoming like these advertising commercials. Is that right, Elaine? For people like to pick up on in the book?
ELAINE KAMARCK: Well, you know, the advertising example is an interesting one because we are fairly sophisticated consumers these days of advertising, and we are not sophisticated consumers of stuff that looks like news. And so one of the ways disinformation spreads is people make up fake newspapers.
You know, they'll just make up the name of the, you know, the "Santa Fe Evening Sun", right? And it turns out that newspaper doesn't exist, but they'll send along an article from a newspaper that doesn't exist, and people will say, Oh, that's from a newspaper, and they assume [00:33:00] that this news, this fake newspaper has all the fact checking and editorial oversight that a normal newspaper has, and it doesn't.
So, that's where we're really getting confused out there, and where the citizen is getting confused. And one of the things we urge the citizen to do is, look, if this doesn't make sense to you, it probably is something you should look into, okay? Don't take this stuff at face value and don't pass it on to your 400 friends until you think it's really true. And that sort of thing is, I think, a public service that we're trying to do in this book.
NICOL TURNER LEE - HOST, TECHTANK: Yeah, that's the part that I really found to be interesting, right? Because across those verticals that I mentioned that you discussed, you're essentially trying to embolden citizens back to action, back to agency over these issues.
You know, one other area I do want to dive into before we pass on what those tools [00:34:00] are for citizens, you've got a fantastic chapter on disinformation and race relations, which I found to be interesting because I think we still have, you know, a lot of memories of the 2016 election and the use of foreign operatives to sort of play off of our history and use that to the advantage of spreading more disinformation and essentially disenfranchising voters.
Why was it important, Darrell, to put that in this book? Right? Because I think, in and of itself, it's one of those areas where people will probably say, yes, there's disinformation, but this is a long history of a whole lot of other stuff. I just, I found it fascinating the way you talked about it in the book.
DARRELL WEST: I mean, disinformation is at the heart of race relations and, in fact, racism itself. Historically, there's the myth of African American inferiority. There were prominent Harvard professors who spread this and disseminated that viewpoint. In more recent times, we see [00:35:00] a overlap between race and crime and the idea that blacks commit more crimes than whites and then therefore that became a vehicle to toughen the sentencing patterns and we ended up in a situation where, you know, it seems like three quarters of the people in prison now are racial minorities. So, there is a long history of disinformation that links up with racism. So, we just wanted to point that out.
We also have seen foreign governments and foreign agents play on the racial divisions in America. Like, whenever anything happens, basically these other countries see it as an opportunity to further divide Americans from one another. And so they will spread false rumors. You know, we just saw this in England, where there was violence that was committed and people blamed it on immigrants. And then there was a wave of anti-immigrant violence that took place. So, even in the contemporary period we're seeing a close tie [00:36:00] between disinformation, race relations, and ethnic conflict.
ELAINE KAMARCK: And one of the things that social science has shown for many years now is that communities that have a high number of immigrants in them actually have less crime, not more crime, than the general public.
And of course there's a reason for that if you think about it. Common sense. And we, by the way, keep coming back to common sense in this book. If you're an illegal immigrant in the United States, you are going to pay every parking ticket. You are going to be careful to walk in the darn crosswalk. I mean, you are not going to run the risk of putting yourself in the midst of the legal system, which could end up having you deported. So, ironically, the very fact of being an undocumented person in the United States makes you less likely, not more likely, to commit a crime. And [00:37:00] yet, given the information out there and the disinformation out there, you would think that immigrants here are coming here to rape and murder.
NICOL TURNER LEE - HOST, TECHTANK: That brings me to a question, too. I mean, we just came off the heels of the Democratic National Convention, where I think there was this play on maybe unraveling some of the disinformation that is out here on some of those issues as well, Elaine and Darrell. My question for the two of you is, as we're thinking about your book, and particularly this time, are we going to catch this disinformation trend in time or, Elaine, is it going to be in the next three to four weeks that it's going to ramp up? I mean, I think we're seeing those tropes play out in such significant ways that your book is so timely on this, right? So I'm just curious, coming off of the heels of this, is this something that, you know, citizens have to be aware of?
ELAINE KAMARCK: Boy, I think it is. And the question really goes to campaigns. You know, there's a lot of discussion in your world and [00:38:00] Darrell's world about the sort of legal aspects to fighting disinformation. But the fact of the matter is, in a fast moving environment, the law is just too slow. Okay? In a fast moving environment, the responsibility for fighting disinformation rests with the opposite campaign. And campaigns are going to have to spend a great deal of resources on literally just the constant monitoring of the internet, of the huge, huge internet, and the constant real time fight against disinformation. Because waiting for it to be proven, and waiting for somebody to have a subpoena brought to them, et cetera, you know, that does not work in elections. And so I think the campaigns are going to be spending a lot of time and a lot of money doing it. And we've gone so far as to propose that, in fact, under the federal election law, there'd be an exemption for monitoring disinformation in terms of [00:39:00] spending, just as there's an exemption for accountant and lawyer fees.
DARRELL WEST: I agree with that. I think the problem in an election campaign is this stuff just happens so fast. It gets seen by millions of people sometimes in a matter of hours. And so it's hard for other candidates to respond. It's hard for the media to respond. And then there's the risk that people are going to end up making up their minds based on false narratives. And I think this is particularly worrisome with the undecided vote. Like, 95 percent of Americans have made up their mind in this presidential election. They are not likely to be persuaded by disinformation, but there's a question about that last 5 percent: On what basis are they going to make up their minds? And I think that's the part that I worry about in the coming months.
Enshittification Part 3: Saving The Internet - On the Media - Air Date 5-19-23
CORY DOCTOROW: Well, I've got some good news for you, Brooke, which is that podcasting has thus far been very enshitification-resistant.
BROOKE GLADSTONE: [00:40:00] Really?
CORY DOCTOROW: Yes, it's pretty cool. Podcasting is built on RSS.
BROOKE GLADSTONE: I know that. It stands for Really Simple Syndication that lets pretty much anyone upload content to the internet that can be downloaded by anyone else. The creators of RSS were very aware of how platforms could lock in users and build their tech to combat that. In turn, podcasts are extremely hard to centralize.
CORY DOCTOROW: Which isn't to say that people aren't trying.
BROOKE GLADSTONE: Like Apple?
CORY DOCTOROW: Oh, my goodness. Do they ever? YouTube. Spotify gave Joe Rogan $100 million to lock his podcast inside their app. The thing about that is that once you control the app that the podcast is in, you can do all kinds of things to the user like you can spy on them. You can stop them from skipping ads.
The BBC for a couple of decades has been caught in this existential fight over whether it's going to remain publicly funded through the license fee or whether it's going to have to become privatized. It does have this private arm that Americans are very familiar [00:41:00] with BBC Worldwide and BBC America, which basically figure out how to extract cash from Americans to help subsidize the business of providing education, information, and entertainment to the British public.
BROOKE GLADSTONE: The BBC created a podcast app called BBC Sounds?
CORY DOCTOROW: That's right. One of my favorite BBC shows of all time is The News Quiz.
GAME SHOW HOST: Welcome to The News Quiz. It's been a week in which the culture secretary suggested that BBC needs to look at new sources of funding, so all of this week's panelists will be for sale on eBay after the show.
[laughter]
CORY DOCTOROW: You can listen to it as a podcast on a four-week delay. [chuckles] You can hear comedians making jokes about the news of the week a month ago or you can get it on BBC Sounds. From what I'm told by my contacts at the B, people aren't rushing to listen to BBC Sounds. Instead, they're going, "There is so much podcast material available, more than I could ever listen to. I'll just [00:42:00] find something else," and that's what happened with Spotify too.
BROOKE GLADSTONE: Spotify paid big bucks like hundreds of millions of dollars to buy out production houses and big creators like Alex Cooper and Joe Rogan in an attempt to build digital walls around their conquest's popular shows just to see their hard-won audiences say, "Hmm, I'll pass."
CORY DOCTOROW: Now, Spotify is making all those pronouncements, "We are going to, on a select basis, move some podcasts outside for this reason and that." Basically, what's happening is they're just trying to save face as they gradually just put all the podcasts back where they belong on the internet instead of inside their walled garden.
BROOKE GLADSTONE: Maybe it's because of the abundance of content or because, like the news business, people are used to getting it for free. Podcasting seems resistant even though no medium is safe from what Doctorow is describing. Enshitification sits at the intersection of some of our country's most powerful players, entrenched capitalist values, [00:43:00] and the consumer's true wants and needs. How do you see our future?
CORY DOCTOROW: I have hope, which is much better than optimism. Hope is the belief that if we materially alter our circumstance even in some small way that we might ascend to a new vantage point from which we can see some new course of action that was not visible to us before we took that last step. I'm a novelist and an activist and I can tell the difference between plotting a novel and running an activist campaign. In a novel, there's a very neat path from A to Z. In the real world, it's messy.
In the real world, you can have this rule of thumb that says, "Wherever you find yourself, see if you can make things better, and then see if, from there, we can stage another climb up the slope towards the world that we want." I got a lot of hope pinned on the Digital Markets Act. I got a lot of hope pinned on Lina Khan and [00:44:00] the Federal Trade Commission's antitrust actions, the Department of Justice antitrust actions, the Digital Markets Act in the European Union, the Chinese Cyberspace Act, the Competition and Markets Authority in the UK stopping Microsoft from doing its rotten acquisition of Activision. I got a lot of hope for people who are fed up to the back teeth with people like Elon Musk and all these other self-described geniuses and telling them all to just go to hell. I got a lot of hope.
Note from the Editor on bad media diets and brain worms
JAY TOMLINSON - HOST, BEST OF THE LEFT: We've just heard clips starting with The New Abnormal explaining the change algorithmic casinos have brought to the internet. Commonwealth Club World Affairs spoke with NBC reporter Jacob Ward about the differences between fact-based journalism and the wild west of live streamers. Wisecrack dove deeper into the effect of live streamers on politics. Zoe Bee expounded on media literacy. TechTank discussed the impacts of [00:45:00] disinformation that society is not trained to handle. And On The Media explored enshittification and how podcasts have avoided the worst of it. And those were just the Top Takes. There's a lot more in the Deeper Dive section.
But first, a reminder that this show is supported by members who get access to bonus episodes, featuring the production crew here at discussing all manner of important and interesting topics, often making each other laugh in the process. To support our work and have all of those bonus episodes delivered seamlessly to the new members-only podcast feed that you'll receive, sign up to support the show at BestOfTheLeft.Com/support. There's a link in the show notes, through our Patreon page, or from right inside the Apple Podcast app. Members also get chapter markers in the show, but I'll note that anyone, depending on the app you use to listen, may be able to use the time codes in the show notes to jump around the show, similar to chapter markers. So check that out. If regular membership isn't in the cards for you, shoot me an email requesting a financial hardship membership. Because we don't let a lack of [00:46:00] funds stand in the way of hearing more information.
And one more quick note, we are in search of a new volunteer transcriptionist. If you would like to join the team and help put our transcripts together, please send an email to [email protected]. Thanks.
Now, before we continue on to the Deeper Dives half, I have a few random thoughts, slightly more random than usual.
The first is to give credit to a sit-com from about eight years ago. Because I just saw it and they just made a clever joke about giving kids phones and the problem with the internet. It's the show You're the Worst. And there's boy in it, maybe about 13 or so. And in one episode, he needs to be found by an adult. He sort of out in the world. And so an adult finds him and asks, "Hey kid, don't you have a phone?" And the kid says, "Nah. My parents are afraid I'll become a YouTube celebrity if I had a phone." Which I find pretty great on multiple levels. There's the classic subverting the expectation of what the parents would be [00:47:00] concerned about, classic joke form, but then there's the underlying reality that being a YouTube celebrity is actually awful, and parents should be concerned about that. So, congrats on that joke from 2016 I just heard.
My second thought is something that I came up with during a recent bonus show, but I need to share it here so it gets wider traction. I think I have an important contribution to the future of internet literacy. I was describing my theory about how modern internet and phones are a disruptive technology to society, similar to how cars were a hundred years ago. People used to be able to simply walk in the road whenever they wanted, without fear of death, just like we used to be able to go on the internet without fear of an algorithm feeding us conspiracy theories. Right?
Well, when cars came along and started killing people on a regular basis, there [00:48:00] were a few ways that we tried to change culture and laws to protect people from being run over. And at the time -- I am personally familiar with this -- "Jay" was an insult term, meaning basically a sort of dumb bumpkin kind of person. So "jaywalking" wasn't just a misdemeanor; it was actually an insult. So it was a way of using shame to help push people to make better decisions for their own benefit.
So I thought we could do the same for modern internet and all of the ways it's trying to feed us disinformation. People need to be pushed to be more discerning in what they believe about what they see online.
So I figured that JFK Jr is a good modern example, having recently admitted to getting sucked in by online misinformation and not being able to recognize AI images and the like on a regular basis -- that he could be a good point of reference. So when someone believes something [00:49:00] false that they saw online, you should ask them, "When do you get your brain worm? You believe stuff you saw online without checking the source? I didn't know you also had a brain worm." And by insulting people, we can steer them to make better decisions. That's my idea.
And the last thing I have to add is the phrase, "deep doubt." I came across this while prepping for this episode in an article. And I think it describes well the informational predicament we find ourselves in. I'll link to the article from Wired, "Welcome to the era of deep doubt." It goes into more detail and the history on the subject.
But to cut to the Suggestions portion and then to add my own bit to it: Getting trustworthy information basically it comes down to this: number one, try to seek out reliable sources. It's sort of obvious, but you got to remind people, right?
But, number two is also really important, because no single source is [00:50:00] right all the time. So after trying to find reliable sources, number two, seek out multiple sources. Only by hearing from multiple perspectives can we ever really hope to get a well-rounded, contextualized perspective on anything.
And then number three, this is my own addition: It is well overdue time to Make RSS Great Again. If you're not familiar with RSS, don't worry. It stands for Really Simple Syndication. And you are actually using it right now, in all likelihood. So it's not scary. Podcasts run on RSS. Blogs, which have fallen a bit out of fashion recently, also run on RSS. And you can get just about any source of media through an RSS reader instead of an algorithmic feed. Instead of saying, I like this, I want to be shown more of it in my [00:51:00] feed, you can just subscribe to a source in an RSS reader. And then get every article, every video, every piece of information that source sends out, and you know exactly what you're getting. No guessing, no casino involved.
Personally, I subscribe to dozens of sources that I consider to be reputable, and they span the spectrum from perspective and temperament. And, even some that get more radical opinions or more conservative opinions, just to round out my perspective. I read each source, understanding their context. And by reading multiple sources, talking about the same issue from different perspectives, I get a greater context, and a more three-dimensional view.
Now look, don't get me wrong. My style isn't what I recommend for everyone. It's my job to read this much, and it's not healthy. So don't take it as a suggestion. But the basic idea of curating your [00:52:00] own set of trusted sources, making sure to throw in some more light and funny stuff for entertainment, is easily superior to the algorithmic alternative.
And look, it's not that I don't see the appeal of the algorithms. They're a slot machine, after all. For you, feeds are made to be fun, so fun that they're addictive. But to me, they're like going to a restaurant that serves really good food, but 10% of the time you order, your food will be just a little poisoned. We're not talking about a lethal dose, but you're going to get ill clearing your system, one way or the other. And maybe it's only one out of a hundred, maybe even it's only one out of a thousand times. But one of those meals you get from this great restaurant with all the fun stuff, it's going to give you a brain worm. And you're going to be in for a world of hurt. So, that's the risk you're taking. To me, no thanks. Not worth it.
SECTION A: SOCIAL MISINFORMATION
JAY TOMLINSON - HOST, BEST OF THE LEFT: And now we'll continue to dive deeper in our quest for truth [00:53:00] in three topics. Next up, Section A: Social misinformation. Followed by Section B: Live by the algorithm, and Section C: Solutions.
Charges against Telegram CEO sparks debate over balance of free speech and responsibility - PBS NewsHour - Air Date 8-29-24
STEPHANIE SY: French authorities have charged Telegram founder Pavel Durov with several offenses related to his messaging app. The charges include complicity in the distribution of child abuse images, drug trafficking and failure to comply with law enforcement requests.
Durov, who operates Telegram from Dubai, was apprehended over the weekend and was ordered to pay five million euros for bail. The arrest of the Russian-born tech billionaire has sparked a free speech rallying cry in some circles and has raised questions about how other social media executives may be held accountable for their platforms.
Joining me to discuss the implications of this arrest is Pranshu Verma, technology reporter for The Washington Post.
Pranshu, thank you so much [00:54:00] for joining the "News Hour."
Before we get into the ramifications of this arrest, tell us why Telegram is in law enforcement's bullseye and what brought about this unprecedented arrest of the company's founder?
PRANSHU VERMA: So Telegram is a wildly popular messaging app, mostly in places like Russia, the Middle East, and South Asia.
About 950 million people use it. And it's a way to send private chats or public broadcasting messages to large — hundreds of thousands of people. And it's also a way to send individual kind of encrypted chats as well. So it melds two types of messaging into one app.
And now this kind of app is pretty good in some ways, for example, if you're a dissident and you want to organize a protest against an authoritarian government. But it's also become an app where some of the worst activity online kind of has become a haven for it, such as the sharing of child sexual abuse imagery.[00:55:00]
And so that's kind of what's made Telegram into the bullseyes of the French authorities now, is that they are basically saying that the owner of Telegram, Pavel Durov, is complicit in making Telegram spread child sex abuse imagery, kind of spread organized crime, and also not complying with law enforcement when law enforcement wants user data about criminal activity.
And so, as you saw this weekend, this all culminated into a head when Pavel Durov landed in France outside of a Paris airport and was arrested and has now been issued charges around these types of activities.
But the other platforms, as you know, including platforms like Facebook and Instagram, have also been accused of having nefarious activities, including sexual abuse imagery of children, extremism, scammers.
How is this different? How does Durov differ from his peers when it comes to that? Do the other platforms, for example, cooperate much more with governments and law enforcement?[00:56:00]
There's no doubt that platforms like Meta and Twitter do host similar types of content.
But what Telegram is very specifically known for, it actually boasts about is its reluctance and often complete noncompliance with law enforcement in sharing user data. So even if a law enforcement official comes to Telegram, it is their policy as they boast about even on their own site to not share zero bytes of data with government to date.
And that's what makes it really different from all the other tech companies is that kind of strong noncompliance.
STEPHANIE SY: Elon Musk and other tech giants have posted their support of Mr. Pavel on X. Musk did. And a lot of people are asking what his arrest means for the heads of other similar platforms.
Should folks like Mark Zuckerberg, for example, be concerned about facing similar accusations? And could you see him being arrested if he travels to Europe? [00:57:00] And do you see these charges being levied against a tech executive in the United States?
PRANSHU VERMA: Yes, this is the big question here. It's opened up a can of worms.
Are the people who own the tech companies liable for the content that is on their platforms? Now, in the United States, there's a rule in law that shields companies from being held liable for the content that they have put on their sites.
But in Europe, there is a little bit more of a strength around holding tech companies accountable. And you have seen now, in this case, kind of the most muscular act to date of a government official — of a government holding a private official of a company to account. And it is unlikely that we would see it in the United States where somebody like a Mark Zuckerberg or Elon Musk is detained for what is on Facebook or on Twitter.
But now the question becomes, what happens if that activity happens in Europe or elsewhere, and will governments kind of respond in kind? And we don't know the answer yet.
Beyond the Grifterverse - Pillar of Garbage - Air Date 9-13-24
PILLAR OF GARBAGE - HOST, PILLAR OF GARBAGE: [00:58:00] Let's start by taking a look at the DOJ indictment. The document opens with some relevant context, which is probably worth including here. I quote, After Russia invaded Ukraine in February 2022, RT was sanctioned, dropped by distributors and ultimately forced to cease formal operations in the United States, Canada, the United Kingdom and the European Union.
In response, RT created, in the words of its editor in chief, an entire empire of covert projects, designed to shape public opinion in Western audiences. As we'll go on to see, that shaping of public opinion is sometimes specific and targeted, but Arty's broad goals in the West are perhaps best summed up by the words of an Arty journalist to an academic researcher the indictment quotes a few pages later.
I asked my editor what is Arty's line for this, but And he said, anything that causes chaos is RT's line. That context aside, the body of this [00:59:00] indictment is the story of one of those covert projects. A Tennessee based online content creation company, it refers to as US Company One, but which we can identify by matching up some website copy the document provides later.
Media. At the end of 2022, following a period of direct work for RT, Tenet's two founders were approached to work with a fictitious Paris based investor by the name of Edouard Gregorian, a figure whose name was repeatedly misspelled by his would be representatives and whose name Google search returns no results for.
Hold on to that fact for later, by the way. It's clear from private communications, though, that the company's two founders knew Gregorian wasn't the real deal. Between themselves, they referred to their backers as, quote, the Russians, the same term they'd used in prior correspondence to refer to RT.
There's also the fact that one founder, while awaiting a response for an invoice sent to the ostensibly Paris based Gregorian, Google searched time in [01:00:00] Moscow. Despite this, Tenet never disclosed to its viewers that the content was sponsored by RT, and Tenet's founders never registered with the Attorney General as agents of a foreign principal, which is, you know, illegal.
But who are those founders? Well, since US Company One is we can fairly easily find out that Founder 1 and Founder 2 are Lauren Chen, the Turning Point USA, PragerU and BlazeTV affiliate we've discussed previously on this channel, and her husband, Liam Donovan. And while it's not straightforwardly apparent who everyone mentioned in the document is, It is clear that the collaborators Chen and Donovan eventually hired and paid with that Russian money included Tim Pool, Lauren Southern, her Blaze TV co star Dave Rubin, and her old Turning Point buddy Benny Johnson, among others.
The indictment reveals that Chen was making tens of thousands of dollars per month coordinating Tenet, and that some of these contributors were getting paid hundreds of thousands of dollars per [01:01:00] video. All in all, the indictment states nearly 10 million dollars trickled from the Russian state into RT, through shell corporations, and into the hands of Tenet and their contributors.
Tenet's YouTube channel officially launched in the autumn of 2023, and in the year since has amassed over 300, 000 subscribers and over 16 million views. I would be showing you footage of all this by the way, but the channel's been deleted by the time I'm editing this. Over that period, the influence RT held over Tenet's output reportedly grew more direct.
The initial contacts who would set up the deal with Chen and Donovan, who by the way all shared an IP address, as did the so called Gregorian, suggesting that all these personas were in fact or Handless, directed Tenet to work with Kalashnikov and Afanasyeva, the RT employees being charged, under the guise of hiring them as an editing firm.
But before long, the pair were corresponding with commentators directly, and by June 2024, had even started [01:02:00] posting their own content directly to Tenet's sites. Allegedly, they also successfully persuaded Chen to have commentators cover directly supplied talking points. And all this was ticking along nicely, at time of writing, their last published video was released just yesterday, until that indictment was released, and Tenet's viewers, Tenet's collaborators, the internet at large, and the mainstream media discovered the business was funded by the Russian state.
We'll get into the fallout and what this all means shortly, but just before we do, it's probably worth addressing the elephant inside the elephant inside the elephant in the room.
Russia bad? The less politically engaged among you might wonder why the Russian connection here even matters. Why is this being treated as a smoking gun? Some of the more politically engaged among you might wonder why I'm going with the Russia bad framing here. Isn't that just some jingoistic cold war relic?
Aren't we past that? If [01:03:00] America bad, wouldn't that make Russia good? Well, in a nutshell, no. The flaws a lot of folks, particularly on the left, find in the US and the West, warmongering, imperialism, corruption, worker exploitation, inequality, restrictions on civil liberties, all that, are just as present, if not far more so, in Putin's Russia.
You guys, it's still colonialism even if they didn't use boats. Anyway, what you need to know here, and I'm obviously simplifying things, is pretty much that the jabronis in charge of today's Russia have a bit of a nostalgia fixation, and think it'd just be swell if things went back to the good old days.
Not to the USSR and whichever genuine leftist sentiments may or may not have kicked off that period of history, but to Russia's imperial period. The only problem there is all the Eastern European, Caucasian, and Central Asian countries and citizens that kind of like not being under the Russian yoke, some of whom have made allies in the West for precisely that reason.
[01:04:00] Naturally then, the weaker the US, NATO, and the EU are, the more feasible Russia's clawing back of influence or land becomes, the more successful the jabronis themselves look, and the less likely it is they end up falling out the window. Remotely weakening foreign countries without launching any missiles is pretty tough though, or it was before we decided to collectively hook ourselves up to the brain rot matrix that is social media, and before the people developing that matrix figured out that throwing in a bunch of black box recommendations algorithms would make them more money by dialing up the brain rots.
Or in more scientific terms, there's a functional misalignment between human psychology, which evolved to learn and adopt beliefs based on a host of social factors, and these algorithms, which are designed simply to maximise engagement. The result of this is the tendency for algorithmic media to amplify our own biases and create false polarisation.
So, and again, this is [01:05:00] something of a streamlined history. In the time since social media became an everyday part of western life, the Russian governments put more and more resources into waging an information war on this front, through tactics like bot farms, paid trolls, and, of course, content.
The Intentions of the Adversary: Disinformation and Election Security - Disinformation - Air Date 5-21-24
PAUL BRANDUS - HOST, DISINFORMATION: information warfare efforts are robust widespread and increasingly sophisticated. But what makes them even more effective, Rand study says, is that those efforts are taking full advantage of our own weaknesses and divisions, in other words, what we are doing to ourselves. In this regard, the Russians are hardly alone.
MAREK POSSARD: And so in many cases, it's not that Russia or China or these other countries are doing this. What, in fact, is happening is they're waiting for us to kind of essentially create a tactical opportunity that they exploit, and then they can amplify it further. So we're doing it to ourselves, and then our [01:06:00] adversaries essentially exploit it. And I think that's what happens in many cases with our elections, where there might be some one-off case. There might be a court case that one is trying to have adjudicated. And then our adversaries are going to jump in the mix and start trying to amplify this stuff online or in other mediums.
PAUL BRANDUS - HOST, DISINFORMATION: They're just piling on to things that we are doing to ourselves.
MAREK POSSARD: Oh, yeah. I mean, when this issue of partisanship and actually broader truth decay in our society are actually really tactical opportunities for our adversaries. It is a Christmas gift to the Russians. It is a gift to the Chinese and the Iranians and other countries that are trying to harm our democracy.
PAUL BRANDUS - HOST, DISINFORMATION: And you say, rather disturbingly, that these are not individual silos, that these seemingly unrelated threats could happen simultaneously. Tell me more about that. How might that unfold? What should we be looking for?
MAREK POSSARD: I think the key thing we should be looking for here [01:07:00] is how one type of seemingly disconnected threat could suddenly relate to another threat. So if there's an attack on our critical infrastructure, such as our utility companies, and there is a partisan reaction and suddenly you start seeing it grow, we do have to ask ourselves, why is it growing? Is it actually homegrown in terms of the reaction to some type of crisis, which could be Attack on critical infrastructure, it could be a hurricane, it could be a cyber attack, or our adversaries trying to amplify this up further. And I think one thing that we're not particularly prepared for is having multiple adversaries jumping in at the same time. And so if there is some national crisis or a regional crisis that may affect the ability for us to carry out election in a state or locality, and then suddenly you have Russian trolls online, you might have Iranian operations operating separately, to really just kind of mishmash this [01:08:00] crisis, are we going to be in a position to be able to adjudicate it accordingly, essentially, and say, what do we need to do to get this done to carry out our elections? Or are we going to essentially just self-consume ourselves during this crisis?
PAUL BRANDUS - HOST, DISINFORMATION: At a recent summit between Russian President Putin and Chinese President Xi Jinping, Xi said there are no bounds to our relationship, meaning military cooperation, intelligence cooperation, economic cooperation, on and on and on. Is there any evidence that you have seen that they are coordinating their efforts to interfere with their election in any of the ways that you have described?
MAREK POSSARD: So I haven't actually looked at that question specifically, so I don't want to speak to whether or not that's happening. I will say, as a hypothesis, it wouldn't be super surprising. These are very cheap operations to carry out. You don't need to invest in a [01:09:00] 10-year weapons system and dump hundreds of millions of dollars in R&D to carry this stuff out. It really is a matter of You have a direct pipeline, particularly with social media and with individuals. Essentially, all you're doing is figuring out who you want to target and you pump out content. It would be not super surprising, to say the least, if Russia and China were somehow coordinating, either explicitly or implicitly. And that very well could just be that there might be a tactical opportunity that Russia finds and they exploit. And then China jumps in in their own way, not necessarily coordinating every step of the way, but again, just finding that opportunity and being able to communicate it. You don't need to do a whole lot to gin people up, particularly in this type of election cycle where I would suspect we're going through a political realignment. And so it's a lot of exploitable opportunities to say the least.
PAUL BRANDUS - HOST, DISINFORMATION: And to your earlier point about our election system being decentralized, you really don't have to do [01:10:00] that much. And in the case of Russia and China, you really only have to look at these handful of swing states that are going to determine the election. It's conceivable they could target just two or three states that they think are going to make a difference and just focus on those. So talk about you know, asymmetric warfare in a very tiny way could actually make a huge difference.
MAREK POSSARD: Exactly. And I think the key is, is that they're not going to, I don't think it's really a huge payoff to necessarily try to hack our voting machines or try to turn individuals who are local election workers because it's such a decentralized system. There's a lot of different standard operating procedures across states, but you're right. There's a few states where the vote margins are very narrow. And if you can find a crisis that somehow relates to or is happening inside that state, it's not that hard to gin people up. And then you start casting doubt [01:11:00] more broadly on the election system based on one or two examples or one or two crises that are occurring in a swing state that could potentially directly relate to kind of the national outcome because of our electoral college. It's an opportunity for the Russians and the Chinese, the Iranians and others.
PAUL BRANDUS - HOST, DISINFORMATION: Let's shift, if we could, just for a minute to artificial intelligence. This is something that is far more top of mind than it was in 2020. Tell me about the impact of that in 2024 relative to four years ago.
MAREK POSSARD: So I had to hypothesize here. I would probably suspect that it's just going to pollute the information space. Well, I think there's two things. One, it's going to pollute the information space because it's just going to create more bullshit on the internet. And when you have more crap on the internet, it's difficult for individuals to disentangle what is true from what is a falsehood, particularly because you might have images and video that look [01:12:00] very, very realistic. The second thing is it allows our adversaries to scale their operations relatively easy. These are already cheap operations to run. You don't need a lot of money to stand up a server and maybe stand up some individuals to produce content, but this will allow you to automate that. Essentially, you're reducing the costs. to enter into this kind of operation. And we already have a lot of crap on the internet. And now you're going to have AI producing more crap, essentially polluting the information space, making it harder for regular citizens to make an informed decision based on whatever policy issue is popping up in a discussion on a given day.
Is Social Media Fueling Far-Right Riots? - Tech Won't Save Us - Air Date 8-15-24
HUSSEIN KESVANI: There are lots of live streamers and stuff who go to these protests because again, like another way of making money out of chaos is by live streaming, apparently.
And so you have these guys and you've got them everywhere who will go to protests and be like, Oh, why are you protesting? And like a lot of these channels very clearly have an intention of [01:13:00] like, you know, we promote right wing views, but we try to do it under the guise of like, oh, we're just sort of going around asking questions.
You know, these people aren't racist. They just love their country, etc. But like watching these interviews is really interesting, because even the people who are sort of there, it's not to sort of say that, oh, yeah, they don't really know why they're there. But it's more just like, And maybe it's because I also spend so much time or I have spent so much time online, but you can sort of see how the internet's kind of cooked their brains a little bit, or like quite a lot at times, I would like encourage people to watch it because like it's interesting to sort of see what happens when you're sort of navigating like a real world environment, but like you sort of believe that.
What you've seen on the internet is real. Like that is kind of the world that you understand, so for example, one of them like with us, like, Oh, why have you come to this? Like, you know, Southport protest, you know, or the right wing thing. And there'll be like, Oh yeah, I'm here to like pay my respect to the little girls who have passed away.
Okay, fine. But then like a second later, there'll be like, yeah, but you know, all these Slur inserted there like, you know, they've come in and like, you know, they take all the money and they take all the jobs and they're setting up mosques everywhere. And like, you know, there used to be a pub down the road and that's a mosque and everything [01:14:00] like that's not true.
Like there's no evidence of that, it feels like an assemblage of stuff that they've kind of read or they've seen on their phone. maybe some of it, they've also sort of invented in their head as well. so where your sort of reality kind of becomes like an assemblage of consumed online, put together by people who have sometimes nefarious ambitions, but sometimes they'll just sort of put stuff out there because you can do that.
Right. Like what's the effect of just putting out like fiction. You never know. I've also seen like right wing Twitter accounts, praise themselves for being able to like insert like pieces of misinformation map. They sort of felt. kind of got a lot more traction than they expected. That's kind of a game to these people.
I always go back to this thing that Adam Curtis said years and years ago about how eventually the internet will become this place where you go to sort of mostly go for entertainment, but you kind of never know what's true and what isn't. And it's not to say that the internet will be full of lies but it's more like You'll approach everything with this idea, but you don't actually know whether what you're reading is true or valid or whether it's not.
And for some people, that'll be like a really scary experience because [01:15:00] it'll be really dislocating and really detaching. But for other people, it'll be like immensely entertaining because again, so much of being online and so much of experiencing online is primarily for entertainment. the Riot livestreams are a form of entertainment, it's not really journalism.
The stuff that Tommy Robinson does is primarily entertainment, and he knows it. in the week before the Southport attack, Tommy Robinson had a very big demonstration in London, where he screened an hour and a half long film, where the entire film was About why he couldn't even though a court told him to stop harassing this teenager why he refused to stop harassing a teenager It was an hour and a half long film about how he was a victim because he couldn't stop harassing a teenager It's like insane But again, it like frames his ideas like well These people sort of see it as entertainment that they see it as entertainment that leads to kind of material effects and so again to kind of go back to British media, which is very right wing, like, where it has sort of struggled to kind of keep up the pace with it.
They cannot be the same type of entertainment platforms as all these other sort of, like, anarchic [01:16:00] creators. Some of them having right wing agendas and some of them having fascistic agendas, but some of them just wanting to cause chaos and mischief. And because they'll never sort of be able to sort of match to that, their only choices, really, are to try to shut them down.
And in some cases, like, what's been interesting is, like, The right wing kind of papers in the UK, oddly enough today, have had front covers, which is like, Oh yeah, the night when the fascists were sort of taken down. And it's like insane to look at, cause it's like, well, but you didn't like the anti fascists.
You've been sort of printing stuff for years and years saying how they were like destroying the country. You have kind of laid the foundations for something like this to happen. but then, you know, other right wing outlets and, you know, your sort of GB news talk TV, which are like the very right wing end of it, having to sort of accommodate a lot of these content creators purely on the basis that they know that they're never going to sort of get as much traction as these guys.
So I think it's like a very messy media environment, one where a lot of chaos can sort of ensue, but one in which like. The content creators who don't have any strings kind of pulling like [01:17:00] holding them in and no real regulation are sort of like, I can create great entertainment by sort of just framing Muslims as more of an existential threat than the Daily Mail ever could.
PARIS MARX - TECH WON'T SAVE US: Yeah, this will drive engagement and will really rile people up. So I'll get my viewers. But you were talking about how you can tell that some of these people have their brains like cooked by the Internet. And it's clear that one of those people is. The owner of Twitter X himself, Elon Musk, who has become a major right wing influencer of his own.
We talked in the past and, you know, there's been like this ongoing conversation and discussion about how Facebook has helped to fuel right wing politics in the past because of the way that it has decided to treat its platform. We know that YouTube has pushed kind of right wing extremism in its algorithms.
Twitter is not immune from that, but since Elon Musk has taken over. And the changes that he has made to the platform have made it. So people on the right basically get boosted a lot more, all of this kind of right [01:18:00] wing misinformation, these right wing narratives get boosted. And then he is also doing the work of boosting them, whether it is the anti migrant stuff, whether it is the great replacement stuff, as you were talking about.
The explicitly anti Muslim narratives. And now with these riots going on in the UK, he has been participating directly in that tweeting that civil war is inevitable. I believe that was after the first night that these went on or the second, like very early on. And just recently he retweeted this fake headline posted by Ashley Simon, who is co leader of Britain first, a far right.
Party that talked about how the UK government was going to set up detainment camps on the Falkland Islands for these protesters, which was completely false, taken from a telegram group, made up, but Elon Musk quote, tweeted it and said detainment camps. And, you know, it took a while to delete it. Like, what do you make of.
One, I guess how social media platforms in general kind of fuel this stuff, but also how when you have someone like Elon [01:19:00] Musk, who is participating in that, how does that become so much more difficult than to try to reign this stuff in?
HUSSEIN KESVANI: I think this is such a good example of how the fiction is sort of like all that's important to these people, because in the aftermath of the riots some of the right wing sort of people who participated in them, one of the things they weren't expecting, was that right wing media would Kind of turn against them or have the appearance of turning against them, which is not to say that they don't like, you know, they've sort of stopped believing in the same things, but it's more just like all the optics of this are really bad, right?
And so we can't be seen to like, so it's sort of put the writers. And so, like, one of the things I think they've sort of really latched to is the idea of , Well, we have to sort of be perpetually seen as victims, right? And so, yeah, we tried to burn down a hotel of children in it, but like, actually we were just doing it because we were scared for our children.
We were scared, but like, you know, our children's safety, why does no one talk about our children's safety and so on. And so like the element of victimization is really important. So, and this is where like fake news content or like fake kind of images and stuff become so important because really what's happening is that like, The [01:20:00] reinforcement of the victim narrative is so essential for perpetuating this movement.
Like, they need these types of grievances and everything. And I also imagine that, like, that probably is the thing that resonates with Elon Musk as well. But, like, he has to kind of perpetually see himself as a victim because, you know, his platform's not doing great and advertisers don't want to do it.
And, like, you know, I imagine he's also becoming more and more alienated by, like, people who used to be his friends.
So, like, he's probably not having, like, the best of personal times right now.
But Elon has also always fallen for like scams or fake stuff quite a lot. My impression though, is that the reason why he sort of seems to be going a lot harder on the UK, partly because we have a new prime minister who is, by his standards, like a left wing socialist.
He is not a left wing socialist by any means,
PARIS MARX - TECH WON'T SAVE US: What? I thought communism had returned to the UK.
HUSSEIN KESVANI: I feel like the UK is kind of because of like the riots and every time a riot happens, the sort of the MPs are always like, Oh, we need to ban like Blackberry. So we need to ban like whatever the sort of contemporary form of technology is.
And at the moment it's like, we need to ban like social media or we need to like put really big [01:21:00] controls and. could tell like Tiktok and Twitter and all that stuff and Twitter in particular, because I feel like for lots of journalists and lots of middle aged people and stuff who still use Twitter as their primary news source, it was very obvious and it's become very obvious to like power users and stuff like, Oh no, this is filled with fascists now.
Like it's very evident that even despite how much you try to Not see right wing stuff. It becomes more and more impossible because of who's boosting what and the messy blue check system and all that type of stuff.
SECTION B: LIVE BY THE ALGORITHM
JAY TOMLINSON - HOST, BEST OF THE LEFT: Now entering Section B: Live by the algorithm.
How to save culture from the algorithms, with Filterworld author Kyle Chayka Part 1 - Decoder with Nilay Patel - Air Date 3-11-24
NILAY PATEL - HOST, DECODER: You have in the book, a meditation on the concept of taste. And that, I mean, literally throughout history, you dive into the history of people thinking about taste and what it is and where it comes from.
And that dynamic you're describing is someone has a little bit different taste or they make something a little bit different, and then suddenly everyone else has the same taste. Uh, the example that I actually really want you to talk about and help me understand is the Stanley cups right now, [01:22:00] where I've read a lot of, you know, Pieces just about what a Stanley cup is and whether they have lead in them.
And like why all these people are buying them, but it feels like it is, it is like a filter world product. That the algorithm lit upon a cup, like literally just a cup. And then everyone was like, I am the cup. Like the cup is my lifestyle. As you think about filter world, like, can you put the Stanley cup in the context of suddenly everything is the same?
KYLE CHAYKA: I think so. I mean, the Stanley cup was interesting in that the chief marketing officer of Crocs moved to Stanley and Crocs had like gone through this viral trend of being adopted by a lot of influencers and TikTok creators and stuff. And so this guy has kind of. turn the same process or strategy with Stanley, and I think it's partly that they like seeded the ecosystem, giving Stanley cups to influencers and stuff.
And it's just the fixation that the internet has on one thing at a time. So a Stanley cup like [01:23:00] starts to become this go to lifestyle accessory. First, for like, Mormon bloggers, actually. That was an early adoption group. And then it becomes almost a kind of currency on TikTok and on Instagram, where memetically, like, if all my friends have this thing, I also have to have this thing.
If all the other influencers are making Stanley Cup content, then I also have to make Stanley Cup content. I think it's almost like, on Twitter we could see this happen with discourse subjects, like there was one subject of conversation each day, and either you were jumping into that conversation or no one cared what you were talking about, except now it's, we're doing that with visual trends and physical objects, it's like, you have to be holding up that Stanley Cup or no one's going to want to watch your content, so you're kind of forced to participate in the meme, in the trend, or otherwise you get ignored.
NILAY PATEL - HOST, DECODER: One of the tropes of decoder is that distribution has an outsized impact on the actual content [01:24:00] that people make, which is a very obvious idea, but we just come back to it over and over and over again, because we spent so much time talking about platforms. You're thinking about filter world is a broad concept, right?
Algorithms on the internet shape the culture in some big way, but YouTube has a different kind of algorithm, right? Like Stanley cups are not happening on YouTube. God forbid Stanley cups happen on X. What was formerly known as Twitter. That's a, that's a particular kind of TikTok trend that bleeds into everything else.
Do you think about the different platforms and their different aesthetics and what they prioritize and what kinds of culture they make?
KYLE CHAYKA: I think they all have different flavors. My pet theory, I think, is that each algorithmic feed, each platform generates its own kind of signature culture that fits into it.
So we're familiar with like. Instagram face, the kind of influencer plastic surgery aesthetic. Um, we're familiar with TikTok influencer voice, which is the kind of monotone syncopated packing as many words into a [01:25:00] sentence as possible. So I think there's like forms of content that work for each different platform.
And on YouTube, I mean, my favorite example of like YouTube culture is lo fi chill hip hop beats to study slash relax, which is this like ambient 24 7 never ending stream of chill drum beats with acoustic instruments and electric synths behind it. And it's all different artists composing these songs, but they're just turned into this wash of, you know, ambiguous, semi meaningless music.
And it's, like, that works for YouTube in a way, because you just leave YouTube on. It's this, uh, Streaming background and that's not how you use TikTok like TikTok that wouldn't work because you're constantly flipping through the feed. You're going to new videos. A tik tok video to be successful has to like grab you and throttle your attention immediately, whereas this YouTube content can be ambient and chill and like [01:26:00] homogenous in a soothing way. So I think there are these, like, quirks or forms that emerge from the structures of the platforms themselves.
NILAY PATEL - HOST, DECODER: Your earliest approach at writing about filter world was Instagram and Airbnb, what you were calling airspaces as you expanded the concept into filter world, it's everything, different social networks, take on different levels of prominence in the culture. So Instagram is still huge. But I would say it is not a driver of culture in the way that it once was.
That role is now TikTok. How do you see that waxing and waning? Why do you think that change happened? Is it just young people use TikTok and that's it? Or is there something else going on?
KYLE CHAYKA: People get bored. I think partly, partly it's like, I mean, just as fashion trends change, technological trends change.
And I think we discount that too often. Like when we use the same platform for. Five or six years, we tend to start getting itchy and wanting something else. And I think it's also been this kind [01:27:00] of gradual evolution of the internet from Text, to more professionalized images, to audio and video, to TikTok, which is this kind of full featured television, essentially.
Like, when we watch it, it's as if we're watching television. So I think, like, the multimedia race has, has gone on and on, and that's changed things. It's also just, more and more of culture has moved onto the internet, I think, like, like digital platforms have absorbed. different areas of culture that used to be more offline, whether it's, you know, a television equivalent like Tik TOK or podcasts that used to be radio, like over the past decade, more things have gotten more online.
And I think that's been a major shift.
NILAY PATEL - HOST, DECODER: I'm always curious on the effects that participating in the platforms have on people and creators. The Stanley Cup to me is actually a really fascinating example. When you are a big distribution network and you say anyone can participate [01:28:00] here, people will sort of naturally gravitate towards exactly what worked for someone else.
It's the first easiest, most instinctive thing to do. And so you can see why things spread mimetically a bunch of kids are like, well, that worked for them. I will do the same dance and participate in the culture. And that is a conversation in a way that I think like broadcast programming directors who are professionals were like, well, we can't just copy that thing.
We have to like, do something different. Right? Like there was, there's that element of like, well, I'm paid to have better ideas than the next person. That a bunch of people working for free are like, I'm just going to do the easiest thing. You can sort of see the tension there, but I'm curious if you see it particularly in a different way with Tik TOK, because there's something about the culture of Tik TOK that not only rewards that repetition, but like directly incentivizes it and makes repetition, the actual content.
KYLE CHAYKA: Yeah, I have, I'm sorry, I'm saying pet theory all the time, but another, another [01:29:00] framework to use to kind of, it's a great vocabulary, uh, framework that I have is that. And in the current iteration of the internet, we're all just like middle schoolers running through the hallways. Yeah. And so it's like, when you see some other kid wearing his hat backwards or something, you're like, Oh man, I'm going to wear my hat backwards right now.
Like it all filters out very quickly. The model of culture we have right now is more bottom up, like, like trends filter. From a grassroots level upward and then get noticed. And I think TikTok rewards that repetition because. You rehash someone else's content in order to participate. It's like making a new version of the same meme, as you were saying, is how you fill the vacuum of TikTok.
And it is how you interact with someone else and have that conversation. So rather than coming up with something new or trying to make a trend of your own, it's like the core behavior is [01:30:00] replicating a trend that already exists. And that's incentivized Like, by the algorithmic feed, by the kinds of aesthetic tools that you have, like the recommendations of sounds to use, or video editing tricks to use.
And I mean, I think you can see that replication happen all over the place, like as Twitter became, or X became more algorithmic. You saw a rash of prompt tweets, just like people asking for you to list your five favorite breakfast foods or something. And suddenly, because that worked for some people, everyone, everyone was doing it and being like, name the five opinions that everyone else hates that you have, it's like, that's not good content.
It's a kind of race to the bottom, I think.
NBC's Jacob Ward: How Technology Shapes Our Thinking and Decisions Part 2 - Commonwealth Club World Affairs (CCWA) - Air Date 1-31-22
JACOB WARD: Here's what we're going to do. I think we're going to, first of all, need to look deep inside these companies and make them civilly, and maybe even criminally liable for, you know, the ways in which they've tried [01:31:00] to manipulate our behavior, I think that it's gonna start costing these companies money.
Right now, human attention is treated as this kind of ephemeral thing. There's, it's endless, but, people smarter than me have been, saying, no, no, no. It is like. Mining and we need to regulate it. Not that we do a great job of regulating mining, but we need to get into to these companies, I think, probably through lawsuits and begin showing what they are knowing and doing.
Now that's for me the first step, but I also think there needs to be a recognition that all of that, like Star Wars, we're watching, you know, when I watched Star Wars these days and Han Solo is being told by C3PO, never tell me the odds, you know, leave me alone, nerd. Right. He's always saying, you know, don't tell me, you know, Oh, you know, Captain Solo, the chances of survival are 10, 566 to one.
Right. And he says, never tell me the odds. Listen to C3PO. C 3PO should be the hero of that movie because he's [01:32:00] right. Should not do this, you know, and our whole culture is geared and has been since the 19th century, on this idea of rugged individualism, growth at all costs is good. We're going out to the West and, pioneering our way out to a better life, you know, as opposed to thinking as a community about how are we going to support one another and what if it all goes wrong?
For me, a big part of that is going to have to be making it socially acceptable to say, here are my mental predilections. So for me, I tell anybody who you know, wants to talk to me about it. Like I no longer drink. I think it's unfair to people who suffered from alcoholism to refer to myself as an alcoholic.
I'm not sure I fall fully into that category, but I absolutely cannot drink. I've learned that about myself. And I have also learned as a result that when people say to me, Hey, let's meet up and go to a bar. I say to them, no, I would love to take a walk with you. I would love to do this other thing, but I cannot go to a bar with you.
I used to drink and I don't anymore. And that's going to mess me up. Right. Being able to [01:33:00] say, TikTok has got me, right? Being able to say, I'm having trouble with this thing, you know, making it socially acceptable to look at the odds, right? To listen to C3PO, I think is going to be a really important thing.
And then the last thing is, I think we need to stop letting culture, the modern culture as it's being dictated by some of the biggest companies. Tell us our norms. So for me right now, I'm in the process at the school that my Children are at of creating a pact with all the parents in the grades that we are in to not give our Children personal smartphones until they enter high school at the very earliest.
And I can't tell you how complicated that conversation is. It's a very hard thing to have that conversation because it involves admitting to your own difficult relationship with smartphones. you gotta sort of admit as a parent, you don't have any idea what your kid is really doing with them and what that might be.
And that you may not even know your kid fundamentally at all. it's a really hard conversation, but we have managed to get through it. And in fact, I'm on the [01:34:00] hook right now for being the guy who's supposed to write up the new revised pledge after a huge amount of really smart input. it makes my palms sweat to realize that I am, on the hook for that right now, but, it's going to require.
Communities coming together and saying, nope, I'm not going to do that because you know, the statistics show that the vast majority of parents get their cues about what's an appropriate use of technology from the ads for technology from a cutesy Alexa ads, in which the kid and the dog, trigger Alexa by accident is not adorable.
You know, they're normalizing behavior that we have not actually signed off on. And I think that we should start coming up with some civic structures for saying, no, it's too quick to say, Oh, don't be a Luddite. As if that's some sort of terrible thing. if you read up on the Luddites, they're pretty interesting group.
That's pretty interesting. You know, and I'm not saying we need to kick it all out of our lives. I love being here with you tonight, DJ, in this way, this is an incredible empowerment of our, slow thinking brain. You and I are doing right now. Fantastic. But. We have to recognize the [01:35:00] profit motive, the power, the way it's going to feel inexorable as pattern recognition systems make their way into our lives and that we have to come up with some civic structures for pushing back on them.
And I think we can, we've done it before. We're going to do it again. I just think we need to speed it up a little bit.
Internet Poisoning (with Jason Pargin) - The Daily Zeitgeist - Air Date 7-23-24
JACK O'BRIEN - HOST, THE DAILY ZEITGEIST: Do you feel like there are trends that, I think this is something we tried to do at Cracked sometimes is just in addition to debunking like myths that get spread around is like, here are the types of lies that our brain or the internet tends to gravitate towards.
And it's, you know, like one that I would say that, you know, I feel like we're seeing this process of like, you know, internet focus grouping and writers rooming a real event in real time with the [01:36:00] attempted assassination of Trump, as we've referred to. And I think. One of the themes that we're seeing there and also in the CrowdStrike story is like people have a real aversion to incompetence as being the explanation or, you know, accident, somebody fucking up.
It's just not a satisfying plot point. In your movie, like if diehard hedge, just like the story had resolved itself because the hacker had accidentally like detonated a bunch of the bombs while Hans Gruber was on top of the bill, you know, like something like that. And then it's, it's just a fuck up along the way that.
It doesn't happen, it doesn't happen in movies really because it's not satisfying the part of our brain that craves novelty and like good storytelling resists that [01:37:00] sort of thing. And so I, I believe like it's a bigger part of the story of the JFK assassination than we tend to think. And I think it's probably a bigger part of the story of the Trump attempted assassination than.
Some people are willing to, like, I, I think it seems to be pretty surface level that there is a fuck up there. But are, are there other, do you, first of all, do you agree that that's a trend and then are there other kind of trends that you've noticed as you, as you've kind of been studying this sort of the Well, yeah, but habits of.
JASON PARGIN: Like, I get that part of it is you just want to simplify the world. So for example, I have one extremely unpopular political opinion, which is, this is the perfect time to get it out when you're trying to sell a book and you've got it up your
ass,
which is that I think most of the world's problems, most of the things that frustrate you in your life are not anybody's [01:38:00] fault.
I think the world's an imperfect place, and I think it's hard to run a society in a way that's perfectly fair to every single person. I think, uh, you know, it's lots of times when prices go up or whatever. It's not necessarily that some evil person. It has a scheme, it's just, it's market forces and it's a company is trying to maximize the revenue because the shareholders demand it.
And like the, the blame for things spreads in so many directions that it just kind of disappears because it's just a system that we're all trying to survive in. And that is incredibly unsatisfying. We would love to hear that there's a villain because in a movie, if there's a problem like this, I don't know if you've seen the, um, Jason Statham, uh, film, the beekeeper.
JACK O'BRIEN - HOST, THE DAILY ZEITGEIST: I have not, but I started watching it a lot about it.
JASON PARGIN: Yeah, it's a, it's, it's a great boomer fantasy of like everything that is terrible about the world, all the way going up to the [01:39:00] president. There's like a cabal of just cartoonishly evil people that if you could kill them. The world would finally be at peace.
And that's, that's very satisfying to think of because yeah, every movie's got to have a villain, a human villain that is causing the problems. Like even a film like the Martian, which is supposed to be all about like troubleshooting and smart people and confidence porn, they still had to have like the villain character, the one guy who refused to was like being obstinate and say no to all of their plans because there's gotta be a bad guy.
And. Uh, this is something that I think is true across the whole political spectrum. Everybody wants there to be a bad guy and not just sometimes like with the pandemic. Sometimes pandemics happen. We are, we exist in nature and we actually. I don't know it's, I think most people did their best and most people didn't freak out.
And most people did what they thought was most reasonable. And I [01:40:00] don't think we like that. I think we like the thought of there being somebody we can yell at and hate. And then if we could get rid of them, everything would be fixed. That seems to be, to me, the most common bias, which is, I want to believe that somewhere there is a person, a bad person who has caused this, because then I've got an opponent and then if we could defeat them, everything would be fine.
The end. Most things in life are not like that. I, I believe.
MILES GRAY - HOST, THE DAILY ZEITGEIST: But in that version, does that sort of like absolve people of any responsibility for like what the actions of like an organization that they come like, you know, or the, the figurehead of, or how do you look at like that sort of piece of it? Like I get the sort of our yearning to be able to like, say, this is where it's all focused.
And that's like, it's in these four or five people kind of thing, but how, like at what point is there, obviously there are systems that are, have the lives of their own, but. Are you saying that everyone is just completely powerless to those things and nothing can be done or how do you score that part?
JASON PARGIN: I think that, for example, I could go on Reddit right now and I could find [01:41:00] memes talking about how the boomers ruined the world, how the boomers, when they were alive, jobs were easy.
Lifetime employment, houses were cheap. They had everything. And then they intentionally screwed over the next employee or the next generation after them because they were so greedy. And so, you know, sociopathic and, and narcissistic, if you could actually grab a random, if
you
could go grab a random boomer off the street, somebody in their seventies say, Hey, why did you run the world?
He's going to say, I worked at a muffler shop for 40 years. I don't, what are you talking about? I don't even, I rented for most of my life. I, I got to take a vacation. Like once every five years, what are you, you're talking about like the CEOs and the politicians that not, not, but it's like, no, we've now distilled all of the boomers into like one, you know, Evil person.
And guess what gang, whatever generation you are, like, let's say there's some Gen Z kids listening in this, a couple of generations from now, they're going [01:42:00] to blame you for what happens with AI. And you're going to say, I didn't do anything with AI. I thought it was stupid. I barely used it. And the kids in the future can say, well, why didn't you stop it?
And
you're going to say, I don't even know who, who did it. I don't even know who was in charge of it. Every company just started doing AI and it suddenly there was AI and all my devices. And they're going to be like, well, why didn't you, why didn't you vote to stop it? Why didn't you boycott those companies?
Why did you, and you're going to say. I was just trying to live my fricking life. I was trying to survive. No, I did not have time to go firebomb a server farm where they were, where they were operating Chet GPT 5. I was just trying to. And so what you find is you get that same answer all the way up to the president saying, look, I, I was voted, people voted for me to carry out an agenda.
They could have voted for somebody else. This was the agenda. This is what I did. I did what I thought was right. This is the most terrible truth. That nobody likes to face, which is that most people are doing their [01:43:00] best. And the, the flaws that happen are because you have different factions in society with different interests.
For example, like housing prices. Every time somebody talks about how, why housing is so expensive, they want to come up with this theory that like, there's like one corporation is secretly buying up all the houses. It's like, no, that they may be doing that. The issue is that half the country are already homeowners and they like the fact that their house costs twice as much because that's their retirement.
You have, it's not a secret cabal of guys in a shadowy room. It's an entire section of the country and their interests are separate from yours. And they're not billionaires. They're just retired dentists or whatever. And it's like, well, no, my entire retirement is based. I'm going to sell this house. When I turned 70, I'm going to move to Florida and rent a condo.
But yes, 400 percent more than what it did when I bought it in 1995. Like, no, I'm not, I don't want housing prices to go down. This is, um, you know, I, this is my retirement right here. So [01:44:00] there's times when some people just want different things from you. And if you're always trying to look for a specific villain or a cabal or a conspiracy, you're going to be disappointed more often than not.
A lot of times it's just people acting out of short term interests or out of ignorance, or, you know, they're just being oblivious, you Yeah,
MILES GRAY - HOST, THE DAILY ZEITGEIST: but is there, I mean, yeah, I guess in that's like, that feels like sort of like a bleak, like how in, in that instance, what, how would we solve things if we're willing to always say like, well, this person is just trying to do the best, not that I think, like, I get the point about like trying to find like this cabal or like darker angle as to explaining certain things like that, but does like, At a certain point, like if, how would that worldview, how do we try to change things like from that perspective?
JASON PARGIN: But things have changed. None of us would prefer to go back and live in the year 1924. Think about what you lose. If you go back, think about how many civil rights get rolled back. Think about how much shorter people [01:45:00] lived, how many more babies died in childbirth. Think about how can much more contaminated the food was back then and how nobody had air conditioning.
Like we have improved the world immeasurably because we've While everybody was yelling at each other, the normal people were just out doing their jobs and building houses and building safer cars. And there's bureaucrats that are just quietly passing, you know, ordinances that make things slightly safer.
And, and
JACK O'BRIEN - HOST, THE DAILY ZEITGEIST: yeah,
JASON PARGIN: the, you know, I, none of us would go back and live a hundred years ago, things were worse by, I think in every possible measure.
How to save culture from the algorithms, with Filterworld author Kyle Chayka Part 2 - Decoder with Nilay Patel - Air Date 3-11-24
NILAY PATEL - HOST, DECODER: One of the things that's interesting about that idea is the influencers are sort of buffeted by algorithmic pressure, right? So they, they have to go pay attention to everything's paying, everyone else is paying attention to, and that loses specificity. But then some of them rise above the others and you get very powerful individuals who can make different decisions or take brain deals or whatever needs to happen there.
That's a very commercial, you can go give Stanley cups to a bunch of Mormon influencers and [01:46:00] make Stanley cups of thing, which is just a fascinating reality. Like that's, I don't think that has been true in the past. Next to that is the decline of media institutions, which they're not supposed to be collections of individuals in that way.
They're supposed to be brands unto themselves. With their own kind of taste. You have the Meryl Streep monologue from Devil Wears Prada in your book, right? It's like an example of how people think about these institutions. In that case, whatever runway, which is a stand in for Vogue, Vogue still exists.
Right. And it's still, it's still for now. Whatever's going on with Connie and ass is going on with Connie and ass, but Vogue still exists and celebrities still want to be on the cover because that institution still has power. My view of the platforms do not want any institutions to have power. They would rather negotiate with an infinite supply of burned out individuals that all kind of do the same thing.
This is a history of, I think the 2010s media is the decline of these media institutions. Do you see a return to that? Like someone [01:47:00] else has to play that validating role. Someone else has to provide a celebrity. Um, something that feels like a magazine cover, something that rises to that level, and that feels like the antidote to filter world, right?
The people seek this validation. People talk about Anna Wintour as though she's in the Illuminati, like literally as though she's in the Illuminati, but that's not forever. And there needs to be something that replaces it.
KYLE CHAYKA: Yeah. I mean, there needs to be like a taste making force that works and there needs to be a way that cultural ideas or people can get distribution that is not just algorithmic, though, like what you're saying has induces this nightmare for me of like TikTok covers, like, like TikTok releases a digital cover for its celebrity of the month and just makes them famous.
Like that's a scary thought. Um, But I think,
NILAY PATEL - HOST, DECODER: but to be clear, I don't think they could, I think that would be empty, right? I think, I think TikTok, the TikTok audience would reject that kind of top down taste making from the [01:48:00] platform itself. YouTube famously tried to do YouTube originals. Like we're going to make TV shows now.
And everyone was like, why? And they just disappeared and PewDiePie went back to making PewDiePie videos, right? Like, there's something about the nature of the platform. So they actually can't do the thing themselves. They need something else to provide that role. And I don't know what that next thing is.
I think it behooves us all to figure it out, but I don't quite see it yet.
KYLE CHAYKA: No, no. And no one trusts those platforms enough to give them their tastemaking judgments. But I think, so we're in a weird, Swing of like media institutions are totally crumbling. And we're, I think we are seeing some rebuilding of that.
Like, I mean, you are a tastemaker, the verge, the verge is a curatorial force that both produces original content and directs attention at specific ideas. The last person on earth. I keep saying that's what we have, but I think like newsletters.
NILAY PATEL - HOST, DECODER: I want to point out that the editor in chief on this podcast was like, I don't know where to leap us.
I do know where she's like from just put it there as a tastemaker. [01:49:00] Sorry.
KYLE CHAYKA: But, so, like, the rebuilding of those taste making forces is happening, I think, in newsletters. I mean, you look at, like, Blackbird Spy Play in the men's newsletter, you look at Magazine, the women's fashion newsletter. For some reason it's happening in fashion very quickly and obviously.
But I think those places will build up and grow and hopefully sustain themselves, which they will have to do by hiring more writers, like, more people. They will have to decentralize from the single person personality cult, just as magazines did, just as Anna Wintour has done. And so I think we'll see them get a little bit bigger and consolidate their presence.
Power and like YouTube channels will publish articles and make podcasts and everything else. But we are in this like rebuilding phase, I think. Yeah.
NILAY PATEL - HOST, DECODER: I want to end with an exercise you did in the book called an algorithmic cleanse. You, you divorced yourself from filter world. [01:50:00] Uh, I feel like everyone did a version of this when Elon bought Twitter and everyone kind of reconsidered their relationship to Twitter, but you went all the way, right?
Explain what that cleanse was like, how you actually executed it and how you came out of it at the end.
KYLE CHAYKA: Yeah, this was toward the end of 2022. So it was as Musk was buying Twitter and I just hit a point where I felt so saturated by algorithmic feeds. And I'd spent the whole process of writing this book thinking about them.
I don't know. I had to escape. I was like, uh, I had to just run from this whole ecosystem. And so I, you know, paused on my accounts. I logged out of everything on my computer and on my phone. I deleted Spotify, I deleted Instagram and Twitter and everything else. And I just went cold turkey for about three months, so I was no longer getting any feeds of information.
I wasn't getting recommendations of anything. And I kind of had to figure out new ways of seeking [01:51:00] out content. Like, I had to look at the newspaper, I had to go to a library, I had to point my browser to theverge. com and see what was on the homepage. I mean, really what I found was that the internet Is no longer built for not being on feeds.
Like, particularly two years ago even, websites were not thinking so much about their home pages. Like, newsletters were less of a thing. I feel like we've come to rely so much on distribution and broadcast that we media creators like don't think enough about just having a place where people go to find things they're interested in.
NILAY PATEL - HOST, DECODER: Do you, that, there's a real. Uh, tail wagging the dog element of this, right? Where you can want to have a different media diet. I have set up RSS readers many times for the past two years. I used to read all of my news in RSS. I used to sit in school, my laptop open and not pay attention and like go through my RSS reader.
And I remember [01:52:00] saying to some of my friends, I'm out of the internet. I finished the internet today because I'd read everything in the RSS reader. And there was a great diversity in content. No one thinks that way anymore. You open our assessor, you plug your favorite websites into it. Even ours. Candidly, even ours, and you get a bunch of stuff.
And some of that stuff is like obviously made for SEO. And some of that stuff is obviously made for other platforms. And very rarely do you see, Oh, there's an audience here that wants to read every article on this website. And that is a package, but it's coming back. Like people want to do that, right?
Like you can see there, there's. You felt that way. I have felt that way. We, we write articles about RSS readers and people read them. There's demand for it. Do you think that demand is ever going to get filled?
KYLE CHAYKA: I hope so. I mean, I tend to think wasn't the great promise of Silicon Valley and all these tech startups, like we are going to give users things that they want.
Like there's this thirst for a new form of delivery of content, better curation, like more [01:53:00] holistic ideas of what we should consume. And I hope that products arise to give us that. I think people are like restlessly questing for it right now in RSS, in newsletters, in a kind of parasocial podcast video, whatever ecosystem.
But I don't know, like, I like internet technology. I like when startups do new stuff. I hope that they take on this challenge and figure it out.
SECTION C: SOLUTIONS
JAY TOMLINSON - HOST, BEST OF THE LEFT: And finally, Section C: Solutions.
Why is Brazil's supreme court shutting down social media platform X? - DW News - Air Date 8-30-24
FABIO DE SA E SILVA: X, basically refusing to accept Brazilian laws and to comply with Brazilian judicial orders. There are some, investigations going on in Brazil. On individuals who used social media to violate Brazilian law, electoral law, as well as criminal law.
Some of those investigations are being presided over by just a small guys. And in the course of those investigations, there have been some orders for Twitter or X to bring down some of those profiles [01:54:00] and so on. And in the beginning, X would comply with those orders as just like other social media platforms, but after Mr.
Musk bought the platform, he made it very clear that he did not accept those orders and that he thinks they amount to censorship.
PHIL GAYLE - HOST, DW NEWS: Well, before, Justice Mores, followed through with his threat, X announced on X that he would not comply with the court's, instruction. Here's some of what appeared on the company's global affairs account today.
soon we expect, Judge Alexander de Mores, will order X to be shut down in Brazil simply because We would not comply with his illegal orders to censor his political opponents. Now, these enemies include a duly elected senator and a 16 year old girl, amongst others. So, Professor, what is Elon Musk referring to there?
FABIO DE SA E SILVA: He's referring to some of those investigations that I mentioned. So in this case, there was a senator who was using his, his account [01:55:00] to, for instance, incite the military against the civilian government. In the other case that he mentioned, a profile of a young girl was used apparently by her father. To docks a police officer who was working in one of those investigations.
And so it was in that context that justice more guys ordered the platform to bring those profiles down. But Mr. Musk, as I mentioned, is refusing to do that. And he's claiming that this is censorship, which I do not agree with. Because, as I mentioned, these decisions are being adopted in the context of investigations that look into violation of precedent laws.
PHIL GAYLE - HOST, DW NEWS: It's an odd decision anyway, because censorship or not, You would think , that working within a particular jurisdiction, you would just follow the law. So is there something more going on there? Why does Elon Musk, think that it's okay to disregard the laws [01:56:00] in Brazil where he wouldn't do that in the United States or even the European Union.
FABIO DE SA E SILVA: I believe, Phil, that, not only Mr. Musk, but others around the world look at Brazil nowadays as a potential, case in which a stronger push to regulate social media has been, attempted, so far not successfully, but, not only through the actions of Justice Moraes, but also through, Congress that was deliberating over a bill, last year.
There have been attempts to, place some limits on what social media platforms can do and what kinds of obligations they should have to, for instance, moderate content and avoid that misinformation as well as hate speech be disseminated on their platforms. I also think, in the case of Mr.
Musk, there is a commercial interest because, apparently his, his business wasn't doing well in Brazil, so he was already trying or planning to moving that away from the country. and there seems to be also [01:57:00] some kind of political, you know, sympathy, on the part of Mr. Musk for the Brazilian far right.
PHIL GAYLE - HOST, DW NEWS: It's interesting you say that it's not doing well commercially in Brazil. I saw an estimate today that something like 40 million Brazilians, roughly a fifth of the population, access X at least once a month, which sounds like a massive market for Elon Musk. Sacrifice in this way.
FABIO DE SA E SILVA: It is. It is a, reasonable or sizable market for any social media platform, which is a reason why many people doubts that Musk would take things as far as he did the information that we need.
Listen to here is that the branch here wasn't doing well commercially in terms of the ad that they are able to sell, for example, right? But yes, you're right that the platform is widely used in Brazil. it happens elsewhere as well. It's a platform that, that's a kind of niche platform.
So it's very much used by journalists [01:58:00] or by some, internet businesses. Yes. Or by some academics to engage in exchanges of ideas, but yeah, it's a decent number and, I think Brazilians are going to miss, X if it's really banned in the next couple of days.
PHIL GAYLE - HOST, DW NEWS: And so tell us about the judge at the center of this case, Supreme Court Justice Alexander de Mores.
Who is he?
FABIO DE SA E SILVA: He is actually, you know, relatively conservative, lawyer. He was a prosecutor in the state of San Paolo, before entering politics. I think he brings some of that, knack for investigation from. His, origins in the public prosecutor's career. He got to the Supreme Court during the term of President Temer, who replaced President Dilma Rousseff after she was impeached.
And, you know, he's always been seen as, a very conservative judge, tough on crime judge or legal scholar which he also [01:59:00] is. that makes things very interesting because nowadays the Brazilian far right suggests that he's working in line with the president when they actually come from very different camps politically.
The Battle for Truth: Social Media, Riots, and Freedom of Expression - Institute of Economic Affairs - Air Date 8-16-24
MATTHEW LESH - HOST, INSTITUTE OF ECONOMIC AFFAIRS: The latest reporting about, their response is that, they're gonna let the existing online state jackal into force, but they're gonna review it down the track with this idea and with a particular focus on Twitter, on X, on Elon Musk, on this idea that, we need to some kind of clamp down and to reduce the spread of that miss or dis information.
I'm wondering what you make of that, Clare.
CLAIRE FOX: Well, one of the difficulties is definitions and when you come to the law, this matters because the, way that dis and misinformation is treated, it's as though we all understand this is fake news, posed truth, lies, malicious lies at that.
In the riots situation. it's worth reiterating here that Hope Not Hate put forward, well, you know, I assume, by the way, tweeted or posted [02:00:00] in good faith, that there were, Muslim women having acid thrown in their faces in the height of the riots, and I was absolutely horrified, and I believed it to be true.
I thought, oh my god, things have got so out of hand that I was just thinking of the horrors of it. Only to discover, you know, the police just said the next day. Oh, no, that wasn't true and hope not hate aren't being rounded up as we speak, right? so, you know, it was like, oh, that was an honest mistake and the hundred Riots as were predicted one night and the next day hope not hate actually said, oh, yes We think that probably was a hoax, but you know, it did lead to very positive headlines and lots of anti-racists on the streets.
So I think that the difficulty we've got with the term like this or misinformation is what do they mean?
KRISTIAN NIEMIETZ: Yeah. I think a lot of people, on the pro. clamping down on this information side, seem to think that this is just another form of content moderation, that this will be the equivalent of, say, having a spam filter.
but the difference, of course, is that we can all agree what spam [02:01:00] is, and the definition of that doesn't change over time. It doesn't depend on, information that may change or, on political leanings, whereas this is very much not like that, and it's inevitably any rules about that are going to get, weaponized and, it's going to be just a social media equivalent of de banking, where with de banking, the issue is just that banks are being, hyper cautious and rationally so, finding out, verifying that a transaction is not dodgy, that it is not money laundering, is very costly, and no individual customer is super important to them, so it's just rational to say, well, I'll just shut down the account, and it's going to be just like that. It's, it would be very difficult for a social media content moderator to verify, did you maybe make an honest mistake, is the claim definitely false, all that kind of stuff, it's just far easier to say, well, This particular, post is not that important to me or to the company in the grand scheme of things.
We will just shut it down. We will just be, out of an excessive, but rationally excessive caution.
MATTHEW LESH - HOST, INSTITUTE OF ECONOMIC AFFAIRS: But obviously exception here being [02:02:00] Elon Musk who, and, and the current version of X where there, it does seem kind of a philosophical, ideological, bulwark, at least to some extent against this. I mean, I'm always reminded in terms of this just misinformation, mate, just How much our idea of what the truth was changed through COVID, and even what the authorities were telling us changed.
So, you know, very early on in the pandemic, it was the WHO repeated the Chinese lines that COVID 19 wasn't actually spreading between humans. You know, there's all this stuff about how masks were useless, and then they were compulsory, and then maybe they were useless again. There were all these instructions about wiping down groceries, despite the virus being airborne.
And this is, you know, That's all legitimate in a way, and assuming the authorities are telling you the best information they have at the time, maybe they weren't at all times, but for the most part, you know, even if you give them the benefit of the doubt, it's very difficult to manage a process around disinformation because there is no, like, one final truth and authority, so therefore you have, it's inevitably a politicized process, as you see with all these fact checkers, And then you, and in order to deal with this information, you have to give somebody the power over you.
[02:03:00] I think there's a second element to it as well. it's not just about who has the power over you, but sometimes allowing false information to be spoken about is actually important. this is actually why I think, Twitter X is doing very well, which is the whole community note situation.
the crowdsource truth finding. rather than we're taking down this information, you think it's false. We're going to, users are going to come together respond to it and make a vote to figure out what is a truer claim or what is an added context claim I found that extremely useful because.
You're not going to stop people from having a bad idea in their head, but you might be able to, give them an alternative viewpoint, and have that debate, rather than just saying, what we need to do is stop having any kind of debate in the first place.
KRISTIAN NIEMIETZ: It's also more effective, this self regulation mechanism through community notes, because, people who are getting a community note attached to a tweet are embarrassed about it, you know, it's like getting ratioed, and that's, possibly more effective than just trying to withhold default information, seeing it, but being invalidated by the community note, that's a stronger way of making the case there is a factual error here.
CLAIRE FOX: I think that's right. I think it's a sort of [02:04:00] bottom up, you know, form of intervention that's very helpful. Okay. because there's this notion that the, government are also pushing that from the age of five, you know, all teachers are to tell pupils how to identify disinformation and misinformation.
And I did a debate for a teacher's group the other night. when I posed to them various queries of what was true, what wasn't true, and so on and so forth. They couldn't handle it and I said now you think the five year olds are going to deal with this, right? they had a worldview that when I challenged their worldview They wanted to say was it misinformation disinformation that they couldn't do that in the end I pointed out these are matters of contentious politics point about community notes And this was the point I made to them was young people seeing that there are different arguments about facts, what is factually accurate or not.
Actually, that's how you learn what critical thinking is. Because you actually, if you do get community noted, as you pointed out, I [02:05:00] think I might be a bit more careful about checking my sources in the future. You're going to think about it much more. But the main problem with the way of approaching misinformation, disinformation, of removing it is that suppression leads people down rabbit holes.
It's precisely the suppression of speech which drives a conspiracy mongering, which makes people cynical about everything they hear that actually undoes the authority of truth in a genuine sense, so that it doesn't matter what you see, you don't believe it, you know, and you sort of say, Oh, well, I've, you know, the government have said that or the police have said that I don't believe them because they've tried to manipulate things.
So it's much better to have this kind of atmosphere where you have competition going on between different versions of events, that's not to be relativistic about truth, but so that you can piece together as an individual what it is you think is the case. When you were talking about community notes though, one of the things that struck me when you kind of got this collective, [02:06:00] all together we'll come up with the is Wikipedia.
I mentioned Wikipedia because Wikipedia is kind of well regarded by the establishment, it's certainly a legal entity. and I can safely say that there's been missing disinformation on my Wikipedia page since Wikipedia started. And it gets on my nerves, right?
And I can go and speak at a conference and people can read out my bio from Wikipedia and I'm mortified, right, because I sound like some complete lunatic. They haven't checked it because they believe it's true and many young people do the same. They think Wikipedia gives them a version of the truth.
Anyway, I mention this because I'm not trying to ban Wikipedia. I'm not trying to encourage the government to lock them up. But they do spread misinformation all the time, even though it's collectively done and the world has not collapsed as a consequence.
Governments Are Suddenly Shutting Down The Internet - Here’s Why - ColdFusion - Air Date 8-15-24
DAGOGO ALTRAIDE - HOST, COLDFUSION: It seems so unbelievable that the internet could be completely shut down, but the world's second largest [02:07:00] economy did that in one of its regions, for almost an entire year. We all know about China's Great Firewall. It's an advanced system for filtering the internet for population control.
ARCHIVE NEWS CLIP: China runs the world's most complicated censorship machine.
The government actually requires Chinese internet companies to employ armies of human censors to police user generated content on their platforms.
DAGOGO ALTRAIDE - HOST, COLDFUSION: But the Chinese government also has control over internet service providers. This allows them to enforce national or regional shutdowns as needed. In 2009, the internet was cut off for 312 days in the Xinjiang region in response to riots.
Meanwhile in 2019, Russia passed the quote, sovereign internet law that gives the government the power to isolate its internet from the rest of the world.
JOHN HEIDEMANN: Telecommunications companies are large corporations and there's usually only a few of them. It's very easy for the government to reach out to the heads of those corporations and we think it's in the interest of the nation to [02:08:00] The internet has something called routing, which is how we decide where to send traffic.
And routing is managed by telecommunications companies. Other countries sometimes have very sophisticated means of routing. Intercepting some communications, but not all.
DAGOGO ALTRAIDE - HOST, COLDFUSION: But where does the USA stand in all of this? The Obama administration tried to pass an internet kill switch. Bill called the quote, protecting Cyberspace as a National Asset Act of 2010.
It was introduced to the Senate, but was heavily criticized and never passed. In the United Kingdom, if there's an emergency that can cause, quote, serious damage to political, administrative, or economic stability, the government can shut down the internet. The Communications Act of 2003 and the Civil Contingencies Act of 2004 gives emergency powers to the government to suspend the internet, and this is done by ordering service providers to shut down internet operations.
A UK government representative said, quote, Quote, it would have to be a very serious threat for these powers to be used, something like a major cyber [02:09:00] attack. These powers are subject to review, and if it was used inappropriately, there could be an appeal to the Competition Appeal Tribunal. Any decision to use them would have to comply with the public law and the Human Rights Act.
End quote. And the passing of this act hasn't been without its detractors and critics. But as usual with government power, there is the risk of abuse, and some governments flat out just abuse this power. After Libya's devastating flood disaster in 2023, Derna, one of the cities hit hardest by the floods, experienced a significant communication blackout and complete internet shutdown.
Initially, some people thought it was because of the natural disaster, but the real reason was to stop online criticism and potential riots against how badly the government was handling the crisis, and this was at a time when 11, 300 were reported dead and 40, 000 displaced. The government obviously didn't have their priorities right, and if this isn't an absurd abuse of power, I don't know what is.
JOHN HEIDEMANN: A lot of countries have proposed we should have a kill switch. The United States, the UK, [02:10:00] Australia, have all proposed kill switches, and I was just looking before joining you. What I saw was a news report saying Australia actually has a kill switch.
DAGOGO ALTRAIDE - HOST, COLDFUSION: In Australia, an internet shutdown mechanism exists under Section 581 of Australia's Telecommunications Act of 1997.
It grants the Australian Government significant authority over telecommunications networks. Including the power to stop internet access in the interest of national security. In 2003, in the wake of the War on Terror, the Act was amended so that the Attorney General, who was the Chief Law Officer of the Commonwealth of Australia, could direct a telecommunications carrier to kill the internet, quote, either generally or to a particular person or particular persons, end quote.
Before executing the Act, He must ask the Prime Minister and Minister for Communications approval first. The amendment was rushed through, giving only four working days for anyone to raise concerns. Concerns were indeed raised, so the language was changed. The law now couldn't be used to turn off the internet for an individual or [02:11:00] organisation, only the internet as a whole.
So, that's comforting, I guess. In 2024, Malaysian Minister Azalina Othman Syed recently announced the government's plans to implement an internet kill switch. The minister issued a statement that the, quote, new legislation that includes the provisions regarding the procedure and enforcement of a kill switch, end quote, is for the purpose of analysing digital security.
That's probably as thinly veiled as it gets. The legislation will reach the Malaysian parliament in October. As mentioned earlier, around 39 countries around the world have in one way or another completely shut down access to the internet. We, as citizens of nations, need protection, if only in the interests of being prudent.
In 1948, the United Nations General Assembly adopted something called the Universal Declaration of Human Rights, an international legal document that outlines fundamental human rights and freedoms, such as the right to life, equity and non discrimination, right to freedom of work, freedom of education, [02:12:00] and so on.
But what about a right to the internet? Just 15 years ago, this might have sounded absurd, but today, the internet is the infrastructure on which modern society is built. Because of this, it's no wonder that the UN has declared internet access a human right and deemed internet kill switches as illegal, and this was in 2016.
In practice, however, while there's significant international pressure against internet shutdowns, there's no binding international law that bans governments from doing such actions. In Bangladesh, the primary reasons to shut down the internet, as cited by the government, was to stop misinformation and rumours from spreading.
But does it actually help?
ARCHIVE NEWS CLIP: The busy streets of Dhaka are deserted with burnt vehicles and bricks strewn across the roads. And the protesters have gone on a rampage at many police stations and government establishments. There's a complete internet and telecom shutdown that is in effect in a move to curb the violence.
JOHN HEIDEMANN: So misinformation is a real challenge. And I guess if you shut the internet off, nobody's looking at Facebook. In [02:13:00] that sense, it's quote successful. It's a very heavy handed maneuver though. The other thing I was thinking about in misinformation, I mean, In the United States right now, there's been some debate about what role the government should have in intervening in the spread of misinformation on social media.
I don't think anyone's proposing shutting down the internet, but people are talking about the role of interactions between the government and social media sites. And how do you label misinformation on social media? And I think those are things we all have to grapple with. And there's different points of view about that.
DAGOGO ALTRAIDE - HOST, COLDFUSION: Shutting down the internet to curtail protests like in Bangladesh, Egypt in 2011 during the Arab Spring, and for election periods in Venezuela, like in 2019. are the more common excuses to kill the internet. But there's a very strange reason why some countries might end up doing it. In 2024, from May 26th through to June 13th, the internet in Syria went dark.
The reason? High school exams. This is actually a common tactic used in many countries [02:14:00] including Syria, Iraq, Algeria, and many others. The purpose is to stop students cheating in exams by using online methods. If this method really is that effective remains to be seen, but it hasn't stopped 12 shutdowns recorded in 2023 alone.
But the question is, what does everyone else in the country do at that time? I guess they just sit around and twiddle their thumbs. But it does bring up an interesting point though. What is the cost of an internet shutdown?
Beyond the tragic loss of life and injuries, the recent internet shutdown in Bangladesh has seen immense economic damage as you can imagine. According to NetBlock and their cost of shutdown tool, the total financial impact to Bangladesh has been around 393 million for five days of internet shutdown.
For the United States, daily e commerce trade is valued at 2 billion and the daily digital economy is about 5. 5 billion. So a complete internet shutdown for one day in the United States could amount to losses in the range north of 7 billion at a bare [02:15:00] minimum. But back to Bangladesh and the internet shutdown.
It's estimated that there's over 1 million freelancers that operate out of Bangladesh. That's a million people who depend entirely on the internet for their careers. And this isn't just for work within the country, but to deliver projects abroad. And suddenly, Their entire livelihood is gone, and the worst part is that they have no idea when it's coming back.
And this isn't to mention the impact on hospitals, banking, and other critical industries that rely on the internet to operate. So it's not just the dollar value, but humans and their daily life is at stake with internet shutdowns.
JOHN HEIDEMANN: Imagine the economic damage that would have if your internet was shut off for five days, you know.
I think a government's got to think very carefully before they take such a decision because of the economic implications, much less the social implications.
Enshittification Part 2: The Mechanisms That Helped Big Digital Go Bad - On the Media - Air Date 5-12-23
CORY DOTOROW: I grew up going to a great little company in Toronto called the Canadian National Exhibitions, the CNE. By 10 a.m., and there'd be someone walking around with a giant teddy bear that they won by throwing three balls in a peach basket. [02:16:00] As hard as you tried, you could never match the feat. So how did they get this giant teddy bear and why? Well, basically, the company made sure they won. The first person who came along and looked like a likely mark, they'd say, Tell you what, I like your face. You got just one ball in the basket. I'll give you a keychain. And if you do it again, I'll let you trade two keychains for the giant teddy bear. And the point was that if you carry that giant teddy bear around all day, other people are going to go, hey, I can get a giant teddy bear too, and put $5 down and fail to win the giant teddy bear. They may not even get the keychain. This guy is lugging around this conspicuous teddy bear doing the marketing for the rigged game where you see Joe Rogan getting $100 million for his podcast or you see TikTokers have these incredible success stories. Or back when Kindle was getting off the ground, there were independent authors who went to Kindle and reported these incredible findings. Substack. All of those early Substack writers who are guaranteed a minimum monthly. We're talking about how Substack [02:17:00] was the future of journalism. And really all that was happening is they were being given these giant teddy bears, same as those Uber drivers who are filling Uber social media with accounts of how much money they make driving for Uber. They're just luring people to go through the picker to ant pipeline.
BROOKE GLADSTONE: And you say the big teddy bear theory plays out big on TikTok.
CORY DOTOROW: So the thing that even TikTok's critics admit is that it's pretty good at guessing what you want. That's why most people who use tech talk just tune in to the recommendation feed. People find themselves going viral because so many people have tuned in to this algorithmic feed. And the assumption had been this is what America wanted to see right now. And so America saw it. But then a reporter from Forbes revealed the existence of something called the heating tool. And it's just a knob that's someone at TikTok twiddles to say, we're going to stick this in front of a lot of people, even though the algorithm doesn't think they'll like it. This is a way of temporarily allocating a surplus, giving goodies to the [02:18:00] kind of performer that they want to become dependent on TikTok. So maybe they want sports bros, they find a few of these guys and they give them giant teddy bears. You're viral. 10 million views every video you post. Are you really going to make like two different videos, one for YouTube and one for TikTok, especially when you're getting ten times the traffic on TikTok? And remember, you know, TikTok's got this like, idiosyncratic format. You really got to customize it for TikTok. So it's not really practical to make it for Instagram, YouTube and TikTok. Maybe you'd become a TikTok first performer, right? And then they can take it away from you if they decide they've got enough sports pro content and now they want to got, I don't know, astrology influencers. They can stop promoting, stop heating the sports bro content and start hitting the astrologer content. But also, it's impossible to tell whether a performer or a writer or creative worker on one of these platforms like Substack, is getting a giant payout. Whether they've been given a giant teddy bear, whether they even know. And in fact, if you [02:19:00] and now we're getting into counter twiddling, if you get aggressive enough and trying to figure out how they're determining whether or not your videos will be shown to your subscribers. Right. If you start to reverse engineer their tools, start to pull apart their app to see if you can find the business rules, they will start to come after you for violating Section 1201 of the Digital Millennium Copyright Act, which broadly prohibits reverse engineering, violating the Computer Fraud and Abuse Act for tortious interference with contract for trademark violation, patent violation, copyright infringement. And again, this just boils down to felony contempt to business model.
BROOKE GLADSTONE: I don't know what that is.
CORY DOTOROW: That's just the idea that even if Congress never passed a law saying never displease a shareholder, that you can mobilize existing laws like copyright law to say that displeasing a shareholder becomes illegal. So, like, let's talk about iPhones just for a second. I make an app for an iPhone. You own an iPhone, you spend [02:20:00] $1,000 on that iPhone. I made the app and I hold the copyright to it. I don't want to share 30% of all my revenue with Apple. And you don't think I should have to. So I give you the app and a tool that allows you to install it on your iPhone, which belongs to you. I the copyright owner, by letting you use my copyrighted work, violate copyright law. Section 1201 of the Digital Millennium Copyright Act, punishable by a five year prison sentence and a $500,000 fine.
BROOKE GLADSTONE: Mm hmm. It's amazing that people who talk about the free market lock it up so tight that it can't be responsive to the consumers that are supposed to operate it.
CORY DOTOROW: Oh, you know, it's even worse, right? Because it's not just that Apple, like all capitalists, hates capitalism when they're on the pointy end of it. It's that Apple did the thing that they would now sue you for two other companies. So if [02:21:00] you think about when Microsoft Windows reigned supreme in the office and Macs were getting harder and harder to use because word for the Mac or office for the Mac was so bad, the way Steve Jobs resolved, that was by having some of his technologists reverse engineer Microsoft Office and make AI work that read and write Microsoft Office files. When Apple did it, that was progress. When you or I do it, that is theft that it it's allowing maximum twiddling on the incumbent side and preventing any twiddling on the new market entrants side. You know, ad blocking is the most successful consumer boycott in history. That's what Doc Searls says. And it's only possible because the web is an open platform. But if you wanted to make an ad blocker for an app, the fact that you have to first reverse engineer the app that you have to bypass digital rights management makes it a felony. And so the ads on apps are a lot more obnoxious, not just. In terms of their presentation, but in terms of the data [02:22:00] that they gather and target with, we're all familiar with the stories about people being targeted by ads based on visiting mosques or abortion clinics and all of the other terrible abuses. That's because without the constraint of counter twiddling, the sky's the limit in terms of how much they can tell you.
BROOKE GLADSTONE: The FTC has lawsuits against Facebook, and the Department of Justice has an antitrust case against Google where twiddling resulted in undetectable fraud. So is there some accountability afoot?
CORY DOTOROW: I think that there's an attempt to do it. But let's go back to the best time to fight monopolies was 40 years ago, and the second best time is now. You know, there are people who say that the monopolies that we have in tech, the winner take all are winner take most monopolies that they come out of tech exceptionalism. There's just something about the great forces of history that have made tech so powerful. But, you know, Occam's Razor says we should [02:23:00] look to the simplest explanation first. And the simple explanation here is that we used to do anti-trust and we didn't get monopolies. We stopped doing antitrust and we got a lot of monopolies. I think the world of the antitrust enforcers in the Biden administration, Lina Khan, is extraordinary.
BROOKE GLADSTONE: The head of the Federal Trade Commission.
CORY DOTOROW: That's right. She was a third year Yale law student just a couple of years ago. And she wrote this paper that was a direct answer to Robert Bork. So Bork's book was called The Antitrust Paradox. Her law review paper was called Amazon's Antitrust Paradox. And it was such a stinging rebuttal to Bork that it set the whole antitrust theoretical world on its ear. And just a few years later, she is the youngest ever chair of the Federal Trade Commission, and she found things like Section Five of the Federal Trade Commission Act, which wasn't that hard to find. It's right between Section Four and Section six, but hasn't been used in 40 years. And it's the article that gives the Federal Trade Commission broad latitude to act against deceptive and unfair [02:24:00] practices. And that's the basis on which she promulgated a rule banning non-compete agreements. We are seeing in Khan what a skilled technocrat can do. If you know where the levers are, you're not afraid to pull the levers. You can make incredible things happen. And we are in an incredible moment for antitrust. And it's not just her, it's Jonathan Kanter at the Department of Justice. It's other commissioners like Rebecca Slaughter. And it's the whole of government approach that Tim Wu crafted when he was in the White House, where every department is now being asked to use its legislative authority to go ahead and act to reduce monopoly and monopoly power across the entire economy.
Credits
JAY TOMLINSON - HOST, BEST OF THE LEFT: That's going to be it for today. As always keep the comments coming in. I would love to hear your thoughts or questions about today's topic or anything else. You can leave a voicemail or send us a text at 202-999-3991 or simply email me to [email protected]. The additional sections of the show included clips from the PBS NewsHour, Pillar of Garbage, Disinformation, Tech Won't Save [02:25:00] Us, Decoder, Commonwealth Club of World Affairs, the Daily Zeitgeist, DW News, Institute of Economic Affairs, ColdFusion, and On the Media. Further details are in the show notes.
Thanks everyone for listening. Thanks to Deon Clark and Erin Clayton for their research work for the show and participation in our bonus episodes. Thanks to our Transcriptionist Quartet—Ken, Brian, Ben, and Andrew—for their volunteer work helping put our transcripts together. And remember, we are looking for a new transcriptionist. Please send me an email if you're interested. Thanks to Amanda Hoffman for all of her work behind the scenes and her bonus show co-hosting. And thanks to those who already support the show by becoming a member or purchasing gift memberships. You can join them by signing up today at bestoftheleft.com/support, through our Patreon page, or from right inside the Apple podcast app. Membership is how you get instant access to our incredibly good and often funny weekly bonus episodes, in addition to there being no ads and chapter markers in all of our regular [02:26:00] episodes, all through your regular podcast player. You'll find that link in the show notes, along with a link to join our Discord community, where you can also continue the discussion.
So, coming to from far outside the conventional wisdom of Washington DC, my name is Jay, and this has been the Best of the Left podcast coming to twice weekly, thanks entirely to the members and donors for the show, from BestOfTheLeft.com.
Showing 1 reaction