📄 Extracted Text (18,854 words)
[00:00:05] Hey, it's Tucker Carlson. Charlie Kirk
[00:00:07] was assassinated two weeks ago today in
[00:00:10] an event that clearly is going to change
[00:00:13] American history. Changed a lot of
[00:00:15] people inside. And there was a moment in
[00:00:17] the first week where you thought to
[00:00:19] yourself, this is going to have effects.
[00:00:21] A lot of them are going to be bad, but
[00:00:22] some of them are probably going to be
[00:00:23] good because Charlie's life was itself
[00:00:27] so good. Charlie Kirk spent his life
[00:00:29] above all trying to live the Christian
[00:00:31] gospel and trying to live the principle
[00:00:34] of free speech which is to say he talked
[00:00:37] and he also listened. He was most famous
[00:00:39] for traveling from college campus to
[00:00:41] college campus and asking people who
[00:00:43] disagreed with him to confront him. Ask
[00:00:46] me anything he said and he sat there
[00:00:48] patiently as they did and they often
[00:00:49] attacked him. They almost always
[00:00:51] expressed views he found repugnant and
[00:00:53] almost always he took those views
[00:00:55] seriously and answered the questions put
[00:00:56] to him as crisply and honestly as he
[00:01:00] could. That's what he spent his life
[00:01:01] doing and in fact he was assassinated
[00:01:03] while doing that. So if there's any
[00:01:06] lesson from Charlie Kirk's life, well
[00:01:08] the first lesson would probably be
[00:01:10] sincere Christians tend to be really
[00:01:12] decent people. Maybe we should have more
[00:01:14] of them. But the more secular temporal
[00:01:17] lesson is that free speech is a virtue.
[00:01:20] It is in fact the foundation of this
[00:01:22] country. Not only its laws but its
[00:01:24] culture and that we should protect it.
[00:01:27] And maybe if we seek to honor Charlie
[00:01:28] Kirk, we should emulate it. Maybe we
[00:01:31] should begin by asking our politicians
[00:01:33] to do what Charlie Kirk spent his life
[00:01:35] doing, which is to answer the question.
[00:01:37] Just calmly answer the question. We'll
[00:01:38] ask you anything and then you go ahead
[00:01:40] and answer it to the best of your
[00:01:42] ability. Like for example, who blew up
[00:01:43] the Nordstream pipeline?
[00:01:45] What happened to all the money we sent
[00:01:46] to Ukraine? Why haven't you released all
[00:01:48] the JFK files, etc., etc., etc., all the
[00:01:51] questions on your mind that slowly drive
[00:01:53] you crazy because no one will address
[00:01:55] them? Why don't we just ask them
[00:01:56] directly to our leaders and they get to
[00:01:58] answer? Nothing, nothing would honor
[00:02:01] Charlie Kirk's memory more than that.
[00:02:03] That is free speech in action.
[00:02:06] But nothing like that happened. Instead,
[00:02:08] the only real conversation we've had
[00:02:10] about free speech has been about Jimmy
[00:02:11] Kimmel, who is hardly a champion of free
[00:02:15] speech. In fact, just the opposite. He's
[00:02:17] a nasty little sensor, talentless, a
[00:02:19] person who has many times on camera over
[00:02:22] the years chuckled and applauded as
[00:02:25] other people, his political enemies,
[00:02:26] have been silenced. a guy who has so
[00:02:29] little influence in American society and
[00:02:30] so little audience. He was on his way
[00:02:32] out anyway, has the job only as a result
[00:02:34] of some kind of weird political
[00:02:36] affirmative action where people who
[00:02:37] agree with studio heads get to have late
[00:02:39] night jobs. He is hardly the person who
[00:02:42] should be taking up the cause of free
[00:02:45] speech or become a symbol of it because
[00:02:47] of course he's the symbol of censorship
[00:02:48] and has been for most of his career.
[00:02:51] And the other thing that we saw, maybe
[00:02:53] even more distressing than that, was
[00:02:55] politicians
[00:02:56] turn not only against free speech, but
[00:03:00] actively and openly announced efforts to
[00:03:03] censor the American population and use
[00:03:06] the memory of Charlie Kirk to do it as
[00:03:09] their justification.
[00:03:11] There are many examples we could pick.
[00:03:13] Here's a particularly raw one. This is
[00:03:16] from Congressman Moscowitz just in the
[00:03:18] House of Representatives. Eight days
[00:03:20] after Charlie Kirk died. Here it is.
[00:03:23] >> It's crazy what's going on on the social
[00:03:27] media platforms. There are so many
[00:03:30] conspiracy theories on what's going on
[00:03:31] with Charlie Kirk. Israel assassinated
[00:03:33] him, right? There are conspiracy
[00:03:35] theories about your personal social life
[00:03:38] all day. It is totally rampant. Big
[00:03:43] names on the right, Candace Owens,
[00:03:48] right? talking about how the what's been
[00:03:51] released as far as the dialogue between
[00:03:53] the perpetrator and his roommate is
[00:03:56] manufactured by the FBI, manufactured by
[00:03:59] the administration.
[00:04:00] It is totally rampant, allowing foreign
[00:04:03] governments to just perpetrate these
[00:04:06] platforms, all of these bots all of the
[00:04:09] time to weaponize Americans. And so if
[00:04:12] we want to do something, then we should
[00:04:13] talk about section 230. We should talk
[00:04:16] about how we're going to make sure that
[00:04:20] we don't let foreign governments poison
[00:04:22] our children's mind. And so I will work
[00:04:24] with you on that, director.
[00:04:26] >> I'll work with you on 230 any day.
[00:04:29] >> So there is the congressman talking to
[00:04:31] the FBI director and there's a lot there
[00:04:33] and we'll unpack it, but the most
[00:04:35] telling line came right in the middle
[00:04:36] and he turns to the FBI director. He
[00:04:37] says, "They're criticizing your personal
[00:04:39] life. They're airing conspiracy theories
[00:04:41] about your personal life." Now, speaking
[00:04:44] for myself, I have literally no idea
[00:04:45] what the congressman was talking about.
[00:04:47] I haven't seen that. Doubtless it
[00:04:49] exists. There are conspiracy theories,
[00:04:50] conspiracy theories about everybody and
[00:04:52] everybody's personal life. If you're in
[00:04:53] public, people are theorizing about you
[00:04:54] on the internet. Kind of the nature of
[00:04:56] the internet and kind of the nature of
[00:04:57] having authority. But you'll notice that
[00:04:59] the congressman thinks this will be a
[00:05:01] compelling argument for the FBI
[00:05:03] director. He basically just says,
[00:05:05] "They're criticizing you and me and
[00:05:07] they're not allowed to do that."
[00:05:09] He's not even pretending that the
[00:05:12] purpose of censoring speech and that's
[00:05:13] what he's saying. We need to censor the
[00:05:15] speech that the purpose of that would be
[00:05:17] protect any vulnerable group vulnerable
[00:05:19] groups. No, they're criticizing us. They
[00:05:22] can't do that. And then of course he
[00:05:24] goes on to blame unseen foreign actors.
[00:05:26] And by the way, that's something that I
[00:05:27] think most Americans would get behind,
[00:05:29] but that the congressman is not behind
[00:05:30] at all. If they want to take the
[00:05:32] influence of foreign nations out of our
[00:05:33] politics, most Americans would applaud.
[00:05:35] And that would start with not taking
[00:05:36] money from their lobbies. um that would
[00:05:39] be a welcome change. But the idea that
[00:05:42] we need to censor what you say because
[00:05:45] the people who run everything don't want
[00:05:48] to be called out or have their personal
[00:05:50] lives misdescribed or the subject of
[00:05:52] conspiracy theories. Well, that's not
[00:05:54] really reform as much as it's just kind
[00:05:56] of classic oldfashioned tyranny, isn't
[00:05:58] it? Shut up. We have guns you don't. And
[00:06:01] we're going to make you. And how are we
[00:06:02] going to make you? That's the question.
[00:06:04] So, you may remember that last week the
[00:06:06] attorney general came out right after
[00:06:08] Charlie Kirk's death and said there is a
[00:06:10] distinction between speech
[00:06:12] constitutionally protected famously in
[00:06:14] the first amendment and the bill of
[00:06:15] rights and something called hate speech
[00:06:17] a category that doesn't strictly
[00:06:19] speaking exist under the law but which a
[00:06:21] lot of people seems seem to believe
[00:06:23] exists and hate speech is never defined
[00:06:26] like most of the most powerful words uh
[00:06:29] that we use to punish people terrorism
[00:06:31] for example racism it's never actually
[00:06:33] defined. What is that exactly? We don't
[00:06:35] know. Other than it's speech the people
[00:06:38] in charge hate and therefore it should
[00:06:41] be banned. And most people who
[00:06:45] understand the American story, who
[00:06:48] understand our government, who
[00:06:50] understand our culture, who care about
[00:06:52] continuing all of those things, reacted
[00:06:54] with outrage when the attorney general
[00:06:56] said that you can't pass a law that will
[00:06:58] strip from us our God-given right to say
[00:07:01] what we think is true. She addressed it
[00:07:04] in such a ham-handed way that it was
[00:07:06] obvious to everybody exactly what she
[00:07:07] was talking about, and they reacted. We
[00:07:10] did, too. But in real life, that will
[00:07:14] not happen. There will not be, I can say
[00:07:17] confidently, in my lifetime, a law in
[00:07:19] the Congress that says explicitly, any
[00:07:22] American who says something our leaders
[00:07:24] don't like. Anybody who traffs in a
[00:07:27] conspiracy theory about our personal
[00:07:28] lives will be shut down, fined,
[00:07:30] imprisoned.
[00:07:31] An open, transparent censorship law will
[00:07:35] not pass through the House of
[00:07:37] Representatives or the United Senate,
[00:07:38] United States Senate. will not be signed
[00:07:40] by the president. Why? Because it's just
[00:07:42] too obvious. So instead, because
[00:07:46] censorship is coming if these people can
[00:07:48] help it, instead they will invoke
[00:07:50] something called section 230. And you're
[00:07:53] going to hear a lot more about this
[00:07:55] without question. It's never again
[00:07:57] explained very well. And the reason it's
[00:07:59] not explained is because they don't want
[00:08:00] you to know exactly what they're doing.
[00:08:01] So let me just give you the cliffnotes
[00:08:03] version of what section 230 is. Section
[00:08:05] 230 is a section 230 within the 1996
[00:08:10] Communications Decency Act and it is the
[00:08:12] piece of legislation often credited for
[00:08:14] creating the internet. It's the
[00:08:15] framework that Congress came up with at
[00:08:17] the dawn of the internet to put
[00:08:20] parameters around what this is to
[00:08:22] protect companies as they grew to set
[00:08:25] laws around this new technology. And one
[00:08:29] of the laws that they made, section 230,
[00:08:32] shields
[00:08:33] internet providers, platforms from
[00:08:36] lawsuits. It gives them legal liability
[00:08:39] from lawsuits
[00:08:41] on the basis of slander,
[00:08:45] obscenity, things that are on their
[00:08:47] platforms that they didn't create. In
[00:08:49] other words, it creates a distinction
[00:08:51] between a publisher like a newspaper, a
[00:08:54] magazine, a television network, and a
[00:08:56] platform. Google, Facebook, X. And the
[00:09:01] distinction allows the platforms to let
[00:09:03] other people post whatever they want
[00:09:06] without getting sued for it. They cannot
[00:09:09] be held liable. These big companies
[00:09:11] cannot be held liable
[00:09:14] for slander, hate speech, anything
[00:09:17] really on their platforms. And as a
[00:09:20] result of this law, those platforms have
[00:09:22] come to dominate news and information
[00:09:25] globally. In fact, when we talk about
[00:09:27] censorship, nobody's talking about
[00:09:29] censoring the New York Times, the
[00:09:30] Washington Post, NBC News, because
[00:09:31] nobody cares. All meaningful information
[00:09:34] and all meaningful social movements are
[00:09:37] influenced by social media. So, if you
[00:09:39] want to get people whipped into a
[00:09:40] frenzy, if you want to change your
[00:09:42] government, for example, you're not
[00:09:43] going to take out an ad in the New York
[00:09:44] Times, of course not. You're going to
[00:09:46] get something going on the social media
[00:09:48] platforms. So, they are huge. They are
[00:09:51] completely dominant. information flows
[00:09:54] almost exclusively on them. And all of
[00:09:57] this is possible because of section 230.
[00:10:00] Now, there's been a pretty vigorous
[00:10:02] debate for the last 20 years over
[00:10:04] whether this is a good idea. And there
[00:10:05] are arguments against it. One of them is
[00:10:07] why would Google get a liability
[00:10:09] exemption when I don't have one? You run
[00:10:12] a business. You're just an American
[00:10:14] citizen. You can be sued at any time
[00:10:16] under our famously loose and destructive
[00:10:18] tort laws and you can go to business.
[00:10:20] You can be bankrupt. You can be
[00:10:21] destroyed. They can do to you what they
[00:10:23] did to Alex Jones, for example. The FBI
[00:10:25] can join up with some activist group and
[00:10:27] take your business away, wreck your
[00:10:29] life. So why should these big tech
[00:10:32] companies be exempt? Now, that's a real
[00:10:34] argument.
[00:10:36] It's similar to the argument about the
[00:10:37] pharma companies. Why should vaccine
[00:10:40] makers get a shield from lawsuits? If I
[00:10:43] make playground equipment, I'm
[00:10:44] vulnerable. If I make the COVID vac, I'm
[00:10:46] not. That's a principled argument. But
[00:10:50] what's interesting about the 230 debate
[00:10:52] is that both parties had been on both
[00:10:54] sides of it at various times. The
[00:10:57] Republicans for years were mad at the
[00:10:59] big platforms because they were
[00:11:01] censoring conservatives, which they
[00:11:02] were. And so they often muttered about
[00:11:04] revoking 230 shield protection unless
[00:11:08] they opened their platforms to all
[00:11:10] points of view. In other words, they
[00:11:11] wanted to use section 230 to end
[00:11:14] censorship. There's no reason you she
[00:11:16] you should get a special carve out from
[00:11:17] the US government, from the Congress if
[00:11:19] you don't treat people equally, if
[00:11:21] there's not fairness and neutrality in
[00:11:23] the way you allow opinions to be
[00:11:25] broadcast on your platform. That seemed
[00:11:26] like a fairly reasonable position. But
[00:11:29] things have changed.
[00:11:32] Now you're seeing Republicans
[00:11:34] invoke section 230, pick up the cudgel
[00:11:38] that they hold over these huge tech
[00:11:40] companies and say unless you censor, we
[00:11:43] will revoke section 230.
[00:11:47] And by the way, they are following in
[00:11:50] the footsteps of the leftward edge of
[00:11:52] the Democratic party in doing this. In
[00:11:54] 2020, Betto Oor of Texas ran one of his
[00:11:57] many doomed campaigns for office. This
[00:11:59] one, I think, for president. and he
[00:12:01] said, "Unless they get hate speech off
[00:12:04] the platforms, we're going to revoke
[00:12:06] section 230 and put these people out of
[00:12:08] business." By the way, the threat is
[00:12:09] enough.
[00:12:11] That was the hope. If we threaten them,
[00:12:13] then we don't have to do the censoring.
[00:12:14] We'll make Google, Facebook, Meta, and X
[00:12:18] do the censoring for us. That was the
[00:12:20] idea. Then no one can accuse us of
[00:12:22] violating the First Amendment or being
[00:12:23] for
[00:12:25] speech codes. We'll make someone else do
[00:12:28] it. He lost. But then Joe Biden that
[00:12:30] same year, 2020, said, "Actually, yes,
[00:12:33] we should use this threat to force the
[00:12:37] big tech platforms to censor in ways
[00:12:40] that we like." And by the way, they did.
[00:12:43] They did throughout the Biden
[00:12:44] presidency, Facebook, Twi, then Twitter,
[00:12:49] Google, all censored opinions the Biden
[00:12:51] administration didn't like. And they did
[00:12:52] this ultimately because they feared
[00:12:55] having their legal protection revoked.
[00:12:58] That's why they did that. That's what
[00:13:00] got them to act. And Republicans, the
[00:13:03] sensible ones, looked at this and said,
[00:13:04] "This is completely wrong. It's totally
[00:13:05] immoral. It's illegal for the US
[00:13:07] government to be imposing censorship on
[00:13:09] its citizens. It's against the
[00:13:11] Constitution of the United States, and
[00:13:12] it's against, more important, natural
[00:13:14] law. These are not rights we were given
[00:13:16] by the Biden administration. These are
[00:13:18] rights we were born with, and when you
[00:13:19] take them away from us, you are the
[00:13:21] criminal." And they made that point. All
[00:13:23] of a sudden, you are seeing Republicans
[00:13:27] take the position that Betto Aor and Joe
[00:13:30] Biden took just 5 years ago. Here, for
[00:13:34] example, and this will come as no
[00:13:35] surprise to you at all. You will not be
[00:13:37] shocked to hear this. Here is Senator
[00:13:39] Lindsey Graham of South Carolina running
[00:13:41] for reelection making exactly the same
[00:13:45] case that Beta Ror made. Watch.
[00:13:47] >> Section 230 needs to be repealed. If
[00:13:49] you're mad at social media companies
[00:13:51] that radicalize our nation, you should
[00:13:53] be mad at a bill that will allow you to
[00:13:55] sue these people. They're immune from
[00:13:57] lawsuit.
[00:13:59] >> Oh, it should be repealed. Now, it's not
[00:14:03] clear from that clip exactly why Lindsey
[00:14:05] Graham is calling for the repeal of
[00:14:07] section 230. Why is he threatening the
[00:14:09] tech platforms? And by the way, the
[00:14:10] pretext always changes. They'll tell
[00:14:12] you, "Well, we're against child sex
[00:14:14] trafficking, as if anyone is for it.
[00:14:16] We're against terrorism, a term once
[00:14:18] again they never define and don't have
[00:14:20] to. We're against drugs. We're against
[00:14:23] foreign influence. We're against
[00:14:24] bigotry. Whatever. They will always give
[00:14:27] you an excuse. And that excuse will make
[00:14:30] them sound like the virtuous party, like
[00:14:32] the good guys. We're here to save the
[00:14:35] vulnerable.
[00:14:37] But that's never the real reason.
[00:14:39] Censorship always and everywhere is
[00:14:42] imposed with the intent and always has
[00:14:44] the effect of shielding the powerful.
[00:14:48] They're the ones who don't want to be
[00:14:50] exposed.
[00:14:51] Free speech, by contrast, and this is
[00:14:54] the reason it's in our Bill of Rights,
[00:14:56] is the one great power that the
[00:14:59] powerless have.
[00:15:02] Especially in a world where your vote
[00:15:03] may or may not matter. All you have is
[00:15:05] your voice. All you have is your
[00:15:07] opinion. And that's infuriating to
[00:15:10] Lindsey Graham's donors. And make no
[00:15:12] mistake, when he's calling for invoking
[00:15:14] section 230 and taking it away,
[00:15:17] threatening the big platforms, he's
[00:15:18] doing that on behalf of his donors who
[00:15:20] feel criticized by random accounts on
[00:15:23] the internet. And they hate it because
[00:15:25] the people in charge always hate to be
[00:15:28] called out.
[00:15:31] Censorship has one goal, and that's to
[00:15:34] preserve secrecy. And secrecy has one
[00:15:37] purpose and that's to abet wrongdoing.
[00:15:40] So people who are doing nothing wrong
[00:15:42] are transparent. People who are
[00:15:44] committing evil hide and censorship
[00:15:46] allows them to hide. It's literally that
[00:15:48] simple. And those people are the most
[00:15:50] powerful people in the country. So who's
[00:15:53] encouraging this?
[00:15:55] The donors, whoever they are. But there
[00:15:57] are lots of lobby groups, all of them on
[00:15:59] the left, pushing the Republican le
[00:16:03] Congress to get behind censorship
[00:16:06] initiatives, using the cover of the
[00:16:08] section 230 debate to get it done, to
[00:16:12] pressure the tech companies into making
[00:16:13] you shut up, into taking your opinions
[00:16:16] off the internet using algorithms
[00:16:18] designed to censor you without even a
[00:16:20] human being entering into the equation.
[00:16:22] No person will decide that your opinion
[00:16:24] is offensive and pull it off. The
[00:16:26] computer will decide that and it will be
[00:16:27] aided by the massive exponential growth
[00:16:31] in computing power that is at the very
[00:16:33] center of tech right now. AI
[00:16:37] that is the goal to make certain that
[00:16:40] opinions that are disruptive to the
[00:16:42] people in charge never see the light of
[00:16:44] day. What's amazing and what's
[00:16:46] especially infuriating is that many in
[00:16:49] the Republican party, the party that
[00:16:51] controls all branches of government
[00:16:54] right now, are completely for this,
[00:16:56] strongly for it. Where did they get this
[00:16:59] idea? Is this a betrayal? Oh, it's a
[00:17:02] betrayal. How profound a betrayal?
[00:17:05] Listen to Congressman Don Bacon of
[00:17:07] Nebraska, a former Air Force general,
[00:17:10] describe who he's been talking to about
[00:17:12] censorship. And I appreciate Jonathan
[00:17:15] Gri what hit him and his ADL stance
[00:17:17] board. Uh I I know you you made us
[00:17:20] better with your feedback and and ideas
[00:17:22] and recommendations and it's been a
[00:17:23] trick to get to know you. We we want to
[00:17:25] be in a country that makes clear that
[00:17:28] anti-semitism or any kind of racism is
[00:17:30] repugnant, unacceptable, not allowed in
[00:17:34] my space and we just zero tolerance for
[00:17:37] it. So we need to hold these companies
[00:17:39] accountable and work with them to take
[00:17:40] it off the airwave.
[00:17:43] It's hard to believe that's a real clip.
[00:17:44] We actually checked. Is that real?
[00:17:46] Congressman Don Bacon of Nebraska, a
[00:17:50] great and sensible state with tons of
[00:17:52] normal people, a former Air Force
[00:17:53] general. Is he really colluding with
[00:17:56] Jonathan Greenblat of the ADL to take
[00:18:00] away your right to say what you think?
[00:18:02] Oh, you bet. That's exactly what he's
[00:18:06] doing. And make no mistake, the ADL is
[00:18:09] not an anti-defamation organization. The
[00:18:12] ADL practices defamation and slander and
[00:18:16] bullying and not in service of
[00:18:18] protecting a marginalized group but in
[00:18:20] acrewing power and by
[00:18:24] forwarding its goals which are
[00:18:25] ideological.
[00:18:27] And if you don't believe that go's
[00:18:29] website and take a look at what the ADL
[00:18:31] considers hate speech. Hate speech
[00:18:33] another one of those terms never quite
[00:18:34] defined but the ADL has actually taken
[00:18:36] the time to define it. What do they
[00:18:38] consider hate speech? Well, among other
[00:18:40] things, complaining about drag queen
[00:18:43] story hour is hate speech according to
[00:18:45] the ADL. Huh. Not being enthusiastic
[00:18:49] about the COVID vax. That's hate speech
[00:18:52] and it's dangerous. Noticing that the
[00:18:54] American population has changed
[00:18:56] completely in the past 30 years thanks
[00:18:58] to immigration. That's dangerous hate
[00:19:00] speech. You should be punished for that.
[00:19:02] For noticing it in your country that you
[00:19:05] were born in, no noticing. You can't
[00:19:07] notice that it looks completely
[00:19:09] different because of decisions that
[00:19:11] someone who never consulted you made
[00:19:13] without your knowledge. Shut up, says
[00:19:15] the ADL. You not only don't have the
[00:19:17] right to speak, we're going to scream at
[00:19:18] you and call you a Nazi and imply that
[00:19:21] you are the dangerous one. The people
[00:19:23] who opened up the borders to 50 million
[00:19:25] foreigners,
[00:19:26] but you're the dangerous one. Sure. You
[00:19:28] know what else is hate speech, by the
[00:19:29] way? Reading the Gospels of Matthew or
[00:19:32] Mark or Luke or John. The gospel itself,
[00:19:34] Christianity itself, is hate speech. I
[00:19:36] know that because three nights ago, I
[00:19:39] recounted the Christian story in its
[00:19:41] essence over like five minutes and was
[00:19:43] immediately denounced by the ADL as
[00:19:45] someone who is dangerous and inspiring
[00:19:47] murder.
[00:19:49] But I'm not the only one.
[00:19:51] The ADL has actively attacked the
[00:19:54] Christian gospel for years, has gotten
[00:19:57] behind a definition of hate speech that
[00:19:59] includes the Christian story. That's not
[00:20:01] an exaggeration. That's not a fid
[00:20:04] conspiracy theory. That's a fact and you
[00:20:06] can look it up. So this is the guy
[00:20:10] that's the guy Jonathan Greenblat of the
[00:20:12] most aggressively left-wing democratic
[00:20:14] aligned but much more important than
[00:20:17] that lunatic antihuman anti-American
[00:20:21] group the ADL completely corrupt.
[00:20:25] He's consulting that guy to decide how
[00:20:29] much speech you should have because
[00:20:31] there are ugly opinions on the internet.
[00:20:34] Yeah, that's your Republican party. Was
[00:20:37] he denounced by his fellow Republicans
[00:20:39] in the House? Was he denounced by the
[00:20:41] Speaker of the House, Speaker Johnson?
[00:20:44] No, he wasn't. They barely even noticed
[00:20:46] because they had the same views. Not all
[00:20:48] of them, but an awful lot of them.
[00:20:50] It's unbelievable. And it's
[00:20:52] counterproductive
[00:20:54] because once again, censorship is never
[00:20:57] enacted to help the powerless. It is
[00:20:59] always and everywhere an effort to
[00:21:01] shield the powerful. Always. and in fact
[00:21:04] has a counterproductive effect on the
[00:21:06] people it is supposedly designed to
[00:21:08] help. How would you feel about any
[00:21:11] person you're not allowed to criticize?
[00:21:13] Would that make you like the person
[00:21:15] more? No. It would make you resentful
[00:21:18] and suspicious and would give you the
[00:21:20] well-deserved opinion that this is not
[00:21:24] an egalitarian society in which we're
[00:21:26] all citizens. It's a hierarchal society
[00:21:28] in which the government has decided some
[00:21:30] people have more rights than others. So,
[00:21:32] if you find out you're not allowed to
[00:21:33] criticize someone else, maybe the first
[00:21:35] question you might ask is, "Well, then
[00:21:36] why are people allowed to criticize me?"
[00:21:39] And the answer is because some people
[00:21:40] have more power in our society or being
[00:21:43] used to pit different groups against
[00:21:44] each other or who knows what's going on.
[00:21:47] But none of it is consistent with the
[00:21:49] core promise of this country, which is
[00:21:50] we're all citizens under our government
[00:21:53] and we're all equal before our God who
[00:21:55] made us. It says that,
[00:21:58] but increasingly that's not the country
[00:22:00] we live in. We live in a country where
[00:22:01] some people have more rights than
[00:22:02] others. And that's exactly the kind of
[00:22:05] message you would send if you wanted to
[00:22:06] ferment a revolution against your
[00:22:07] government because it enrages people and
[00:22:09] it divides them from each other. Oh, we
[00:22:11] have to protect this group. What does
[00:22:13] everyone else think of that? They
[00:22:15] secretly don't like the group.
[00:22:17] You're not ending bigotry by enacting
[00:22:20] censorship. You're creating it, Dumbo.
[00:22:24] And this is specifically aimed at
[00:22:26] Congressman Bacon, who was somehow an
[00:22:29] Air Force general. He can't be dumb, but
[00:22:32] he's obviously not very thoughtful
[00:22:33] because this is very obvious.
[00:22:36] People don't like other people who get
[00:22:40] special treatment. Were you never a
[00:22:42] child? You never learn that.
[00:22:45] Who knows what the purpose here? It
[00:22:47] doesn't even matter.
[00:22:49] It is happening right before us. The
[00:22:52] people who are elected to protect us,
[00:22:54] who say they're our friends, are selling
[00:22:56] us out. And you can theorize as to why.
[00:23:00] And by the way, all of that theorizing
[00:23:01] is itself unhealthy. Where do conspiracy
[00:23:04] theories come from? Where do you think
[00:23:05] they come from? They come from living in
[00:23:07] a country where the government will
[00:23:08] never explain anything and lies
[00:23:10] constantly. So the next time you see
[00:23:12] someone in power complain about
[00:23:13] malicious conspiracy theories, stop him
[00:23:15] in mid-sentence and say they exist
[00:23:17] because of you. If you would just tell
[00:23:19] the truth, if you would live like
[00:23:20] Charlie Kirk and answer the question
[00:23:23] politely, reasonably,
[00:23:26] fully,
[00:23:28] there wouldn't be a vacuum into which
[00:23:30] lunatics would rush. We would have a
[00:23:33] plausible answer to basic questions
[00:23:35] like, "What the hell is going on?" But
[00:23:37] because you haven't provided that, what
[00:23:40] do you think's going to happen? People
[00:23:42] are going to have some pretty far out
[00:23:44] explanations. And maybe some of them are
[00:23:45] true. By the way, we don't know. Your
[00:23:48] behavior is so suspicious because you
[00:23:51] can't answer any question straight any.
[00:23:55] And you're spending your time talking to
[00:23:56] Jonathan Greenblat, one of the darkest,
[00:23:59] most corrupt people in our society.
[00:24:02] Truly a divisive figure. Speaking of
[00:24:04] divisive,
[00:24:05] how many Americans have made fellow
[00:24:08] Americans hate each other more
[00:24:09] consistently over the years than
[00:24:10] Jonathan Greenblad? Very, very few. Very
[00:24:13] few. and you're talking to him.
[00:24:18] So, if this sounds like a paranoid rant,
[00:24:20] like, "Oh, that could never happen."
[00:24:22] Well, you should know that it is
[00:24:23] happening right now in the state of
[00:24:25] California. The state of California like
[00:24:31] two weeks ago, 10 days ago, 8 days ago,
[00:24:33] something like that, has passed a law in
[00:24:36] the state legislature, both chambers of
[00:24:38] the state legislature. It awaits a
[00:24:40] signature from Gavin Newsome that would
[00:24:43] ban hate speech on the internet in
[00:24:47] California. Hate speech. Now, how do
[00:24:49] they define hate speech? I actually have
[00:24:51] the definition. I actually wrote it down
[00:24:52] because I was so shocked by it that this
[00:24:54] is happening. the state of California,
[00:24:56] if Gavin Newsome signs this law, and he
[00:24:58] has until October 13th to do it,
[00:25:02] he people will be fined if the sensors
[00:25:04] determine that speech constitutes, and
[00:25:07] we're quoting now, violence,
[00:25:09] intimidation, or coercion. What's
[00:25:11] intimidation or coercion? Right. Right.
[00:25:14] Or coercion based on race, religion,
[00:25:16] gender, sexual orientation, immigration
[00:25:19] status, or other protected
[00:25:21] characteristics.
[00:25:24] Obviously, white Christian men are not
[00:25:26] covered under that. And so, this the
[00:25:28] society becomes ever more hierarchical
[00:25:31] with a Brahman class and untouchables at
[00:25:33] the bottom. The opposite of the country
[00:25:35] all of us over 50 grew up in that had an
[00:25:37] egalitarian spirit where some were rich,
[00:25:39] some were poor, some were smart, some
[00:25:40] were dumb, some had good jobs, others
[00:25:42] were unemployed. But all of us were
[00:25:44] considered equal under the law and equal
[00:25:46] in the eyes of God. And that concept is
[00:25:49] the basis of a stable society, any
[00:25:51] stable society. And it was the basis of
[00:25:53] stability in this country. And laws like
[00:25:55] this and the attitudes that give birth
[00:25:58] to them to laws like this have made it
[00:26:01] wildly unstable, wobbly. It's so
[00:26:04] unstable.
[00:26:06] So as of October 13th, that could become
[00:26:09] law. Now that's a censorship law. Now
[00:26:11] they'll say, "No, no, no. We're just
[00:26:12] we're actually getting the platforms to
[00:26:13] censor." Well, right. You're getting
[00:26:16] someone else to do the job for you. But
[00:26:19] if you hire a hitman and he carries out
[00:26:21] the hit, you're the murderer. He
[00:26:23] participated in it, but you hired him.
[00:26:25] And that's exactly what's going on here.
[00:26:26] The state of California under Gavin
[00:26:28] Newsome is about to, we think, censor
[00:26:32] the opinions of Americans, not to
[00:26:35] protect anybody, but to shield
[00:26:37] themselves from criticism so they can
[00:26:39] continue to do what they want to do in
[00:26:41] secret.
[00:26:43] [Music]
[00:26:46] Jonathan Greenblat,
[00:26:49] the head of the ADL,
[00:26:52] applauds this. And in case you're not
[00:26:54] familiar with Jonathan Greenblat, and in
[00:26:56] case you want a sense of what he's like
[00:26:58] and what he considers hate speech, let's
[00:27:00] just go right to the tape so you know
[00:27:03] that we're not exaggerating. This is
[00:27:04] Jonathan Greenblat video. When you look
[00:27:07] at the prevalence of antivaxer accounts
[00:27:09] that have been amplified and spread
[00:27:11] across Facebook, they don't show up on
[00:27:13] your network, but they show up every day
[00:27:15] to billions of people because Facebook
[00:27:18] profits from amplifying these voices
[00:27:20] which are literally killing people. And
[00:27:23] freedom to express your opinion isn't
[00:27:25] the freedom to incite violence. But but
[00:27:28] for Facebook, it is. And that needs to
[00:27:30] change. That's all. It's simple. There's
[00:27:33] nothing wrong with keeping all of us
[00:27:35] safe from violent white supremacists or
[00:27:38] hateful people.
[00:27:41] >> So criticizing the COVID vax is
[00:27:44] tantamount to murder. There's never been
[00:27:46] I mean obviously that's primopacia
[00:27:48] insane. It's untrue. It's a deranged
[00:27:51] perspective. But more than anything
[00:27:54] you're seeing who Jonathan Greenblat
[00:27:56] really is. He is a faithful Ptorian
[00:28:00] guard for the people in charge. This is
[00:28:02] not someone who's ever challenged actual
[00:28:04] power. Not once in his life. That's who
[00:28:06] he works for. That's who he takes money
[00:28:09] from.
[00:28:12] That's what hate speech looks like.
[00:28:14] Anybody in charge
[00:28:17] can make you shut up when you criticize
[00:28:20] them or stand in the way of their aims.
[00:28:24] So, in case you don't think this can
[00:28:26] come to the United States, one final
[00:28:28] clip, and it's a sad one, and it comes
[00:28:30] from the UK. Now, the UK, obviously, the
[00:28:33] country that gave birth 2 hours, a
[00:28:35] cousin, a country so similar to ours and
[00:28:37] so close, 6 hours by plane overnight,
[00:28:40] that we don't really think of it as
[00:28:41] fully foreign. It's not like going to
[00:28:43] Malaysia or Burundi or even France. It's
[00:28:46] an English-sp speakaking country whose
[00:28:48] customs are recognizable, whose
[00:28:51] government and common law form the basis
[00:28:54] of our government and our law.
[00:28:57] Everything about England
[00:28:59] seems like home but 3° off. And yet the
[00:29:03] UK has become a police state. And if you
[00:29:08] don't believe that, if you think that's
[00:29:10] just hyperbole designed to whip you into
[00:29:12] a frenzy, here's a stat that we checked
[00:29:14] and it's it's hard even to believe this
[00:29:16] is true, but this is true.
[00:29:19] 2023,
[00:29:21] so like a year and a half ago, how many
[00:29:24] people do you think were arrested in the
[00:29:26] United Kingdom for speech violations?
[00:29:29] Arrested by the police, handcuffed, and
[00:29:31] brought to jail in 2023.
[00:29:34] Couple dozen. You know, the ones you see
[00:29:35] on X, the ones Fox News talks about. How
[00:29:38] many people in that year were arrested
[00:29:40] for saying things the government didn't
[00:29:42] want them to say? What's your guess? Is
[00:29:44] it more than 12,000? Cuz that's the
[00:29:47] answer. More than 12,000.
[00:29:50] Wow, that seems like a lot. Is that a
[00:29:52] lot? I mean, it's kind of hard to know,
[00:29:53] right? Okay. Well, let's compare it to
[00:29:57] the number, the widely agreed upon
[00:29:59] number from the most totalitarian
[00:30:02] country in the world. A country so
[00:30:05] lacking in basic freedom. A country run
[00:30:07] by a mad man. A country that's so evil,
[00:30:10] we're literally at war with that country
[00:30:12] right now just on principle because we
[00:30:13] so disapprove of how they treat their
[00:30:15] people. And that country of course is
[00:30:17] Russia under Vladimir Putin.
[00:30:20] So, if the UK
[00:30:22] handcuffed 12,000, more than 12,000
[00:30:25] people in one year for saying things the
[00:30:27] government didn't like, how many were
[00:30:29] arrested in Russia, a country with twice
[00:30:32] the population of the UK? Oh, we happen
[00:30:35] to have the number. 3,319.
[00:30:39] So to restate, more than 12,000 people
[00:30:42] arrested in the United Kingdom, England
[00:30:47] in one year for speech code violations,
[00:30:50] 3,300 arrested in Russia, a country with
[00:30:54] twice the population. So that tells you
[00:30:57] you don't think totalitarianism can come
[00:30:58] to the Anglosphere. Oh, it already has.
[00:31:01] We haven't even touched on Australia,
[00:31:03] New Zealand, Canada. In some ways, even
[00:31:05] worse.
[00:31:06] But what does it look like? What is the
[00:31:08] face of hate crime prosecution? What
[00:31:13] does it actually look like when a
[00:31:14] citizen is arrested for saying something
[00:31:15] the government doesn't like? This video
[00:31:18] is not from China. It's from the United
[00:31:19] Kingdom. This is a British veteran being
[00:31:22] arrested for offending the government.
[00:31:23] Watch
[00:31:26] >> police would realize how ridiculous this
[00:31:29] is. It
[00:31:29] >> is ridiculous.
[00:31:30] >> It is to come to this what did it need
[00:31:34] to come to? Tell tell us why you
[00:31:36] escalated it to this level cuz I don't
[00:31:37] understand.
[00:31:38] >> I posted something that he posted. You
[00:31:40] come to arrest me. You don't arrest him.
[00:31:42] Why has it come to this? Why am I in
[00:31:44] cuffs? Because of something he shared
[00:31:46] then I shared
[00:31:47] >> because someone has been caused
[00:31:48] obviously anxiety based upon your social
[00:31:52] media post. That's why you've been
[00:31:54] arrested.
[00:31:57] >> Oh yes. The velvet wrap jack boot of
[00:32:00] British fascism. You're being arrested
[00:32:03] because someone has been caused anxiety
[00:32:05] by your views. Notice that someone has
[00:32:07] never identified. And of course, the
[00:32:08] answer is someone in power. Someone in
[00:32:10] the government or someone who funds the
[00:32:11] government, someone close to the
[00:32:12] government, someone who's a lot more
[00:32:13] power than you. Didn't like what you
[00:32:14] were saying, felt anxious about what you
[00:32:16] were saying. And so, unfortunately,
[00:32:17] we're going to have to handcuff you and
[00:32:19] bring you to jail.
[00:32:24] That happened this year. That happened
[00:32:25] in That's from January. And it happens
[00:32:27] every single day. more than 12,000
[00:32:29] people arrested every single year for
[00:32:30] criticizing their government in the UK.
[00:32:32] Our closest ally with whom we share
[00:32:36] intelligence on every level, British
[00:32:38] intelligence. I know everyone's spun up
[00:32:39] about MSAD, very close to MSAD. We're
[00:32:41] closer to British intelligence. That's
[00:32:44] the country we're partnering with to spy
[00:32:46] on our respective populations.
[00:32:50] Yeah. So, it's really, really simple. If
[00:32:53] a government, if your government is
[00:32:56] willing to arrest you for saying things
[00:32:59] that they don't like, if your government
[00:33:01] is arresting you for criticizing them
[00:33:05] one way or another, you need a new
[00:33:07] government. If there is any
[00:33:10] justification for revolution, it's that
[00:33:13] that's unacceptable. That's tyranny. A
[00:33:15] government that does that is not a
[00:33:16] legitimate government. has absolutely no
[00:33:18] right to do that and it should be
[00:33:20] stopped from doing that immediately.
[00:33:22] That's the red line right there. Michael
[00:33:25] Shelonburgger is one of the great
[00:33:27] reporters in the United States, a friend
[00:33:29] of ours and someone who as a former
[00:33:32] liberal has probably thought about
[00:33:34] speech uh for more years and with more
[00:33:37] clarity than probably anyone I know. And
[00:33:39] so we're so grateful to have him on to
[00:33:41] assess the state of free speech in the
[00:33:43] United States 2 weeks after Charlie
[00:33:44] Kirk's assassination. Mike, thanks very
[00:33:46] much uh for coming on. Are you worried?
[00:33:51] >> I I'm very worried. I mean, I think
[00:33:54] what's maybe under said recently is
[00:33:57] that, you know, assassination is the
[00:33:59] ultimate form of censorship. You know,
[00:34:01] you and it comes from the same place.
[00:34:04] You know, I think that's what everybody
[00:34:06] senses about it is that
[00:34:07] >> there had been efforts, you know, to
[00:34:09] censor, you know, and and and they had,
[00:34:11] I think, censored Charlie Kirk,
[00:34:12] obviously. I mean the Twitter files we
[00:34:14] discovered that he was on a blacklist.
[00:34:17] There was obviously huge attempts to
[00:34:18] keep him out of universities. He had
[00:34:20] already had you know many death threats
[00:34:22] which is uh not a direct form of
[00:34:24] government censorship but these are
[00:34:25] societal demands that he be silenced.
[00:34:29] And then at a global institutional level
[00:34:32] yeah it's a very disturbing trend that
[00:34:34] we're seeing. I mean I think there's two
[00:34:35] things happening. And there's both a
[00:34:37] organic kind of demand from powerful
[00:34:40] people like the kind that you were
[00:34:42] describing where a politician just
[00:34:44] really, you know, I think it was the
[00:34:46] Moscowitz where he just kind of can't
[00:34:48] stand something and they just want to
[00:34:49] see something taken down and you see
[00:34:51] that from groups and politicians. And
[00:34:54] then there's more of an inorganic demand
[00:34:56] for censorship which we've labeled the
[00:34:58] censorship industrial complex. You can
[00:35:00] call it the censorship industry. And
[00:35:03] that is in place in the European Union,
[00:35:06] in Britain, in Brazil, California would
[00:35:10] like to have that. Basically, all the
[00:35:12] five eyes countries are pursuing that.
[00:35:15] And their strategy, I think, is pretty
[00:35:17] clear at this point, is to encircle the
[00:35:19] United States and to make our tech
[00:35:21] platform censor along the lines that
[00:35:24] they would like to so they can achieve
[00:35:26] censorship through the back door. And
[00:35:29] this has always been their strategy
[00:35:30] because they know that the first
[00:35:31] amendment is a major obstacle for them
[00:35:35] since it requires that the people have
[00:35:37] really radical levels of free speech
[00:35:39] that no country has ever come close to.
[00:35:41] And as you said based on this idea of
[00:35:43] natural rights that we are granted by
[00:35:46] our creator, not given to us by the
[00:35:48] government. The speech comes before the
[00:35:50] government. The speech is how we
[00:35:51] constitute our government. Whereas in
[00:35:53] Europe and everywhere else, the
[00:35:54] government had gradually, you know, let
[00:35:57] people say certain things. You have to
[00:35:58] petition the king. Oh, king, can we
[00:36:00] criticize you for sleeping with Anne
[00:36:02] Boland? And the king would decide
[00:36:03] whether that would be okay or not. And
[00:36:05] and that's how it would occur. And the
[00:36:08] creators of this amazing country and as
[00:36:10] you work on free speech, I've only
[00:36:12] worked on it now for really two and a
[00:36:14] half years. I'm a new newbie to the
[00:36:16] issue, but one of the things you just
[00:36:18] really appreciate is really how radical
[00:36:20] and powerful and strong that commitment
[00:36:22] to the First Amendment was. It's not
[00:36:24] just hype. You might just think, is that
[00:36:25] just patriotic hype from Americans? It's
[00:36:28] not when you read the history of free
[00:36:30] speech over 2500 years going back to
[00:36:32] Socrates who was put to death uh for
[00:36:35] things that he said, also an act of
[00:36:37] censorship.
[00:36:38] um then you realize just how radical the
[00:36:41] first and how beautiful it the first
[00:36:42] amendment is because the Americans that
[00:36:44] created our country said we don't want
[00:36:45] to have a country we don't have a
[00:36:48] government if we can't have full free
[00:36:50] speech with some very narrow exceptions
[00:36:53] and so the exceptions now they sort of
[00:36:54] say well the internet changed everything
[00:36:56] that's what you hear from that's what I
[00:36:58] hear from my progressive friends heard
[00:37:00] it on Martha's Vineyard of all places
[00:37:02] it's all changed with the internet it's
[00:37:04] too dangerous to allow this high level
[00:37:06] of free speech we have to change things.
[00:37:08] Well, Tucker, to to put into to put in
[00:37:10] context how crazy that is. Our Supreme
[00:37:13] Court has ruled not once but twice that
[00:37:16] Nazis can march through neighborhoods
[00:37:19] not only of Jewish Americans but of
[00:37:21] Holocaust survivors. And that the line
[00:37:25] where you the the line that gets crossed
[00:37:27] between speech and actual violence is
[00:37:30] when I say go kill that person there or
[00:37:33] go light that house on how it's when the
[00:37:35] speech becomes part of the action, you
[00:37:39] know, or coordinating an assassination
[00:37:40] or something. Of course, language in
[00:37:42] that context
[00:37:43] >> has to be illegal because it's part of
[00:37:45] it's part of breaking the law. But
[00:37:48] marching through a neighborhood with the
[00:37:49] most vile ideology is something that the
[00:37:52] Supreme Court has twice upheld. Well,
[00:37:54] now we're supposed to believe that some
[00:37:56] racist comments on a on on a Facebook
[00:37:59] post or as you said, it's really
[00:38:02] political. It's really about stifling
[00:38:04] the conversation around migration,
[00:38:07] gender, climate. Uh I mean, it's
[00:38:10] actually been less on race, more, you
[00:38:12] know, huge amount on on trans issues. I
[00:38:14] mean, we just saw a British gentleman,
[00:38:16] Gran Linham, a famous uh television
[00:38:19] comedy writer, get arrested when he
[00:38:22] landed in Britain for having urged uh
[00:38:26] biological women to defend themselves
[00:38:28] from biological males that come into
[00:38:30] their bathrooms. So, it is a very
[00:38:32] serious threat. Uh Tucker, I think that
[00:38:35] the thing to keep our eye on is they've
[00:38:37] been trying to basically get governments
[00:38:39] to empower a mostly secret group of
[00:38:42] so-called NOS that would be financed by
[00:38:45] the government and who would be telling
[00:38:47] the social media companies what to take
[00:38:49] down. In some places, they're more
[00:38:50] subtle with it than in others. That's
[00:38:52] the big threat. The Trump administration
[00:38:54] has done some great things to defund
[00:38:56] that. It was all going to work through
[00:38:58] NSF and then Congress would have to
[00:39:00] bless it and that was the way that they
[00:39:01] were going to do it. So the the Trump
[00:39:02] administration has done a great job
[00:39:05] defunding that and also holding a strong
[00:39:07] line on both Europe and Brazil putting,
[00:39:10] you know, demanding that free speech
[00:39:12] protections uh be there. But obviously,
[00:39:15] you know, we see some some backsliding
[00:39:17] and some behaviors over the last couple
[00:39:19] of weeks that were lamentable. Certainly
[00:39:21] the attorney general's comments, which
[00:39:23] she then later kind of went back on and
[00:39:25] said she didn't mean what she had said.
[00:39:28] And then obviously or maybe not
[00:39:29] obviously but a dust up over the the FCC
[00:39:33] chair and his comments around Jimmy
[00:39:35] Kimmel which I got to say as the days go
[00:39:37] by and you look in retrospect it just
[00:39:39] seems absurd. I mean you had Democrats
[00:39:41] trying to create this elaborate
[00:39:43] censorship system and then you had some
[00:39:45] bad mouthing of Jimmy Kimmel. It wasn't
[00:39:48] really it it wasn't great. I I I think
[00:39:51] it was a it was inappropriate. Um, but I
[00:39:54] think there was a lot of other
[00:39:55] complicating factors too and there was,
[00:39:57] you know, economic concerns around Jimmy
[00:39:59] Kimmel and just sort of this demand
[00:40:01] that, you know, that he have to be
[00:40:03] carried on every television station. I
[00:40:05] think in retrospect, uh, we won't look
[00:40:08] back on as it's not a great moment. I
[00:40:10] don't think the Trump administration
[00:40:11] covered itself in glory over the last
[00:40:12] couple of weeks on that. But on on the
[00:40:15] European side of holding strong on free
[00:40:17] speech and also in standing up to
[00:40:19] Brazil, uh, I do give the Trump
[00:40:21] administration credit. All it did was
[00:40:23] save Jimmy Kimmel. I mean, Jimmy Kimmel
[00:40:25] is on his way out. Nobody watches that.
[00:40:27] It's crap. I mean, it has no effect on
[00:40:29] American society. It's just
[00:40:30] murbbitatory. It's really he's he's
[00:40:32] playing for an audience of one himself.
[00:40:34] And um this kind of allowed him to pose
[00:40:37] as a free speech defender. I got to say,
[00:40:39] I'm the Tik Tok thing. I think I sort of
[00:40:41] missed that. I don't know why I wasn't
[00:40:43] paying attention. I should have been,
[00:40:45] >> but Tik Tok was banned by the Congress
[00:40:48] uh for forced to sell. Chinese owned
[00:40:50] company by dance owned Tik Tok and the
[00:40:53] argument in the Congress was well we
[00:40:55] can't have foreign ownership of you know
[00:40:58] a critical service like social media in
[00:41:00] this country and so it has to be owned
[00:41:03] um at least 51% by Americans. Okay. I
[00:41:06] just I don't know why I missed the
[00:41:08] significance of this. Then it turns out
[00:41:10] and they said this openly. It had
[00:41:11] nothing to do with China and members of
[00:41:14] Congress said this. I'm v a lot of
[00:41:16] Republicans. I'm voting to shut down Tik
[00:41:18] Tok because people are starting to like
[00:41:21] Hamas when they watch it. Now, I'm not
[00:41:23] endorsing Hamas, obviously. I'm not pro
[00:41:26] Hamas at all. But Americans have a right
[00:41:29] to like anything they want and to come
[00:41:31] to their own conclusions about some
[00:41:33] foreign conflict or even domestic
[00:41:35] conflict, any conclusion they want
[00:41:37] because they're not slaves. So, is it
[00:41:40] okay for the Congress to decide, I don't
[00:41:41] like the effects, the radicalization
[00:41:44] of this one social media platform, so
[00:41:48] I'm going to shut it down? I mean, is is
[00:41:50] that allowed? Can they do that?
[00:41:53] Well, this gets to let me let me come to
[00:41:55] it by addressing the one of the things
[00:41:57] that you I thought rightly discussed in
[00:41:59] your monologue, which is this uh very
[00:42:01] wonky but important issue of this law
[00:42:04] called section 230 and the nature of
[00:42:07] these platforms that we have. And I
[00:42:09] think it's helpful to think of these
[00:42:10] platforms at this point as utilities.
[00:42:13] They're monopoly utilities, right? You
[00:42:15] could say there's some competition
[00:42:16] between Instagram and Tik Tok and and X
[00:42:19] and there's truth to that, but there's
[00:42:21] often situations in in monopoly
[00:42:23] environments where there's some
[00:42:24] competition. Uh but they really do
[00:42:27] operate, I think, functionally as
[00:42:29] monopolies. And they're they're already
[00:42:31] regulated monopolies by section 230.
[00:42:33] They're already saying to them, you're a
[00:42:35] different category of business. Um
[00:42:37] you're not liable if you take down
[00:42:40] illegal content. you're not you can't be
[00:42:42] sued for having had that content on your
[00:42:44] website. It still requires them to take
[00:42:46] it down. I think my view and I've
[00:42:49] published a couple of white papers on
[00:42:51] it. I've testified on it uh hasn't
[00:42:53] exactly caught fire. My uh proposal
[00:42:56] >> I've got a lot of views like that too.
[00:42:59] >> Yeah. But I think like I think that uh
[00:43:01] you know I mean look one thing you have
[00:43:02] to understand about these these big tech
[00:43:04] companies they're so powerful. There's I
[00:43:06] I I was shocked when I learned that like
[00:43:08] Facebook has a different lobbyist for
[00:43:09] House Republicans than for House
[00:43:12] Democrats and a separate lobbyist for
[00:43:14] Senate Republicans, Senate Democrats. I
[00:43:15] mean, these guys really put a lot of
[00:43:17] money into having that control over
[00:43:19] Congress. So, uh that's a little bit
[00:43:21] like the utilities uh power uh at the
[00:43:23] state level, the electric utilities
[00:43:25] power at the state level. So, it's a
[00:43:27] regulated environment, but I do think
[00:43:29] public interest voices like yourself and
[00:43:31] Joe Rogan and others uh out there
[00:43:34] carrying this message is really
[00:43:35] important because I think what's in the
[00:43:37] public interest is that we actually do
[00:43:40] keep 230 but make it contingent on
[00:43:43] allowing all adult users to filter our
[00:43:48] own content, our own legal content. So
[00:43:51] in other words, all the child
[00:43:53] exploitation stuff obviously still being
[00:43:55] policed as we do now. All of the, you
[00:43:58] know, copyright violation, all that
[00:44:00] stuff still being policed as is now. But
[00:44:02] when you would go into social media
[00:44:04] platform, you'd have a chance to
[00:44:05] basically decide what you wanted to see
[00:44:06] and what you didn't want to see. And
[00:44:08] there was even some talk among uh
[00:44:10] Republicans who I respect, but I
[00:44:12] disagree on this issue that were upset
[00:44:13] that the the video of Charlie Kirk being
[00:44:16] assassinated was on X. I mean, it was
[00:44:18] quite shocking. I have to agree with
[00:44:20] that. But I don't think the solution is
[00:44:22] to necessarily take it down, but you
[00:44:23] could certainly create your own filter
[00:44:24] that I want to filter out any scenes of
[00:44:26] people being physically harmed. You
[00:44:28] could have a lot of different filters.
[00:44:30] You could have the Tucker Carlson
[00:44:31] filter. You could have the Greta Tunberg
[00:44:33] filter, but the the users would be able
[00:44:35] to do that. And then, of course, the
[00:44:37] platforms like X already does can feed
[00:44:39] you their own platform, a separate
[00:44:40] stream. I think Elon has gotten pretty
[00:44:44] close to that at X. Not as close as I
[00:44:46] would love to see it, but we're so
[00:44:48] grateful because I mean the impact that
[00:44:51] he's had has been so enormous in terms
[00:44:53] of ending this censorship fa
[00:44:56] fact-checking mafia. Mark Zuckerberg at
[00:45:00] Meta earlier this year decided he was
[00:45:02] going to copy the Elon Musk model of
[00:45:04] crowdsourcing, which is what the spirit
[00:45:06] of the first amendment is. We
[00:45:07] crowdsource truth with the first
[00:45:09] amendment. And then we just saw Google
[00:45:11] yesterday uh in a letter to chairman Jim
[00:45:14] Jordan in the house said that they would
[00:45:16] move to something more like that. So
[00:45:18] those are those are good directions, but
[00:45:20] for me that's should be the only issue
[00:45:22] of section 230 reform is to actually uh
[00:45:26] expand the speech that's allowed not
[00:45:29] restricted. What they want to do is they
[00:45:31] want to basically give the deep state,
[00:45:33] you know, for lack of a better word,
[00:45:35] DHS, NSF, DoD, the power to kind of
[00:45:39] choose the the people that will decide
[00:45:41] what the truth is as these NOS's who
[00:45:43] would then get NSF money, which is, you
[00:45:45] know, public money, national science
[00:45:47] money, and that they would then get
[00:45:48] special access to the data. I mean, this
[00:45:50] was their whole vision, and they were
[00:45:51] close to achieving it. We only really
[00:45:53] discovered it with the Twitter files.
[00:45:55] That's their holy grail is to be able to
[00:45:57] control it that way. It's they're
[00:45:58] they're set back in the United States,
[00:46:00] but they are moving for sure in that
[00:46:03] direction in Europe, uh, Britain, uh,
[00:46:06] Brazil, and certainly California would
[00:46:07] like to do the same. I hope that if
[00:46:09] Gavin does sign that atrocious
[00:46:10] legislation that you were describing,
[00:46:12] Tucker, I hope that the Supreme Court or
[00:46:14] that the courts strike it down. They
[00:46:17] struck down the last California
[00:46:18] censorship initiative which was aiming
[00:46:20] at uh banning uh AI, you know, parodies
[00:46:25] and that got struck down by a judge in a
[00:46:27] very eloquent uh decision. But that's
[00:46:29] kind of where we're at and why I think
[00:46:30] your special on this is so important
[00:46:33] because we're on a knife's edge. On the
[00:46:34] one hand, I think we're making some good
[00:46:36] progress here in the United States in
[00:46:37] exposing the censorship um and shutting
[00:46:40] and defunding it, but I think worldwide
[00:46:42] the trends are in the wrong direction.
[00:46:44] and on college campuses with young
[00:46:46] people. Unfortunately, we've seen an
[00:46:48] increase of support among censorship.
[00:46:50] Really, every generation from baby
[00:46:52] boomers to Gen Xers to millennials to
[00:46:54] zoomers that we see support for
[00:46:56] censorship going up. And it's exactly
[00:46:58] like what you were saying. It's about
[00:47:00] protecting feelings. It's this, you
[00:47:02] know, to use a bit of jargon, it's this
[00:47:04] expressive individualism, which is that
[00:47:06] my feelings are like the most important
[00:47:08] thing in the world, and if my feelings
[00:47:09] are hurt, then somebody has to pay a
[00:47:11] price. That sadly is the ideology and
[00:47:14] that's why I think the censorship is in
[00:47:16] this broader
[00:47:18] >> you know cultural decline you know where
[00:47:20] it's like the intolerance and the
[00:47:23] entitlement that people feel are these
[00:47:26] big forces that have been I think
[00:47:27] driving those demands for censorship.
[00:47:29] >> I think it's I think everything you've
[00:47:31] said is absolutely true. I think it may
[00:47:33] even be more insidious than that.
[00:47:35] However, I think that there are decent
[00:47:37] people who've had their best impulses
[00:47:40] hijacked by totalitarians
[00:47:42] and used against them. In other words, I
[00:47:44] think there are like good people,
[00:47:45] Americans, mostly women, to be totally
[00:47:47] blunt about it, who are like, "Oh, we
[00:47:50] can't be mean to this or that group."
[00:47:51] Well, I think that's a good impulse, by
[00:47:53] the we shouldn't be mean to any group,
[00:47:55] and the weaker they are, the more
[00:47:57] careful we should be about being mean to
[00:47:58] them because you don't want to be a
[00:47:59] bully, right? Like, that's a good
[00:48:00] impulse. But that impulse is hijacked by
[00:48:03] the sensors who are acting on their own
[00:48:05] behalf, not on behalf of whatever
[00:48:07] marginalized group they claim they're
[00:48:08] acting on behalf of. They don't care
[00:48:10] about those groups, obviously. The lives
[00:48:12] of black people in inner cities did not
[00:48:14] improve after the George Floyd riots,
[00:48:16] obviously. But they they hijack that and
[00:48:19] they say, "If you care about the weakest
[00:48:21] among us, you will get on board with
[00:48:24] censorship." I I think that's really
[00:48:27] clever and insidious and evil, but
[00:48:30] effective.
[00:48:32] Yeah, I think that's right. I think it's
[00:48:34] um it's a manipulation. You know, it
[00:48:37] just shows how emotional the culture has
[00:48:40] gotten, you know, that you can appeal to
[00:48:42] those feelings. I mean, Tucker, it's so
[00:48:45] easy to show how much of an abuse of
[00:48:47] power you can get with these hate speech
[00:48:49] laws. I mean, it's worth considering.
[00:48:50] For example, you'll get people that will
[00:48:52] be like, "Oh, well, are you defending
[00:48:54] the right of people to like call for
[00:48:56] genocide?" And you kind of go, "Oh my
[00:48:58] god, well, no. I mean, that's horrible.
[00:48:59] I don't want people to call for
[00:49:00] genocide. And so you kind of go, so
[00:49:02] we'll carve that out. But then you kind
[00:49:03] of go, well, wait a second. So the same
[00:49:05] people that are saying that to you are
[00:49:07] the ones that
[00:49:08] >> point out as soon as you talk about the
[00:49:10] American experiment of the 17th and 18th
[00:49:13] and 19th centuries that the that the
[00:49:16] European settlers committed a genocide
[00:49:18] to create the United States of America.
[00:49:20] So how hard would it be to criticize
[00:49:23] somebody that say praised America,
[00:49:26] praised the western expansion, praised
[00:49:28] the the west opening up and the European
[00:49:30] settlers uh frankly taking over the
[00:49:33] United taking over this land from
[00:49:35] indigenous people. Someone could say
[00:49:36] that's defending genocide. So you see
[00:49:38] how easy and quickly it comes. I'll give
[00:49:40] you another example. Institute for
[00:49:42] Strategic Dialogue, which is one of
[00:49:44] these deeply sinister I mean they would
[00:49:46] not return any phone calls or whatever
[00:49:48] and they personally uh uh smeared me and
[00:49:51] a lot of others.
[00:49:53] >> Can I ask you to stop? So the the Center
[00:49:56] for Strategic Dialogue didn't want any
[00:49:58] dialogue.
[00:49:59] >> Yeah, the Institute for Strategic
[00:50:01] Dialogue refused to have dialogue.
[00:50:03] >> I'm not surprised.
[00:50:04] >> Um creepy creepy group close to British
[00:50:07] intelligence. I mean it's just like I
[00:50:10] mean close to I'm being generous you
[00:50:12] know like clearly an intermediary
[00:50:14] uh with deep state British organizations
[00:50:16] that is very interested in censoring
[00:50:18] Americans and we see this dynamic a lot
[00:50:20] where they you know the Brits you know
[00:50:22] pick on Americans the British groups
[00:50:23] pick on Americans um because you know
[00:50:26] the US intelligence uh uh community
[00:50:29] can't directly go after Americans so
[00:50:30] they get their British uh allies to do
[00:50:33] it and this is a group that labeled
[00:50:36] along with the center for countering
[00:50:37] digital hate, which is equally sinister
[00:50:39] organization, very connected to the
[00:50:40] Labor Party of Britain. They label
[00:50:43] criticism of George Soros
[00:50:46] as anti-semitic. And and Tucker, I'm not
[00:50:48] saying criticism of George Soros and
[00:50:51] even noticing that he's Jewish. It
[00:50:53] wasn't even that. It wasn't like they
[00:50:55] said the Jewish philanthropist George
[00:50:57] Soros. They'd be like, just criticizing
[00:50:59] George Soros
[00:51:01] >> was anti-semitic. That's how crazy it is
[00:51:05] and got that. And you know you look at
[00:51:07] the number of institutions that have
[00:51:08] been putting this. It's the European
[00:51:09] Union. It's NATO. It's the United
[00:51:12] Nations. It's Germany, France, Britain.
[00:51:15] United States is really powerful. And I
[00:51:17] think that the president did a good job
[00:51:19] pushing back against those types of
[00:51:21] people around the world. But it is
[00:51:23] important to remember that the European
[00:51:24] economy is bigger than the United States
[00:51:26] economy. And certainly when you go and
[00:51:28] kind of look at a world with this
[00:51:30] incredible economic power of China and
[00:51:33] the gravity it exercises and all these
[00:51:35] other countries in the world including
[00:51:36] Europe including Brazil. I mean it was
[00:51:39] notable that when Trump punished Brazil
[00:51:41] with tariffs for its censorship and
[00:51:43] banning their their their lead their
[00:51:45] opposition party leader and the leading
[00:51:47] presidential candidate Bolsinaro that
[00:51:49] China made up the difference in the loss
[00:51:51] of trade. So you sort of start to see
[00:51:54] the world, you know, and particularly
[00:51:56] get this kind of organic, you know,
[00:51:58] decline of real belief and support in
[00:52:00] free speech with a kind of global move
[00:52:02] towards this censorship industrial
[00:52:05] complex system of censorship by proxy.
[00:52:08] It is disturbing because they can
[00:52:10] exercise economic power over our
[00:52:11] platforms. And you know, I mean, I I I
[00:52:14] think Elon has shown good reason that we
[00:52:16] can, you know, basically trust what he's
[00:52:18] done. I think he's made great decisions
[00:52:20] um for the most part since he's taken
[00:52:22] over the platform. But you know if
[00:52:24] anything were to happen to Elon I mean
[00:52:25] these companies Mark Zuckerberg, Google,
[00:52:28] Sundar Pashai, they've shown themselves
[00:52:30] to be quite cowardly. Facebook um was
[00:52:33] worried about the lack of help from the
[00:52:35] Biden administration. It was enough to
[00:52:37] get Facebook to censor because they were
[00:52:39] worried about not having enough help
[00:52:41] from the Biden administration to
[00:52:43] retrieve their very valuable user data
[00:52:45] from Europe, which their laws require.
[00:52:48] And that's why they agreed to do
[00:52:50] censorship that even their own social
[00:52:52] scientists within Facebook said would
[00:52:54] backfire because they were like, look,
[00:52:56] if you go censor mothers sharing
[00:52:59] information about the about the COVID
[00:53:01] vaccine side effects, it will make
[00:53:02] mothers more nervous. That's what the
[00:53:04] internal people at Facebook said. And
[00:53:06] the Facebook execs were like, "We better
[00:53:08] just give the Biden administration what
[00:53:09] they want, otherwise they're not going
[00:53:10] to help us with our data in Europe." So,
[00:53:12] it doesn't I mean, it's it's not hard to
[00:53:15] imagine, you know, the power that these
[00:53:17] states can exercise on these platforms.
[00:53:19] And I don't think that that threat has
[00:53:21] gone away.
[00:53:23] >> It doesn't seem like a good system if
[00:53:25] one, you know, South African-born
[00:53:28] naturalized American is the only thing
[00:53:31] standing between us and tyranny. I mean,
[00:53:33] I really think that the media wouldn't
[00:53:36] be I mean, I I work in the media. My
[00:53:38] whole life I've worked in the media.
[00:53:39] Elon Musk did this. Elon Musk did all
[00:53:42] this and he did it because I I think he
[00:53:45] says because he believes in it. What
[00:53:46] whatever the cause, he did it. He opened
[00:53:49] up everything.
[00:53:49] >> Yeah.
[00:53:50] >> So, but which I'm will never stop being
[00:53:53] grateful for obviously. However, that's
[00:53:55] a pretty thin thread kind of holding
[00:53:58] your civilization aloft. No.
[00:54:02] >> Yeah. Yeah, I mean I the things I really
[00:54:04] worry about are those numbers, you know.
[00:54:06] I mean, you've had, you know, um, you
[00:54:09] know, just those numbers of young people
[00:54:10] that I mean, the number of young people,
[00:54:12] the number of college students, the
[00:54:15] share of college students that said that
[00:54:17] violence may sometimes be necessary to
[00:54:20] stop a campus speaker was under 20% in
[00:54:23] the year 2020. It's 34%
[00:54:27] right now. That means onethird of
[00:54:29] college students think that violence
[00:54:31] might be necessary to stop a campus
[00:54:33] speaker. That is I mean that's
[00:54:36] pathological. I don't know there's
[00:54:37] another word for that's bonkers,
[00:54:39] >> crazy scary behavior. And so you know
[00:54:43] remember George Orwell he was uh you
[00:54:46] know a leftist um wrote 1984 because he
[00:54:50] had read James Bham who's this very
[00:54:52] famous you know former Trosky who
[00:54:54] becomes a conservative and writes u you
[00:54:56] know this book about the managerial
[00:54:58] state which is basically about how this
[00:55:01] totalitarianism would kind of emerge out
[00:55:03] of the society and out of the state in
[00:55:07] these in these very specific safetist
[00:55:11] you know harm reduction demands that
[00:55:14] that you would sort of get a whole kind
[00:55:15] of state of busybody
[00:55:18] you know nanny state people that wanted
[00:55:20] to police the speech I mean that was his
[00:55:22] prediction in 19 whatever that was 1947
[00:55:25] I think or you know when 1984 came out
[00:55:28] that is was so brilliant I mean it's
[00:55:30] terrifying brilliantly pressing it
[00:55:32] because that's what I worry about and I
[00:55:34] think you know I saw the obviously very
[00:55:37] moved by all the Charlie Kirk what's the
[00:55:39] response to the Charlie Kirk perk
[00:55:41] assassin and the desire to go into
[00:55:43] universities. I think we need to figure
[00:55:45] out how to move that number down so that
[00:55:48] people really do be the young people
[00:55:50] become and the everybody becomes more
[00:55:52] comfortable with Yeah. I mean look I
[00:55:55] mean Charlie really was um inspiring in
[00:55:58] the way that he would go into places and
[00:55:59] of course the sign said prove me wrong.
[00:56:01] He was saying look I'm open to debate is
[00:56:04] exactly he I mean it's so I don't know
[00:56:07] ironic is not the right word. It's so
[00:56:10] powerful that the person that was
[00:56:13] assassinated was the person doing what
[00:56:16] we need the most of. That the person
[00:56:18] that was killed was the person who was
[00:56:20] doing what we need to see much much more
[00:56:23] of at the high schools and the colleges,
[00:56:25] which is getting people very comfortable
[00:56:28] with having difficult conversations and
[00:56:30] with having conversations with people
[00:56:32] whose values you don't share and who
[00:56:34] believe things that you find
[00:56:35] reprehensible. And that is at the heart
[00:56:37] of it. And I don't know. I mean, it's
[00:56:39] sort of what's terrifying is that, you
[00:56:42] know, those numbers of intolerance kept
[00:56:44] increasing over the last 10 years. I
[00:56:47] hope that we've hit an inflection point.
[00:56:48] I will say, Tucker, one number that did
[00:56:51] change that gave that that I felt some
[00:56:53] hard in was that Pew had found that the
[00:56:55] share of Democrats that thought the
[00:56:57] government should be involved in
[00:56:58] censoring misinformation online had
[00:57:00] risen from 40% in 2018 to 70% in 2023.
[00:57:06] They did the stud the same question
[00:57:08] earlier this year and it's now declined
[00:57:10] to 58%. So I do feel like there is a
[00:57:12] chance at which the I mean it's still
[00:57:14] terrible but like the the the sense in
[00:57:16] which that trance has broken you know
[00:57:19] that hypnotic we have to fight
[00:57:20] misinformation that just bonkers
[00:57:23] anti-American unamerican impulse. It
[00:57:26] feels like it was broken but I still
[00:57:28] think there's a lot of that you know
[00:57:29] cultural work that we need to do to
[00:57:31] really educating kids because I just
[00:57:33] don't think free speech is intuitive. I
[00:57:35] mean, you go to a playground and you see
[00:57:36] little kids playing and they just are
[00:57:38] yelling shut up, shut up at each other
[00:57:40] all the time. It's our natural instinct.
[00:57:43] You hear something you don't like, you
[00:57:44] want to shut them up. And the
[00:57:46] alternative to listen to somebody and
[00:57:48] actively disagree with them and maybe
[00:57:50] think about how to respond or just
[00:57:52] figure out if you agree or disagree. It
[00:57:54] take it's like a muscle. It just takes
[00:57:55] practice. And I think we have to teach
[00:57:57] the kids, you know, both the K through
[00:57:59] 12 and the college students how to do
[00:58:01] that. and that doing that is a core
[00:58:04] value that will be rewarded in life and
[00:58:07] we should be celebrating rather than the
[00:58:09] opposite which is this desire to silence
[00:58:11] and shut down. Are you concerned that
[00:58:14] technological advances that we're in the
[00:58:16] middle of really um will be harnessed to
[00:58:20] affect censorship without people even
[00:58:22] knowing it? I mean, you did the Twitter
[00:58:23] files with mental and found what I don't
[00:58:27] think anyone knew. There was this vast
[00:58:29] censorship program at Twitter and but
[00:58:32] most users had the sense that, you know,
[00:58:34] it's a liberal website, whatever,
[00:58:35] they're taking the conservatives off,
[00:58:36] but had no idea
[00:58:39] uh about the specifics until you brought
[00:58:40] them to light. Does AI increase the
[00:58:43] power of the platforms to take
[00:58:46] information off the site without anyone
[00:58:48] even knowing it's been taken off?
[00:58:51] Yeah, I mean just to answer that
[00:58:53] question, I'll preface it by saying of
[00:58:54] course I watched very closely your
[00:58:56] interview with Sam Alman where you asked
[00:58:59] I think some very important questions
[00:59:01] which is what is the moral framework
[00:59:03] with which uh his AI will follow. That
[00:59:06] is the right question and of course it
[00:59:10] remains an everpresent question. It's
[00:59:12] not like it'll it will never go away.
[00:59:14] Ultimately, these decisions about what
[00:59:16] gets censored and what the AI censors
[00:59:18] for us are made by people. And so, you
[00:59:21] look at the worst episodes of censorship
[00:59:23] over the last, you know, 5 to 10 years.
[00:59:26] You can find the people that were
[00:59:28] demanding the censorship. You can find
[00:59:29] the groups they created to demand it.
[00:59:32] you know that it was it was human
[00:59:34] decisions and that in fact at the
[00:59:36] company level in the Twitter files.
[00:59:39] There was a huge amount of debate around
[00:59:41] I mean not enough a huge amount of
[00:59:43] debate around deplatforming the
[00:59:45] president of the United States like
[00:59:47] removing the account of the president of
[00:59:48] United States which is so insane. They
[00:59:51] it was a big deal. Obviously, it was
[00:59:53] talked about and then of course you
[00:59:54] could see it. Same thing with the Hunter
[00:59:56] Biden laptop where the FBI ran a
[00:59:58] deception operation against the social
[01:00:00] media platforms illegally in an illegal
[01:00:03] conspiracy. Uh that included spreading
[01:00:06] disinformation about the laptop. Um that
[01:00:09] was obviously a very elaborate thing
[01:00:11] that a lot of people could see and were
[01:00:12] getting kind of glimpses into. There was
[01:00:15] also just the the the humrum or the
[01:00:17] ordinary kind of deamplification.
[01:00:20] So you remember Twitter famously said,
[01:00:22] "Oh, we don't shadowban." That was the
[01:00:24] language that people had used. Well, of
[01:00:26] course they did. They called it
[01:00:28] something different. It was called like
[01:00:30] uh you know, do not ampl amplify lists
[01:00:32] for examples, like a kind of blacklist
[01:00:34] that they ran or a trends blacklist.
[01:00:36] Don't let them show up on the trends
[01:00:37] thing. So there's just all a million
[01:00:39] dials of course uh as you know, Tucker,
[01:00:42] to like kind of turn these things up and
[01:00:43] down.
[01:00:43] >> Yeah,
[01:00:44] >> the AI can help, but sometimes, you
[01:00:45] know, like they wanted to go after a
[01:00:47] QAnon conspiracy at one point. I
[01:00:49] reported this in my Twitter files on the
[01:00:51] decision to deplatform Trump and there
[01:00:53] was something around like the Kraken
[01:00:55] which I guess is like a giant like squid
[01:00:57] in the ocean. I think it's uh you know
[01:00:59] they were like the Kraken was somehow
[01:01:00] tied into QAnon conspiracy theories and
[01:01:03] they wanted to censor that which is also
[01:01:05] insane. like they wanted to literally
[01:01:07] stop people from talking about Kraken
[01:01:09] like on Twitter and then somebody
[01:01:11] figured out that the Seattle I think
[01:01:13] hockey team is the Kraken and that all
[01:01:15] these tweets around hockey were getting
[01:01:17] like swept up in it. So it's like you
[01:01:20] know it's like I worry about it but
[01:01:21] ultimately it's not a bunch of sensors
[01:01:23] in like the Philippines or even I think
[01:01:26] PaloAlto these like the worst forms of
[01:01:28] censorship are being decided at at the
[01:01:29] executive level. But as I said, my view
[01:01:32] is that if you have section 230, which
[01:01:35] is what gives you the power to be a
[01:01:37] monopoly. It's like literally like the p
[01:01:40] like the permit to operate as a
[01:01:41] functional natural monopoly. I think
[01:01:44] that that you should have to um give the
[01:01:46] user the adult user complete control
[01:01:50] over all legal con content and you can
[01:01:53] censor the illegal content. And I do
[01:01:55] think we should there's a whole separate
[01:01:57] thing on kids, you know, which I think
[01:01:59] is complicated because they're using the
[01:02:01] kids right now in Australia. They're
[01:02:03] literally using the kids in Australia to
[01:02:05] create digital identifications as a way
[01:02:06] to censorship, which I think you we
[01:02:08] should be alarmed about. But
[01:02:09] nonetheless, as a father who's seen the
[01:02:11] impact of social media on adolescence, I
[01:02:13] do worry about it. But I do think like
[01:02:15] if you're going to have section 230,
[01:02:16] that's that should be the agreement.
[01:02:20] >> Yeah. And I thank you for describing it
[01:02:22] as using the kids because it is the most
[01:02:25] obviously transparently cynical uh
[01:02:28] attempt to censor political speech by
[01:02:31] using the suffering of children about
[01:02:32] whom they care nothing obviously. Um
[01:02:35] there's no demonstrated care for kids
[01:02:37] like how are the schools you know they
[01:02:39] don't care.
[01:02:40] >> Um right and any pretext will do. I mean
[01:02:43] the terrorism thing was huge as you know
[01:02:46] um under the Bush administration was
[01:02:48] terrorism
[01:02:49] >> what is that exactly? Can you can you
[01:02:50] define it for me? No they can't. Um I'm
[01:02:53] wondering though what's the recourse? So
[01:02:55] these are decisions you just described
[01:02:56] that are being made at like the highest
[01:02:58] level of well global society. I mean the
[01:03:02] richest man in the world decided to
[01:03:03] restore free speech to the United
[01:03:06] States. The president of the United
[01:03:08] States helped him. um judge, federal
[01:03:11] judges rule on these things, but let's
[01:03:14] say we have a different president and
[01:03:15] there's no Elon or his commitment
[01:03:17] changes and there's a different Supreme
[01:03:19] Court. Uh
[01:03:22] like where's the power to fight back
[01:03:25] against this? Is can you imagine a kind
[01:03:27] of civil disobedience that people could
[01:03:29] use to regain their speech? I'm trying
[01:03:32] to think through what that would look
[01:03:33] like.
[01:03:35] Well, look at the top of my list is that
[01:03:38] we are in something called the NATO
[01:03:40] organization and it has a treaty that
[01:03:43] requires that we only have as members
[01:03:47] free democracies. Dem we only are going
[01:03:50] to defend countries that allow free
[01:03:53] speech and that allow candidates uh
[01:03:56] people to choose the candidates of their
[01:03:57] choice. Currently that is absolutely
[01:04:00] under attack in Europe. Romania has
[01:04:02] already prevented, as you interviewed,
[01:04:05] the Romania has prevented their
[01:04:07] presidential front runner.
[01:04:09] >> Now France is about to ban their
[01:04:11] presidential front runner, Marine Le
[01:04:13] Pen, in a completely trumped up charge
[01:04:15] that the prime minister, the last prime
[01:04:18] minister was already guilty of and still
[01:04:20] uh came into office. Germany, there's I
[01:04:24] just interviewed a mayoral candidate who
[01:04:25] was banned for for like madeup reasons
[01:04:28] because he liked Lord of the Rings. I'm
[01:04:30] not even kidding. and he went to a book
[01:04:32] fair where there were some people that
[01:04:33] the intelligence services uh didn't like
[01:04:36] people and that was like the basis of
[01:04:37] the election council preventing him from
[01:04:39] running and then they have these
[01:04:40] elaborate censorship industrial
[01:04:41] complexes. We're part of NATO. Everybody
[01:04:44] knows that we're the main event. We
[01:04:46] subsidize it to the tune of, you know,
[01:04:49] hundreds of billions of dollars a year.
[01:04:52] Like you, Tucker, I'm willing to die for
[01:04:54] free speech and democracy. like I with
[01:04:56] Senica
[01:04:58] >> uh like the Spartan slave boy in the
[01:05:00] great Senica passage. I would rather
[01:05:02] die, you know, a free man than live as a
[01:05:05] slave. And so we will we are willing to
[01:05:07] die for for freedom. And I think we all
[01:05:09] care a lot about Western civilization in
[01:05:11] Europe, but not if they're going to I
[01:05:14] don't want to defend I'm not going to I
[01:05:15] don't want to put my life on the line to
[01:05:17] defend authoritarian sensorial autocracy
[01:05:20] autocracies like France, Germany, and
[01:05:22] Romania and potentially Britain. So I
[01:05:25] think the president has, you know, been
[01:05:27] pretty strong on it, you know, they're
[01:05:28] still negotiating this right now. But I
[01:05:30] just think that the public certainly
[01:05:32] MAGA but whatever leftists are still in
[01:05:35] favor of free speech out there including
[01:05:36] as you mentioned I think a lot of the
[01:05:37] the pro Palestinian folks that felt p
[01:05:40] you know felt censored on TikTok and
[01:05:41] elsewhere and have been censored in
[01:05:43] other ways that that we should all make
[01:05:46] very clear that we don't want to be a
[01:05:47] part of a military uh treaty that uh has
[01:05:51] us risking our lives for for illiberal
[01:05:55] autocracies. Like that's got to be at
[01:05:56] the top of the list. Same thing with
[01:05:58] Brazil. It's like, you know, okay, I
[01:06:01] think people, Americans need to know we
[01:06:03] should pay more for orange juice if it
[01:06:05] means protecting our freedom of speech.
[01:06:07] That like our freedom of speech, it's
[01:06:09] not like a small thing. It's like the
[01:06:11] main event. It's like the reason why
[01:06:12] America is the greatest country that's
[01:06:14] ever existed and certainly the greatest
[01:06:16] country in the world still despite all
[01:06:18] of our problems and the country that
[01:06:20] everybody wants to live in is because of
[01:06:21] the first amendment of free speech. So,
[01:06:22] it just has to be an absolute
[01:06:24] non-negotiable. So, I said this is the
[01:06:27] number one issue. If you don't have free
[01:06:28] speech, you don't have anything. You
[01:06:29] don't have democracy, you don't have
[01:06:31] your dignity, you don't have you don't
[01:06:32] have c you don't have prosperity, you
[01:06:34] don't have you have infrastructure can't
[01:06:36] work, it's just everything depends on
[01:06:38] free speech. And so it's just got to be
[01:06:40] an absolute issue for the administration
[01:06:43] um in these in these negotiations. And
[01:06:45] yeah, I mean I think civil disobedience
[01:06:48] um if we see you know when things get to
[01:06:50] that level is always should always be an
[01:06:52] option particularly for defending
[01:06:54] something as essential and sacred uh as
[01:06:57] the first amendment. Do you have any
[01:06:59] guesses or theories as to what happened
[01:07:01] to Great Britain, which of all countries
[01:07:03] is the closest to ours, has the deepest
[01:07:06] historical ties, and is now arresting
[01:07:08] more than 12,000 people a year for
[01:07:10] saying things the government doesn't
[01:07:12] like that. It really, it's hard for me
[01:07:14] even to digest that. But I'm also
[01:07:17] confused by it. Like, how did that
[01:07:19] happen?
[01:07:22] >> Yeah. Yeah, I mean, look, you had
[01:07:23] Christopher Caldwell on uh the other
[01:07:26] week, and I thought he did um an
[01:07:28] incredible job explaining what's
[01:07:29] happened to Europe, but I mean, I think
[01:07:31] we're I think it's fair to say that
[01:07:33] we're at the end of an 80-year cycle
[01:07:35] that began in 1945 with the end of World
[01:07:38] War II. And the United States had the
[01:07:40] role of being the, you know, really the
[01:07:42] the main, you know, the country that was
[01:07:44] at the center of this new empire. I
[01:07:47] mean, you can call it the American
[01:07:48] Empire, whatever you want to call it.
[01:07:49] And we were trying to prevent another
[01:07:51] war in Europe. and we pushed out an
[01:07:52] ideology that you might call RRO calls
[01:07:56] the open society ideology. And at first
[01:07:59] it made sense when you're denazifying
[01:08:01] Germany and you're um you know uh
[01:08:05] whatever way they did with Japan,
[01:08:06] moderating Japan and you're trying to
[01:08:08] kind of usher in a liberal democratic
[01:08:11] western order made sense for a few
[01:08:13] decades. Probably didn't make sense
[01:08:15] after 1990.
[01:08:17] Um and it went too far and it obviously
[01:08:19] we decimated our industries by exporting
[01:08:21] them to China and you know created you
[01:08:24] know with the help of George Soros
[01:08:26] created this elaborate NGO sector that
[01:08:28] basically pushed two things at the same
[01:08:30] time because I think the only way you
[01:08:31] can understand the censorship and the
[01:08:33] demand for totalitarian censorship in
[01:08:35] the kind of mental space is to just also
[01:08:38] understand the total disorder that
[01:08:40] they're creating in the physical world.
[01:08:43] you know, from like as you're saying,
[01:08:45] the unchecked mass migration, the
[01:08:47] collapse of borders, people in boats,
[01:08:49] you know, people with 14,
[01:08:51] >> you know, criminal uh prosecutions and
[01:08:55] still let out on the street street
[01:08:56] despite their schizophrenia to commit
[01:08:58] murder against refugees. I mean, that
[01:09:01] disorder is I think I don't think it's a
[01:09:03] coincidence that that those two things
[01:09:06] are unleashed by the same people at the
[01:09:08] same time. I mean, Soros Foundation
[01:09:09] wants censorship. they also want uh
[01:09:12] disorder and anarchy and lawlessness uh
[01:09:15] you know at the street level at the city
[01:09:17] level. So I think that you know as that
[01:09:20] you know the contradictions of their own
[01:09:24] you know ideology of just sort of you
[01:09:26] know the guilt around the past and the
[01:09:28] construction of these singularities of
[01:09:30] evil that were you know the holocaust
[01:09:32] slavery indigenous genocide those became
[01:09:35] new religious the new original sins for
[01:09:37] this new woke religion
[01:09:40] and you know it's funny because it was
[01:09:41] interesting when you everyone looks at I
[01:09:43] mean everyone's seen the data you know
[01:09:44] that really the the border the migration
[01:09:47] and the illegal migration to United
[01:09:48] States really wasn't nearly as out of
[01:09:50] control be, you know, before Trump. He
[01:09:52] campaigned on in 2016, but it really
[01:09:54] gets out of control as a kind of
[01:09:55] reaction by Biden and the blob elites
[01:09:59] after 2020. Europe's a slightly
[01:10:01] different story, but um you know, I
[01:10:04] think it's just what it looks like.
[01:10:05] There's just this woke religion has just
[01:10:08] absolutely displaced the older story
[01:10:10] that we had of Western civilization,
[01:10:12] which is that Christianity gave way to
[01:10:15] the Enlightenment. The Enlightenment
[01:10:16] secularized a whole bunch of Christian
[01:10:18] values including the idea that we're all
[01:10:20] born with dignity and rights and we just
[01:10:24] that old story just got replaced by this
[01:10:26] really ugly story which is that humans
[01:10:29] are a cancer on the earth that Western
[01:10:31] civilization is just genocidal and um
[01:10:34] you know it's just the opposite of
[01:10:36] really what it's been historically which
[01:10:38] is a massively liberating phenomenon and
[01:10:40] we got stuck in this awful story. It got
[01:10:42] taught in the schools. that got taught
[01:10:44] in the universities and it's just that
[01:10:46] beautiful open society vision from 1946
[01:10:49] just became its complete totalitarian
[01:10:52] opposite and I think Britain really
[01:10:55] exemplifies that and it's worth knowing
[01:10:56] by the way too because I think you've
[01:10:57] done such a good job here Tucker of
[01:10:59] point out the left and right origins of
[01:11:01] this certainly like what we call the
[01:11:02] foreign policy establishment the blob
[01:11:04] they were behind the online safety act
[01:11:07] in Britain that passed in 2023 it was
[01:11:10] the conservative government that that
[01:11:12] got it done but it was the same foreign
[01:11:14] policy blob that was behind our
[01:11:15] censorship industrial complex and that
[01:11:18] was clearly emerged out of this effort
[01:11:20] to govern the American empire and then
[01:11:23] was reacting to this just massive
[01:11:25] populist unrest to you know out of
[01:11:28] control migration policies, energy
[01:11:30] policies that were aimed at creating
[01:11:31] scarcity and high prices. the trans
[01:11:34] madness where literally I mean that is
[01:11:36] just one where I mean if you really want
[01:11:38] to like it's like a David Croninberg
[01:11:40] movie you know it's like it's these
[01:11:42] atrocities physical atrocities that
[01:11:45] you're then not allowed to talk about
[01:11:47] like that you're then if you actually
[01:11:49] deny if you actually I mean they wanted
[01:11:51] censorship on all of it we had at
[01:11:52] Twitter they censored Megan Murphy for
[01:11:54] saying but a man is not a woman like
[01:11:57] that's what she said and they like
[01:11:59] deplatformed her I mean you talk about
[01:12:00] like a terrifying scenario we're in a
[01:12:02] scenario where it's these hideous
[01:12:05] medical experiments are being conducted
[01:12:06] on the bodies of adolescence and
[01:12:08] mentally ill people and they were then
[01:12:10] trying to censor people talking about it
[01:12:12] and and demanding that you believe that
[01:12:15] it's possible to change your sex. I
[01:12:17] mean, that's just how you kind of go
[01:12:18] that's how far gone we we've we we got
[01:12:20] as a civilization, you know? It's that
[01:12:23] we convinced ourselves that you could
[01:12:24] perform, you know, biological alchemy
[01:12:27] and then we wanted to silence and
[01:12:29] suppress anybody who told the truth
[01:12:31] about it. So, you know, I it's there's I
[01:12:34] think there's a black pill moment where
[01:12:36] one could say that we're pretty far
[01:12:37] gone, you know, if you're already at
[01:12:39] this place. But I do think, you know,
[01:12:41] thanks to, you know, uh, what you were
[01:12:43] saying to the opening of the platform,
[01:12:45] to the success of people like you and
[01:12:47] Joe Rogan and the creation of this
[01:12:48] alternative media universe, I do think
[01:12:51] we have a chance to to remake that case,
[01:12:54] not just for free speech, but really for
[01:12:56] this amazing, you know, you know, tiny
[01:13:00] moment in history where like actually
[01:13:02] everybody that's a citizen of a country
[01:13:03] got to be free. And and I and and that
[01:13:06] that's a beautiful, wonderful thing. And
[01:13:08] anybody that's ever traveled outside the
[01:13:10] United States, I I think can see that
[01:13:12] and appreciate it.
[01:13:13] >> It's the greatest thing that we have.
[01:13:15] And the reason we have it is because
[01:13:16] we've reminded ourselves generationally
[01:13:19] because we told the story of it that
[01:13:20] this is the greatest thing that we have.
[01:13:21] And I can't think of a greater tragedy,
[01:13:24] a more perverse tragedy than the
[01:13:26] assassination of Charlie Kirk being
[01:13:28] leveraged by people in order to
[01:13:31] construct a world that he hated and
[01:13:33] fought against for his entire short
[01:13:35] life. to use Charlie's assassination as
[01:13:37] a pretext for censorship. I I I I mean
[01:13:41] the mind struggles even to to understand
[01:13:44] that. But that's how brazen people are.
[01:13:45] So
[01:13:46] >> I hate to ask you this, but you've
[01:13:48] thought so deeply about it. Maybe you
[01:13:49] have an answer. I don't. What's the
[01:13:51] motive for all this? Like why would you
[01:13:53] want to conduct hideous medical
[01:13:56] experiments on children? It doesn't
[01:13:57] benefit you. It doesn't benefit them.
[01:13:58] Like what what is this actually?
[01:14:02] >> Yeah. Yeah, I mean, you know, as you as
[01:14:03] you probably remember in one of my my
[01:14:05] last book on uh the homeless crisis, I
[01:14:08] put a lot of emphasis on this desire
[01:14:11] from the left to be compassionate and to
[01:14:13] think of ourselves as good people. Um,
[01:14:15] and that really the idea and it's really
[01:14:17] this immediate emotive, you know, like
[01:14:19] with addiction, people that are crying
[01:14:22] out there saying, "I'm fine and I'm fine
[01:14:24] and my living in my waist and being
[01:14:26] sexually assaulted every night. I'm just
[01:14:27] fine. Let me just smoke some more
[01:14:29] fentanyl." uh every everybody should
[01:14:31] know that that person needs an
[01:14:33] intervention so that they stop harming
[01:14:35] themselves in public.
[01:14:37] >> Um but the emotionalism and the
[01:14:40] sentimentality that immediate appeal to
[01:14:42] oh no it's somehow cruel oops that it's
[01:14:45] somehow cruel to allow uh you know to
[01:14:48] enforce laws and mandate care for
[01:14:50] people. So on one hand it does seem like
[01:14:53] this empathy of like oh we have to
[01:14:54] protect people but I also think there is
[01:14:56] something you know darker and more
[01:14:58] selfish and frankly more hedonistic than
[01:15:01] that which is as you were saying I mean
[01:15:04] the the the ability to censor somebody
[01:15:06] is an incredible act of power and
[01:15:08] domination. It's not something that weak
[01:15:11] the weak can't censor people. I mean you
[01:15:13] look at any movement for human
[01:15:14] liberation
[01:15:16] >> like the weak don't have the power to
[01:15:18] censor. The censorship comes from these
[01:15:20] really arrogant, overly empowered,
[01:15:23] overly powerful, entitled elites
[01:15:26] displaying traits of frankly antisocial
[01:15:28] disorder um with no empathy for the
[01:15:31] people that they're censoring. And so I
[01:15:33] think that my views are that there's
[01:15:35] certainly plenty of people that think
[01:15:37] that that feel that they're being
[01:15:38] empathic, but I think a lot of other
[01:15:40] people it's will to power and nothing
[01:15:42] besides to paraphrase NZ where it's
[01:15:46] actually the pleasure of just
[01:15:49] controlling what people can say online.
[01:15:51] I mean the people these sensors have now
[01:15:54] we've profiled them. I mean, we've
[01:15:56] written, you know, we we haven't
[01:15:57] published all of it, but we try to
[01:15:59] understand the people that that are
[01:16:01] doing this very deeply at a psych at a
[01:16:03] psychological level, and they're just
[01:16:05] absolutely power- hungry and they're
[01:16:08] completely arrogant. Like, they're and
[01:16:09] closed-minded and frankly not very
[01:16:11] smart. I mean, that's the thing you
[01:16:13] forget about the totalitarians.
[01:16:15] You know, it's depressing because I
[01:16:17] think there's a lot there's a story
[01:16:18] that's a story that's getting told. I
[01:16:20] won't say everybody knows by who, but
[01:16:21] somebody there's somebody on the right
[01:16:22] that's sort of telling a story about how
[01:16:24] terrible democracy is and how if we if
[01:16:27] we had an autocracy it would be run by
[01:16:29] somebody competent like Elon Musk and
[01:16:31] everything would work great. Actually
[01:16:32] the history of totalitarianism is that
[01:16:35] it's the incompetent awful idiotic
[01:16:38] bureaucrats that are running things.
[01:16:40] It's not like it's not Mozart and Gerta
[01:16:43] and Nze that are like you know running
[01:16:45] things. It's like these very crude dumb
[01:16:48] people. And so you see someone like Nina
[01:16:50] Jenkowitz or Renee Desta. These are
[01:16:53] really power-hungry, very petty, small
[01:16:56] people. There's so much kind of just a a
[01:16:59] kind of almost like a neediness there.
[01:17:01] You see it in some of them. A need a
[01:17:03] neediness for people to tell them how
[01:17:05] good they are and how much they care. A
[01:17:07] lot of like, you know, if you remember
[01:17:08] the movie Misery, the Kathy Bates
[01:17:10] character, I always sort of say a lot of
[01:17:11] that Kathy Bates energy of, you know,
[01:17:13] I'm going to take care of you, but it's
[01:17:15] actually I'm going to dominate. So I
[01:17:17] think that when we I think that people
[01:17:19] sometimes sort of say it's suicidal
[01:17:22] empathy or it's pathological altruism
[01:17:24] and I know what they mean. I think that
[01:17:26] what's underneath it is something
[01:17:28] darker, more nihilistic that is just
[01:17:31] feeding hedonistically
[01:17:33] off the power of dominating and
[01:17:35] censoring and persecuting others that
[01:17:39] really isn't in service. you know, as
[01:17:42] you know, we've you know, as the
[01:17:45] foundational spiritualities and and um
[01:17:48] philosophies of the West have aimed at
[01:17:50] that power be used in service of of
[01:17:53] beautiful values, it's not it's not in
[01:17:55] service of that. It's just in service of
[01:17:57] their own individual expression of
[01:17:59] power. And like you said, you know, if
[01:18:00] it's if it's to censor you on COVID or
[01:18:02] anti-semitism
[01:18:04] or uh trans or migration or the Ukraine
[01:18:08] war, they don't care. like they're
[01:18:09] always wanting to find new ways to
[01:18:11] censor because it's coming from
[01:18:12] something so deep, something so deep and
[01:18:15] frankly pathological inside of them.
[01:18:17] >> It's the war impulse is so similar. You
[01:18:19] killing people being the ultimate
[01:18:21] expression of power. You know, you you
[01:18:23] can't create life, but you can end it.
[01:18:24] Um, and there there are people, and I
[01:18:26] would say Lindsey Graham is one of them,
[01:18:28] but there are many, who just derive
[01:18:30] pleasure from the idea of killing
[01:18:33] people, not just because they're cruel,
[01:18:35] that obviously they are, but because it
[01:18:38] makes them feel alive. And I think
[01:18:39] there's something you see it in school
[01:18:41] administrators. So, I I have I really
[01:18:43] feel like we're on the cusp of like
[01:18:45] something great.
[01:18:46] >> Charlie Kirk's memorial on Sunday made
[01:18:49] me feel that way. I feel like it's not
[01:18:50] all darkness and like don't take the
[01:18:52] black pill, you know? there is light
[01:18:54] there and um so I feel that way but then
[01:18:58] you see I have to say video of Don Bacon
[01:19:01] the Republican congressman from Nebraska
[01:19:04] the most normal state out of 50 saying
[01:19:06] oh yeah I'm talking to Jonathan
[01:19:09] Greenblad who's like a gargoyle from ADL
[01:19:12] which is like the most anti-human
[01:19:13] organization like I've ever dealt with
[01:19:15] in my life and you feel like wow if Don
[01:19:19] Bacon is taking orders from the ADL and
[01:19:21] Jonathan Greenblat then like the fix is
[01:19:24] in. Like it's a bipartisan conspiracy to
[01:19:28] strip people of their most basic
[01:19:29] freedom.
[01:19:31] >> Yeah. I mean I think that for me also
[01:19:34] what comes up Tucker and I I I know that
[01:19:36] this is something that you are concerned
[01:19:37] about too is that I think you were
[01:19:39] saying it before like there's a there's
[01:19:41] a censorship and then there's well
[01:19:43] there's actually so many sides to the
[01:19:44] totalitarianism. There's the censorship,
[01:19:47] there's the disinformation and
[01:19:49] dehumanization that the state or these
[01:19:52] paristatal
[01:19:53] >> censorship, you know, proxy entities
[01:19:55] play. And then there's the secrecy. And
[01:19:58] so I think what we now know and again
[01:20:01] I've I've said very clearly and praised
[01:20:03] the Trump administration. They've
[01:20:04] actually been helpful in my own case in
[01:20:06] Brazil where I'm I'm under criminal
[01:20:08] investigation for the Twitter files
[01:20:09] Brazil. And so I'm very grateful to the
[01:20:11] Trump administration. I hope that's
[01:20:13] clear. But nonetheless, I think we can
[01:20:15] see that there are clearly some things
[01:20:18] that we're not that they really don't
[01:20:20] want us to know about. And the Epstein
[01:20:22] one, the Jeffrey Epstein situation is
[01:20:25] easily, I think, the most explosive and
[01:20:27] most famous one where everybody knows
[01:20:30] there's these files and everybody knows
[01:20:32] that they're the FBI and DOJ and
[01:20:33] everybody knows that there's no legal
[01:20:36] barriers to releasing them and that
[01:20:38] there's all these excuses or everybody
[01:20:40] knows that it's not just Jeffrey
[01:20:41] Epstein's own personal pornography
[01:20:44] collection. um and that the the story
[01:20:46] has kept shifting, but you know, it
[01:20:48] looks like there may now be enough votes
[01:20:50] in in the House pretty soon to force a
[01:20:52] vote on it. I think Speaker Johnson
[01:20:54] could still try to stop that, but I
[01:20:56] think I'm heartened that uh the MAGA
[01:20:59] movement uh actually remained true to
[01:21:02] following that issue through to the end.
[01:21:04] But um I think we've seen that there's
[01:21:06] you know frankly a secret government
[01:21:09] that I mean we can people will say that
[01:21:11] sounds conspiratorial but I think if you
[01:21:14] realize what the Epstein files are and
[01:21:17] that it was covering up almost certainly
[01:21:20] very likely a sex blackmail operation.
[01:21:23] And by the way, we didn't even have
[01:21:24] proof of the hidden cameras until a
[01:21:26] couple of weeks ago when the New York
[01:21:28] Times published two photos of the hidden
[01:21:30] cameras, one of them pointing right at a
[01:21:32] bed in Epstein's New York apartment. I
[01:21:35] think we know that. And you know, we had
[01:21:37] and Massie was in, you know, Congressman
[01:21:38] Massie was in Congress last week and he
[01:21:40] said there was 20 names that he knows
[01:21:42] who they are that are in the files. He
[01:21:44] gave us one of them, CEO of Barclay's
[01:21:46] Bank. And then he kind of listed who the
[01:21:48] other ones were. one of them was like a,
[01:21:50] you know, Hollywood producer, rock star,
[01:21:52] magician. Um, so we know all these
[01:21:55] things. It I think it's a really
[01:21:57] important test. I think it's really
[01:21:58] important that all of us that are
[01:22:01] sympathetic to things the Trump
[01:22:02] administration has done that we continue
[01:22:04] to not let the Epstein issue go. And
[01:22:07] then I think the other issue, Tucker,
[01:22:08] that that I know you care a lot about is
[01:22:10] the UAP issue. the the president said
[01:22:14] after the drones over the drones over
[01:22:16] New Jersey, the unidentified uh mostly
[01:22:18] unidentified drones over New Jersey that
[01:22:20] we were going to be able to find what
[01:22:22] that is. I have a list of all of the doc
[01:22:24] of the key documents provided to me by
[01:22:26] John Greenwald of the document the UAP
[01:22:29] documents that exist that have many of
[01:22:32] which have been released and have just
[01:22:33] been so heavily redacted. They need to
[01:22:35] release these things. They need to um
[01:22:38] stop hiding this. And I'll just end by
[01:22:40] saying uh on this to culminate it all.
[01:22:43] Look, the elephant in the room here is
[01:22:44] the CIA. You know, you've got this
[01:22:47] wonderful reform leader in Tulsi Gabbard
[01:22:50] who is a unifying leader. She has so
[01:22:52] much trust from people that were on the
[01:22:54] left, so much trust from the MAGA
[01:22:56] community. She's obviously a good person
[01:22:59] like anybody that has ever met her or
[01:23:01] seen her.
[01:23:02] >> That's correct. and and she she by law
[01:23:05] Congress after 9/11 made this law that
[01:23:09] she is the boss of the intelligence
[01:23:10] community. That is what the law
[01:23:12] requires. But we have this recalcitrant
[01:23:15] CIA where I mean come on guys like we
[01:23:19] have not seen significant change to
[01:23:21] personnel. Apparently only two of the
[01:23:24] people that worked on the bogus
[01:23:25] intelligence community assessment about
[01:23:27] Russia interference in the 2016
[01:23:28] elections. Only two of those people are
[01:23:30] gone. Um it's the response from CIA to
[01:23:33] us. Uh there I frankly found their what
[01:23:37] they told us to be just uh uh
[01:23:40] patronizing to the point of offensive in
[01:23:43] insisting you know is basically trust us
[01:23:45] bro. Um it's all good now. The CIA is
[01:23:48] fine. The CIA is not fine. The CIA is
[01:23:52] hiding information that the American
[01:23:54] people paid for and have a right to know
[01:23:57] on a lot of issues. a lot UAP Epstein.
[01:24:02] Uh, Congressman Massie revealed that
[01:24:03] there is a CIA file on Epstein that
[01:24:06] needs to come out. And look, maybe CIA
[01:24:09] shouldn't exist. I mean, Senator Moahan
[01:24:11] before he died and uh uh Kennedy's
[01:24:16] historian, why am I blanking his name?
[01:24:18] >> Schlesinger.
[01:24:19] >> Schlesinger.
[01:24:21] >> There's been various proposals to break
[01:24:23] up the CIA. You know, frankly, it's a
[01:24:25] paramilitary organization. ever since
[01:24:27] 9/11. It was supposed to be an Truman
[01:24:29] wanted an intel organization. We need
[01:24:32] good intel. By the way, congratulations
[01:24:34] on your brilliant documentary. I saw the
[01:24:36] first part of it last night. So, now it
[01:24:37] appears, if I'm understanding correctly,
[01:24:39] that the CIA was probably behind uh the
[01:24:43] 9/11 attacks. It was a botched uh CIA
[01:24:46] operation. It sounds like I haven't
[01:24:47] finished your series, but here you have
[01:24:49] this. So I mean you kind of go so here
[01:24:51] you have an organization that's
[01:24:53] responsible for just the worst like
[01:24:55] regime change coups followed by
[01:24:57] dictators who tortured people CIA that
[01:25:00] you know infiltrated American student
[01:25:03] groups that used labor unions to you
[01:25:06] know engage in regime change um you know
[01:25:10] that spawn off people that were involved
[01:25:12] in the censorship industrial complex and
[01:25:14] lawfare may have been sounds like what
[01:25:16] you're saying you know that uh was
[01:25:18] behind or at least
[01:25:20] didn't stop or contributed to the 9/11
[01:25:22] attacks. And then they did the torture
[01:25:25] after 9/11, which not only doesn't work
[01:25:27] uh like creates bad information and is a
[01:25:30] stain on the moral character of the
[01:25:33] United States. At a certain point,
[01:25:36] you're like, what is this dog of an
[01:25:38] organization doing being just unreformed
[01:25:41] and trampling on all of our basic, you
[01:25:43] know, freedoms? So, I mean, I, you know,
[01:25:46] I kind of go, I think we just need to
[01:25:48] tell people that we don't really govern
[01:25:49] ourselves as long as you have this, you
[01:25:52] know, uh, mess of an institution called
[01:25:54] the CIA where a bunch of analysts uh,
[01:25:57] kind of appear to run the world. As long
[01:25:59] as that organization remains unreformed
[01:26:02] and we don't really get true disclosure
[01:26:03] about all the things that we know are
[01:26:05] going on, then I think we should be
[01:26:07] pretty unhappy and pretty demanding of
[01:26:09] much more significant reforms than it
[01:26:12] appears uh the Trump administration is
[01:26:14] going to pursue.
[01:26:15] >> I'd settle for real oversight rather
[01:26:17] than, you know, Tom Cotton, who runs the
[01:26:20] Senate Intelligence Committee, is
[01:26:21] basically just an apologist for the CIA.
[01:26:24] There's no oversight at all. he carries
[01:26:26] water for the agency in ways that hurt
[01:26:29] this country and it's I'm not exactly
[01:26:31] sure why. Like what is that? And I don't
[01:26:33] know the answer. Um people can speculate
[01:26:36] all they want. I do want to just go back
[01:26:38] and thank you for what you said. It's
[01:26:40] all true. It's true.
[01:26:42] >> Okay.
[01:26:42] >> Um
[01:26:43] >> good. I haven't seen the end of it, but
[01:26:44] I saw the first part. It's amazing by
[01:26:45] the way.
[01:26:46] >> No, no, I'm not talking about our doc
[01:26:48] documentary series. I was just saying
[01:26:49] your analysis of CIA. I mean, how many
[01:26:53] people do you think in the White House
[01:26:54] right now know what the actual CIA
[01:26:56] budget is? You know, I'd be surprised if
[01:26:58] you could find someone. I I don't I've
[01:27:00] never met anyone who can act who can
[01:27:02] even they can't tell you because it's
[01:27:03] classified, but and I assume supposedly
[01:27:07] the House and Senate Intel Committee
[01:27:09] chairman know what the full budget is,
[01:27:11] but I would be shocked if they actually
[01:27:13] did. I mean, it's its own country. It's
[01:27:15] autonomous. It doesn't have um
[01:27:17] oversight. It doesn't have command
[01:27:19] control structures. it just kind of does
[01:27:21] what it wants. It lies about what it
[01:27:22] does. There's no way to know for a fact.
[01:27:24] I mean, you it's it's a it's a separate
[01:27:27] government within our borders. Um just
[01:27:30] like they had in Portland, Oregon at the
[01:27:32] you know, the the height of uh George
[01:27:35] Floyd. But I just want to ask you about
[01:27:36] something um that you said about the the
[01:27:40] drone sighting or the lights in the sky
[01:27:41] over New Jersey last year and you know,
[01:27:45] so many sightings that it's really no
[01:27:47] dispute that it happened. And the
[01:27:48] question is what is it? And the
[01:27:50] president said that he would tell us.
[01:27:52] We've never heard. What was that, do you
[01:27:55] think?
[01:27:57] >> I don't know. And you know, they're
[01:27:58] having also very similar drone sightings
[01:28:00] now um over in Denmark that actually
[01:28:03] shut down both Danish airports on
[01:28:06] Sunday. I mean, it's uh uh Yeah. I mean,
[01:28:11] and why can't we know about it, you
[01:28:12] know? And
[01:28:13] >> but what's your sense? I mean, you
[01:28:15] you've done a lot on this. I I I've
[01:28:18] talked to you off, you know, off camera
[01:28:20] just because I you're one of the few
[01:28:21] people whose judgment on this I trust.
[01:28:23] There's so much deception on this
[01:28:25] question. I think parts of it, what's
[01:28:28] the term they use? It's an op. Um I
[01:28:30] think part of it is, part of the
[01:28:31] explanation is, but at its core are
[01:28:34] physical phenomena that have been
[01:28:36] recorded in such volume at such scale
[01:28:39] that like something real is happening.
[01:28:41] And I and I know you don't really have
[01:28:43] the final answer on that, but what is
[01:28:45] your sense?
[01:28:47] >> Yeah, I mean I think look, first of all,
[01:28:49] the government is engaged in extensive
[01:28:51] disinformation on this topic and that's
[01:28:54] not uh that's not that's just all
[01:28:56] confirmed like right
[01:28:57] >> it's been well documented what they've
[01:28:59] done. I mean there's a you know there's
[01:29:01] a there's an alien crash retrieval
[01:29:04] manual that uh is you know that is
[01:29:08] officially according to official story a
[01:29:10] total fake a total fabrication but I
[01:29:12] mean when you look at it it is
[01:29:14] extraordinary in its quality of like if
[01:29:17] it is a fake I mean complete with like
[01:29:19] the names of the people who checked it
[01:29:20] out and those people having been checked
[01:29:22] out who who would do that like why would
[01:29:25] you do that well one story is that it
[01:29:27] was used as passage material to identify
[01:29:29] counter counterint in intelligence spies
[01:29:31] in the US intelligence community but
[01:29:34] nonetheless there has been so much
[01:29:36] government misinformation. Um there's
[01:29:39] also been you know efforts to there's
[01:29:41] also secret you know uh technology
[01:29:43] projects. I mean one of the guys that
[01:29:45] testified at the UAP hearing last week
[01:29:47] just said that uh that he has seen
[01:29:50] successful reverse engineering of
[01:29:52] technologies. Um, you know, there's a
[01:29:55] whole kind of Pentagon technological
[01:29:58] side of this that many other people have
[01:30:01] done so much better work on and
[01:30:02] reporting on than me. Jesse Michaels
[01:30:04] being one of the leaders of kind of
[01:30:07] unearthing it. I will say I don't think
[01:30:10] any of it could all be reduced to uh
[01:30:12] hard military hardware, either ours or
[01:30:14] somebody else's. I'm very confident that
[01:30:16] there's just way too many cases that
[01:30:18] don't fit that. Um I also uh think that
[01:30:21] Jacques Valet is done really some of the
[01:30:24] most important scholarship on this. I
[01:30:26] find myself and he just gave a
[01:30:28] presentation on it. He's like he's the
[01:30:30] French character played by France Trufo
[01:30:32] in Close Encounters of the Third Kind by
[01:30:34] Steven Spielberg, a French researcher
[01:30:37] who's just sort of a international
[01:30:39] treasure of UFO, you know, cases. And
[01:30:43] you know, he's actually gone a very
[01:30:45] similar direction that you've gone. Um
[01:30:46] and I find myself going there a little
[01:30:48] bit too which is that there is a
[01:30:50] spiritual element to this that I don't
[01:30:53] think is just purely attributable to
[01:30:55] technology because the issue is such a
[01:30:58] gestalt
[01:30:59] uh issue because you know if you look
[01:31:02] like in the classic gestalt is it an old
[01:31:03] woman is it a young woman is this a
[01:31:06] spiritual issue sort of manifesting as
[01:31:09] sort of some high-tech hardware or is it
[01:31:13] some high-tech civilization manifesting
[01:31:17] um as something spiritual. I find myself
[01:31:19] really gravitating towards these uh
[01:31:22] cases which is also where valet
[01:31:24] encouraged a lot of new research. Um one
[01:31:27] of which is my favorite is this English
[01:31:29] woman in the countryside who had a UFO
[01:31:31] sighting in the ' 50s. And I would um
[01:31:34] dare anyone to watch that. And she
[01:31:36] describes scene, you know, she's like a
[01:31:38] very workingclass uh English woman. It's
[01:31:40] a beautiful interview with her uh done.
[01:31:43] It's not by it's like BBC or somebody
[01:31:45] and they said, "Describe what you saw."
[01:31:46] You know, she said she hears this noise.
[01:31:47] Her two boys are in the front yard. She
[01:31:49] sees a huge UFO over her house. He asked
[01:31:52] her to describe it and she said, "What
[01:31:54] can I say? It was like a Mexican hat,
[01:31:56] you know, like a typical, you know, uh,
[01:31:58] flying saucer with a dome." Her kids
[01:32:01] were there seeing it. She swear as she
[01:32:03] saw it. She says there was two people
[01:32:05] inside and they were beautiful people
[01:32:07] with long blonde hair and sort of
[01:32:09] slightly bigger foreheads and sort of
[01:32:12] looking at her and um it ends so
[01:32:15] interesting. She says, you know, we we
[01:32:16] told people about this and then we were
[01:32:18] ridiculed and then she said, but it's
[01:32:20] okay because I know it happened. It's
[01:32:22] true. And I I think I dare people to
[01:32:24] watch that and come away thinking that
[01:32:26] she was lying. I don't think she was
[01:32:28] lying.
[01:32:29] >> Yes. I also um as you know I have
[01:32:32] interviewed a fair number of psychotic
[01:32:34] people living tragically on the street
[01:32:36] um and people in psychotic states that's
[01:32:40] not the kind of story they tell in fact
[01:32:41] I have I even have psych guy I have I
[01:32:44] have homeless people I've been
[01:32:44] interviewing that are methinduced
[01:32:46] psychotic you know methinduced psychosis
[01:32:48] talking about aliens and it's just a lot
[01:32:50] of word salad and and garbled it's like
[01:32:53] talking to somebody trying to explain a
[01:32:54] dream they had it doesn't make sense.
[01:32:56] >> Yes.
[01:32:57] >> So I don't think she lied. I don't think
[01:32:59] she's cap I think that most actors are
[01:33:01] bad actors. I don't think she's capable
[01:33:03] of having invented that and then
[01:33:04] persuading her children to lie with her.
[01:33:06] I think that that she had that
[01:33:08] experience. I don't think she's
[01:33:10] psychotic. Um I don't really know if
[01:33:14] anybody knows if that if if that if
[01:33:16] those beings come from a different
[01:33:18] planet or they're interdimensional or
[01:33:20] they're spiritual or if they have some
[01:33:22] other form and they're just manifesting
[01:33:24] and and hologramming like that. I don't
[01:33:26] know. Um but I think that um the
[01:33:31] conversation, you know, thanks to again
[01:33:33] people like you and and Joe and others
[01:33:35] has just widened so that we can see just
[01:33:38] what a what a big lie it's been that
[01:33:41] science has really properly accounted
[01:33:44] for reality. um you know this you know
[01:33:46] sci science magazine did a survey of
[01:33:49] scientists including natural scientists
[01:33:51] and I think it was somewhere around 60
[01:33:53] to 80% of natural scientists I'm talking
[01:33:56] physics and biology and chemistry were
[01:33:59] not able to replicate famous studies in
[01:34:02] their field they admitted this in a
[01:34:04] survey and then they would ask them do
[01:34:05] you still trust your field of science
[01:34:07] and they all said yes but they can't
[01:34:08] replicate basic scientific experiments
[01:34:12] they keep changing their mind on the
[01:34:13] creation stories at the big bang I think
[01:34:16] there's sufficient doubts about we about
[01:34:18] human origins and so but like that
[01:34:22] became that was so taboo that was so you
[01:34:24] couldn't talk about that in polite
[01:34:26] society but I do think now we are able
[01:34:28] to have those conversations and I do
[01:34:30] think it's really notable that at this
[01:34:32] political shift there is I think a
[01:34:34] spiritual a spiritual movement I mean
[01:34:37] I'm obviously really into it um other
[01:34:40] people in my life are not as excited
[01:34:42] about it but for me I these um
[01:34:46] experiences,
[01:34:48] you know, the evidence um you know uh or
[01:34:51] the spiritual side of it, the government
[01:34:53] cover up, you know, are just huge areas
[01:34:56] that we should be doing so much more
[01:34:58] research and investigations and
[01:34:59] journalism on. Um, I get a little
[01:35:02] frustrated because I think sometimes I
[01:35:05] think the conversation right now in the
[01:35:07] podcast world and in the conversation is
[01:35:09] just a lot of people are repeating and
[01:35:12] speculating about stories that we've
[01:35:14] kind of heard before or sort of know
[01:35:16] about, but we haven't put nearly enough
[01:35:18] pressure on the government to to release
[01:35:21] or unredact the documents that we know
[01:35:23] exist to come clean about what they
[01:35:25] appear to know and are unwilling to
[01:35:28] tell. there should be a real movement
[01:35:30] around this and there should be
[01:35:31] consequences for members of Congress
[01:35:32] because that's information uh that uh
[01:35:36] that belongs to all of us and if there's
[01:35:38] some evidence of non-human intelligence
[01:35:41] or a lot of evidence my I've been my
[01:35:43] understanding I'm very confident that
[01:35:45] there are thousands of highquality
[01:35:48] videos photos sensor data radar data a
[01:35:52] lot a lot that the military is keeping
[01:35:55] from us and the CIA is keeping from us
[01:35:58] and we should be really upset about
[01:35:59] that. And I think that for me, I'm I
[01:36:03] think that we can we there's just been a
[01:36:05] lot of conversations where people go
[01:36:06] round and round about with the data that
[01:36:08] we know, but what we're missing is the
[01:36:10] fact that the government the government
[01:36:12] is sitting on so much more of it. And I
[01:36:14] find myself wanting to do more to force
[01:36:16] it out. And I'm getting frustrated. Um,
[01:36:19] but I'm a little bit, you know, I think
[01:36:21] as you've seen, I'm on the one hand very
[01:36:23] grateful to this administration and the
[01:36:25] strong things it's done, you know, on
[01:36:27] free speech and the disclosure it's
[01:36:29] done. Certainly disclosing so much more
[01:36:31] than the last administration, but we
[01:36:33] still need a lot more. There's still so
[01:36:35] much that needs to be released on
[01:36:37] Epstein, on COVID origins, the whole
[01:36:40] COVID pandemic response, on the
[01:36:42] weaponization of FBI, the continuing
[01:36:44] rot. I mean, we were someone at the CIA
[01:36:47] told us pathological rot at the CIA and
[01:36:51] we need we need to know what's going on
[01:36:52] with the UAPs. It's just the specula,
[01:36:55] like you were saying before, um it would
[01:36:57] be irresponsible not to engage in
[01:37:00] conspiracy theorizing and speculation
[01:37:02] given how little information they give
[01:37:03] to us. And if they were really so
[01:37:05] concerned about conspiracy theories and
[01:37:07] speculation and misinformation, then
[01:37:09] they they should be releasing those
[01:37:10] documents. Well, of course they ferment
[01:37:12] conspiracy theories and race hatred um
[01:37:16] because it's a distraction from what
[01:37:17] they're doing. I mean, when I was
[01:37:18] younger living in Washington and I began
[01:37:21] to understand that the government was
[01:37:22] systematically lying across agencies
[01:37:25] about a couple of things, probably a lot
[01:37:26] of things, but UAPs were definitely one
[01:37:28] of them. That became obvious a while
[01:37:30] ago. And I remember asking, you know,
[01:37:33] like what is this? And never getting a
[01:37:36] straight answer. Except people would
[01:37:38] say, look, it's not it's destabilizing.
[01:37:40] It would be destabilizing if the public
[01:37:43] knew. And like who wants an unstable
[01:37:45] country? You know, there are some things
[01:37:46] that people just aren't ready for or
[01:37:47] whatever the euphemism they use. But
[01:37:48] that was the explanation.
[01:37:50] >> As I got older, I began to, you know,
[01:37:52] talk to other people and have other
[01:37:54] thoughts and one of them was totally
[01:37:56] possible that the government has
[01:37:57] something really does have something to
[01:37:58] hide is participating in things that
[01:38:00] people would not approve of or be
[01:38:01] shocked to learn. And all of that gets
[01:38:04] to a a question that's never occurred to
[01:38:06] me till right now, but like who named
[01:38:09] America's
[01:38:10] military headquarters after a pentagram?
[01:38:13] Like who thought that was a good idea?
[01:38:15] And I know you've done a lot of research
[01:38:17] on that period, the war period. Like
[01:38:19] what what was that?
[01:38:21] >> Well, yeah. I mean, this is uh you know,
[01:38:23] I haven't seen it yet, so I can't
[01:38:24] evaluate it yet. I don't know a lot
[01:38:26] about it, but yeah. I mean, there's some
[01:38:28] there's a real darkness to the whole
[01:38:31] area.
[01:38:31] >> Yeah. Let let's call the Let's call the
[01:38:33] building that controls nuclear weapons
[01:38:35] the Pentagon.
[01:38:37] >> Huh. I mean, it's sort of like right in
[01:38:38] your face, right? Or no?
[01:38:41] >> Yeah. I mean, I I I just haven't looked
[01:38:43] that much on it. I do know that like a
[01:38:45] lot of the UFO stuff is very tied in
[01:38:47] with the occult. Yeah.
[01:38:49] >> And apparently apparent, you know, Jesse
[01:38:51] Michaels again did a apparently did a a
[01:38:53] new documentary on a cult. I mean, I'm
[01:38:58] not vouching if I don't know about it,
[01:38:59] but I like Jesse occult behaviors within
[01:39:03] NASA. Uh, so very concerning. I don't
[01:39:07] know what it means. Um, you know, I I
[01:39:10] I'm I'm shocked by how little curiosity
[01:39:13] there is at a societywide level. I think
[01:39:16] that you know the intellectual life of
[01:39:18] this country by which I mean not just
[01:39:21] the universities but also the newspapers
[01:39:23] and the big media companies that is how
[01:39:25] censorship was done over the last 80
[01:39:28] years. The internet is almost a return
[01:39:30] to a pre radio pre-bro period um when
[01:39:35] people were really free to just print
[01:39:36] whatever they wanted. Uh the internet is
[01:39:38] not there but it's it's a lot closer to
[01:39:40] it. we finally get to kind of learn that
[01:39:43] actually there's all these anomalies
[01:39:46] around human evolution around human
[01:39:50] history around you know archaeological
[01:39:53] sites where things don't seem to add up
[01:39:55] right and you start to get um people
[01:39:57] that were called you know pseudo
[01:40:00] archaeologists starting to kind of win
[01:40:02] some arguments publicly I mean there's
[01:40:04] one happening right now around uh
[01:40:06] Gabbeckley with this guy Jimmy Corsetti
[01:40:09] where he's just shown known that the
[01:40:11] people that are supposed to be
[01:40:12] excavating the site are destroying it,
[01:40:14] planting trees whose roots will destroy
[01:40:18] these ancient sites and also building
[01:40:20] these really grotesque roofing
[01:40:23] structures in ways that destroy the
[01:40:24] site. They're very weird and suspicious.
[01:40:27] Um there's just a lot of, you know, we
[01:40:29] know that a lot of the Tesla information
[01:40:31] was missing, uh that, you know, should
[01:40:34] have shown some very interesting things.
[01:40:36] And then yeah, I mean I think that the
[01:40:38] relationship with nuclear is one of the
[01:40:40] most interesting parts of this because
[01:40:43] these UAPs they show up around nuclear
[01:40:46] sites. I used to work on nuclear a lot
[01:40:48] and there would be nucle there would be
[01:40:50] these drones
[01:40:53] um these unidentified
[01:40:55] uh uh they seemed like objects but
[01:40:57] unidentified phenomenon around nuclear
[01:40:59] power sites. uh the nuclear the people
[01:41:02] working at them um were often very
[01:41:04] concerned around public perception of
[01:41:06] danger and so they often didn't talk
[01:41:08] about them but they they've certainly
[01:41:10] been over Diablo Canyon nuclear plant in
[01:41:12] California. Um but when the drones
[01:41:15] happened in New Jersey well we caught
[01:41:17] them in an open lie. I mean they just
[01:41:19] said John Kirby at one point said
[01:41:21] something like they had evaluated like
[01:41:22] 3,000 cases of drone sightings in like
[01:41:25] 48 hours which is like absurd. There's
[01:41:27] no possibility they did it. And then we
[01:41:29] started looking, a set set of people
[01:41:31] started looking and you discover that in
[01:41:32] fact there's been these drone swarms
[01:41:36] around US military bases. I mean, not a
[01:41:39] couple either. I mean, I think it was
[01:41:40] somewhere around two or three dozen
[01:41:42] military bases and there's a lot of
[01:41:44] evidence that those those drones um are
[01:41:48] circling around those moments when
[01:41:49] there's nuclear weapons uh in the area.
[01:41:52] So, um, you know, if it's I'm skeptical
[01:41:56] that it's Chinese and Russian because
[01:41:57] the drones are engaged in behaviors that
[01:41:59] I think are very difficult, uh, for
[01:42:02] anybody to do. But, I mean, if these
[01:42:05] objects are behaving in ways that, you
[01:42:07] know, do appear to be using a different
[01:42:09] kind of propulsion or anti-gravity.
[01:42:12] I mean, I'm very skeptical that that's
[01:42:14] ours in the sense that it takes a lot.
[01:42:16] It took a huge effort to create the
[01:42:18] Manhattan Project and to create nuclear
[01:42:20] weapons. It was a massive, massive
[01:42:21] endeavor. And so to somehow easily get
[01:42:24] or to be able to easily hide reverse
[01:42:27] engineering, I don't know how you do
[01:42:29] that. Um I'm really skeptical that we
[01:42:32] have it, but it's absurd that we have to
[01:42:34] just sit around and speculate about it.
[01:42:36] Like like we need there's basically no
[01:42:38] transparency. Instead, we have a DoD
[01:42:41] organization called Arrow, which as far
[01:42:44] as I can tell is part of a deception
[01:42:46] operation, consistent with the CIA's
[01:42:49] recommendations through the Robert
[01:42:50] Robertson panel in the 1950s,
[01:42:54] that the main thing the US government
[01:42:56] should do is so is supposedly debunk the
[01:42:59] UFOs and to de and to ridicule the
[01:43:01] people that see them and and research
[01:43:04] them. And worse, um there's a lot of
[01:43:06] threats made to people in this field. I
[01:43:09] personally find it one of the scariest
[01:43:11] issues. Um, trans and UAPs are the two,
[01:43:18] paradoxically, the the scariest issues.
[01:43:20] Um, and because it just seems like a lot
[01:43:23] of people really don't want us to know
[01:43:25] what's going on with it. And uh,
[01:43:27] President Trump sent made noises like he
[01:43:30] was going to reveal something and Tulsi
[01:43:32] Gabbard just made some noises that she
[01:43:34] wanted to get to the bottom of it. But
[01:43:35] otherwise, Tucker, they're just
[01:43:38] they just seem like they really they it
[01:43:40] seems like they want to they want a
[01:43:41] Jeffrey Epstein the UAP files.
[01:43:45] >> Yeah. I don't think there's any. And if
[01:43:46] you're wondering if there's a spiritual
[01:43:47] component to the whole thing, if it's if
[01:43:49] it's if it's about technology and, you
[01:43:51] know, I don't know, Martians,
[01:43:54] >> right?
[01:43:55] >> Uh probably not going to be uh this kind
[01:43:59] of response to it. I mean, this just
[01:44:01] glows with intensity. Again, it's the
[01:44:06] Pentagon. So, yeah, there's a spiritual
[01:44:08] component to it. I would say I've been
[01:44:10] I've been scared off, too. It's like I
[01:44:11] don't even want to deal with it. But I'm
[01:44:12] I'm grateful for you, Mike
[01:44:14] Shelonburgger. Really, I I'm I'm so
[01:44:17] grateful you went into journalism. There
[01:44:18] are few people uh with, you know, you
[01:44:20] could be doing a lot of other things.
[01:44:21] There not that many super smart people
[01:44:23] in journalism with, you know, true
[01:44:25] principles and you're definitely one of
[01:44:28] the very few. And so I'm always grateful
[01:44:30] to talk to you and I'm grateful you're
[01:44:31] doing what you're doing. So thank
[01:44:33] >> thank you and back at you and
[01:44:34] congratulations on uh coming back to
[01:44:37] your famous monologues and I was really
[01:44:40] delighted that you did it on free speech
[01:44:42] and I hope you keep doing a a weekly
[01:44:44] monologue. I think it's uh
[01:44:45] >> getting all spun up. Yeah, I enjoy it.
[01:44:49] >> Thank you. I hope we can have dinner
[01:44:50] soon. Great to see you.
[01:44:52] >> Great.
[01:44:53] >> The great Mike Shelonburgger. We'll be
[01:44:56] back next week. Good night.
ℹ️ Document Details
SHA-256
yt_0fDTlTN14Jc
Dataset
youtube
Comments 0