Friendly Fire: Will AI Destroy Us?
📄 Extracted Text (15,829 words)
[00:00:00] You guys need me here as a community
[00:00:02] college dropout with all you Ivy League
[00:00:03] nerds.
[00:00:04] >> You were just making fun of me because I
[00:00:05] brought that up and now you're bringing
[00:00:06] [laughter] that up.
[00:00:07] >> Well, it's I'm I'm bringing it I'm
[00:00:08] bringing it back to the real world.
[00:00:09] >> No, no, no. You're reading the study
[00:00:11] totally wrong. That's not what the study
[00:00:12] says. Okay, now I really want to [music]
[00:00:14] move on because Matt's offering a
[00:00:15] moderate opinion and Ben is agreeing
[00:00:17] with him. Friends like these [music]
[00:00:20] enemies and friends like [singing] these
[00:00:23] enemies.
[00:00:25] >> Everybody, welcome to Friendly Fire. All
[00:00:27] DailyWire Plus subscriptions are 50% off
[00:00:31] right now. Get them right now.
[00:00:33] dailywire.com/subscribe.
[00:00:37] Also, stick around because we have the
[00:00:38] world premiere of the trailer of Pen
[00:00:41] Dragon, the Pen Dragon Cycle, The Rise
[00:00:42] of the Merlin. That is coming up at the
[00:00:44] end of the show. But before we get to
[00:00:46] any of that, speaking of wizardry, I
[00:00:48] want to talk about AI and whether AI is
[00:00:51] really good like everyone seems to think
[00:00:53] it is, like all the financial
[00:00:55] speculators have thought, which is why
[00:00:56] it boosted the MAG7 stocks until
[00:00:58] recently before our impending stock
[00:01:00] market collapse, or whether AI is
[00:01:03] probably mostly bad for all of us. To
[00:01:06] kick it off, the most optimistic person
[00:01:08] on the panel, Mr. Walsh.
[00:01:11] Uh, yeah. I'm I'm very I'm I become more
[00:01:13] anti- AI with each passing day. I I hate
[00:01:16] AI. If I could I said before, if I could
[00:01:19] commit some sort of anti-AI genocide, I
[00:01:22] would totally [laughter] do it. Um, I
[00:01:24] think that and here's what here's what
[00:01:26] blows my mind about it is that we can
[00:01:27] all most of us anyway can see even
[00:01:30] people who are behind AI like Elon Musk
[00:01:33] can see coming this like potential
[00:01:37] civilizational level catastrophe and
[00:01:40] basically nothing is being done about it
[00:01:42] at all because what is absolutely going
[00:01:45] to happen as far as I can tell is AI at
[00:01:48] a minimum is going to wipe out many
[00:01:50] millions of jobs over the next 5 to 10
[00:01:52] years. How many millions? There's no way
[00:01:54] to say for sure. I I did ask by the way
[00:01:56] chat GPT before we went on. Uh I asked
[00:01:59] Chat GBT to estimate how many jobs it
[00:02:01] will take and AI will take from us in
[00:02:04] the next 10 years. And I think the
[00:02:05] answer I got was 15 million or something
[00:02:07] like that. 15 to 25 million. So, who
[00:02:10] knows? It's millions of jobs are going
[00:02:12] out the window. We know that because of
[00:02:13] AI. And they're not going to be replaced
[00:02:14] by anything. They're they're just
[00:02:15] they're going away. They're not coming
[00:02:17] back. Um that's going to happen. We're
[00:02:20] going to be we're already we're we're
[00:02:21] almost there now, but we will soon be in
[00:02:23] a situation online where you just simply
[00:02:25] cannot tell reality from fiction at all
[00:02:28] where the AI videos are going to be so
[00:02:30] good that if anybody wants to smear any
[00:02:33] of us here, I can't imagine anyone would
[00:02:35] want to smear any of us because we're so
[00:02:36] [laughter] we're all so beloved. But if
[00:02:38] anyone wanted to do that, they could
[00:02:39] just make a video of any of us doing or
[00:02:41] saying something horrible and there'd be
[00:02:42] no way for us to prove it.
[00:02:43] >> That cat video radicalized me. I don't
[00:02:45] know if you guys saw the cat playing the
[00:02:46] digery do and everything. It was very
[00:02:49] good. If I didn't if I didn't know that
[00:02:51] most cats don't play digo, I would have
[00:02:53] thought that was a 100% real video.
[00:02:55] >> Well, that but Michael, that's the other
[00:02:57] that's that's the other thing that's
[00:02:58] going to happen with AI is that people
[00:03:00] are just sitting there looking at this
[00:03:02] slop made by an algorithm all day every
[00:03:04] day while their minds are melted. And
[00:03:07] then on top of all those other things,
[00:03:08] it's going to completely destroy every
[00:03:10] creative industry uh is all going out
[00:03:13] out the window. And so what are we doing
[00:03:16] about this? So, we just got to sit back
[00:03:17] and let it happen because that seems to
[00:03:19] be the kind of defeist attitude that
[00:03:21] most people have is like, "Well, we
[00:03:22] can't do anything. So, let's just um I
[00:03:25] guess you know, we had a good run, human
[00:03:26] beings. Let's uh let's pack it in." And
[00:03:29] uh I I
[00:03:30] >> Matt, I do Matt, I want to ask you
[00:03:32] seriously. Do you think that AI is going
[00:03:33] to kill all of us or is this kind of
[00:03:35] your list of uh because I know that's
[00:03:36] the sort of the most catastrophist take
[00:03:38] on this is that AI is going to turn
[00:03:40] around and do gigantic murder to all of
[00:03:42] us like Terminator 2. But yeah, no, like
[00:03:45] this is your list of complaints that I
[00:03:46] just want to make sure that that's list
[00:03:47] of complaints so that I can argue with
[00:03:48] them.
[00:03:48] >> No, the Terminator thing. I don't that's
[00:03:51] like I'd prefer that. I mean, at least
[00:03:53] [laughter] that's that you know what if
[00:03:55] if if AI becomes Terminator, then that
[00:03:57] at least gives us jobs that we could do
[00:03:58] cuz we're fighting the AI. Uh so it's
[00:04:01] not that at all. I I'm not looking at
[00:04:03] any science, you know, sci-fi scenario.
[00:04:05] It really is. The main thing is people
[00:04:07] will not have much to do because AI is
[00:04:10] going to do everything and it's going to
[00:04:11] take all of our jobs. And I don't think
[00:04:13] that we have the capacity to sustain
[00:04:15] that. I don't think we have any plan for
[00:04:16] what we do when 20 million people all of
[00:04:18] a sudden have no job. That's that's the
[00:04:20] main thing.
[00:04:21] >> Okay. I'm going to argue with everything
[00:04:22] you just said. Okay. So, I'm I'm not a
[00:04:24] person who who believes that AI is the
[00:04:27] cure for all problems. Uh I also do not
[00:04:30] think that what we are in right now is
[00:04:32] sustainable economically. I've been
[00:04:33] saying this for a while. I've actually
[00:04:34] been saying it for for well over a year
[00:04:35] is that I think we are in a bubble. I
[00:04:37] think pretty clearly we're in an AI
[00:04:38] bubble. That doesn't mean AI isn't is is
[00:04:40] not important. It just means that the
[00:04:42] overinvestment in infrastructure at some
[00:04:44] point is going to have to pay off an in
[00:04:45] actual earnings or the entire pyramid is
[00:04:47] going to crumble at least for for most
[00:04:48] of these companies. Uh as far as I'm
[00:04:51] hearing kind of three arguments there.
[00:04:52] One is the AI is going to take all of
[00:04:54] our jobs. Uh two is that if the AI takes
[00:04:57] all of our jobs, what are we going to do
[00:04:58] with our lives? And three is the quality
[00:05:00] of AI is is demeaning to sort of the the
[00:05:04] the human being. That what's going to
[00:05:06] happen to human art? What's going to
[00:05:07] happen to quality? It's all going to
[00:05:08] kind of descend into AI slot mediocrity.
[00:05:11] So, one at a time. I I will say that AI
[00:05:14] is going to cause job dislocation, but
[00:05:17] it's not going to take out nearly all of
[00:05:19] the jobs. And in the end,
[00:05:22] what you will see is a job shift.
[00:05:24] Actually, predominantly away from the
[00:05:26] white collar industries and more toward
[00:05:28] the blue collar industry. So, what
[00:05:29] you'll see is all the people who were
[00:05:30] telling welders to code 15 years ago,
[00:05:33] all those people are now going to have
[00:05:34] to go learn to weld. That's actually
[00:05:36] what's going to happen. There going to
[00:05:36] be a lot of people who are going to have
[00:05:37] have to be in sort of more physical
[00:05:39] industries. They're going to have to do
[00:05:40] more nursing, for example. Like there's
[00:05:42] certain things human beings want from
[00:05:43] other human beings that AI isn't going
[00:05:44] to provide. It's going to be more of an
[00:05:46] aid than anything else. And it's going
[00:05:48] to take slower to to work its way into
[00:05:50] the market than everybody thinks.
[00:05:51] Everybody always thinks it's going to be
[00:05:52] transitional boom like tomorrow all jobs
[00:05:54] replaced by AI. And and it's not true.
[00:05:56] The people who it's first going to
[00:05:57] replace are the coders. You've already
[00:05:58] started to see some of this happen at
[00:06:00] Google. And I know people uh friends and
[00:06:02] family to to whom this has happened. But
[00:06:04] it's going to take a while for it to
[00:06:05] filter into all business. And there will
[00:06:07] be transitional job loss and then it
[00:06:09] will move into other areas. This is what
[00:06:11] happened with the internet. This is what
[00:06:12] happens with every you know kind of
[00:06:14] great industrial age invention is that
[00:06:16] there's tremendous job dislocation at
[00:06:17] the beginning and then the job market
[00:06:19] moves. And I don't think AI is going to
[00:06:21] destroy wholesale all of these jobs. But
[00:06:24] let's let's move to part number two
[00:06:25] which is sort of the idea that it will
[00:06:27] destroy all the jobs. Let's take that as
[00:06:28] an assumption. So here's my thing. I I
[00:06:31] was actually at a conference with a
[00:06:32] bunch of people who are like the
[00:06:33] creators of these systems. And they were
[00:06:35] arguing kind of what you're arguing,
[00:06:36] Matt, that that eventually AI will be
[00:06:38] better at everything and none of us will
[00:06:39] have jobs anymore. And what are we going
[00:06:42] to do with our day? And I raised my
[00:06:43] hand. I said, you know what? I know what
[00:06:44] I'm going to do with my day. I'm going
[00:06:45] to take care of my family. I'm going to
[00:06:46] go to synagogue more often. I'm going
[00:06:48] to, you know, learn the holy books. I'm
[00:06:50] going to actually spend more time
[00:06:51] getting in touch with God. Like, I think
[00:06:52] that actually religious people and
[00:06:54] community oriented people will be fine
[00:06:56] because we actually have a thing to do
[00:06:57] with our day. I think that secular
[00:07:00] humanism is going to have a real problem
[00:07:01] determining what to do with its day in a
[00:07:03] way that many religious people will not.
[00:07:05] And then just as far as the quality of
[00:07:06] it, I'm not sure that AI is ever going
[00:07:09] to be creative enough. Visually, it will
[00:07:11] be. It'll be able to fool you visually.
[00:07:12] But in terms of the actual creativity of
[00:07:15] truly great writing. I don't think AI is
[00:07:17] ever going to be a great writer. It's
[00:07:18] all derivative. I I think that AI
[00:07:21] because it's predictive text mechanism
[00:07:22] and it's it's you you will end up with
[00:07:24] mid-range slop for the most part. But
[00:07:26] the the way that I've used AI in my own
[00:07:28] work is to save time asking a
[00:07:30] sophisticated question that would take
[00:07:32] me a while to research for example or if
[00:07:34] I'm doing creative writing project and I
[00:07:35] don't want to take a lot of time looking
[00:07:36] up the details of Soviet Russia in 1938
[00:07:39] or something then I can ask a multi-part
[00:07:41] question it'll spit out an answer. If if
[00:07:43] I asked it to write dialogue the
[00:07:44] dialogue would just not be as good. Uh,
[00:07:46] and so I I agree with that. There will
[00:07:47] be a lot of slot, but I think that the
[00:07:49] people who are best at their craft will
[00:07:50] actually end up benefiting from AI and
[00:07:52] and usually when the best get better,
[00:07:54] that's actually good for everybody else
[00:07:55] because it tends to drag everybody else
[00:07:57] along in terms of quality.
[00:07:58] >> So on your point on the religious uh
[00:08:02] people who you know uh they'll know what
[00:08:04] to do with their time or the educated
[00:08:06] people or the cultural elites or what I
[00:08:08] totally agree with that. But to me, this
[00:08:10] is what's really worrisome about Matt's
[00:08:12] point that it's going to displace 15
[00:08:13] million jobs and most people are not
[00:08:15] going to know what to do because I agree
[00:08:16] with you. You you will figure out what
[00:08:18] to do with
[00:08:19] >> No, but in the white collar jobs,
[00:08:20] Michael, in the white collar jobs,
[00:08:23] >> those are the jobs. But those are but
[00:08:24] those are the people who you're talking
[00:08:25] about like largely bluecollar people who
[00:08:28] are like you're saying, you know, all
[00:08:30] the people who are like the intellectual
[00:08:31] elite, those are the people who are now
[00:08:32] most likely to lose their jobs.
[00:08:35] >> No, I'm drawing a distinction here.
[00:08:36] There are plenty of people in white
[00:08:38] collar jobs who are complete philistines
[00:08:39] who are secular humanists who are who I
[00:08:42] don't know that they that they they are
[00:08:43] going to figure out what to do because
[00:08:44] really what it gets down to is is a
[00:08:46] perennial question which is what we do
[00:08:48] for leisure time. You know that's what
[00:08:50] the liberal arts were supposed to teach
[00:08:51] us how to do. Now we think of them more
[00:08:53] as trade school but it was supposed to
[00:08:54] teach us what to do with our freedom
[00:08:56] what how aristocrats are supposed to
[00:08:57] live. We we obviously don't really have
[00:08:59] that. So my my fear is that the promise
[00:09:01] of AI is really just an extension of the
[00:09:04] promise of the internet. the internet
[00:09:06] was going to make us all smarter. We
[00:09:07] were going to have all of human
[00:09:08] knowledge at our fingertips. We were
[00:09:10] going to we could learn a new language.
[00:09:11] We It's all the same stuff we're hearing
[00:09:13] with AI. And the reality is for some
[00:09:16] people the internet did make them
[00:09:18] smarter and more productive and more
[00:09:19] thoughtful and have fuller lives. And
[00:09:21] for more people than that, I really for
[00:09:24] most people I think it made them dumber
[00:09:26] and it made them more vicious. And I
[00:09:28] think it made them more likely to look
[00:09:29] at porn and it made them more likely to
[00:09:31] ignore the great works. And this goes
[00:09:32] all the way back to the Federris. you
[00:09:34] know, Plato's dialogue where Socrates is
[00:09:36] saying that written language books
[00:09:38] essentially are going to make people
[00:09:40] dumber because they're going to have the
[00:09:41] simulacrim of wisdom, but they're not
[00:09:43] actually going to memorize anything.
[00:09:44] They're not going to know anything. And
[00:09:45] so, I fear I I think you're right. I
[00:09:47] think if for people who have their lives
[00:09:48] in order and are religious and have a
[00:09:50] cohesive view of the purpose of life, I
[00:09:52] think it could improve their lives. And
[00:09:54] I think for most people, it probably
[00:09:56] won't. Drew,
[00:09:57] >> this is Well, this is I if I could take
[00:09:59] you and Ben and mash you together just
[00:10:01] for my own personal pleasure, that would
[00:10:03] be great. But also I think that what
[00:10:04] you're say what you're saying you're
[00:10:05] hitting that the problem is not AI the
[00:10:07] problem is human beings and it's always
[00:10:09] the problem. I mean people talk about
[00:10:11] are are we going to have to regulate an
[00:10:12] industry. You don't regulate industries.
[00:10:14] You regulate human beings. You have to
[00:10:16] regulate human beings because they're
[00:10:17] sinful and broken and we'll kill each
[00:10:19] other and rob each other and do all
[00:10:20] these things already. We see with AI. I
[00:10:23] mean recently last week I think it was
[00:10:25] they brought out an AI where you can
[00:10:27] record somebody and then after he's dead
[00:10:29] you can continue to talk. will give you
[00:10:31] an AI version of your dead relatives so
[00:10:32] you can talk to mom even after she's
[00:10:34] passed. I mean, this that is idolatry of
[00:10:36] the worst possible kind. There have been
[00:10:39] AI dolls that have been put in
[00:10:40] children's rooms that talk them out of
[00:10:42] believing in God and tell them how to
[00:10:43] get drugs and things like this. So, the
[00:10:45] problem is not the AI per se. It is it's
[00:10:48] what people are going to do with it. It
[00:10:50] is going to make porn spectacular. I
[00:10:53] mean, the porn that's [laughter] going
[00:10:54] to come out of AI, I mean, I can already
[00:10:55] see that it it will do anything you want
[00:10:57] it to do. It's going to it's going to
[00:11:00] rob people of their desire to read. I
[00:11:03] mean, it's already people are like
[00:11:05] condensing books. Well, now I've got,
[00:11:07] you know, War in Peace. It's just give
[00:11:08] me two paragraphs. But that's a complete
[00:11:10] destruction of what it means. And so,
[00:11:12] people who don't have the meaning of
[00:11:14] life or don't know where it's it it
[00:11:16] lies, which is in the internal life, uh
[00:11:19] are are going to be lost. You and I,
[00:11:22] Nolles, had a conversation with a very
[00:11:24] powerful leader in AI just the other
[00:11:27] week or so. And I went up to him and I
[00:11:29] said to him, "Don't you understand that
[00:11:31] when AI speaks, it's not speaking? It's
[00:11:34] not conscious." And I said, "It's like
[00:11:36] it's like I quoted the great Louisie
[00:11:38] Armstrong saying, I see friends shaking
[00:11:40] hands saying, "How do you do?" They're
[00:11:42] really saying, "I love you." meaning
[00:11:44] that when we speak we deliver our inner
[00:11:47] selves to one another even if our words
[00:11:49] are not precisely that meaning AI has no
[00:11:51] inner life and these guys don't know
[00:11:53] that they are convinced that because it
[00:11:55] can imitate an inner life they think the
[00:11:57] touring test which is the stupidest idea
[00:11:59] anybody ever had is is uh indicative of
[00:12:02] an inner life if it can confuse us about
[00:12:04] its inner life it has one so what I'm
[00:12:06] worried about it is it is in some ways
[00:12:08] the ultimate idol and we know what
[00:12:10] people do with idols you know we know
[00:12:12] that when all Moses has to do is leave
[00:12:14] town for 5 minutes [laughter] and they
[00:12:16] start worshiping the golden calf. That's
[00:12:17] where I think the danger lies. I think
[00:12:19] jobs will create be created. I think
[00:12:21] creativity will exist. But I think your
[00:12:24] point is it's a really important point
[00:12:26] because part of that conversation and
[00:12:28] I've had this conversation with other
[00:12:29] people too is can AI write a poem and
[00:12:32] people get really really I don't know
[00:12:33] vitriolic about this. They very because
[00:12:36] it's it's really the heart of the AI
[00:12:37] debate. And my argument was they can't
[00:12:40] write a poem because to write a poem you
[00:12:42] have to have sensual experience. You
[00:12:44] have to be you have to be able to like
[00:12:46] describe a grape in a way that you know
[00:12:48] gives someone the sensory experience of
[00:12:50] that and you have to be able to take
[00:12:51] language which is just full of dead
[00:12:53] metaphors. It's like the graveyard of
[00:12:55] dead metaphors and you have to create a
[00:12:57] new metaphor you know something that's
[00:12:58] that's evocative that and and AI in
[00:13:01] particular cannot do that because it
[00:13:03] doesn't have any senses yet. It's worth
[00:13:05] pointing out that with robotics it
[00:13:06] actually might have sensory experience
[00:13:08] and two it's just learning on dead
[00:13:10] language. So in my view it it can't make
[00:13:12] a poem but uh I don't know may maybe it
[00:13:15] can and all of this is a little bit
[00:13:16] beside the question of all right if it's
[00:13:18] going to have these negative effects
[00:13:20] what do we do about it? Do we regulate
[00:13:21] it or do we not we let the market run
[00:13:24] its course? What are we going to do?
[00:13:25] >> You guys you guys hang on a second. This
[00:13:28] is why you guys need me here as a
[00:13:31] community college dropout with all you
[00:13:32] Ivy League nerds who immediately this
[00:13:35] becomes a this becomes a like can AI
[00:13:37] make a poem and what will we what will
[00:13:40] we [laughter] think about in our leisure
[00:13:41] time about AI? My question is how are
[00:13:44] people going to eat? Okay, I'm not
[00:13:45] talking about leisure time. How are you
[00:13:47] going to feed yourself? How are you
[00:13:48] going to make money to buy a house? like
[00:13:51] that that that's the first question here
[00:13:53] because and and if the answer is well
[00:13:56] >> we'll live in some sort of AI socialist
[00:13:58] dystopia where where where AI will
[00:14:01] provide all that stuff for you well I'm
[00:14:03] I'm very skeptical that it will work out
[00:14:04] that way I think what's actually going
[00:14:06] to happen is you're going to end up with
[00:14:07] you know a handful of trillionaires off
[00:14:09] this AI stuff and a lot of other people
[00:14:11] who are totally destitute but even if it
[00:14:13] did work out that way okay well then
[00:14:15] that's our life that now we're living as
[00:14:17] people that are totally dependent on
[00:14:18] this nonhuman algorithm to provide died
[00:14:20] for us. I think that's a pretty
[00:14:21] horrifying vision of the future. But
[00:14:22] look, it's it's also this is
[00:14:26] >> it's not just white collar jobs. It's
[00:14:28] it's also blue collar jobs, okay?
[00:14:30] Delivery drivers, truck drivers, Uber
[00:14:32] drivers, that's all going away. That's
[00:14:34] gone. That's finished. And that's just
[00:14:35] the beginning of it. And they're not
[00:14:37] being rep this is not creating new jobs
[00:14:38] because this is different from any other
[00:14:40] technology that has ever existed on the
[00:14:42] planet. It is not analogous to anything
[00:14:44] else because the whole point of it, the
[00:14:47] whole point is to take the human element
[00:14:49] out of it completely. It's not a new
[00:14:51] tool for humans to use. It's not like
[00:14:53] going from a carriage driver to now
[00:14:54] you're driving an automobile. This is
[00:14:56] the human is gone. We don't need you
[00:14:58] anymore. It's artificial intelligence.
[00:15:00] And so these jobs are leaving and
[00:15:02] they're not being replaced for all the
[00:15:04] drivers who are not going to have a job
[00:15:05] anymore. There's not some new thing. Oh,
[00:15:07] well, you'll go over here and do this.
[00:15:09] It's there's nothing for you. You're out
[00:15:11] now.
[00:15:12] Why do you think that's true? They say
[00:15:14] this every time a new technology comes.
[00:15:16] >> I'm not I want to get I want to get No,
[00:15:18] Drew. It's fine. I want to get to it.
[00:15:19] But before we get to it, we need to we
[00:15:21] need to eat. Okay. The [laughter] only
[00:15:23] way we're going to eat is if I read this
[00:15:24] ad right here. This one. So guys, just
[00:15:27] cut for a second.
[00:15:28] >> Well, [laughter]
[00:15:30] I I hope not. Okay, guys. Did you know
[00:15:32] that up until the 1990s, cryptography
[00:15:34] was classified as a strategic weapon by
[00:15:36] the United States government? And during
[00:15:37] the Cold War, it was the it was added to
[00:15:39] the same US munitions list that
[00:15:41] restricts export of rifles and rockets.
[00:15:42] In 1954, encryption hardware and
[00:15:44] algorithms were added to the list to
[00:15:46] prevent the Soviets from acquiring tools
[00:15:47] that protected American military
[00:15:49] secrets. Well, just the way that we are
[00:15:51] allowed to possess firearms to protect
[00:15:53] life and liberty because we have an
[00:15:54] amazing Second Amendment. We also can
[00:15:55] create, share, and wield strong
[00:15:57] cryptographic arms to safeguard their
[00:15:59] communications data and digital lives
[00:16:01] from any adversary, foreign or domestic.
[00:16:02] That's what ExpressVPN does for you.
[00:16:04] It's what it does for me. It's an app
[00:16:05] that encrypts and rroots your internet
[00:16:07] connection through secure servers that
[00:16:09] makes your online activity private. No
[00:16:10] one can monitor, record, manipulate, or
[00:16:12] profit from it without your consent.
[00:16:14] ExpressVPN works on every device, phone,
[00:16:15] laptop, tablet, you name it. And you can
[00:16:17] protect up to 14 devices with one
[00:16:19] subscription. Get four extra months of
[00:16:21] ExpressVPN just by using our special
[00:16:23] link. Go to expressvpn.com/friendly
[00:16:25] fire. That's exsvpn.com/friendly
[00:16:29] to get four extra months. Start
[00:16:31] protecting yourself today. I know when
[00:16:32] I'm traveling, I'm using public Wi-Fi. I
[00:16:34] don't want anybody else looking over my
[00:16:35] shoulder at the data that I'm using or
[00:16:37] the stuff that I'm searching. So that's
[00:16:39] why I use ExpressVPN. I'm using it all
[00:16:40] the time. You should do the same. Head
[00:16:42] on over to expressvpn.com/friendly
[00:16:44] fire. That's exp rsvpn.com/friendly
[00:16:47] fire. Get four extra months and start
[00:16:49] protecting yourself today. Okay. Now,
[00:16:50] Drew, you want to say something?
[00:16:52] >> Hang on. I I also have to jump in, I'm
[00:16:55] told, with with another momentum killing
[00:16:58] advertisement. [laughter] Uh anyway, but
[00:17:00] I'm right when it's getting interesting.
[00:17:03] Let's jump in with the yes. It's fine
[00:17:04] though. It's good because I do want to
[00:17:05] tell you about uh Helix Sleep and I and
[00:17:07] I do love Helix Sleep. I actually uh we
[00:17:10] we have Helix mattresses in our house.
[00:17:12] We all sleep on Helix. All of our kids,
[00:17:14] all of our all of our 90 kids all have
[00:17:16] Helix mattresses and uh and it's great.
[00:17:19] Um I'm not getting a lot of sleep right
[00:17:21] now because after the, you know, after
[00:17:23] we fall back with daylight savings,
[00:17:24] everyone talks about how, oh, we save an
[00:17:26] hour of sleep. Well, the problem is when
[00:17:28] you have young kids, they don't realize
[00:17:31] they don't have they don't they don't
[00:17:32] they don't pay care about the clock. So,
[00:17:34] now I've got uh twin toddlers waking up
[00:17:36] at 4:30 in the morning uh who are
[00:17:38] rousing me out of sleep out of my very
[00:17:41] comfortable Helix mattress. So, Helix
[00:17:42] will help you sleep like a baby at
[00:17:44] night. Unless you have babies in the
[00:17:45] house and they will wake you up. There's
[00:17:47] nothing we can do about that. Um but
[00:17:49] Helix uh is great. I can't recommend it
[00:17:51] enough. You can go to
[00:17:52] helixleep.com/friendlyfire
[00:17:55] for 27% off sitewide. That's
[00:17:57] helixleep.com/friendlyfire
[00:17:59] for 27% offsite. Why you go to their
[00:18:01] website, you take a sleep quiz and you
[00:18:04] get matched with the perfect mattress
[00:18:05] for you. Um because everyone is
[00:18:08] different and um and they take care of
[00:18:10] that there. Make sure you enter our show
[00:18:12] name into the post purchase survey so
[00:18:13] they know we sent you
[00:18:14] helixleep.com/friendlyfire.
[00:18:18] >> So here's my problem with the no job
[00:18:19] scenario is that it comes up every
[00:18:21] single time there's a new technology.
[00:18:24] Every time. And it's why government is
[00:18:26] so bad at managing economies. It's why
[00:18:28] you don't want a top- down economy
[00:18:30] because when the carton horse goes out
[00:18:32] of style, the government says we must
[00:18:34] save the jobs of buggy whip buggy whip
[00:18:37] makers, you know. And the thing is
[00:18:38] there'll be new jobs. There will be new
[00:18:40] jobs. And the thing is maybe we can't
[00:18:42] even imagine. I think this has happened
[00:18:43] a million times before. You can't
[00:18:44] imagine what the new job is going to be,
[00:18:46] but there'll be jobs to do because
[00:18:48] people are endlessly creative. It's like
[00:18:50] it's like the people who worry about
[00:18:51] running out of oil. You know, we don't
[00:18:54] you don't run out of energy because
[00:18:55] energy is a product of the human mind.
[00:18:58] The human mind turns things into energy.
[00:19:00] And if we run out of oil, we'll turn
[00:19:02] something else. You know, we'll mash up
[00:19:03] nulls. We'll use him for energy. I mean,
[00:19:05] you can always can always make energy.
[00:19:07] The human mind and imagination and
[00:19:09] creativity is bottomless. It's endless.
[00:19:11] I don't I don't fear this about AI at
[00:19:13] all. Although, I do think Ben is right
[00:19:14] that there could be difficult
[00:19:16] transitions and knowing what how people
[00:19:18] are will handle that in the worst way
[00:19:19] way possible. But I do think I do think
[00:19:22] when you have a powerful new tool, you
[00:19:24] have to start to think about human sin.
[00:19:26] You have to start to think about the
[00:19:27] things we're going to use it for that
[00:19:29] are destructive. And that's where I I
[00:19:30] see
[00:19:31] >> I totally agree with this, Drew. I mean,
[00:19:32] my my worry about AI is the endless
[00:19:34] pornography, the endless, you know,
[00:19:36] narcissism, the the things that social
[00:19:38] media has done to human beings by
[00:19:40] exacerbating our worst qualities and and
[00:19:41] and that getting even worse. Obviously,
[00:19:43] that's the thing I worry about. But as
[00:19:45] far as sort of the economic point here,
[00:19:47] I'm I'm significantly less worried about
[00:19:48] that for a couple of reasons. one
[00:19:50] because I'm just less worried about it
[00:19:52] based on the the history of
[00:19:53] technological innovation. If you go back
[00:19:54] to the early 20th century, well over 80%
[00:19:57] of jobs in the United States were
[00:19:58] agriculturally based or early industry
[00:20:00] based. Uh and and obviously very few
[00:20:02] people do agriculture now. If you go to
[00:20:03] the middle of the 20th century, America
[00:20:05] was a manufacturing based economy. Now
[00:20:07] we're a service-based economy. Jobs tend
[00:20:08] to move around and human beings are
[00:20:10] quite adaptable. If the question is, you
[00:20:12] know, will I be endlessly poor where all
[00:20:14] a few people are trillionaires? The that
[00:20:15] that wouldn't work because they wouldn't
[00:20:16] be trillionaires if everybody is
[00:20:18] endlessly poor. [laughter] That's not
[00:20:19] the way that actually wealth
[00:20:20] distribution happens. They don't take
[00:20:21] their wealth from a bunch of super duper
[00:20:23] poor people. If there's no wealth for
[00:20:25] them to take, then they don't generate
[00:20:26] the product. So the actual thing that
[00:20:28] would happen, the kind of worst case
[00:20:29] scenario that people are talking about
[00:20:31] actually would be a sort of Star Trek
[00:20:32] replicator machine. So in Star Trek, I
[00:20:34] know not a lot of Trekies on the line
[00:20:35] here, but if you're if you are Treky, my
[00:20:37] understanding is that there is a
[00:20:38] replicator machine whereby you can
[00:20:40] literally generate any product from
[00:20:41] nothing with no resource use
[00:20:42] essentially. And so you don't have to
[00:20:44] worry about anything. Well, if you don't
[00:20:45] have to worry about anything, I thought
[00:20:47] that that was mostly the goal of human
[00:20:48] beings because work, I mean, we all
[00:20:50] understand that work is important, but
[00:20:52] there are other types of work, right?
[00:20:54] Like for example, spending time with
[00:20:55] your family. It's a different type of
[00:20:57] fulfillment. It's not really work, but
[00:20:58] it's it's it's service. What we would
[00:20:59] call in Hebrew avoda, which is the same
[00:21:01] Hebrew, the the word for for work and
[00:21:03] service is the same. It's avod. Um, the
[00:21:05] same type of thing, I think, is true in
[00:21:07] our lives, right? when I think of like
[00:21:09] the things that I do that are important,
[00:21:10] my work actually comes maybe third or
[00:21:11] fourth on the list after family and
[00:21:13] religion and and the stuff that I'm
[00:21:14] doing in my community and for the
[00:21:16] country. So, you know, I'm I'm less
[00:21:17] worried about the kind of how do I get
[00:21:19] my stuff? If things work out great,
[00:21:20] we're all going to be way richer and
[00:21:21] have a lot more leisure time. If you're
[00:21:22] worried about the leisure time, that's a
[00:21:24] human nature problem. That's what that
[00:21:25] Drew is talking about. And then there is
[00:21:27] the other problem, which is what's the
[00:21:29] alternative? People keep talking about,
[00:21:30] okay, we could regulate it out of
[00:21:32] existence, right? We're just going to
[00:21:33] regulate it, stop it from taking trucker
[00:21:34] jobs. Okay, let's say that we were able
[00:21:36] to do that. Let's say they were able to
[00:21:37] ban all the self-driving cars. Does
[00:21:39] anybody think that any other place on
[00:21:41] earth is going to ban the self-driving
[00:21:42] cars? So the actual thing that will
[00:21:44] happen is that China will gain complete
[00:21:46] economic dominance over planet earth
[00:21:48] unless you are going to essentially make
[00:21:49] America autaric and poor. That is the
[00:21:51] way that trade actually works. China
[00:21:53] will gain the advantage of every
[00:21:54] efficiency on planet earth while we
[00:21:56] hamper ourselves and we will live in
[00:21:59] relative poverty compared to what we are
[00:22:00] now. While China gains significantly
[00:22:02] more power globally and then uses that
[00:22:04] power in order to cram down its terrible
[00:22:06] vision of the world which will
[00:22:07] eventually is your view then like pure
[00:22:10] less afair no regulation whatsoever. Let
[00:22:13] the market lead in it and that way we'll
[00:22:14] beat China and we'll maintain our
[00:22:16] >> DEP except for morality except for
[00:22:18] morality and national security. Yes. So
[00:22:19] I don't think we should be selling
[00:22:20] Nvidia chips to China because I think
[00:22:23] China is our enemy. Um and I also think
[00:22:25] that we should be heavily regulating
[00:22:26] pornography period and that applies also
[00:22:29] to AI. But if we're talking about like
[00:22:31] should we stop AI from generating health
[00:22:33] care solutions because people in the
[00:22:35] healthare industry are going to lose
[00:22:36] their jobs. Uh I mean let's let's be
[00:22:38] real about this much like it's easy for
[00:22:40] us living in a first world country with
[00:22:42] an average life expectancy above 80 to
[00:22:44] talk about you know the the evils of AI
[00:22:46] but if AI for example in medical
[00:22:49] industry extends lifespans by another 20
[00:22:51] years which could easily happen you know
[00:22:53] that that that seems like a pretty good
[00:22:55] thing to happen. And I think that one of
[00:22:58] the big mistakes I see people happen
[00:23:00] there's a mistake that I just generally
[00:23:02] object to and that is I think it happens
[00:23:03] on the Marxist left and I think it
[00:23:05] sometimes happens on the populist right
[00:23:06] and that is they take a spiritual
[00:23:08] problem people's emptiness and inability
[00:23:10] to function in the absence of particular
[00:23:12] guardrails and then they say there's a
[00:23:14] material solution for that
[00:23:16] >> and it is very rare to me that there's
[00:23:17] actually a material solution
[00:23:20] point that's a very good point Ben
[00:23:21] because it is true sometimes people
[00:23:23] think like like with the birth rate
[00:23:24] problem you can just fix it with a lot
[00:23:26] of material solution And there's not a
[00:23:28] lot of evidence. However, there's a
[00:23:29] distinction between a material solution
[00:23:31] and a government solution because the
[00:23:33] government influences culture. It
[00:23:35] promotes certain ideas, suppresses
[00:23:36] others. It promotes religion
[00:23:38] traditionally and I think inevitably.
[00:23:40] And so, you know, like for the to use
[00:23:42] the birth rate example, the only thing
[00:23:44] that seems to reliably increase birth
[00:23:45] rate is the promotion of religion. But
[00:23:47] the government can can do things there.
[00:23:49] Either explicitly promote religion or at
[00:23:50] least stop the suppression of religion
[00:23:52] like, you know, we saw under Joe Biden
[00:23:54] and we see under a lot of liberals. So
[00:23:56] is there any role just before we get to
[00:23:58] the other guys is there any role for the
[00:24:00] government here in maybe not providing a
[00:24:02] material solution to the consequences of
[00:24:04] AI but some role for the government?
[00:24:06] >> I mean I I want to know the specifics.
[00:24:08] It always comes down to the specifics.
[00:24:09] And this, by the way, no one the problem
[00:24:11] with AI is a bunch of unknown unknowns,
[00:24:13] right? It's not known unknowns. It's
[00:24:14] just we we literally don't know what's
[00:24:15] going to happen next. How do you
[00:24:16] regulate for that? Which is why the Ky
[00:24:18] markets, right? Cali is one of our
[00:24:19] sponsors right now in the Ky markets
[00:24:21] like 5% shot that there's any serious
[00:24:23] regulation of AI because no one even
[00:24:25] would know what that looks like. What
[00:24:27] does that even look? I mean, this is a
[00:24:28] question, honestly, Matt, this is a
[00:24:29] question for you because you're you want
[00:24:31] to regulate AI. I assume you you want to
[00:24:33] do something to stop sort of the forward
[00:24:34] march of AI. So, on a practical level,
[00:24:36] what does that look like? Well, I think
[00:24:38] that and I don't have all the answers.
[00:24:40] I'll fully admit that. That's why that's
[00:24:41] why it it it's so frustrating to me that
[00:24:44] we're not we're not at a serious level
[00:24:46] even having this conversation. I mean,
[00:24:47] we're having this conversation right
[00:24:48] now, but including like our lawmakers
[00:24:51] having this debate about what what can
[00:24:54] we do, what should we do, and that
[00:24:55] conversation just isn't happening at
[00:24:57] all. Um, and if I had all the answers
[00:24:59] myself, then I guess I wouldn't I
[00:25:00] wouldn't be frustrated by that because I
[00:25:02] could just say, "Well, here's the
[00:25:03] answers, guys." I don't have them. But I
[00:25:05] what I know the the answer can't be well
[00:25:08] whatever. We we'll see how it plays out.
[00:25:10] That can't be the answer when you're
[00:25:11] facing something uh that's that is going
[00:25:13] to fundamentally alter our civilization
[00:25:15] the way that this is going to. Now there
[00:25:16] are some things that can be done. I mean
[00:25:18] people have suggested when it comes to
[00:25:19] and this is kind of on a lower level but
[00:25:21] things like um intellectual property.
[00:25:23] This is another huge problem with AI and
[00:25:26] I think some of you guys have already
[00:25:27] have already kind of touched on it that
[00:25:28] AI cannot create anything. It can't it
[00:25:31] can't it can't it can't make a poem like
[00:25:33] it can't write a poem. It can't do a
[00:25:34] screenplay.
[00:25:35] >> You were just making fun of me because I
[00:25:36] brought that up and now you're bringing
[00:25:37] that up. I'm bring I'm bringing it back
[00:25:40] to the real world. So the the problem
[00:25:42] with the reason why I can't do that is
[00:25:43] because it's stealing from what other
[00:25:45] people have done and right now AI lives
[00:25:47] in this kind of like bubble where the
[00:25:49] rules of plagiarism don't apply to it.
[00:25:51] So uh there are things that you could do
[00:25:53] there legislatively. There's um again
[00:25:56] it's not easy to do but I I do think you
[00:25:58] have to do something there to protect
[00:26:00] people from having their from having
[00:26:02] their having their creative probably
[00:26:03] slow but but I but I would kind of I
[00:26:05] would flip it back in the other way
[00:26:06] because what I'm going to ask is okay
[00:26:09] the uh the drivers are all going to lose
[00:26:11] their jobs most likely customer service
[00:26:14] the customer service industry a lot of
[00:26:16] that is just going away because if when
[00:26:19] AI is adopted and I I don't think this
[00:26:22] is not some kind of like sci-fi
[00:26:23] speculation It's just it's just
[00:26:25] extending out a little bit. It's like
[00:26:26] pretty clear that if we keep applying
[00:26:28] this stuff, there's there's not going to
[00:26:30] be anything for people to do in a lot of
[00:26:31] these jobs. So, I think a lot of these
[00:26:32] customer service jobs are going to go
[00:26:34] away. Um, and then and then yes, there's
[00:26:36] also the white collar, but I care about
[00:26:38] those people, too. There people anyone
[00:26:40] who sits in a in a in a cubicle all day
[00:26:42] and enters data into computers, which is
[00:26:45] millions of people, um, probably a lot
[00:26:47] of their jobs are going away. And I
[00:26:49] think that that matters, too. My
[00:26:50] question is if that were to happen,
[00:26:52] let's just say, and maybe AI all breaks
[00:26:55] down and it doesn't happen. I think it
[00:26:57] probably will. If that happens over the
[00:26:58] next 5 to 10 years, and you've got tens
[00:27:00] of millions of people who not just their
[00:27:03] job, but really their entire industry
[00:27:05] just went away, what are we doing with
[00:27:07] them? What are we doing?
[00:27:11] Here's the thing. Hold on. Let me just
[00:27:13] answer that. It'll take me I I promise
[00:27:14] like four sentences. Okay. So, here's
[00:27:16] the answer to that. If I had asked you
[00:27:18] that same question in 1998, the advent
[00:27:20] of the internet is going to kill a bunch
[00:27:21] of jobs. And it will kill a bunch of
[00:27:22] jobs. You know, based on all the supply
[00:27:24] chains being changed, everything getting
[00:27:26] a lot shorter, you won't have to go to
[00:27:27] the local mom and pop shop. You can
[00:27:28] order off the internet. And I said to
[00:27:30] you, don't worry. In 20 years, there
[00:27:32] will be literally millions of people who
[00:27:34] are working on AI coding and database
[00:27:36] building, data center building. You
[00:27:38] would say, what the hell are you even
[00:27:39] talking about? What do those words mean?
[00:27:41] I don't know what those words mean. If I
[00:27:43] said to you there would be legitimately
[00:27:45] thousands of jobs that were people who
[00:27:47] were social media editors and marketers.
[00:27:48] You say, "What the hell? What is what's
[00:27:50] a social media and how do it work?"
[00:27:52] Right? Like this is the whole point of
[00:27:53] the market is that jobs that we don't
[00:27:55] even know exist will come about because
[00:27:57] that's what the market does. The market
[00:27:59] generates innovation because the because
[00:28:01] human desire is endless and then the
[00:28:03] human desire for new and and innovative
[00:28:06] things is also endless. This is
[00:28:08] different. This is I want to hear from
[00:28:10] >> Yes. Yes. We can't, you know, we can't
[00:28:12] imagine these things. I I totally agree
[00:28:13] with Ben. I think there are going to be
[00:28:14] jobs that we have no idea could possibly
[00:28:17] exist. But the question that Nolles
[00:28:19] asked and actually Ben referred to is
[00:28:21] the really important question. When back
[00:28:24] in the day when you wanted to get a
[00:28:26] pornographic magazine, you had to walk
[00:28:28] into a store, shame yourself, you had to
[00:28:30] make sure nobody none of the neighbors
[00:28:31] saw you, you know, you went home with
[00:28:32] this piece of paper that you could look
[00:28:35] at and all this stuff.
[00:28:36] >> Not that not Drew. Not that you have any
[00:28:37] experience. No, I have no I'm talking
[00:28:38] about the one right on theoretically a
[00:28:41] friend of mine theoretically. Right. So,
[00:28:43] but but nobody when people said, "Oh,
[00:28:44] we've got to ban this." And they did ban
[00:28:46] it and you know they they censored
[00:28:47] things and then they said, "Oh, yeah, we
[00:28:49] got to censor Ulyses, too." It it was
[00:28:51] silly. You had to get rid of it. Now
[00:28:53] you've got this sewer of porn wiping
[00:28:56] people's lives away with no regulation
[00:28:58] whatsoever. And so now conservatives
[00:29:00] when I come out and say things for
[00:29:02] instance like you should not be able to
[00:29:04] censor opinions on YouTube.
[00:29:06] Conservatives go oh my regulation
[00:29:08] regulation. Well no it's a new thing. It
[00:29:11] needs new regulations to make sure the
[00:29:13] freedom of speech lives because if you
[00:29:15] censor things on YouTube you have
[00:29:16] virtually taken them off out of the
[00:29:18] public square. So what do you do with
[00:29:19] pornography? I mean, I I I who would
[00:29:23] have said, you know, so what pornography
[00:29:25] 30 years ago now think, holy, [laughter]
[00:29:28] this is this is an a toxin being pumped
[00:29:31] into the human psyche like never before.
[00:29:33] >> Dude, I wrote a literal book on
[00:29:35] pornography and what it was going to do
[00:29:37] to destroy young people in 2005. And I
[00:29:40] was mocked for it. I was 21 years old
[00:29:41] when I wrote that book. Many were
[00:29:44] question these are the questions that
[00:29:46] we're not addressing now where we where
[00:29:48] we know the danger, we can see the
[00:29:49] danger. is only going to get worse.
[00:29:51] These are the issues I think we should
[00:29:53] be addressing, not whether jobs are
[00:29:54] going to disappear because everything
[00:29:56] will change or we don't even know what
[00:29:58] that's going to look like.
[00:29:59] >> All right, Matt, last word.
[00:30:00] >> Yeah, on the on the regulation side of
[00:30:02] it, I mean, obviously the most the most,
[00:30:04] you know, the sort of the most
[00:30:05] heavy-handed and obvious thing if we're
[00:30:07] talking about regulation is, you know,
[00:30:09] the government saying that, hey, okay,
[00:30:11] you want to wipe out all the driver
[00:30:13] jobs, you want to wipe out uh you want
[00:30:14] to, you know, you want to get rid of all
[00:30:16] your customer service jobs if you're
[00:30:17] McDonald's. And it's it's a law saying,
[00:30:20] "Well, you can't do that. We're just
[00:30:22] you're not you can't do that. We're not
[00:30:23] going to let you do that." Uh because
[00:30:25] we're not going to let you put millions
[00:30:26] of people out of work all the same time
[00:30:27] because we just can't we can't sustain
[00:30:29] that as a society. We can't it it can't
[00:30:31] happen. And uh now that's very
[00:30:33] complicated. That's a that's the kind of
[00:30:35] thing that I normally would not support
[00:30:37] and there is this tension between like
[00:30:39] free markets and then this other huge
[00:30:41] civilization level concern. Uh so that's
[00:30:43] just that's that's the thing. That's
[00:30:45] what we're dealing with. And I do think
[00:30:47] and I just go back to that this is a
[00:30:48] different kind of thing. I think any
[00:30:50] analogy breaks down. Ben, you brought up
[00:30:52] the internet. Well, the internet is a
[00:30:53] different kind of thing. The internet is
[00:30:55] a, you know, a very high tech um
[00:30:58] sophisticated form of communication.
[00:31:00] It's just a way of for people to
[00:31:01] communicate and connect with each other.
[00:31:03] And and so that in and of itself is not
[00:31:06] going to take away jobs. It might change
[00:31:07] what the jobs are, but you still need
[00:31:09] the the you still have humans who are on
[00:31:11] the internet communing communicating
[00:31:13] with each other. And that's the case
[00:31:14] with all of these technological
[00:31:16] innovations that it's just a different
[00:31:17] tool for people to use. And so yeah,
[00:31:19] maybe the job where you use the the more
[00:31:21] primitive tool goes away, but now you
[00:31:23] use the more sophisticated tool and
[00:31:24] that's the job. And I think with AI,
[00:31:26] it's just different because uh as I
[00:31:28] said, it's artificial intelligence,
[00:31:30] which means the entire point of it is
[00:31:32] that we don't need a person to do this
[00:31:34] at all. It's not a new thing for you to
[00:31:35] do. You're not needed. And because we're
[00:31:38] facing this totally new kind of thing,
[00:31:40] which I really believe is unprecedented
[00:31:41] in human history, um I think we might
[00:31:44] need to embrace solutions we we
[00:31:46] otherwise would that otherwise would
[00:31:48] make us uncomfortable.
[00:31:49] >> In fairness, we don't know if that's
[00:31:51] even Matt really talking right now. That
[00:31:53] could be [laughter] rock or Gemini or
[00:31:55] something. Now, I want to get to it
[00:31:57] something we touched on though. It's
[00:31:58] it's related, but it's a totally
[00:31:59] separate topic is affordability. It's
[00:32:02] the word. It's the meme that everyone's
[00:32:04] it's the new 67. Everyone's just saying
[00:32:06] affordability all the time. I want to
[00:32:07] get into what that actually means. But
[00:32:09] first, before you talk about a little
[00:32:10] balance to this conversation, I want to
[00:32:12] say I want you to tell us
[00:32:14] >> here is something that AI cannot do. It
[00:32:16] cannot eat your vegetables. It can't
[00:32:17] even eat my vegetables. In fact, I can't
[00:32:19] eat your vegetables. It's a very very
[00:32:21] complicated thing. These these
[00:32:22] vegetables and if you want to get enough
[00:32:24] of them, you need to use balance of
[00:32:25] nature. because I love vegetables, but
[00:32:27] if I ate enough the kinds of things
[00:32:29] that, you know, nutrition experts
[00:32:31] recommend, it would be all over my
[00:32:33] beard, my face, it would be just
[00:32:34] disgusting. So instead, I have balance
[00:32:37] of nature, fruits and veggies. And you
[00:32:39] may say, well, if you use them all the
[00:32:41] time, which I do, why aren't they open?
[00:32:43] It's because I have so many of these
[00:32:45] things [laughter]
[00:32:46] that I don't even have to open them. I
[00:32:47] got more downstairs that are open.
[00:32:49] Balance of Nature. What they do is they
[00:32:51] freeze dry fruits and veggies, then
[00:32:52] powder them, and blend them into the
[00:32:55] most convenient nutritional value. You
[00:32:56] can take the fruits and veggie
[00:32:58] supplements with water, chew them, or
[00:33:00] open them up, and mix the powder into
[00:33:01] your food or drinks, which just sounds
[00:33:02] silly to me, but it's still, it's made
[00:33:04] from 100% whole food ingredients. You
[00:33:08] wonder how an animated corpse like
[00:33:09] myself can look like a 30-year-old man?
[00:33:12] It's because I use Balance of Nature.
[00:33:13] So, go to balanceofnature.com and get a
[00:33:16] free fiber and spice supplement. You
[00:33:18] didn't even have time to talk about the
[00:33:20] fiber and spices. Plus, you get 35% off
[00:33:23] your first set as a new preferred
[00:33:24] customer by using discount code friendly
[00:33:27] fire. Go to balanceofnature.com and use
[00:33:29] the discount code friendly fire. Now,
[00:33:32] Nolles, what were you saying?
[00:33:33] >> Well, I was saying with the rest of your
[00:33:35] money, you need to go to
[00:33:36] dailywire.com/subscribe
[00:33:37] because we have the biggest deal of the
[00:33:39] year right now. This is the Black Friday
[00:33:41] deal. 50% off. It's really, really big.
[00:33:44] Uh, you're going to get everything.
[00:33:45] Obviously, we have Pen Dragon coming
[00:33:46] out. You're going to get the world
[00:33:48] premiere of that trailer coming out at
[00:33:50] the end of the show. Really big stuff
[00:33:52] though. New docs, new hosts, new
[00:33:53] everything. It's It's very exciting. Uh
[00:33:55] you guys are are who empower DW to build
[00:33:58] culture. And so right now you can save
[00:34:00] 50%. I I love building culture, but I
[00:34:03] also like doing it on a good deal. You
[00:34:06] know, I like I want to build culture
[00:34:07] frugally. And so when you can do it for
[00:34:10] 50% off, it's great great time to do it.
[00:34:12] Go to dailywire.com/subscribe.
[00:34:15] Absolutely fitting, apt way to talk
[00:34:17] about affordability, which is a a very
[00:34:19] serious problem. You know, I usually
[00:34:21] sweet little Alisa does the shopping in
[00:34:23] the house occasionally. I had to go out
[00:34:24] the other day to get lemons for a
[00:34:27] cocktail that I was made, not even for
[00:34:28] food, just for a cocktail I was making.
[00:34:29] And uh so I a great cocktail, but that's
[00:34:32] a story for another time. Anyway, I go
[00:34:34] to the grocery store and the prices are
[00:34:36] insane. I see why Alisa had been keeping
[00:34:38] me from them largely. I mean, you know,
[00:34:41] the affordability problem is very real.
[00:34:43] It's not that it's not being pounced on
[00:34:45] by political actors and it's obviously
[00:34:46] become a big political talking point,
[00:34:47] but it's very, very real. A ton of
[00:34:49] Americans are hurting. A lot of the
[00:34:52] fundamentals of the economy are a little
[00:34:54] shaky right now, even though those MAG7
[00:34:56] stocks that we were just talking about
[00:34:57] AI is pumping up the market. It's it's
[00:34:59] really really tough. And so there there
[00:35:01] are a bunch of related questions. One,
[00:35:03] uh can the government do something to
[00:35:05] fix this? Uh is or is the government
[00:35:08] only going to make things worse? uh how
[00:35:10] is this going to affect the midterms in
[00:35:11] the 2028 election? Are we are we headed
[00:35:14] for an economic disaster? And Ben, you
[00:35:16] got in a huge amount of trouble because
[00:35:18] there was a short clip of you going
[00:35:20] around saying, "Yeah, listen, you know,
[00:35:22] if you can't afford stuff, move out of
[00:35:24] your move out of your town, even if it's
[00:35:25] your hometown, even if your family's
[00:35:27] been there for a long time, just get you
[00:35:28] got to get out. You got to be mobile."
[00:35:30] And you were variously uh exalted and
[00:35:33] pillaried for this comment. So, what's
[00:35:35] it mean?
[00:35:36] >> Yeah. So uh uh let me start with what
[00:35:38] that uh that comment meant. That was a
[00:35:40] piece of personal advice to people that
[00:35:42] I think every single young person that I
[00:35:44] know has at some point taken, which is
[00:35:46] if you're living in a place that you
[00:35:47] can't afford and the policies aren't
[00:35:49] going to change and you want to make
[00:35:50] your life better, you do have to make a
[00:35:52] significant calculation as to whether
[00:35:53] you think your life is going to get
[00:35:54] better where you are or whether you're
[00:35:55] going to have to go pursue a dream
[00:35:57] someplace else. And and you've seen
[00:35:58] this. You've seen tremendous population
[00:35:59] movement in this country right now out
[00:36:01] of New York to places like Austin,
[00:36:03] Texas. You've seen tremendous population
[00:36:04] movement from the blue areas to the red
[00:36:06] areas. of the country specifically
[00:36:07] because people are seeking economic
[00:36:08] opportunity. So what I thought I was
[00:36:10] saying was something that's that's
[00:36:11] fairly obvious which is that if you are
[00:36:13] on a personal level in a place where
[00:36:15] you're stuck and you can't afford to
[00:36:16] live there, you have to make the best
[00:36:18] decision for yourself and your family.
[00:36:19] And that does include the possibility of
[00:36:21] actually moving as opposed to shouting
[00:36:23] at the wind if the policy isn't going to
[00:36:24] change. That's a separate question from
[00:36:26] what sort of policies could be pursued
[00:36:28] in order to make things more affordable.
[00:36:29] I mean, I'll start with this. If you're
[00:36:31] talking about Manhattan, Manhattan will
[00:36:33] never be as affordable as De Moine. It
[00:36:34] just is not going to. And anybody who
[00:36:36] says that it is going to is totally
[00:36:37] lying to you. It's just a it's just a
[00:36:39] flatout lie. The the reality is there
[00:36:41] are only two ways to make things more
[00:36:42] affordable. One is to drop the demand
[00:36:44] for a product and retain the same
[00:36:46] supply. The other is to radically
[00:36:48] increase the supply of a product and to
[00:36:49] retain the same demand. That's it. Those
[00:36:51] are the only way that things become more
[00:36:52] affordable. There is no magical third
[00:36:54] way. The only way things become more
[00:36:55] affordable is if the supply greatly
[00:36:57] outstrips the demand. And the only ways
[00:36:58] to do that are to increase supply or
[00:37:00] reduce demand. That's it. So if you're
[00:37:02] talking about how to make things more
[00:37:03] affordable, one of the things you can do
[00:37:05] to increase supply is remove
[00:37:06] regulations, right? You can get rid of
[00:37:08] tax structures that disincentivize
[00:37:09] investment, you you can get rid of a lot
[00:37:12] of the difficulty in building, for
[00:37:13] example, in New York. But are you ever
[00:37:15] going to build enough units so that
[00:37:16] suddenly the the real estate prices
[00:37:17] there reflect what it would be across
[00:37:19] the river in in sort of rural parts of
[00:37:21] New Jersey? The answer, of course, is
[00:37:23] no. And and when people talk about
[00:37:24] affordability, the thing that makes me
[00:37:26] totally crazy about this is I'm totally
[00:37:28] sick in politics. I'm sick to of people
[00:37:30] in politics doing this routine where
[00:37:32] they say the problem over and over and
[00:37:34] over providing zero solution and then
[00:37:36] when you say you know what I don't
[00:37:37] really see a solution to the thing
[00:37:38] you're talking about. They pillar you
[00:37:40] for noting the obvious like okay if
[00:37:41] you're not prov Zoram donni is not
[00:37:43] providing a solution. Him saying
[00:37:44] affordability didn't make affordability
[00:37:46] magically appear like Beetlejuice if he
[00:37:48] said affordability three times. And also
[00:37:50] politicians are in the business of lying
[00:37:52] to you. Okay. when when the president of
[00:37:54] the United States, who I generally agree
[00:37:56] with, he made a mistake when he came
[00:37:57] into office and said, "I'm going to make
[00:37:59] things affordable again." The answer is
[00:38:01] no, you're probably not. And the reason
[00:38:02] you're probably not is because all of
[00:38:04] the inflation that Joe Biden embedded in
[00:38:06] the economy already made things so
[00:38:07] wildly unaffordable that the best you're
[00:38:09] probably going to do is keep prices
[00:38:10] stable. Right? What the Federal Reserve
[00:38:12] seeks to do is keep the inflation rate
[00:38:14] at like 2%. Which is an increase in the
[00:38:17] prices just by the very nature of it.
[00:38:18] And what people actually want is for
[00:38:20] there to be deflation. They want the
[00:38:22] prices to be back at 2019 levels.
[00:38:24] They're not talking about going back to
[00:38:25] 2024 levels. They're talking about 2019
[00:38:28] levels. The only way to get back to 2019
[00:38:29] levels is probably an economic
[00:38:31] recession. That's just the reality. And
[00:38:34] so I I again saying unpopular things.
[00:38:36] The the best that the inflation rate
[00:38:38] could look like for President Trump is
[00:38:39] like this under Joe Biden and then like
[00:38:42] this under Trump. Okay. So here's okay.
[00:38:46] This would be Biden. This gigantic spike
[00:38:48] and then Trump stays steady. The problem
[00:38:50] is people are looking at the prices here
[00:38:52] and they're saying, "Well, they don't
[00:38:53] look like the prices here." Well, yeah.
[00:38:55] What's Trump supposed to do about that
[00:38:57] absent a radical increase in in the
[00:39:00] interest rates that would sink the that
[00:39:01] would sink the economy?
[00:39:02] >> So, one thing that has happened,
[00:39:04] everyone was predicting that Trump's
[00:39:05] tariffs were going to be inflationary.
[00:39:07] And the Treasury Secretary, Scott
[00:39:08] Besson, was doing a a little victory lap
[00:39:11] because when he was uh being confirmed
[00:39:13] for his position, he said, "No, I
[00:39:15] actually think tariffs are going to be
[00:39:16] deflationary." And the San Francisco Fed
[00:39:18] just came out and said the tariffs are
[00:39:19] deflationary.
[00:39:19] >> No, no, no. You're reading the study
[00:39:21] totally wrong. That's not what the study
[00:39:22] says. I read the entire study. It's 150
[00:39:24] pages. What that study says is that when
[00:39:27] you look at the infl tariffs over time,
[00:39:31] there's a spike at the beginning because
[00:39:32] things get more expensive because you're
[00:39:33] reducing the supply and the demand
[00:39:35] remains the same, right? So the price
[00:39:36] goes up temporarily and then people
[00:39:37] start to lose their jobs. And when
[00:39:39] people start to lose their jobs, the
[00:39:40] demand goes down. And when the demand
[00:39:41] goes down, the prices come down.
[00:39:43] >> No, no. So you can say it's
[00:39:44] deflationary. There was a big caveat
[00:39:45] even in the popular reporting which is
[00:39:47] the caveat is it hurts employment and it
[00:39:50] hurts economic growth. Yeah. So there's
[00:39:51] obvious
[00:39:53] there's one one further point on it just
[00:39:56] to why I think your video went viral Ben
[00:39:58] is because one thing people are hearing
[00:40:00] is they're not they're not missing the
[00:40:02] context of you're giving personal advice
[00:40:04] to someone who's asking you know but at
[00:40:06] a at a macro level at a political level
[00:40:08] what people are hearing is hold on
[00:40:09] you're telling me my family's been in
[00:40:11] this town forever. I'll use my own
[00:40:12] example. I got I have dozens of family
[00:40:14] members buried in the local uh cemetery
[00:40:18] in my hometown and even before that go
[00:40:20] the Nolles initially were from New
[00:40:22] Hampshire and they you know they arrived
[00:40:24] here the null side in 1660 the Nolles
[00:40:27] family home stood from 1660 until 1994
[00:40:31] when the home burned down. There are
[00:40:33] still Nolles's all over that area in New
[00:40:36] Hampshire and Maine. And what I think a
[00:40:38] lot of people are looking around at is
[00:40:40] part of the reason that housing in
[00:40:42] particular is is unaffordable right now
[00:40:44] is because of government decisions.
[00:40:46] Government decisions to flood the
[00:40:48] country with a bunch of like Venezuelan
[00:40:50] criminals or Somali or something and
[00:40:52] increase the cost of housing or
[00:40:53] government decisions that are are going
[00:40:56] to compromise certain industries or
[00:40:57] certain jobs because of trade deals or
[00:40:59] whatever going all the way back to NAFTA
[00:41:00] or even further. We don't need to to
[00:41:02] litigate those in in particular. But
[00:41:04] you're saying, "No, it's part of this
[00:41:06] political order that has led to this
[00:41:09] crisis at the very least with
[00:41:10] migration." And so why is it that I'm
[00:41:12] just supposed to say, "A shucks, I got
[00:41:14] to lose my hometown because well, you
[00:41:16] know, Republicans and Democrats together
[00:41:18] flooded the country with with aliens."
[00:41:21] Isn't there a good to having, you know,
[00:41:24] long family histories in a single place?
[00:41:27] >> Of course. Sure. And there's and there's
[00:41:28] a single and there's a good to having
[00:41:29] your family live near you. I have tons
[00:41:31] of family that lives near me. I'm a
[00:41:32] person who who grew up in LA. I spent my
[00:41:34] entire life living in LA until I was 35,
[00:41:36] one mile from my parents and then I
[00:41:38] moved to Florida and I still live one
[00:41:39] mile from my parents because I took them
[00:41:40] with me. So, I'm very much in favor. One
[00:41:41] of the things I talk about on the show
[00:41:42] all the time is having family structures
[00:41:44] nearby because you need those supportive
[00:41:46] family structures. That's not the case
[00:41:47] that I'm making is that you should
[00:41:49] abandon this sort of stuff or that mass
[00:41:51] migration should replace you in your
[00:41:52] hometown. I think everyone here is very
[00:41:54] much against mass migration is in very
[00:41:56] much in favor of what President Trump
[00:41:57] has been doing on the immigration
[00:41:58] program. That the problem that I see is
[00:42:00] is not any of that. I agree with all
[00:42:01] this on policy, but if there's a
[00:42:03] mentality that sets in that says I bear
[00:42:05] no responsibility in changing my own
[00:42:06] life if I can't change the outside
[00:42:08] circumstances and now I'm just going to
[00:42:09] sit here and about it like that
[00:42:11] doesn't seem like a a specific recipe
[00:42:12] for individual success. But Matt, I I
[00:42:14] want to know what you take cuz I I think
[00:42:16] you and I are as as usual we are on
[00:42:18] opposite ends of the spectrum in some
[00:42:19] ways.
[00:42:19] >> I agree with your practical point and I
[00:42:22] agree also with maybe I'm sort of in
[00:42:24] between because I agree with your your
[00:42:26] point. I also agree with some of the
[00:42:27] criticism, the more the more rational.
[00:42:29] >> You have a modern position, Matt.
[00:42:31] >> Well, no, because here's here's the way
[00:42:32] I would put it. Ben's correct, and I and
[00:42:34] I've said the same thing many times that
[00:42:36] uh especially as a young man, I also
[00:42:37] think there's a gender element to this
[00:42:38] that is a is a sort of a different
[00:42:40] topic, but like as a as a parent, I want
[00:42:43] my sons when they become adults to move
[00:42:45] out of I don't want them to move 10
[00:42:47] hours away hopefully, but if they have
[00:42:48] to, they have to. I do want them to like
[00:42:50] move out and, you know, experience
[00:42:52] living on their own a little bit uh
[00:42:53] before they become before they become
[00:42:55] husbands and fathers. My daughters, I I
[00:42:57] would love for them to just stay home
[00:42:58] with me until they get married many many
[00:43:01] many years in the future. So, I do think
[00:43:03] there's like a gender element to it, but
[00:43:04] that's a separate thing. I think if I I
[00:43:07] totally agree that if you're in a spot,
[00:43:09] particularly if you're a young man and
[00:43:12] you can't afford anything, you can't get
[00:43:13] a job, can't afford to live anywhere,
[00:43:16] while you're single, you have you have
[00:43:18] no kids, you have no dependence, you can
[00:43:20] go anywhere and do anything and you can
[00:43:22] take risks and you know, the stakes are
[00:43:25] are pretty low. I mean, worst case
[00:43:27] scenario, you go somewhere, you end up
[00:43:28] sleeping in your car or something for a
[00:43:30] while. I mean, that's not good, but it's
[00:43:31] like, well, it's just you. You can you
[00:43:33] can handle that, especially as a young
[00:43:34] man. So you could take you could take
[00:43:35] risks. You can go out and and and and
[00:43:37] pursue opportunities. Uh however, at the
[00:43:40] same time, it's also true that you
[00:43:43] shouldn't have to do that. Like
[00:43:45] something is wrong that so many people
[00:43:47] have to do that. You should be able to
[00:43:50] to Michael's point when if you're a
[00:43:52] young man and you're looking at, okay,
[00:43:53] well my my parents were born here. They
[00:43:54] lived here. My grandparents lived here.
[00:43:57] Maybe my grandpa my great-grandparents
[00:43:59] lived here. So generations of a family
[00:44:00] lived in the same place. And now all of
[00:44:02] a sudden, and I'm I I I have the same
[00:44:04] kind of skills that they do. I might
[00:44:06] even be more more educated than they
[00:44:07] were. So I'm in many ways more qualified
[00:44:10] for a job than even any of them were.
[00:44:11] And yet all of a sudden, everything's
[00:44:13] broken down. It doesn't work for me to
[00:44:14] live in this town anymore. Something is
[00:44:16] wrong. Something is broken. It should
[00:44:18] not be this way. We need to fix it. So,
[00:44:20] but on the practical level, well, it is
[00:44:22] this way now. And we want you to still
[00:44:24] succeed. So you might have to go
[00:44:26] somewhere else hopefully with the intent
[00:44:28] of eventually coming back to live around
[00:44:29] your family because I totally believe I
[00:44:31] mean we we emphasize the nuclear family
[00:44:33] so much which is important but also the
[00:44:35] the quote unquote extended family is
[00:44:37] also important so getting back to them
[00:44:39] and that's what you know what a lot of
[00:44:40] us did what I kind of did move around
[00:44:42] move around end up back with your family
[00:44:44] um so you might have to do that
[00:44:45] practically
[00:44:47] >> you shouldn't have to it shouldn't be
[00:44:48] that way that's the policy end of it and
[00:44:51] so we need policies in place that make
[00:44:53] it possible for people to live with
[00:44:55] their family and then move next door and
[00:44:58] stay with generations of families their
[00:45:00] entire life. You should be able to do
[00:45:01] that in a functioning and thriving
[00:45:03] society. One of the ways to make that
[00:45:05] happen is the thing we all agree with.
[00:45:07] Uh get all the illegals out. There's a
[00:45:09] lot we've been they've been saying 20
[00:45:11] million illegals in this country.
[00:45:12] They've been telling me that since I
[00:45:13] like 20 years ago they were saying it
[00:45:15] was 20 million. It's way more than that.
[00:45:16] We don't know how many. Get them all
[00:45:18] out. Shut down immigration. And uh
[00:45:20] that's one of the policy changes that
[00:45:22] could be made and we need to do that.
[00:45:23] But until that happens, yeah, you got to
[00:45:25] figure out what you're going to do in
[00:45:26] your own rare moment. Totally agree.
[00:45:28] >> I want to hear Ben's point and I want to
[00:45:30] hear from my great great-grandfather
[00:45:32] Andrew play.
[00:45:35] >> I was just going to say I agree with
[00:45:36] Matt actually. So Matt and I are
[00:45:37] actually in total agreement on this.
[00:45:38] >> Okay, now I really want to move on
[00:45:40] because Matt's offering a moderate
[00:45:41] opinion and Ben is agreeing with him. I
[00:45:43] want to tell you at the other end of the
[00:45:45] age spectrum about pre-born. I want you
[00:45:48] to go to pre-born.com/fire
[00:45:52] right now because Pre-born is is one of
[00:45:54] my absolute favorite uh charities. I
[00:45:57] personally support it. I encourage you
[00:45:58] to personally support it to give what
[00:46:00] you can. They've saved over 380,000
[00:46:02] babies uh through their rescue program.
[00:46:05] Uh what they do is pretty simple. They
[00:46:08] introduce babies to their mothers. And
[00:46:11] when a woman sees an ultrasound, it
[00:46:13] doubles the baby's chance of life. When
[00:46:15] a woman is considering abortion, it it's
[00:46:17] they provide amazing care and work. Not
[00:46:20] only do they introduce the babies to the
[00:46:21] mothers, they also take care of those
[00:46:23] mothers afterward. Radically increase
[00:46:25] the chances that that baby is going to
[00:46:27] live and that they will have a
[00:46:28] successful life. This giving season, do
[00:46:32] not let another life be lost. Be the
[00:46:34] hope for worried mothers and at risk
[00:46:36] babies to donate securely. Two ways to
[00:46:38] do it. If you like your phone, if you're
[00:46:40] a little more of a lite than some of us,
[00:46:42] you're not down with on the AI train,
[00:46:43] you dial pound 250, you say keyword
[00:46:46] baby, pound250, keyword baby. Or you go
[00:46:48] to pre-born.comfire,
[00:46:50] pre-born.com/fire.
[00:46:51] Every gift is taxdeductible. So, it's
[00:46:54] another way of not having to pay all
[00:46:56] those bureaucrats in Washington. It's a
[00:46:57] your money can be put to good use and
[00:46:59] not be put to bad use. Okay. Ben agrees
[00:47:02] with Matt. Matt has a moderate opinion.
[00:47:04] I'm totally scandalized and I want to
[00:47:06] hear from Drew. So, I disagree with Ben
[00:47:09] in a couple of ways here. I mean, first
[00:47:11] of all, Zora Mandani is is one of the
[00:47:13] scummiest politicians I've ever seen in
[00:47:15] my entire life. But he did do half the
[00:47:17] job. He did raise the issue. And when
[00:47:20] you raise the issue, people people perk
[00:47:22] up. No, it's it's a terrible thing. He
[00:47:24] raised the issue and then offered
[00:47:25] socialist solutions that we know will be
[00:47:27] utterly utterly destructive. It's not
[00:47:30] playing Candyman to say the word that
[00:47:32] people are thinking about. The worst
[00:47:34] thing a politician can do and the thing
[00:47:35] will destroy any administration is to
[00:47:38] show people a chart that shows them
[00:47:40] they're not suffering when they can't
[00:47:41] afford Christmas presents for their
[00:47:42] kids. Like here's the chart, you're
[00:47:44] doing great, you know, and people know
[00:47:46] exactly how they're doing and it it
[00:47:48] makes them incredibly frustrated. What
[00:47:49] what they're frustrated with Trump now
[00:47:51] is he's do something I think is urgently
[00:47:53] important. I think we're going to be
[00:47:54] very grateful to Trump for what he did
[00:47:56] five, six, seven years down the line
[00:47:58] when China finally invades Taiwan. I
[00:48:00] think he's totally rearranged America's
[00:48:03] priorities in absolute great ways, but
[00:48:05] he didn't pay attention to the thing
[00:48:07] that's right there on the table and he
[00:48:08] has to pay attention to it. Now, the
[00:48:10] other thing I disagree with is normally
[00:48:12] it is true that you have to put people
[00:48:13] out of work to bring down inflation.
[00:48:15] That's what Reagan did and he lost the
[00:48:17] midterms. He didn't lose that the
[00:48:19] houses, but he lost the midterms because
[00:48:21] of it. And everybody said, "Oh, this is
[00:48:22] a disaster." And then the economy turned
[00:48:24] around for the next 25 years because of
[00:48:26] what Reagan did. But the other thing
[00:48:28] that the there is a third way of of uh
[00:48:32] dealing with inflation which is raising
[00:48:34] the investments and the salaries of
[00:48:37] people. If you can steady you know if
[00:48:38] you can cut inflation off and make the
[00:48:41] prices level out and then wages start to
[00:48:43] rise then you can actually that is the
[00:48:46] same thing as bringing down inflation.
[00:48:47] Now people can afford the things they
[00:48:49] couldn't afford before. So Matt is
[00:48:51] incred totally right that we got to get
[00:48:53] rid of all the illegals and as far as
[00:48:55] I'm concerned I don't care who it is.
[00:48:56] I've lost all sympathy with the the
[00:48:58] illegal immigrations. I know some of
[00:49:00] these people are great people who snuck
[00:49:01] in. They got to go. Everybody's got to
[00:49:03] go and we got to give the country back
[00:49:04] to the people who are here and who were
[00:49:06] born here. No question about that. In my
[00:49:08] mind, I cannot have compassion for 20
[00:49:10] million people. I can only have
[00:49:11] compassion for one person at a time. If
[00:49:13] one guy sneaks in, I can have compassion
[00:49:15] for him. I can't have a compassion for
[00:49:17] an invading army, which is what the
[00:49:18] Biden administration gave us. But the
[00:49:20] other thing is we have to have
[00:49:22] capitalist solutions. And I think there
[00:49:24] are capitalist solutions. For instance,
[00:49:26] I think a lot of companies are now
[00:49:27] offering people stock. A lot more
[00:49:29] companies are offering people stock and
[00:49:31] investment as payment as part of the
[00:49:33] payment. I got that when I worked for
[00:49:35] Coca-Cola. I was a reader for Columbia
[00:49:37] Pictures and Coca-Cola owned them and
[00:49:39] they gave me Coke stock. It was it was
[00:49:41] transformative. I mean, it was I all I
[00:49:43] had to do is hold on to it. And now I
[00:49:44] had an investment in the company and in
[00:49:47] the economy and I think that's really
[00:49:48] important. Trump is talking about
[00:49:49] personal savings accounts that I think
[00:49:51] is also a really good idea. Some of his
[00:49:54] ideas, like the 50-year mortgage, I'm
[00:49:55] not too happy about because that's like
[00:49:57] double the price of homes. But still, it
[00:49:59] might it might
[00:50:00] >> liime debt slavery. [laughter]
[00:50:02] >> Lifetime debt slavery. Yeah. So, but I
[00:50:05] think that there are ways for
[00:50:06] capitalists to increase people's
[00:50:08] participation in the economy so that
[00:50:11] when things work for the bosses, they
[00:50:14] work for the people too. I think this
[00:50:16] it's a wonderful thing that this country
[00:50:18] when it is working on all cylinders and
[00:50:20] when the capitalism is in place, it
[00:50:22] makes so much money that the big guys
[00:50:25] can afford to share about a little of it
[00:50:27] with the little guys. Not by having the
[00:50:29] government redistribute it, but by
[00:50:30] saying here's a piece of what you're
[00:50:32] working for. Starbucks did it. It worked
[00:50:33] really well for a long time and I think
[00:50:35] all the a lot of companies should do it.
[00:50:37] And so I think that there are ways of
[00:50:39] dealing with this, but I think that that
[00:50:40] dealing with it is something government
[00:50:42] has to do. It is a policy problem.
[00:50:44] government creates inflation. People do
[00:50:46] not It's not the greedy banks. It's not
[00:50:48] the greedy, you know, drugstores or
[00:50:50] whatever. It's it's the government that
[00:50:51] creates inflation. They can they can
[00:50:53] actually do things to bring it down. And
[00:50:55] I think one thing you're right that we
[00:50:56] don't want deflation because it means
[00:50:58] the economy is tanking. But you can get
[00:51:00] wages growing in a lot of different
[00:51:01] ways. One of them by reducing the
[00:51:03] workforce by getting rid of the people
[00:51:04] who shouldn't be here would be a great
[00:51:06] first step.
[00:51:07] I don't disagree with some of those
[00:51:09] those policy prescriptions, but I think
[00:51:10] that the thing that I am am kind of
[00:51:12] stuck in and it's driving me a little
[00:51:14] crazy is and I think it's the reason why
[00:51:15] the country is penduluming
[00:51:17] [clears throat] side to side incredibly
[00:51:18] wildly. You'll see you'll see like right
[00:51:21] now the you know Koshi is one of our
[00:51:23] sponsors. I'll mention them again here
[00:51:24] because I did on my show earlier. But if
[00:51:26] you look at the polls right like the the
[00:51:27] couch markets right now, Democrats
[00:51:29] according to that market and I kind of
[00:51:30] agree with this are actually the
[00:51:31] favorites in 2028. Uh, and I think the
[00:51:34] reason for that and I think the reason
[00:51:35] that the country just keeps swinging
[00:51:36] wildly poll is because when you have
[00:51:39] politicians who are actively saying the
[00:51:41] same thing but none of them are saying
[00:51:42] what is true, this is what you end up
[00:51:44] with. So if everybody says affordability
[00:51:46] is I agree affordability is a problem.
[00:51:48] This is why I'm kind of waving that
[00:51:49] away. I can it's it labeling problems is
[00:51:51] the easiest thing in the world. You can
[00:51:52] do it in your life all day long. And I
[00:51:54] can agree with my wife on every single
[00:51:55] problem that exists in our life. It's
[00:51:57] when you get to the solutions that
[00:51:58] things get a little bit complicated. And
[00:52:00] when you have politicians who always say
[00:52:02] the same thing but from different sides
[00:52:03] of the aisle, which is you're right,
[00:52:04] it's government's job to solve it. Okay,
[00:52:06] there's only one problem. If the thing
[00:52:07] that you're saying is not going to solve
[00:52:09] it, and you're asking for additional
[00:52:11] centralized power in order to solve the
[00:52:13] thing, what you are going to end up with
[00:52:14] is failure. And then the other guy is
[00:52:16] going to say, "Give it to me." And so
[00:52:18] they're just passing the ball side to
[00:52:19] side. The only thing that is going to
[00:52:20] create affordability is a dynamic and
[00:52:23] innovative economy, which means a few
[00:52:25] things. One, a consistent level of
[00:52:27] regulation or less regulation, right?
[00:52:29] like actual certainty and what's going
[00:52:30] to happen tomorrow in the economy. Two,
[00:52:32] you're actually going to need innovators
[00:52:34] to innovate and you need to leave them
[00:52:35] alone and allow them to innovate and
[00:52:37] actually capture the profits that
[00:52:38] they're creating through innovation. And
[00:52:40] then you're going to need to get the
[00:52:41] hell out of the way. I mean, the the
[00:52:42] magic of the Reagan economy. I know
[00:52:44] Reagan has now become an anathema for
[00:52:45] some reason that I cannot even imagine I
[00:52:47] can't imagine why the right has decided
[00:52:49] that Reagan was suddenly bad other than
[00:52:51] because we we need to cast up a false
[00:52:53] villain in order to elevate you know
[00:52:55] whatever the new
[00:52:56] >> amnesty irritated some people in
[00:52:57] retrospect I'm not saying everything
[00:52:59] about Reagan was I'm not saying
[00:53:00] everything about Reagan was wonderful
[00:53:01] but I don't think everything about Trump
[00:53:02] is wonderful either I I do think that
[00:53:03] the Reagan economy generated more job
[00:53:06] growth and pulled us out of a greater
[00:53:08] economic morass than any president in
[00:53:10] history probably and so I think he that
[00:53:13] that is worth something. And so if you
[00:53:15] look at at, you know, Reagan, Reagan's
[00:53:17] pitch, his pitch was, I can't solve all
[00:53:19] your problems for you, but I can get the
[00:53:20] government out of your way so you can
[00:53:21] solve your own problems. And I just want
[00:53:23] one politician who will say that, like
[00:53:25] just one, as opposed to this kind of
[00:53:26] centralized government bull where
[00:53:28] everybody says, "No, no, don't worry.
[00:53:30] You sit there and I'll solve all your
[00:53:31] problems for you." No one is going to
[00:53:32] solve the vast majority of problems in
[00:53:34] your life. No politician will do it. the
[00:53:36] best they can do is get rid of the
[00:53:37] obstacles that are in your way. The
[00:53:38] systemic obstacles that are in your way
[00:53:40] and then most of the decisions in a free
[00:53:42] country ought to be up to you and that
[00:53:43] is scary because it means that actually
[00:53:45] your success or failure is largely on
[00:53:46] your own shoulders%
[00:53:49] on this Ben this is different. I agree
[00:53:50] with Ben 100% on all of it.
[00:53:52] >> No, but I I in defense of those who are
[00:53:54] critiquing Ray obviously I still love
[00:53:56] St. Gipper and politicians come and go.
[00:53:58] You know, Nixon was in in the crater for
[00:54:00] a while. Now Nixon's making a comeback.
[00:54:02] Kulage was the man for a while. Now
[00:54:03] people are looking more toward I don't
[00:54:04] know, they like Teddy Roosevelt. They
[00:54:05] used him. So this happens as we rethink
[00:54:08] uh history and as we move on to new
[00:54:10] circumstances. Part of the reason that
[00:54:11] there's a little more of a critical
[00:54:13] lens, you know, as opposed to just
[00:54:15] exalting St. Reagan is of being perfect
[00:54:18] in all ways is because, you know, in in
[00:54:20] the 80s, mass amnesty for illegal
[00:54:23] aliens, for example, wasn't really all
[00:54:25] that big a deal, but it did set the
[00:54:26] stage for a major problem. And so, we're
[00:54:28] rethinking that. In in the 80s, uh, you
[00:54:31] know, obviously Reagan was massively
[00:54:33] successful in his economic policy, as
[00:54:35] was Thatcher, as was that that whole
[00:54:36] kind of movement. We do live in a a
[00:54:39] different world today. And so it's not
[00:54:40] to say we throw out all of their sol
[00:54:42] it's not to say that we throw out all of
[00:54:43] their solutions, but it's to recognize
[00:54:44] that there are more difficult economic
[00:54:46] problems that we have to deal with. And
[00:54:47] so one of, you know, Drew actually
[00:54:49] offered some real solutions here, which
[00:54:50] is uh he you pointed out, Drew, that
[00:54:54] having people really bought into the
[00:54:56] economy, you know, Coca-Cola giving you
[00:54:57] some stock back in the day is helpful.
[00:54:59] Back when we were rethinking some of the
[00:55:01] problems with industrial capitalism a
[00:55:02] 100 years ago, you had writers,
[00:55:04] especially Catholic writers like
[00:55:05] Chesterton and Bellock saying we need
[00:55:07] some option, not socialism and
[00:55:09] communism, not pure unbridled
[00:55:10] capitalism, but some other option. They
[00:55:13] propose something called distributism,
[00:55:14] which is too complicated to get into
[00:55:15] here and probably isn't all that
[00:55:17] practical, but part a lot of what it
[00:55:19] comes down to is give people some
[00:55:20] ownership, give people some stake, and
[00:55:22] and I think that's really really
[00:55:24] important. And so here's another
[00:55:25] criticism maybe of what came out of the
[00:55:26] Reagan era is that we judge the health
[00:55:28] of an economy purely by GDP. And GDP is
[00:55:32] a fine economic indicator, but it's not
[00:55:33] the beall and endall of everything. And
[00:55:35] I think what a lot of people are looking
[00:55:36] around at today is saying, look, you can
[00:55:38] show a lot of economic activity uh in
[00:55:41] all sorts of ways by the pornography
[00:55:43] industry to use the topic we keep coming
[00:55:44] back to. You know, the pornography
[00:55:46] industry is booming. Look at that. GDP
[00:55:47] is going up. You know, there are all
[00:55:48] sorts of very destructive industries. We
[00:55:51] we brag now about how women's employment
[00:55:53] is the highest ever. I'm not sure that's
[00:55:55] a great thing, you know? I mean, who's
[00:55:56] taking care of the kids? Who's watching
[00:55:58] the home? Isn't there some cost to that
[00:55:59] as well? And so, I just I I wonder one
[00:56:02] slightly practical solution might be to
[00:56:04] say, "All right, look, maybe GDP isn't
[00:56:06] the beall and endall of everything." And
[00:56:08] maybe there are certain areas of the
[00:56:09] economy that are legitimately immoral
[00:56:11] and destructive, and we used to heavily
[00:56:13] regulate them, like pornography, for
[00:56:15] instance, but all sorts of other kind of
[00:56:17] vicious and degrading avenues. We've
[00:56:20] liberalized gambling. I don't know that
[00:56:21] that's really great. Maybe that maybe it
[00:56:23] ticks up GDP a little bit, but it
[00:56:24] doesn't I don't think that's really
[00:56:25] great for the true health of an economy.
[00:56:27] Maybe we need to rethink what economic
[00:56:29] health really looks like because uh the
[00:56:31] changes that came about in the late part
[00:56:33] of the 20th century h did have some
[00:56:35] negative side effects as well as
[00:56:37] positive outcomes.
[00:56:37] >> Can I can I address the Reagan thing for
[00:56:39] a minute though because a lot of this I
[00:56:41] think started with that Caldwell book,
[00:56:42] The Age of Entitlement, in which he he
[00:56:44] blamed Reagan for things that Reagan
[00:56:46] actually did. Reagan said he failed to
[00:56:47] cut down the government. That was the
[00:56:48] big failure of his administration. But
[00:56:50] we've edited the Cold War out of
[00:56:51] history. And you know, Reagan like won
[00:56:54] the Cold War. He freed like a huge huge
[00:56:57] section of the world of the globe. He
[00:57:00] set people free. And what what they did
[00:57:01] with that is up to them. But he he
[00:57:03] actually did that. That you can't
[00:57:05] imagine how unheard of that was, how
[00:57:08] unexpected it was, how nobody thought it
[00:57:10] would ever happen, how we were dealing
[00:57:11] with the Soviet Union for the rest of
[00:57:12] our lives. Not just people who thought
[00:57:14] that communism was going to work, but
[00:57:16] people who thought it's just never going
[00:57:17] to go away. He he made it go away. And I
[00:57:20] think for that he's a he's a hero. And
[00:57:22] yeah, what what Nolles is saying is
[00:57:23] true. We now are living in a absolutely
[00:57:25] new economy. And while the basis
[00:57:27] deregulation well the bas totally
[00:57:29] disagree there's no such thing as a new
[00:57:30] economy.
[00:57:32] >> Let me finish. Let me finish. The ba the
[00:57:34] basis of deregulation and freedom and
[00:57:36] and uh free markets are absolutely the
[00:57:39] same. They don't change at all. You
[00:57:41] know, but the problems that arise
[00:57:43] because pro no no system solves human
[00:57:46] problems because human beings can't be
[00:57:47] solved the the problems that arise and
[00:57:50] the and the places where the peaks of
[00:57:51] problems are change and then we have to
[00:57:53] address those and one of them them
[00:57:54] you're absolutely right. One of them one
[00:57:57] of the key ones is the role of women in
[00:57:58] our society which I think is screwed up
[00:58:01] so badly that it's it's destroying
[00:58:02] everything. We've actually stopped
[00:58:04] reproducing which to me is always a bad
[00:58:06] sign. you know
[00:58:07] >> that economic indicator another
[00:58:09] indicator
[00:58:10] >> I mean so actually I this teaches me a
[00:58:12] lesson I should let Drew finish his
[00:58:13] sentences because when he finishes them
[00:58:14] I'm more likely to agree with them but
[00:58:16] [laughter] uh but at
[00:58:17] >> that'll be a whole new relationship but
[00:58:19] at the same time you know nullles I I'll
[00:58:21] pick on you a little bit when we say you
[00:58:22] know terrible we shouldn't look at GDP
[00:58:24] it's not a good indicator of economic
[00:58:26] >> it's not the be all and end all
[00:58:27] >> okay it's not the be all but it's the be
[00:58:29] all okay so there's no such thing as an
[00:58:30] economic beall and end all okay but I
[00:58:32] think that we are mixing up a few
[00:58:33] terminologies here and I think that we
[00:58:35] ought to tease without the strain for
[00:58:37] one second. There's a difference between
[00:58:38] economic health and societal health.
[00:58:39] These are not the same thing. And you
[00:58:41] you can have a very economically healthy
[00:58:43] society that is that is breaking down in
[00:58:45] a lot of social ways with with
[00:58:46] tremendous pathologies. I think that's
[00:58:48] what you're actually seeing. And so yes,
[00:58:50] it turns out that we are materially
[00:58:52] significantly better off than we were in
[00:58:53] the 1980s. In fact, we are materially
[00:58:55] significantly better off than we were in
[00:58:56] the mid-200s. When when people talk
[00:58:58] about the unaffordability of homes,
[00:59:00] that's because an average home in 1950
[00:59:01] was a 980 ft, you know, square foot
[00:59:05] brick house with no insulation and no
[00:59:07] heating or air and maybe a bathroom
[00:59:09] outside. Like the this kind of idea that
[00:59:11] we're living worse than your parents or
[00:59:12] grandparents is just belied by every
[00:59:14] available fact. Maybe you're living
[00:59:16] worse than your grandparents are right
[00:59:17] now, but you're not living worse than
[00:59:19] your grandparents were at the same age.
[00:59:20] Right? But if you're a 20-year-old
[00:59:22] living in 2025, you are not worse off
[00:59:24] than your grandparents were living as a
[00:59:26] 20-year-old in 1958 or 1960.
[00:59:28] >> iPhone, but you don't have a house. I
[00:59:30] mean, I do now, but you don't have
[00:59:32] >> your apartment is nicer than their house
[00:59:34] was. Okay, that is a reality. If you
[00:59:36] were living anywhere except for New York
[00:59:37] City, and and by the way, the idea that
[00:59:40] you couldn't move somewhere and get a
[00:59:41] house, that that's this is now you're
[00:59:43] getting back to my original point, which
[00:59:44] is on a personal level, if you want to
[00:59:45] live a life like your grandparents, you
[00:59:47] might have to do the thing that your
[00:59:48] grandparents did. Okay? your
[00:59:49] grandparents went to a war and then they
[00:59:50] came back and moved to a town that they
[00:59:52] actually probably did not grow up in and
[00:59:53] then they got a house that was like off
[00:59:55] the lot from some from from some big
[00:59:58] corporation that built a bunch of
[00:59:59] standard box looking houses that now you
[01:00:01] drive past those on the freeway and you
[01:00:02] say I can't believe somebody ever lived
[01:00:03] in those. So it's kind of you know
[01:00:05] rosecolored glasses about the past
[01:00:06] drives me a little bit insane. And again
[01:00:09] I think that if we want to look at the
[01:00:10] real problems in our society we
[01:00:11] shouldn't create a mythical past and we
[01:00:13] shouldn't create a mythically terrible
[01:00:14] present. We should actually look at the
[01:00:16] problems in our society. And one of
[01:00:17] those would be people not having kids.
[01:00:19] One of those would be deep depression
[01:00:20] and unhappiness. People killing
[01:00:22] themselves with opioids. You know,
[01:00:23] people being yes, people having their
[01:00:25] jobs taken by illegal immigrants in
[01:00:26] certain industries. Like those are
[01:00:27] actual real solvable problems. But I
[01:00:29] don't have a Delorean. All I have right
[01:00:31] now is the way that people are living
[01:00:33] right now. And so now we have to look at
[01:00:35] the problems in front of us and how do
[01:00:36] we solve those? Yeah, but that's the
[01:00:37] that's the one that's the one part where
[01:00:39] I so so at the buzzer I I get to
[01:00:41] disagree with you Ben on on I remember
[01:00:43] there was one thing you said in that in
[01:00:45] that clip that I that I did disagree
[01:00:46] with I couldn't remember then you just
[01:00:47] said it again. Uh so the the the one
[01:00:50] part about well this is you know
[01:00:51] America's this how America's always been
[01:00:53] that you you you leave and you go
[01:00:55] somewhere else away from your family and
[01:00:57] I think that like back in the pioneer
[01:00:59] days I mean that there is something
[01:01:00] about that that's in the American spirit
[01:01:02] of like literally going out into a
[01:01:04] wilderness and building your own life
[01:01:07] maybe thousand miles away from anyone
[01:01:09] that you know and so there's that's
[01:01:11] American in a certain sense but that was
[01:01:13] back in the pioneer days I think for
[01:01:15] most of for most of for most of American
[01:01:17] history. It It's like anywhere else in
[01:01:19] the world. People they they grew up in a
[01:01:21] place they didn't move that far away.
[01:01:22] They they stayed where their support we
[01:01:25] are less mobile now and by the stats. We
[01:01:27] are less mobile now than we have ever
[01:01:28] been any time in American history. Quick
[01:01:30] raise your hand if you are currently
[01:01:32] living in the town where you grew up.
[01:01:34] >> You're saying but you're saying we're
[01:01:35] less mobile now.
[01:01:37] >> And I'm saying that we are a unique
[01:01:38] breed in that we actually like we're a
[01:01:40] little older than the Jenzers. Okay.
[01:01:43] Like we but the the people who tend to
[01:01:44] be more successful and again as a piece
[01:01:46] of advice are the people who tend to
[01:01:48] actually move in pursuit of opportunity
[01:01:50] and if you look historically speaking it
[01:01:52] is not true that in 1920 everybody is
[01:01:53] living in the town where they grew up.
[01:01:55] In fact in 1920 there were more people
[01:01:56] who were moving across the country at
[01:01:58] great expense and difficulty than there
[01:02:00] are today in in 2025.
[01:02:04] Exceptional people exceptional people
[01:02:06] move. They go into the wilderness they
[01:02:08] build new towns but most people are not
[01:02:09] exceptional
[01:02:11] people. Yes. Yeah. Yeah. Right. So, so
[01:02:13] and you want us and you want a country
[01:02:15] filled with communities and filled with,
[01:02:17] you know, people with traditions and
[01:02:18] things like that. So, I I kind of half
[01:02:20] agree with you on this. I do believe
[01:02:21] that exceptional people should and will
[01:02:23] move. But I but I think that that Matt
[01:02:25] is right that it shouldn't be like that
[01:02:26] for everybody.
[01:02:27] >> Sorry. Go back to Matt and Matt can
[01:02:28] finish disagreeing with me being a jerk
[01:02:29] again.
[01:02:30] >> Uh no, I I think I think uh I think
[01:02:32] that's the I don't know the the claim
[01:02:35] that um people were more mobile in the
[01:02:37] 1920s. I there's also there's a
[01:02:40] technological side of this too that that
[01:02:42] for a lot of American history you know
[01:02:43] moving away from your family uh and
[01:02:46] going to another state over was like a
[01:02:48] threemonth journey and you know people
[01:02:50] are going to die along the way. So so
[01:02:52] that this one one of the reasons why we
[01:02:53] know that that that for a lot of you
[01:02:56] know American history and human history
[01:02:58] people didn't tend to do that. I mean
[01:02:59] sometimes they did but that was again
[01:03:01] that's like you're a pioneer. Um, I
[01:03:04] think that the at the very least and I
[01:03:05] and I don't think we're disagreeing on
[01:03:06] this point that the desire to stay in
[01:03:11] your community where you were born,
[01:03:13] where your family is, stay with your
[01:03:15] support system with your families and
[01:03:17] your your family and your friends.
[01:03:18] That's a good desire. There's nothing
[01:03:20] wrong with that.
[01:03:21] >> I agree with that.
[01:03:22] >> And and a and a healthy country is one
[01:03:24] where people if they want to do that are
[01:03:27] able to do it. So, but I think that's
[01:03:29] the part
[01:03:30] >> I think we all agree on that, right?
[01:03:31] That's that's you know this gets back
[01:03:33] though to this point of uh the neat and
[01:03:35] pat distinction between economic health
[01:03:37] and social health. I'm not sure that we
[01:03:39] can. Obviously, they're distinct
[01:03:41] concepts, but I'm not sure that we can
[01:03:42] totally separate them, you know,
[01:03:44] especially as increasingly in the modern
[01:03:46] age, we think of ourselves as
[01:03:47] omoeconomicus, you know, we're like
[01:03:49] primarily uh economic creatures. And I I
[01:03:52] don't I think we're just integral
[01:03:53] creatures and we we have all of these
[01:03:55] things together. And so, you know,
[01:03:57] especially at this kind of moment, you
[01:03:59] look now compare it to 1980 or 1880 for
[01:04:03] that matter. One of the major problems
[01:04:04] that we have is that social solidarity
[01:04:07] has really frayed, that religiosity has
[01:04:10] declined precipitously, though there are
[01:04:12] some signs that that's turning around.
[01:04:14] And you can't divorce that from the
[01:04:16] birth rate problem. You know, you can't
[01:04:18] divorce that divorce that from the fact
[01:04:19] that people aren't having kids. These
[01:04:20] are great predictor. You know,
[01:04:21] stability, tradition, and religion are
[01:04:23] are predictors of people having kids.
[01:04:25] And you can't divorce that from the
[01:04:26] economic problems because if we don't
[01:04:28] import the entire third world, we're
[01:04:30] told that our our economy is going to
[01:04:32] collapse, that GDP is going to collapse.
[01:04:33] So that's the whole argument for mass
[01:04:35] migration. And so these problems are all
[01:04:37] so deeply intertwined that it seems to
[01:04:39] me that there has to be some firmer
[01:04:42] political uh solution to rather than
[01:04:45] just say look uh we're going to let the
[01:04:47] free hand of the market, you know, work
[01:04:49] its way and we'll let the chips fall
[01:04:50] where they may. A lot of people are
[01:04:51] looking around and saying I don't like
[01:04:52] where the chips are falling. Well, I
[01:04:54] mean I this is a great place to for for
[01:04:56] us to conclude because I'm going to
[01:04:57] disagree for one second with Nolles and
[01:04:59] just say that there are many many more
[01:05:01] impoverished countries than the United
[01:05:03] States that have less severe pathologies
[01:05:05] than the United States. And in the past
[01:05:07] we were a less wealthy nation with less
[01:05:10] severe pathologies. And so this is why I
[01:05:12] say that trying to tie the economic
[01:05:13] situation to the pathologies I think in
[01:05:15] some cases and in most cases actually
[01:05:17] can be a fool's errand. But we'll have
[01:05:19] to save that for next time because
[01:05:21] here's the deal before we leave folks.
[01:05:22] Our biggest and best sale of the year is
[01:05:24] happening right this very instant like
[01:05:26] at this moment while you're listening to
[01:05:28] us. All DailyWare plus annual
[01:05:30] memberships are 50% off. You get
[01:05:32] everything. You get access to the DW
[01:05:34] library of movies, documentaries, Matt's
[01:05:36] documentaries mostly is what we're
[01:05:37] talking about there because those are
[01:05:38] the best ones that have ever been made
[01:05:39] and series that stand for the ideals
[01:05:40] that keep America free. And that of
[01:05:42] course includes the Pen Dragon Cycle
[01:05:44] Rise of the Merlin. It is coming January
[01:05:46] 22nd. All Access members get early
[01:05:48] access to episodes one and two one month
[01:05:50] early on Christmas Day, which is a bit
[01:05:51] of a sweetener for you there. You
[01:05:53] empower DW Plus to build culture, defend
[01:05:55] values, launch stories that ensure your
[01:05:57] voice and your values shape the future
[01:05:58] of the United States. Whether you want
[01:06:00] to join or give the gift of a DW
[01:06:02] membership to someone, now is the time
[01:06:03] to do it at 50% off. It is our best deal
[01:06:06] of the year. You can head on over to
[01:06:08] dailywire.com/subscribe.
[01:06:10] We will all be very happy to see you
[01:06:12] over there. Well, in just a moment, we
[01:06:14] are going to bring you the magical
[01:06:16] mystical trailer for finally the Pen
[01:06:19] Dragon Cycle, Rise of the Merlin, is
[01:06:21] coming January 22nd. Guys, thanks for
[01:06:23] stopping by. We will see you here
[01:06:25] hopefully never for the rest of our but
[01:06:26] actually, [laughter] we will see you
[01:06:27] here in a couple of weeks and we'll get
[01:06:28] together and disagree in friendly
[01:06:30] fashion on friendly fire with one
[01:06:32] another. Without further ado, here's the
[01:06:34] trailer.
[01:06:38] All of this is an illusion, an echo of a
[01:06:41] voice that has died.
[01:06:44] And soon that echo will cease.
[01:06:53] [music]
[01:06:57] They say that Merlin is mad.
[01:07:04] They say he was a king in David,
[01:07:08] the son of a princess of lost Atlantis.
[01:07:11] They say the future and the past are
[01:07:14] known to him. That the fire and the wind
[01:07:18] tell him their secrets. That the magic
[01:07:20] of the hill folk and druids come forth
[01:07:23] at his easy [music] command.
[01:07:26] They say he slew hundreds. Hundreds. Do
[01:07:31] you hear that the world burned and
[01:07:33] trembled at his wrath? [screaming]
[01:07:38] >> The Merlin died long before you and I
[01:07:41] were born.
[01:07:44] >> Merlin Emmeris has returned to the land
[01:07:47] of the living.
[01:07:50] >> Vigan [music] is gone. Rome is gone. The
[01:07:53] Saxon is here.
[01:07:56] Sax and Hangust has assembled the
[01:07:58] greatest war host ever seen in the
[01:07:59] island of the mighty. And before the
[01:08:01] summer is through, he means to take the
[01:08:03] throne,
[01:08:05] and he will have it. If we are too busy
[01:08:08] squabbbling amongst ourselves to take up
[01:08:10] arms against him, here is your hope. A
[01:08:13] king will arise to hold all Britain in
[01:08:16] his hand. A high king who will be the
[01:08:19] wonder of the world.
[01:08:21] you
[01:08:24] >> to a future of peace.
[01:08:27] [groaning]
[01:08:28] >> There'll be no peace in these lands till
[01:08:30] we are all dust.
[01:08:32] >> Men of the island of the mighty.
[01:08:35] YOU STAND TOGETHER.
[01:08:36] [screaming and groaning]
[01:08:37] >> We stand as Britain's.
[01:08:40] We stand as warn.
[01:08:44] >> Great darkness is falling upon this
[01:08:46] land.
[01:08:48] These brothers are our only hope to
[01:08:50] stand against it.
[01:08:53] >> Not our only hope.
[01:08:55] >> They say Merlin slew 70 men with his own
[01:08:58] hands.
[01:08:59] Like Cath he slew 500.
[01:09:04] >> No man is capable of such a thing. The
[01:09:07] mortal man.
ℹ️ Document Details
SHA-256
yt_yrq7EJgL4_Y
Dataset
youtube
Comments 0