Is Your Phone Listening? Expert Reveals Every Secret to Protect Your Online Privacy
📄 Extracted Text (17,967 words)
[00:00:04] you've dedicated your life to preserving
[00:00:08] privacy. So let's just start big
[00:00:11] picture. What is privacy and why is it
[00:00:14] important?
[00:00:15] >> So I believe that privacy is is core to
[00:00:20] freedom at the end of the day. I would
[00:00:22] even go as far as saying that is it is
[00:00:24] synonymous with freedom. Um and it is
[00:00:28] protecting you um protecting your inner
[00:00:32] core essentially um protecting your
[00:00:35] identity as a human being um from from
[00:00:38] forces that don't want you to be an
[00:00:41] individual and a human being at the end
[00:00:43] of the day. And so um
[00:00:45] >> so that was so nicely put. I think I
[00:00:47] think what it what it really boils down
[00:00:49] to is um
[00:00:52] and and in that regard I think privacy
[00:00:54] is is relatively similar to what um was
[00:00:57] originally intended also with the second
[00:00:59] amendment in the in the United States.
[00:01:02] It is a tool for you as a human being to
[00:01:05] protect yourself against coercive force
[00:01:08] against your very soul your your inner
[00:01:11] core.
[00:01:12] So there are forces and this has always
[00:01:15] been true at every time in history that
[00:01:17] seek to make people less human to turn
[00:01:20] human beings into slaves or animals or
[00:01:23] objects and privacy is the thing that
[00:01:26] prevents that. So the the the the crazy
[00:01:30] principle that exists within this
[00:01:32] universe is that there's this asymmetry
[00:01:35] baked right into the very fabric that we
[00:01:38] exist in. There's certain mathematical
[00:01:41] problems um where the effort required to
[00:01:44] undo them isn't just scaling linearly or
[00:01:47] exponentially but that scale so
[00:01:49] violently that the universe itself
[00:01:51] prohibits persons that don't have access
[00:01:54] don't have permission to undo this
[00:01:56] mathematical problem that they literally
[00:01:58] cannot do that. So um what that means is
[00:02:00] that with a very little amount of
[00:02:03] energy, a minuscular amount of energy, a
[00:02:06] laptop, a battery and a few milliseconds
[00:02:08] of computation, you can create a secret
[00:02:11] that not even the strongest imaginable
[00:02:15] superpower on earth is able to without
[00:02:17] your um explicit granting of access are
[00:02:21] able to recover. That is the fundamental
[00:02:24] principle on what on on top of which
[00:02:26] encryption, cryptography and privacy in
[00:02:28] the modern age are built. And it's it's
[00:02:31] so fascinating that the universe itself
[00:02:33] allows for this computational asymmetry
[00:02:36] where I can create a secret. I can
[00:02:39] encrypt something. I can make something
[00:02:41] hidden and you with the most um powerful
[00:02:44] imaginable coercive force violence. um
[00:02:47] you could imagine continent- sized
[00:02:50] computers running for the um entire
[00:02:52] lifespan of the universe. You would not
[00:02:54] be able to apply that force to my secret
[00:02:58] because I have encrypted it and the
[00:03:00] universe inherently um sort of smiles
[00:03:03] upon encryption and and appreciates
[00:03:05] that. So I I always found that so
[00:03:07] intoxicating this this concept that this
[00:03:09] is inherently baked into the universe.
[00:03:12] It is a interaction between mathematics
[00:03:14] and physics sort of um and is a
[00:03:17] fundamental property just like um you
[00:03:20] could say um nuclear weapons are a
[00:03:23] fundamental property of reality, right?
[00:03:25] And so encryption and privacy exist in
[00:03:29] this reality and um before you you you
[00:03:32] we as humans have figured that out that
[00:03:35] wasn't necessarily clear, right? It
[00:03:37] could also be that you can never hide
[00:03:39] something, encrypt something, keep
[00:03:41] something to yourself. Um, but it turns
[00:03:43] out you actually can. And so that is
[00:03:46] fascinating, I think. And what it
[00:03:49] conceptually allows you to do is to take
[00:03:51] something and move it into a different
[00:03:54] realm, the encrypted realm, right? And
[00:03:57] if someone else wants to go into that
[00:04:00] realm, follow you there. Um, they would
[00:04:03] need unlimited resources to do so. And I
[00:04:07] would say that's what really got me into
[00:04:09] cryptography and privacy.
[00:04:11] >> Okay. I I have I'm having all kinds of
[00:04:13] realizations simultaneously. First of
[00:04:14] all, that you're you're an extraordinary
[00:04:17] person. I think that's first listen to
[00:04:20] three minutes. Okay. Who are you? Where
[00:04:23] are you from? And are you ready to
[00:04:25] suffer for
[00:04:28] your ideas? Because what you've just
[00:04:30] articulated is the most direct subtle
[00:04:33] but direct possible challenge to global
[00:04:36] authority anyone could ever articulate.
[00:04:39] But first, where how did you come to
[00:04:41] this? Where are you from? Tell us about
[00:04:42] you yourself for just a moment.
[00:04:44] >> So, I'm I was born in Germany. Um I'm 25
[00:04:47] years old and um
[00:04:52] I um I originally actually um I in my
[00:04:56] life I studied law and then later I
[00:04:59] studied mathematics and computer science
[00:05:01] and then at some point I met a few
[00:05:05] people who also had these kinds of ideas
[00:05:08] about privacy technology, distributed
[00:05:11] technology, decentralization
[00:05:13] and um we then decided to found a
[00:05:16] company that that builds this kind of
[00:05:17] technology and that's how I ended up
[00:05:20] here I guess.
[00:05:22] >> So you're a you're you're German. You're
[00:05:24] a product of Europe and European culture
[00:05:26] which is not priv privacy for all of its
[00:05:29] wonderful qualities. They built the
[00:05:30] world. I love Europe and the culture but
[00:05:32] it it's not a privacy culture.
[00:05:34] >> It is not.
[00:05:35] >> No. So especially German, how did you
[00:05:38] what why did you come to this conclusion
[00:05:40] when all of your neighbors didn't?
[00:05:42] >> So I think it's interesting, right? If
[00:05:44] you if you um view privacy as this
[00:05:47] inherent um yeah political thing that
[00:05:50] protects you as a human being um um
[00:05:54] there is data protection laws, GDPR,
[00:05:57] right? There's fines against
[00:05:59] surveillance capitalist tech giants in
[00:06:01] Europe, but as you said, I feel like um
[00:06:04] most of that stuff is a charade. It's
[00:06:06] it's not really um about protecting your
[00:06:09] privacy and and we are we seeing that in
[00:06:12] in the UK in the European Union. I mean
[00:06:14] there's there's so many cases that
[00:06:16] already have um made some significant
[00:06:18] movements already this year. Um so I
[00:06:21] would say for for me personally um it
[00:06:26] it has really been this this
[00:06:28] technological and mathematical
[00:06:30] understanding of the power of this
[00:06:32] technology. So, um, realizing this,
[00:06:36] realizing that the universe allows us to
[00:06:40] do these things and the universe sort of
[00:06:42] has this built right into it, um, got me
[00:06:46] so fascinated that I that I really
[00:06:48] thought deeply about this. And what I
[00:06:50] realized sort of is that what what
[00:06:52] humans have done in the past is that
[00:06:54] they've allowed information, right? Any
[00:06:58] type of information that we now share
[00:06:59] with our mobile surveillance devices. Um
[00:07:02] so that information to be encrypted and
[00:07:05] be put at rest somewhere securely right
[00:07:08] that is how encryption is mainly been
[00:07:10] used um or to do things like like um
[00:07:14] signal is doing where we do end to end
[00:07:16] encrypted messaging right where we are
[00:07:18] able to send some message from one human
[00:07:21] to another human being via something um
[00:07:25] some some untrusted channel right where
[00:07:27] there can be interceptors that's that
[00:07:29] try to get those messages but thanks to
[00:07:31] mathematics ICS we are able to send this
[00:07:33] message across the whole universe and it
[00:07:36] arrives at the end point with no
[00:07:38] intermediary being able to take a look
[00:07:41] at the message because of this inherent
[00:07:43] property of the universe. What I
[00:07:44] realized sort of has been that there's a
[00:07:48] missing piece which is whenever we are
[00:07:50] accessing this information whenever we
[00:07:52] are interacting with this information
[00:07:54] whenever we want to utilize it basically
[00:07:57] um we have to decrypt it again which
[00:07:59] then makes it accessible to whoever
[00:08:03] takes a look at it right whoever runs
[00:08:05] the machine that you that you decide to
[00:08:08] put that data on which can be AWS which
[00:08:11] can be cloud providers big data big AI
[00:08:13] >> exactly whoever, right? And so um this
[00:08:16] idea that I had was what if we can take
[00:08:19] this asymmetry that is a fact of reality
[00:08:23] and move that to computation itself to
[00:08:26] enable that all of those computations
[00:08:28] can be executed in private as well and
[00:08:32] then we can do some amazing things. Then
[00:08:34] the two of us can decide to compute
[00:08:36] something together. Not just exchange
[00:08:38] information via some secure
[00:08:40] communication channel, but actually
[00:08:42] perform some mathematical function over
[00:08:44] something, produce an output from some
[00:08:47] inputs, but we can keep those inputs to
[00:08:50] ourselves. So Tucker has a secret, Yanik
[00:08:53] has a secret. And with this technology,
[00:08:56] we can produce uh some value, some
[00:08:58] information.
[00:09:00] While you don't have to share your
[00:09:02] secret, I don't have to share my secret.
[00:09:04] And we can scale that to enormous sizes
[00:09:06] where the entirety of humanity can do
[00:09:08] those things, where countries can do
[00:09:10] those things. But importantly, at its
[00:09:12] core, what we're doing is we um
[00:09:14] implementing this this asymmetry that
[00:09:17] exists within the universe and bringing
[00:09:19] that to the to the next level to the
[00:09:21] final form sort of. And that's how I
[00:09:23] ended up founding Archium. Yeah, getting
[00:09:25] older can make you realize you don't
[00:09:27] actually want all the things you have.
[00:09:29] That's why mini storage is so big.
[00:09:31] Consumerism kind of loses its appeal.
[00:09:33] What you really want is peace, peace of
[00:09:36] mind. And what's the best way to get
[00:09:37] that? Well, keeping your home and family
[00:09:39] protected would be at the top of the
[00:09:40] list. Enter Simply Safe. This month, you
[00:09:43] get a 50% discount on your first
[00:09:45] Simplysafe system. Simply Safe takes a
[00:09:49] much better approach to home security.
[00:09:52] The idea is, how about we stop people,
[00:09:54] invaders, before they come into the
[00:09:56] house, not just trying to scare them
[00:09:58] once they're already inside your house.
[00:10:00] And they do that with cameras backed by
[00:10:02] live agents who keep watch over your
[00:10:04] property. If someone's lurking outside,
[00:10:06] they will tell the person, "Get out of
[00:10:08] here." And then they'll call the police
[00:10:10] if they don't. 60-day satisfaction
[00:10:12] guarantee or your money back. So, there
[00:10:15] really is no risk here. Not
[00:10:17] surprisingly, Simply Safe has been named
[00:10:18] America's best home security system for
[00:10:20] 5 years running. Protect your home
[00:10:22] today. Enjoy 50% off a new Simplysafe
[00:10:25] system with professional monitoring at
[00:10:27] simplysafe.com/tucker.
[00:10:29] Simply sim plysafe.com/tucker.
[00:10:33] There is no safe like simplysafe. I
[00:10:36] can't think of a more virtuous project.
[00:10:39] Um, and you said it in the first minute,
[00:10:41] the the point of the project is is to
[00:10:43] preserve humanity, to keep human beings
[00:10:45] human, and they're not just objects
[00:10:48] controlled by larger forces. They're
[00:10:50] human beings with souls.
[00:10:52] >> And I again, I don't think there's any
[00:10:54] any more important thing that you could
[00:10:56] be doing with your life. So, thank you
[00:10:58] for that. Can you be more specific about
[00:11:01] our current system and how it doesn't
[00:11:04] protect privacy?
[00:11:06] >> Yes. So um I would say
[00:11:11] there's um so so I think there's there's
[00:11:14] a lot of things to unravel um if we if
[00:11:17] we take a look at the the systems that
[00:11:20] we are interacting with every single
[00:11:22] day. um what those what those tools and
[00:11:25] applications um those social media
[00:11:28] networks basically everything that we do
[00:11:30] um in our digital lives and all of our
[00:11:32] lives have basically shifted from
[00:11:34] physical reality to to this digital
[00:11:36] world. So everything we basically do um
[00:11:40] everything we do in this room,
[00:11:41] everything we do when we are out in the
[00:11:43] street because um all of the technology
[00:11:45] has become part of physical reality um
[00:11:48] has been consumed sort of. And so um all
[00:11:51] of this has been built on top of um what
[00:11:54] the former Harvard professor Shashana
[00:11:56] Sububof has called surveillance
[00:11:58] capitalism, right? And I think that that
[00:12:00] really lies at the core. Um, and it's
[00:12:03] relatively straightforward to understand
[00:12:05] what those companies are doing. If you
[00:12:08] ask yourself,
[00:12:09] hey, why is this application that I'm
[00:12:12] using actually free, right? Why is
[00:12:14] nobody charging me to ask this super
[00:12:18] intelligent chatbot questions every day?
[00:12:22] Why are they building data sensors for
[00:12:24] trillions of dollars while I don't have
[00:12:27] to pay anything for it? Right? So, so
[00:12:28] that's the question that you need to ask
[00:12:30] yourself, right? And um what you end up
[00:12:33] realizing is that all of those systems
[00:12:35] are basically built as um yeah rent
[00:12:38] extraction mechanisms um where from you
[00:12:42] as a as a user, you're not really a
[00:12:44] user, you're sort of a subject of those
[00:12:46] those platforms um you
[00:12:50] are being used to extract value from you
[00:12:53] um without you noticing and they're able
[00:12:55] to extract value from you because all of
[00:12:58] your behavior, all of your interactions
[00:13:00] um with those systems are being taken um
[00:13:04] and they perform mass surveillance, bulk
[00:13:08] surveillance um and it's those
[00:13:09] companies, right? We're just talking
[00:13:11] about companies. We're not even talking
[00:13:12] about intelligence or or governments or
[00:13:14] anything. We're just talking about those
[00:13:16] those companies that exist um within our
[00:13:18] our economy. And so they record
[00:13:21] everything they can because every single
[00:13:23] bit of information that I can take from
[00:13:26] your behavior allows me to predict your
[00:13:28] behavior. here and where I can predict
[00:13:29] your behavior. I can utilize that to in
[00:13:34] the most simple case do something like
[00:13:35] serving you ads, right? But in more um
[00:13:38] complex cases, I can do things like I
[00:13:40] can steer your behavior. I can literally
[00:13:43] control you. I can turn you into a
[00:13:45] puppet that does whatever I want. Um and
[00:13:48] so those are the systems that we are
[00:13:50] faced with right now. And the internet
[00:13:52] has sort of been this amazing
[00:13:54] emancipator for for humanity, right?
[00:13:57] this show is only possible because of
[00:13:59] the internet. Otherwise with with
[00:14:01] traditional media we wouldn't be able to
[00:14:03] to speak about those topics. I feel like
[00:14:04] that's right.
[00:14:05] >> Um
[00:14:06] >> but at the same time sort of nowadays it
[00:14:08] has um transformed into one of the
[00:14:12] biggest threats to human civilization
[00:14:16] >> at at the user level at my level the
[00:14:19] level of the iPhone owner. Um is it
[00:14:23] possible to communicate privately with
[00:14:25] assurance of privacy with another
[00:14:26] person?
[00:14:28] >> That's an interesting question. So um we
[00:14:30] start with this concept of insecure
[00:14:32] communication channels. Um
[00:14:35] >> and since every communication channel is
[00:14:38] insecure, what we employ is end to end
[00:14:40] encryption. Um and endstrand encryption
[00:14:43] allows us to take this information, take
[00:14:46] a message um and
[00:14:50] lock it securely so that only Tucker and
[00:14:52] Yanik are able to unlock them and see
[00:14:55] what's going on and that is a fact. So
[00:14:57] there have been there have been many
[00:14:59] cases where um where big players with
[00:15:02] big interests I guess have have
[00:15:04] attempted to to undermine cryptography
[00:15:06] have attempted to to get rid of end
[00:15:08] encryption to install backd doors. Um
[00:15:10] there has been what is commonly called
[00:15:13] the crypto wars in the 1990s right where
[00:15:16] um the cipher punks fought for the right
[00:15:19] to publish open-source um encryption and
[00:15:22] and cryptography um and many many more
[00:15:25] cases I guess but at the end of the day
[00:15:26] I would say as a realistic assessment
[00:15:29] this kind of cryptography is secure and
[00:15:31] it works. Now that unfortunately is not
[00:15:35] the whole answer because what we have to
[00:15:36] think about is now what happens with
[00:15:39] those end devices right
[00:15:41] >> fair.
[00:15:41] >> I mean the the message the messenger
[00:15:43] right that is being sent from Yanik to
[00:15:46] Tucker might be secure but now if I
[00:15:49] cannot undermine and and apply force to
[00:15:52] this message to to to understand what's
[00:15:55] what's what's inside well I'm just going
[00:15:57] to apply force to your phone and that's
[00:16:00] sort of what's happening. So um when we
[00:16:04] look at different applications for sure
[00:16:06] there is a whole variety of applications
[00:16:09] um messaging applications right that do
[00:16:11] not employ um um yeah encryption and and
[00:16:16] security standards and might collect all
[00:16:18] of your messages and images and utilize
[00:16:21] them for those those machines right that
[00:16:23] that extract as much value as possible
[00:16:25] from you. Um, but there's applications
[00:16:27] like Signal that don't do that, that are
[00:16:29] actual open-source cryptography
[00:16:31] technology that anyone can verify
[00:16:33] themselves and um, take this code and
[00:16:37] turn it into an actual application,
[00:16:39] install it on your phone. All of those
[00:16:40] things are possible, right? So, that's
[00:16:42] not the issue. The underlying issue
[00:16:43] really is that you have this device in
[00:16:46] your hand that is sort of closed
[00:16:48] hardware. You don't know how that that
[00:16:50] thing works, right? It is impossible to
[00:16:52] understand how that thing works. is
[00:16:54] impossible to understand how the
[00:16:55] operating system on that thing works.
[00:16:58] And um there's flaws in those systems,
[00:17:01] right? Those are closed systems. There's
[00:17:03] flaws in those systems for some reason
[00:17:05] because um people don't always have the
[00:17:08] best interests of others in mind. Um but
[00:17:11] also not always
[00:17:12] >> not always but also because people make
[00:17:15] mistakes, right? Honest honest mistakes
[00:17:16] that are non-malicious. Um and so I
[00:17:19] think that in general also speaks for
[00:17:20] the importance for free accessible
[00:17:23] hardware where people with technical
[00:17:26] skills can play around with and and and
[00:17:28] find issues. Um but um at its core what
[00:17:32] you're being subjected to right now I
[00:17:34] would say is tactical surveillance. Um,
[00:17:37] and what it means is that there's some
[00:17:39] actor, can be some state actor, can be
[00:17:42] someone else that decides that Tucker
[00:17:45] Carlson is worth to be surveiled. Um,
[00:17:49] >> I think that has been decided. Yeah,
[00:17:52] >> I think I do. Yeah, I'm getting that
[00:17:53] sense.
[00:17:55] So, so tactical surveillance that means
[00:17:57] that you specifically are being targeted
[00:17:59] and that is in contrast to strategic
[00:18:02] surveillance which is this idea of
[00:18:05] everyone is being surveiled. Let's just
[00:18:07] surveil everyone, collect every single
[00:18:09] bit of information and store that for
[00:18:12] the entirety of human history and then
[00:18:14] someday maybe we'll be able to use that.
[00:18:16] Right? So those are those two concepts
[00:18:18] and
[00:18:20] um what we've seen over the last few
[00:18:22] years is sort of a shift away from
[00:18:24] tactical surveillance towards strategic
[00:18:26] surveillance. Um and surveillance
[00:18:30] capitalism has really helped this
[00:18:32] concept because there's so much data
[00:18:34] that is being locked that can be stored.
[00:18:36] There are so many new devices and
[00:18:38] applications that can be employed. Um,
[00:18:41] and so we see pushes like for example
[00:18:43] chat control within the European Union
[00:18:45] that is sort of a backd dooror to
[00:18:48] implement backd doorors within all of
[00:18:50] the messenger applications to be able to
[00:18:53] scan your applications to uh to scan
[00:18:55] your messages to take your messages
[00:18:57] somewhere else and decide whether or not
[00:18:59] those people like what you're saying
[00:19:01] within your private messages. Um so I
[00:19:04] would say in general as a as a normal
[00:19:06] human being um with your iPhone you are
[00:19:09] still able to privately communicate.
[00:19:11] That is still something that that
[00:19:13] exists. Um however this this ability has
[00:19:17] greatly been limited if there is someone
[00:19:20] who wants to to see your message. I
[00:19:22] would say they can unfortunately. How
[00:19:25] difficult is it for a determined, say,
[00:19:28] state actor, an intel agency to say, I I
[00:19:31] want to read this man's communications,
[00:19:34] listen to his calls, watch his videos,
[00:19:36] read his texts. How hard is it for them
[00:19:38] to do that? So, I think that and and we
[00:19:41] can we can look at different court cases
[00:19:44] that have publicly emerged in regards to
[00:19:46] Apple, for example, right, where Apple
[00:19:48] has refused um intelligence um to give
[00:19:52] them backdoor access to their devices.
[00:19:54] Um, and what's so important about this
[00:19:57] discussion that we are having here is
[00:19:59] that every time you're you're you're
[00:20:01] building a system where you add backdoor
[00:20:04] access so that someone in the future can
[00:20:08] decide to get access and and take a look
[00:20:10] at what you're writing, right? What that
[00:20:12] invites is for everyone to do that
[00:20:14] because a back door inherently is a
[00:20:16] security flaw in our system. Um, and
[00:20:19] it's not just some specific intelligence
[00:20:21] agency that decides to read your
[00:20:23] messages, right? It's every intelligence
[00:20:25] agency on earth at that point, right? Um
[00:20:28] and so that's why as a as a nation, you
[00:20:31] cannot um weaken security by getting rid
[00:20:34] of privacy without weakening your entire
[00:20:37] economy, cyber security and also social
[00:20:41] fabric at the end of the day, right? And
[00:20:42] the whole strategic positioning of you
[00:20:44] as a nation. Um
[00:20:47] how difficult it is. Um,
[00:20:50] I would say also from a from a practical
[00:20:53] operational security standpoint depends
[00:20:56] on what are you doing with your phone,
[00:20:59] right? Is your phone this strict device
[00:21:01] that is only used for messaging or is
[00:21:03] your phone also using different types of
[00:21:06] media? Are you sending me uh images? Are
[00:21:08] you receiving messages? So um I think
[00:21:11] two years ago there was this this case
[00:21:13] um where there was a zero day back door
[00:21:17] um being being used across Apple devices
[00:21:20] because when I sent you an image and
[00:21:23] your messenger had auto download on I
[00:21:27] could get full access to your phone um
[00:21:29] by sending you a message and you're not
[00:21:32] not my contact even probably right I
[00:21:35] just figure out what your phone number
[00:21:36] is I send you an image the image gets
[00:21:39] automatically downloaded some malicious
[00:21:41] code that I have injected gets executed
[00:21:44] and now I own your phone and I can do
[00:21:46] whatever I want and then end encryption
[00:21:48] doesn't help you right because I have
[00:21:50] literal access to the end device that
[00:21:52] decrypts this information and so that's
[00:21:54] very dangerous that that has been fixed
[00:21:56] but I think what it highlights really is
[00:21:58] that um complexity is the issue here so
[00:22:01] complexity in the kinds of applications
[00:22:04] that you're running complexity in the
[00:22:05] underlying operating system that this
[00:22:08] device has all of that complexity
[00:22:10] invites mistakes and also malicious
[00:22:13] security flaws to be installed in those
[00:22:15] systems.
[00:22:16] >> Um, of course, yeah,
[00:22:17] >> human organizations are the same way.
[00:22:18] The bigger they are, the easier they are
[00:22:20] to subvert.
[00:22:21] >> Yes, of course.
[00:22:22] >> Yeah.
[00:22:22] >> February is the perfect month to get
[00:22:24] cozy cuz it's chilly outside. Our
[00:22:26] partners at Cozy Earth understand this
[00:22:28] and they're helping Americans everywhere
[00:22:30] stay toasty throughout the frigid
[00:22:32] winter. We hope you're seated because
[00:22:34] this detail may shock you. Cozy Earth
[00:22:36] offers bamboo pajamas. Lightweight,
[00:22:40] shockingly soft, these pajamas are a
[00:22:42] true upgrade. They sleep cooler than
[00:22:43] cotton. Plus, they're made out of
[00:22:45] bamboo. That is just wild and awesome.
[00:22:48] From pajamas and blankets to towels and
[00:22:50] sheets, Cozy Earth has something unusual
[00:22:52] and great for everybody, and it's
[00:22:53] entirely risk- free. You get a 100 night
[00:22:56] sleep trial, 10-year warranty. There is
[00:22:58] no downside that we can see. So, share
[00:23:01] love this February. Wrap yourself or
[00:23:03] someone you care for in comfort that
[00:23:05] feels special. Bamboo pajamas. Visit
[00:23:08] cozyearth.com. Use the code tucker for
[00:23:12] 20% off. That's code Tucker for up to
[00:23:14] 20% off. And if you get a post purchase
[00:23:16] survey, make certain to mention you
[00:23:18] heard about cozyear from us.
[00:23:20] >> So that's very I mean that's a very
[00:23:22] simple thing to send someone
[00:23:25] >> uh you know to text him an image and all
[00:23:26] of a sudden you have control of his
[00:23:28] phone. I think we can be fairly
[00:23:29] confident that
[00:23:31] people who have adversaries are being
[00:23:33] surveiled, right?
[00:23:34] >> Yes, I think so. I I would say that um
[00:23:38] tactical surveillance really is is is
[00:23:40] something that exists. Um I would say um
[00:23:44] in this battle for privacy, it is
[00:23:46] actually not the the most important
[00:23:49] thing to focus on, right? because um
[00:23:52] this kind of tactical surveillance um
[00:23:54] sort of I feel like to a certain degree
[00:23:56] we need to accept unfortunately um right
[00:23:59] not the tactical surveillance that says
[00:24:01] Tucker Carlson is a journalist I don't
[00:24:03] like that let me surveil him right
[00:24:05] that's not the kind of technical
[00:24:06] surveillance I'm speaking of but um if
[00:24:08] we if we have legal procedures and
[00:24:11] actual judiciary warrants in place right
[00:24:14] I feel like as a society we could accept
[00:24:16] that um to convert as long
[00:24:19] criminal activity. We can definitely
[00:24:21] accept that, of course.
[00:24:22] >> But the fundamental issue really is, and
[00:24:24] that's that's that's sort of so ironic,
[00:24:27] right? That that all of the surveillance
[00:24:29] sort of um needs to operate under
[00:24:31] secrecy in order to to function, right?
[00:24:34] You should not know that you're being
[00:24:36] surveiled. Nobody sort of has oversight.
[00:24:38] Um not even the the democratic processes
[00:24:41] are able to have oversight because it's
[00:24:43] all wrapped in in secrecy. Um so that
[00:24:46] really brings us to the fundamental
[00:24:47] issue here also with strategic
[00:24:49] surveillance surveilling everyone just
[00:24:51] deciding well um I'll take a look at
[00:24:54] everyone's phone store everything and
[00:24:56] maybe I don't like someone in the future
[00:24:58] then I have this backlog of information.
[00:25:00] So um the important question to consider
[00:25:03] here is thinking about um is there even
[00:25:08] a future where from a legal standpoint
[00:25:10] it is possible to implement procedures
[00:25:13] that guarantee that there is no secret
[00:25:16] surveillance in place. Um which I think
[00:25:19] um the answer is pretty clear to that
[00:25:21] question
[00:25:22] >> and it is it is I think I think it is
[00:25:25] not. So I think I think it is important
[00:25:27] to have these these these laws in place,
[00:25:29] right, that that that prohibit
[00:25:31] surveillance and that um enable um
[00:25:34] different kinds of of of processes with
[00:25:36] warrants, right? Literally the fourth
[00:25:38] amendment, right, to to to allow for
[00:25:41] that to be to be um implemented in the
[00:25:43] 21st century. Um but um what we've seen
[00:25:48] sort of is that the the tools that that
[00:25:50] governments have access to um are so
[00:25:54] powerful that um it is impossible to um
[00:25:59] to make a law that prohibits use of that
[00:26:01] because whoever within a centralized
[00:26:04] architecture, that's always the case, um
[00:26:06] has access to this technology um
[00:26:09] basically becomes a single point of
[00:26:11] failure and that single point of failure
[00:26:13] will necessarily
[00:26:15] be corrupted by the power that um
[00:26:18] exists.
[00:26:19] >> Just a couple
[00:26:22] obvious lowbrow technical questions. Is
[00:26:24] the iPhone safer than the Android or
[00:26:27] less?
[00:26:28] >> That's a that's a good question. So, um
[00:26:31] I would say a huge advantage that that
[00:26:34] Android devices bring to the table,
[00:26:36] right, is this this nature of um I guess
[00:26:40] a subset of those devices, right? not
[00:26:42] not not speaking for the entirety but
[00:26:43] the operating system for example
[00:26:45] >> being publicly fable by anyone right you
[00:26:48] can understand it and I think that is so
[00:26:51] important not just for for security but
[00:26:53] also for technological innovation um and
[00:26:56] so I I would say that is a huge
[00:26:58] advantage um now the devices are
[00:27:01] manufactured by some manufacturer who
[00:27:04] you need to trust at the end of the day
[00:27:05] um based on how the hardware is built
[00:27:09] and how the firmware is compiled and
[00:27:11] then put on your device. So, there have
[00:27:14] been interesting operating operating
[00:27:16] systems. I think there's one called
[00:27:17] Graphine OS um which is an secure open-
[00:27:21] source operating system as far as I
[00:27:23] know. Haven't looked too deeply into
[00:27:25] that, but you could on an Android device
[00:27:27] theoretically say I'm going to run my
[00:27:29] own operating system on that which I
[00:27:31] think is a is a strong value
[00:27:32] proposition. Now, I myself am also an
[00:27:35] Apple user. Um there is also a sort of
[00:27:38] element of institutional trust involved
[00:27:41] here right where you say okay I trust
[00:27:43] the the manufacturing and and software
[00:27:45] process that that this company has. Um,
[00:27:48] but in general, if I'm being honest, if
[00:27:51] I wouldn't be lazy, right, what I'd be
[00:27:54] doing is I would actually be looking for
[00:27:58] a minimalistic, secure, open-source
[00:28:01] operating system for my mobile phone and
[00:28:03] I would build that myself and get some
[00:28:06] some hardware um and and put that on
[00:28:09] there. So, I would say that would be the
[00:28:11] the smartest thing to do if you are
[00:28:13] technically versatile.
[00:28:14] >> I read that you use an iPad, not a Mac.
[00:28:17] Is there an advantage?
[00:28:18] >> That's what I did back in the day when I
[00:28:20] started. Yeah. Yeah.
[00:28:20] >> Is there an advantage to the iPad over
[00:28:22] the Mac from a privacy standpoint?
[00:28:25] >> Um
[00:28:30] I think I think what it boils down there
[00:28:32] uh down to there is um what kind of
[00:28:36] applications could be installed on your
[00:28:39] system.
[00:28:40] I would say in general devices like the
[00:28:42] iPhone or the iPad um operate in a more
[00:28:46] sandboxed way um where applications are
[00:28:50] actually isolated, right? Um rather than
[00:28:54] how it works on on operating systems
[00:28:56] like Mac OS or or Windows, right, where
[00:28:59] you could compromise the entire system
[00:29:02] way more easily, right? It's on the on
[00:29:03] the iPhone, you just have an app store
[00:29:05] with with applications. And the the
[00:29:08] level of compromise that such an
[00:29:10] application can have theoretically at
[00:29:12] least from the from the idea is limited
[00:29:15] um to just this the single application,
[00:29:17] right? Doesn't have access to your
[00:29:19] messenger if you're installing an app.
[00:29:21] Um although it has I guess if there's
[00:29:24] some flaw in the system, which always is
[00:29:26] the case. So so you never have this this
[00:29:28] absolute security. Um I think I think
[00:29:31] what it what it really boils down to is
[00:29:33] this idea um that really emerged in the
[00:29:37] 1990s of decentralization, right?
[00:29:40] um moving away from central single
[00:29:43] points of failures towards
[00:29:45] decentralization where we can mitigate a
[00:29:47] lot of these risks by not depending I
[00:29:50] guess on one single type of computer and
[00:29:53] not even depending on one single
[00:29:54] computer but having many computers which
[00:29:57] introduces redundancy resilience and I
[00:30:00] guess um risk reduction and and
[00:30:02] distribution um to to computer systems.
[00:30:05] So speaking more broadly about um how
[00:30:07] the internet in a free society should be
[00:30:10] built I guess. Yeah. So most people
[00:30:11] don't wake up in the morning and decide
[00:30:13] to feel horrible, exhausted, foggy,
[00:30:15] disconnected from themselves. But it
[00:30:17] does happen and it happens slowly.
[00:30:19] You're working hard. You're showing up
[00:30:21] and yet your energy disappears by
[00:30:23] midday. Your focus is dull. Your weight
[00:30:25] won't move. A lot of people are told
[00:30:27] that's just getting old. That's what it
[00:30:29] is. But that's not actually true.
[00:30:32] For many men and women, these are not
[00:30:34] personal failures. They are signals tied
[00:30:36] to your metabolism, your hormones, and
[00:30:39] nutrient imbalances that go undetected
[00:30:41] for years. You don't even know you're
[00:30:42] deficient. And that's why we're happy to
[00:30:44] partner with Joy and Bloss, a company
[00:30:46] that was built for people who are all
[00:30:48] done guessing and ready to figure out
[00:30:50] what exactly is going on. And that
[00:30:52] starts with comprehensive lab work and a
[00:30:54] one-on-one consultation with a licensed
[00:30:56] clinician. An actual human being
[00:30:58] explains what's happening inside you and
[00:31:00] builds a personalized plan which
[00:31:02] includes hormone optimization, peptide
[00:31:04] therapy, targeted supplements. So don't
[00:31:06] settle. Go to joy andblok.com/tucker.
[00:31:09] Use the code tucker for 50% off your lab
[00:31:11] work and 20% off all supplements. That's
[00:31:16] joy andblok.com/tucker.
[00:31:18] Use the code Tucker. 50% off labs, 20%
[00:31:21] off supplements. Join Blok. Get your
[00:31:24] edge back. You've said a couple of times
[00:31:27] that the problem is the hardware. It's
[00:31:28] not it's not the software.
[00:31:30] >> So So the device it is it is the device,
[00:31:33] right? It's the union of the hardware
[00:31:35] and the
[00:31:35] >> software. So what's the option? Is there
[00:31:38] an option at this point? If I if I am,
[00:31:41] you know, intent on sending a private
[00:31:44] message to someone else electronically,
[00:31:46] is there a way to do it as of right now
[00:31:49] that's private, guaranteed private?
[00:31:52] So, um I would say the way that I myself
[00:31:57] at least handle it really is to have a
[00:31:59] dedicated phone for that for that
[00:32:01] specific use case, right? And then just
[00:32:03] have uh encrypted messenger there um
[00:32:06] that you can trust because maybe you
[00:32:08] don't even install it via the app store,
[00:32:10] but you have built it yourself um and
[00:32:12] there's no other interactions taking
[00:32:15] place with that phone. I would say from
[00:32:16] an operational security standpoint, that
[00:32:18] is as good as it can get. Um otherwise
[00:32:21] you would really have to look at um I
[00:32:24] don't know you can do creative things
[00:32:26] always right you could um write your
[00:32:29] message and hand encrypted and then type
[00:32:32] it in the phone right so doesn't doesn't
[00:32:34] matter at that point um so maybe maybe
[00:32:37] we need to get away from the from the
[00:32:39] devices altogether right um what's
[00:32:41] interesting what we what we're doing
[00:32:43] with archum is that we we never have a
[00:32:46] single point of failure everything is
[00:32:48] encrypted everything sits within a
[00:32:50] distributed network um where as long as
[00:32:53] you're not able to basically get access
[00:32:57] to the entire globally distributed
[00:32:59] network to every single participant you
[00:33:02] have security um and it's difficult to
[00:33:05] do that with your own phone um but at
[00:33:09] the end of the day I think um over time
[00:33:13] those systems get more secure um however
[00:33:17] what is important is to be certain that
[00:33:19] there is no back doors explicitly
[00:33:21] installed right from those manufacturing
[00:33:23] processes. I think there's there's
[00:33:25] there's some countries um where if
[00:33:28] you're buying a phone from there you
[00:33:30] could be certain now okay there might be
[00:33:31] something installed because the company
[00:33:33] itself is is owned by the government um
[00:33:37] and we need a legal frameworks for that.
[00:33:40] Um and also what what we require sort of
[00:33:43] is that the manufacturing process itself
[00:33:48] um mirrors distributed decentralized
[00:33:51] systems where there again is not a
[00:33:53] supply chain of single points of failure
[00:33:56] where if one single worker decides to um
[00:34:00] to install some backdoor because they
[00:34:02] get paid off right they can do so but
[00:34:04] instead there is oversight and I I think
[00:34:06] that Apple um runs on on that model
[00:34:09] already. So I would I would be
[00:34:10] relatively comfortable with with these
[00:34:12] kinds of um systems. But there's also
[00:34:15] other other interesting technologies. So
[00:34:17] for example um Solana which is a um
[00:34:20] American company uh blockchain network
[00:34:23] right um they they actually um have
[00:34:26] their own phone company or or um
[00:34:29] offering phones. um they have a very
[00:34:32] small manufacturer and they manufacture
[00:34:34] those phones because they say, "Well,
[00:34:36] those phones need to be very secure
[00:34:38] because you literally store your money
[00:34:40] on there now because your money is is
[00:34:43] digital and on top of a blockchain
[00:34:45] network." And so I think I think those
[00:34:47] are very interesting approaches where
[00:34:49] I'm really looking forward to seeing
[00:34:51] more phones like this where there's um
[00:34:54] then again a competitive market emerging
[00:34:57] for who's building the most secure
[00:34:59] phone. Um yeah, I actually think a a
[00:35:02] friend of Julian Assange um from from
[00:35:05] Germany, I don't remember his name, um
[00:35:09] had a company manufacturing um secure
[00:35:12] phones. The issue with explicitly built
[00:35:15] secure phones, however, always is that
[00:35:17] um I would say um many of these
[00:35:20] companies are honeypotss. Um we I've
[00:35:23] noticed
[00:35:23] >> Yeah. with with the um
[00:35:27] encroachet or whatever it was called.
[00:35:29] There was this this large scale um um
[00:35:32] yeah police operation to to stop um
[00:35:34] truck cartels which worked out nicely I
[00:35:37] guess um in the end but but the company
[00:35:38] itself was just a facade um to to sell
[00:35:41] back door phones. Um, yeah,
[00:35:45] >> right. I mean, it's the perfect
[00:35:47] honeypot. And so, by the way, a signal,
[00:35:49] which I'm not saying is a honeypot, of
[00:35:51] course, but it was and I use it um as
[00:35:55] the authorities know.
[00:35:56] >> But, um, it was created with CIA money,
[00:36:00] so it doesn't mean it's a CIA operation,
[00:36:04] but why wouldn't it be? I mean,
[00:36:06] honestly, I'm not accusing anybody
[00:36:08] because I have no knowledge, but I mean,
[00:36:10] pretty obvious move, right?
[00:36:13] >> It it would be um I think I think what's
[00:36:16] what's um important when we look at
[00:36:19] signal actually is that we look at what
[00:36:22] what signal is. Signal is open-source
[00:36:25] software that anyone can verify for
[00:36:28] themselves. Um and what it means is that
[00:36:31] we have this global community of
[00:36:33] mathematicians and cryptographers that
[00:36:36] have invented those protocols that have
[00:36:38] independently without getting funding
[00:36:40] from CIA or whomever um thought of
[00:36:44] mathematical problems that they want to
[00:36:46] solve that they are passionate about and
[00:36:48] all of those people look at those
[00:36:50] open-source um lines of code and
[00:36:52] mathematical formulas and they find
[00:36:55] those flaws in those systems. Um and so
[00:36:58] that makes me confident in the in the
[00:37:01] design of of signal itself. Um
[00:37:03] >> do you use it?
[00:37:04] >> I use Signal. Yes. I got my entire
[00:37:06] family to use Signal.
[00:37:07] >> Okay. Good. So well that's and I and I
[00:37:09] have to say I know a lot of Intel people
[00:37:11] use Signal um a lot. All the ones I know
[00:37:15] >> and so that tells you something right
[00:37:17] there.
[00:37:17] >> Yes. So so so I think I think it it
[00:37:19] would be highly unlikely that um that
[00:37:21] Signal itself um would actually turn out
[00:37:25] to not be secure. There has been there
[00:37:28] has been this interesting case um called
[00:37:32] um that was in the early 2000s um where
[00:37:35] there was this attempt to actually
[00:37:37] undermine strong encryption um called
[00:37:40] very exotic name dual elliptic curve
[00:37:43] deterministic random bit generator dual
[00:37:46] EC DRBG right nobody understands no no
[00:37:50] non-technical person understands what
[00:37:52] that means right um and um it actually
[00:37:57] um and what you need to understand um in
[00:38:00] order to to to comprehend what what has
[00:38:01] happened there is that when we when we
[00:38:03] encrypt information when we as I as I
[00:38:05] said earlier when we take something and
[00:38:07] move it into this different realm where
[00:38:10] you cannot follow this information into
[00:38:11] that realm because that would require
[00:38:13] you to have literally infinite resources
[00:38:16] more energy than the sun will emit over
[00:38:18] its lifespan. Isn't that crazy? Right.
[00:38:20] So you you cannot you cannot follow
[00:38:22] there. Um well how fundamentally this
[00:38:26] asymmetry is achieved in in cryptography
[00:38:30] is that the universe runs on energy and
[00:38:33] uncertainty right particles jitter stars
[00:38:37] burst and so there's this randomness in
[00:38:39] the universe. If you look at the sky or
[00:38:41] if you just look at how how um how
[00:38:45] things are made up there's random noise
[00:38:47] everywhere. Yes. And so when we when we
[00:38:49] encrypt something, we make use of that
[00:38:52] chaos and we inject it into a message
[00:38:55] that we are sending for example. Um and
[00:38:58] it's only possible to not decrypt that
[00:39:01] message um in an unauthorized way if the
[00:39:05] randomness that has been injected in
[00:39:07] this message is actually unpredictable.
[00:39:10] Um now if we think of random
[00:39:12] >> unless it's truly random,
[00:39:14] >> it has to be truly random. Yeah. I
[00:39:16] cannot I cannot figure out how you
[00:39:18] arrived at the random number.
[00:39:21] >> No pattern.
[00:39:21] >> No pattern. Exactly. True randomness.
[00:39:23] True entropy. Right. Yes. That's that's
[00:39:26] what what cryptographers I would say
[00:39:28] spend most of their time on thinking
[00:39:30] about how can we achieve true randomness
[00:39:32] because then if we are able to inject
[00:39:35] that um using mathematics um for you it
[00:39:38] becomes impossible to distinguish this
[00:39:40] message from randomness. You can't find
[00:39:42] a pattern. Hence, you're not able to
[00:39:44] apply any any optimized algorithm to
[00:39:47] undermine.
[00:39:48] >> Exactly.
[00:39:48] >> Um so if we if you think about it
[00:39:50] practically, what that means is let's
[00:39:51] say we have a deck of cards, 52 playing
[00:39:54] cards, right? Um and I randomly shuffle
[00:39:57] this deck of poker cards um we have 52
[00:40:01] cards. What it means is that there's so
[00:40:03] many possible um um ways that a deck
[00:40:07] could be stacked um that it is very
[00:40:10] unlikely that for truly randomly
[00:40:12] shuffled decks there have ever been um
[00:40:15] two identical decks in the history of
[00:40:17] humanity, which is hard to believe in
[00:40:20] general, but that's how how statistics
[00:40:22] and mathematics work, right? So we we
[00:40:24] take this deck and we use it as the
[00:40:26] randomness. Now, if I play with a
[00:40:28] magician, the magician um can pretend to
[00:40:31] shuffle the deck, but actually they have
[00:40:33] not shuffled the deck. They know what
[00:40:36] the cards look like. Um what we're doing
[00:40:39] with all of this randomness that we are
[00:40:41] um injecting into into into information
[00:40:44] is we're basically um describing what
[00:40:48] key is being used to unlock them. And if
[00:40:51] I don't know how the randomness look
[00:40:52] like, if looks like, if I don't know
[00:40:54] what the next playing card in the stack
[00:40:56] is, um, I have to try every single
[00:40:59] possible key
[00:41:01] and try to unlock it with this message.
[00:41:03] So you could think of it as I have this
[00:41:05] message. Now I want to apply violence to
[00:41:07] this message in order to recover it.
[00:41:10] What I'm doing is I take key number one,
[00:41:12] I try to unlock it. Doesn't work. Then
[00:41:15] let's try key number two. And you do
[00:41:17] that for an inconceivably large amount
[00:41:20] of numbers. So that's why you basically
[00:41:22] practically speaking cannot brute force
[00:41:24] these these kind of mechanisms. Although
[00:41:27] you can if you know where to start
[00:41:29] looking for the keys, if you know that
[00:41:31] you need to start looking at the um yeah
[00:41:34] millionth key, then you can you can
[00:41:36] recover it. And so if the deck is being
[00:41:39] manipulated, the randomness is being
[00:41:41] manipulated, then you can undermine
[00:41:44] encryption while the process of
[00:41:47] encrypting it itself remains sound,
[00:41:49] right? You don't notice it. You actually
[00:41:51] do what you mathematically need to do to
[00:41:53] securely send your message, but the
[00:41:56] value that you use to do so this
[00:41:58] randomness is actually not random. And
[00:42:00] that's what the um what had been
[00:42:02] attempted with this specific algorithm
[00:42:04] dual EC um the um RPG um what they did
[00:42:10] was they created this concept of
[00:42:13] kleptography where they actually have
[00:42:15] randomness um derive it in a way that is
[00:42:18] deterministic and they actually have
[00:42:20] some some secret value and then from
[00:42:22] that secret value they derive fake
[00:42:24] randomness that looks random but it's
[00:42:26] not actually random and the NSA a um
[00:42:30] proposed this algorithm um to the to the
[00:42:33] NIST the National Institute of Science
[00:42:36] and Technology um in the early 2000s um
[00:42:40] as the yeah best state-of-the-art
[00:42:43] randomness derivation function I guess
[00:42:45] right um and that got accepted they got
[00:42:49] accepted as official standard and then
[00:42:52] there was companies like RSA actually a
[00:42:56] highly sophisticated and respected
[00:42:58] crypto cryptography company, right? With
[00:43:00] with with um the founders being some of
[00:43:03] the um fathers of modern day
[00:43:05] cryptography, right? So, um that then
[00:43:08] built products and distributed it to
[00:43:11] industry and people using this
[00:43:13] technology um nobody knew about it, but
[00:43:16] it's not not actually true that nobody
[00:43:18] knew about it. So, there were a lot of
[00:43:20] cryptographers that raised questions um
[00:43:22] a couple of years later where they were
[00:43:24] like um I don't think this is actually
[00:43:26] random. It looks suspicious to me, but
[00:43:29] they were like um if someone
[00:43:31] theoretically had access to some secret
[00:43:33] key s and then created some mathematical
[00:43:36] formulas and actually mathematically
[00:43:38] proved that there was insecurity there.
[00:43:41] It was not random because they noticed a
[00:43:43] pattern and it's
[00:43:44] >> they they realized sort of that so so
[00:43:48] basically what they realized is that
[00:43:49] there's there's just those numbers. So
[00:43:51] that um they they wrote this proposal.
[00:43:54] Hey, let's use this algorithm and this
[00:43:56] this algorithm contains some constant
[00:43:59] numbers. So there's those those numbers
[00:44:01] written there and then they were like
[00:44:03] are those numbers random because we're
[00:44:05] literally deriving our randomness from
[00:44:07] those numbers. We like yeah those are
[00:44:09] random. We we randomly generated them.
[00:44:11] It turns out there was some other key
[00:44:13] that is being used to then
[00:44:14] mathematically be able to recover
[00:44:16] whatever randomness you used. So that
[00:44:18] that was this this secret attempt to
[00:44:21] undermine cryptography and
[00:44:22] >> but by by the US government.
[00:44:24] >> Yes. Yes. And I think what's so striking
[00:44:27] about this again is that you're not just
[00:44:30] undermining um privacy, right? You're
[00:44:34] undermining the entire security of your
[00:44:37] your economy, your country, right? Um
[00:44:39] and
[00:44:41] >> banking
[00:44:42] codes, everything. Yes. Everything. So,
[00:44:45] um
[00:44:47] the the thing that then happened was in
[00:44:50] 2013 um Snowden um revealed um yeah a
[00:44:57] few papers I guess and
[00:45:00] one of one of those um was was project
[00:45:03] Bull Run. Um and within project bull run
[00:45:06] they allocated um funding to that
[00:45:10] specific project where they tried to um
[00:45:13] undermine cryptography. And so once that
[00:45:16] got published um
[00:45:19] um the corresponding companies and
[00:45:21] standardization institutes and and it's
[00:45:23] it's so striking that you get
[00:45:25] standardization because once it's um
[00:45:27] defined as a standard you industry need
[00:45:30] to implement it right to get
[00:45:32] certification. So it's it's it's
[00:45:34] literally um impossible to then use some
[00:45:37] other alternative that is secure because
[00:45:39] certification only um um gets provided
[00:45:42] for this backd dooror techn technology
[00:45:45] but that got uncovered thanks to Snowden
[00:45:47] um then people stop using it.
[00:45:49] >> Was he celebrated? Did he win the
[00:45:51] presidential medal of freedom for this?
[00:45:53] >> Yes. In an alternative reality
[00:45:57] um in a in a in a different realm I
[00:45:59] guess
[00:45:59] >> one of the one of the great patriots of
[00:46:01] our time. relentless. I mean, they'd
[00:46:03] murder him in a second. Um, he's still
[00:46:05] in exile, not by choice, but um it it
[00:46:10] Yeah. So, what they also uncovered is
[00:46:12] that they actually paid this company
[00:46:13] that built those products um 10 million
[00:46:16] US the NSA um to use that as a standard.
[00:46:20] Um so, yeah. Um that's why you cannot
[00:46:23] trust anyone
[00:46:24] >> as you point out. It's not simply I
[00:46:27] mean, so this is an intel agency trying
[00:46:29] to spy on its own people, the ones who
[00:46:30] pay for
[00:46:31] >> Sure. Yeah.
[00:46:32] >> Yet to exist, but it's and that's
[00:46:35] immoral and you know something that we
[00:46:38] should fight against. But they were also
[00:46:40] sabotaging the US economy and US
[00:46:43] national security. And because if your
[00:46:47] cryptography is fake, then that means
[00:46:49] you're exposed on every level throughout
[00:46:51] your
[00:46:52] >> Yes. Yeah. And it's and it's so
[00:46:54] interesting because um it is their task
[00:46:57] that why that's why it was possible for
[00:46:59] them to do that to increase national
[00:47:02] security right at that point um they
[00:47:04] were the leading cryptography research
[00:47:07] company in the world sort of right and
[00:47:09] so um that really is is striking to me
[00:47:12] um that you're willing to
[00:47:14] >> to to undermine the entire security of
[00:47:17] your nation and that at the end of the
[00:47:18] day puts you in a worse strategic
[00:47:21] position Um,
[00:47:22] >> I think many people don't realize that.
[00:47:24] Um,
[00:47:25] >> I'd never thought about it until you
[00:47:26] mentioned it, but it just it highlights
[00:47:28] I mean I love Ed Snowden and I'm not
[00:47:31] embarrassed of that. I'm proud. Um, but
[00:47:33] it just highlights, you know, the
[00:47:36] suffering that he's been through in
[00:47:37] order to help his own country and he's
[00:47:40] still slandered constantly and it um it
[00:47:43] drives me crazy. But this is yet another
[00:47:45] example of why he did something
[00:47:48] >> more than almost anyone else to help
[00:47:50] this country, you know. But uh so you
[00:47:54] are sounds like you're convinced that
[00:47:57] open that the current state-of-the-art
[00:48:00] in cryptography is actually secure.
[00:48:02] >> Yes. Yeah. 100%. Um I think um as I said
[00:48:07] I think this is a great example to look
[00:48:09] at um where even with those back doors
[00:48:13] that had been implemented there were
[00:48:16] cryptographers within this global
[00:48:18] open-source mathematics cryptography
[00:48:21] community that rang the bell but nobody
[00:48:23] was listening to them but they actually
[00:48:25] identified the issue years in advance
[00:48:28] and rang the bell and said this is not
[00:48:30] secure not random even within those
[00:48:32] companies and standardization institutes
[00:48:34] But nobody took it seriously um or I
[00:48:37] guess took it seriously but doesn't
[00:48:39] matter if the law is you have to use
[00:48:41] this algorithm right um so that makes me
[00:48:45] very confident that um this system works
[00:48:49] the system of mathematicians
[00:48:51] >> is cryptography
[00:48:53] global which is to say like is Chinese
[00:48:56] cryptography
[00:48:58] different or stronger than European or
[00:49:00] American cryptograph is interesting so
[00:49:03] you have you have um actually specific
[00:49:07] encryption standards used by um by by
[00:49:10] militaries of the world right so the
[00:49:12] Chinese use different cryptography than
[00:49:15] the Russians than the Americans it is at
[00:49:18] the end of the day the same thing right
[00:49:19] from a mathematical standpoint but some
[00:49:22] there are some deviations in the level
[00:49:25] of security and the kind of numbers used
[00:49:27] right so everyone builds their own
[00:49:29] standards because they mutually distrust
[00:49:31] each other um But at the end of the day,
[00:49:34] the underlying mathematics are are are
[00:49:37] the same. Um the cryptographic
[00:49:39] standards, the the way that cryptography
[00:49:41] works, that is the same. Um yeah.
[00:49:43] >> So there's no reason to think the
[00:49:44] Chinese or the Russians have stronger
[00:49:46] crypto cryptography than the Europeans
[00:49:48] and the Americans.
[00:49:50] >> So I think um
[00:49:54] no, no. Um and I think um I mean it's
[00:49:58] interesting to to think about is there
[00:50:00] cryptography that is being developed
[00:50:02] inhouse within militaries or whatever
[00:50:04] proprietary human organization right um
[00:50:08] that is not publicly known that is um
[00:50:11] that is incredibly powerful um I mean
[00:50:14] what what what I've been doing with my
[00:50:17] my team um and I'm so glad that I have
[00:50:20] those um incredible cryptographers in my
[00:50:23] team that actually understand all of
[00:50:25] those things um on a way way more
[00:50:27] detailed level than than I do um is is
[00:50:31] built this this protocol that allows us
[00:50:34] to literally take everyone's data. You
[00:50:38] could imagine um the entirety of the
[00:50:40] United States, right? We take everyone's
[00:50:43] healthcare data, something like that,
[00:50:45] right? And then we say, well, we need to
[00:50:47] do something with that data. Let's say
[00:50:49] we need to research our disease or
[00:50:51] whatever. Instead of taking that data
[00:50:53] and passing it to some company that will
[00:50:55] inevitably expose it, lose it, it will
[00:50:57] get leaked or it will be used against
[00:50:59] those people, we encrypt it. Nobody ever
[00:51:02] has to share any information and we just
[00:51:04] run whatever computation that we
[00:51:06] collectively said we are going to do
[00:51:08] that with this data. We do that. We get
[00:51:10] the result. We I don't know figure out a
[00:51:12] cure to cancer or whatever. But at no
[00:51:15] point in time you ever had to share your
[00:51:17] data. Your data never left your
[00:51:19] ownership. And I think that's that's
[00:51:21] really core. Um and um it sort of is the
[00:51:25] holy grail of cryptography, I would say,
[00:51:27] being able to do these kinds of things
[00:51:29] because you can now run any type of
[00:51:31] computer program instead of in the
[00:51:33] public in private and you can
[00:51:36] restructure the way that um your entire
[00:51:39] economy and and country can work, right?
[00:51:42] And that goes beyond just um just
[00:51:45] economical human interactions. That also
[00:51:47] touches upon things like rethinking how
[00:51:50] we can actually um improve democratic
[00:51:53] processes because what those
[00:51:55] computations inherently have as a
[00:51:58] property um is so-called verifiability.
[00:52:02] So what what's the status quo sort of in
[00:52:06] in in the the current internet is um you
[00:52:10] task some cloud provider to run a
[00:52:13] computer program for you, right? because
[00:52:15] you have limited resources. You want
[00:52:17] them to run that computer program for
[00:52:19] you. So you pass them some information,
[00:52:22] an algorithm, and you get an output
[00:52:24] back. But how do you know that this
[00:52:27] output is actually correct, right? Could
[00:52:29] be that there was an error. Could be
[00:52:30] that they maliciously tried to undermine
[00:52:33] um the output that they have sent you.
[00:52:36] So this technology that we've built
[00:52:38] actually solves this, right?
[00:52:40] Verifiability
[00:52:41] for computations. You can mathematically
[00:52:43] verify that a computation has been
[00:52:46] correctly executed and that itself is a
[00:52:50] amazing property an amazing property
[00:52:52] that you want to see within every
[00:52:53] system. Right? But you don't get that
[00:52:55] amazing property without implementing
[00:52:58] privacy for those systems. Isn't that
[00:53:00] amazing?
[00:53:00] >> It is amazing. It is. How did you all
[00:53:03] create this? Um, so I I'm I'm very lucky
[00:53:08] that within my within my company, um, I
[00:53:11] have very experienced cryptographers
[00:53:13] who've who've literally worked, um, more
[00:53:16] years on these specific issues than I
[00:53:20] have been in cryptography. Um, and so,
[00:53:23] um, I'm sort of building on on the
[00:53:25] shoulders of giants, of course, right?
[00:53:27] Um, and there has for a very long time
[00:53:30] been been research in those areas being
[00:53:32] able to run those encrypted
[00:53:34] computations. Um, but it has never been
[00:53:37] practical enough where it is fast
[00:53:40] enough, cheap enough, right, and
[00:53:42] versatile enough where you can actually
[00:53:43] do all of those things. Um, and so I
[00:53:47] think what what really guided us is to
[00:53:51] and what really guided me in the the way
[00:53:53] that I I I designed the system is to
[00:53:56] think about okay how can I actually
[00:53:59] build the system so that people are
[00:54:00] going to use it and are going to build
[00:54:03] applications and are going to integrate
[00:54:05] that into systems right um because I
[00:54:07] think with privacy technology in general
[00:54:09] in the past what what has been done is
[00:54:12] that it sort of has has been created in
[00:54:15] a in a echo chamber, in a vacuum almost,
[00:54:18] where you're a smart cryptographer that
[00:54:21] builds amazing technology, but you maybe
[00:54:24] don't understand how markets work and
[00:54:27] how to get product market fit, how to
[00:54:29] actually get those users, right? And so
[00:54:32] we've tried to um build it in a
[00:54:34] different way and that's that's how we
[00:54:35] ended up here. But to be honest, we we
[00:54:38] it was an evolutionary process for us.
[00:54:40] So we originally started with um with a
[00:54:44] different kind of cryptography I would
[00:54:46] say um that was more limited um that
[00:54:49] didn't allow for all of those
[00:54:50] interactions and then at some point we
[00:54:52] sort of decided okay um and we realized
[00:54:55] that that was not good enough that was
[00:54:57] not not enough and at that point
[00:54:59] basically everyone was still building
[00:55:01] with that technology and we were like
[00:55:03] let's do something different instead
[00:55:05] let's think about how the future will
[00:55:07] look like how sort of computation and
[00:55:09] privacy can can converge in something
[00:55:11] bigger for the entirety of humanity. And
[00:55:13] that's then how we how we built it in
[00:55:16] very very quick time actually.
[00:55:19] How did you fund it? So, we got um
[00:55:22] investor funding. Um and
[00:55:26] I'm I'm incredibly thankful for all of
[00:55:28] the investors that I've gotten. Um
[00:55:30] Coinbase, for example. Um so so so big
[00:55:34] names in the in the space of um
[00:55:38] blockchain distributed systems right all
[00:55:41] of those those networks like Bitcoin um
[00:55:43] all of those those those those networks
[00:55:46] are distributed in nature decentralized
[00:55:49] um and um yeah there's a lot of players
[00:55:53] within that space that truly believe in
[00:55:55] the value of privacy and that that
[00:55:57] privacy is is a human right and privacy
[00:56:01] is inevitable. as a technology that that
[00:56:04] like to to support it, but not just
[00:56:06] support it, right? Because it is
[00:56:08] something they believe in, but invest in
[00:56:10] it because they sort of have realized
[00:56:13] that this is one of the most powerful
[00:56:16] technologies that that can exist in in
[00:56:19] in humanity, right? Being able to take
[00:56:22] information, move it into this realm,
[00:56:23] and then it can stay in this realm and
[00:56:25] it can be processed and everyone can do
[00:56:27] that. That is incredibly powerful. It is
[00:56:30] emancipating and it is powerful for for
[00:56:32] businesses but also nation states. At
[00:56:34] the end of the day it is a neutral
[00:56:36] technology and so we have investors that
[00:56:38] that believe in that.
[00:56:41] So one of the applica we were just
[00:56:43] talking off camera. One of the
[00:56:45] applications uh for this technology it
[00:56:48] well one of the big ones is the movement
[00:56:50] of money in a way that's private.
[00:56:54] How exactly does that work? And let me
[00:56:56] just add one editorial comment. The
[00:56:58] great disappointment of the last 10
[00:57:00] years for me is that crypto transactions
[00:57:02] don't seem to be as private or beyond
[00:57:04] government control as I thought they
[00:57:05] would be. I hope they are someday, but
[00:57:08] watching the Canadian truckers,
[00:57:10] you know, have their crypto frozen was
[00:57:12] just such a shock. I've never gotten
[00:57:13] over it. Uh, will this technology change
[00:57:17] that?
[00:57:18] >> Yes. So, if you think about Bitcoin as
[00:57:21] the state-of-the-art um model of or or I
[00:57:25] guess the original not state-of-the-art
[00:57:27] but the original kind of blockchain
[00:57:29] network, right? What what it is at the
[00:57:31] end of the day is a way for
[00:57:34] distributed people to find consensus
[00:57:37] over some unit of money which is
[00:57:40] actually more like a commodity than
[00:57:42] actually a financial instrument.
[00:57:44] >> That's right. um and they find consensus
[00:57:48] and they create this this currency. Um
[00:57:50] and that's why people think that it's
[00:57:52] that it's fake, non-existent, right?
[00:57:54] Although it's a way more real process of
[00:57:56] creating a currency than than fiat
[00:57:58] currency, they they they mine it by
[00:58:01] taking energy and solving a mathematical
[00:58:04] problem. And once they correctly solve
[00:58:06] that mathematical problem, they get
[00:58:08] rewarded in that newly mined currency.
[00:58:11] Right? So it's a very very elegant
[00:58:14] design. Um most people think that these
[00:58:18] kinds of networks are anonymous and and
[00:58:21] are dangerous, right? Because I feel
[00:58:22] like it has actually been a narrative
[00:58:24] that um media and different actors want
[00:58:29] the people to to believe.
[00:58:31] >> I I'm just for I just have to add I
[00:58:33] would like them to be anonymous and
[00:58:34] dangerous.
[00:58:35] >> Oh yes. Yeah. Yeah. Yes.
[00:58:36] >> That's what I was hoping for.
[00:58:39] >> So So people believe that. um which
[00:58:42] which attracts people, right? And and
[00:58:44] also keeps other people from from using
[00:58:46] them and trying to to outlaw them. In
[00:58:48] actuality, they're not anonymous. Um
[00:58:50] what you what you have in in Bitcoin
[00:58:53] specifically is pseudonomy. So you don't
[00:58:56] see on the blockchain Tucker Carlson um
[00:58:59] has 10 Bitcoin or whatever and send
[00:59:01] Yanik one Bitcoin. You instead see ABCD
[00:59:04] EFG blah blah blah whatever, right? a
[00:59:06] random string of numbers and and letters
[00:59:09] has sent something to another random
[00:59:11] string and and um of of letters and
[00:59:14] numbers. However, they're linked to this
[00:59:18] identity that you have. So for every
[00:59:20] single transaction that you've performed
[00:59:22] in history on on top of this distributed
[00:59:24] ledger, you will see all of those
[00:59:27] transactions. Um so I when when you
[00:59:31] later after the show send me one
[00:59:33] bitcoin, I guess, right? So I I would
[00:59:35] see
[00:59:36] >> they're cheaper today than they were
[00:59:37] yesterday.
[00:59:38] >> You noticed? Yeah. So So when I when you
[00:59:43] send me something, what I'll be able to
[00:59:46] to see is all of the other transfers
[00:59:49] that you've performed in the past,
[00:59:51] right? That's that's unfortunately how
[00:59:53] Bitcoin works. And so it has this
[00:59:56] inherent full transparency. there's no
[00:59:59] privacy because it's so easy to then via
[01:00:02] um I guess on and off ramps how you
[01:00:05] actually moved money in there right
[01:00:06] because you most likely don't actually
[01:00:09] get this currency through through work
[01:00:11] by applying energy you buy it for for a
[01:00:14] different currency fiat money right um
[01:00:16] so your identity is linked everything is
[01:00:18] public um and so that's a fundamental
[01:00:22] issue that is actually a dystopian
[01:00:26] scenario where we could end up if this
[01:00:28] is adopted as the technology where all
[01:00:30] of your money now sits and and you're
[01:00:32] sending transactions where you have this
[01:00:34] big upside of having cash-like
[01:00:36] properties which is amazing but you have
[01:00:38] this tremendous downside of literally
[01:00:41] everything being recorded for the
[01:00:44] conceivable future of humanity right um
[01:00:47] and you have no privacy um and that
[01:00:51] inherently
[01:00:52] limits your freedom um to use this
[01:00:54] technology and so that is an issue um
[01:00:58] that that exists not just within Bitcoin
[01:01:00] but also other blockchain networks and
[01:01:02] Bitcoin is this the the this pure form
[01:01:05] that's why within this this crypto
[01:01:06] industry there's a lot of competition
[01:01:08] also between between different players
[01:01:10] that say Bitcoin is this pure form that
[01:01:13] only allows transfers of money right and
[01:01:15] other networks allow execution as well
[01:01:18] um and that is that has led to what is
[01:01:22] commonly being called smart contracts so
[01:01:24] this concept of computer programs
[01:01:27] that that simply exist in the adder.
[01:01:30] Basically, a computer program that can
[01:01:33] execute something that you tell it to do
[01:01:35] and it will guaranteed to do so. And
[01:01:37] this amazing property that that all of
[01:01:40] the the the founding fathers of of those
[01:01:42] networks basically identified as
[01:01:44] important is so-called censorship
[01:01:46] resistance, which I think is also
[01:01:47] important in in real life.
[01:01:49] >> Very um and so those networks provide
[01:01:52] censorship resistance. It doesn't matter
[01:01:54] if one computer decides, well, I'm not
[01:01:56] going to accept Tucker's transaction
[01:01:58] because I don't like Tucker. Well,
[01:02:00] there's going to be another computer
[01:02:01] that says, I will accept it. So, that is
[01:02:03] censorship resistance that is inherently
[01:02:05] baked into those systems. And what that
[01:02:07] means is if you interact with this as
[01:02:10] this invisible machine, right? you get
[01:02:13] guaranteed execution for whatever you
[01:02:16] tell it to do. Either send someone money
[01:02:18] or perform some other computational
[01:02:21] logic that is baked into the system. And
[01:02:24] so there had been um have been different
[01:02:27] kinds of pioneers on the on the front of
[01:02:30] performing um um yeah adding
[01:02:33] cryptographic privacy to those systems.
[01:02:35] Um there has for example emerged a
[01:02:38] network called zeroache Ccash which is
[01:02:41] basically Bitcoin with cryptographic
[01:02:43] privacy. Um and there have also been
[01:02:46] pioneers like the inventors of tornado
[01:02:49] cashache who have built a smart contract
[01:02:52] that exists within this edger is
[01:02:54] unstoppable. Once you've uploaded it you
[01:02:57] cannot stop it anymore. So they did that
[01:02:59] and the kind of code that they
[01:03:02] implemented there gave you privacy on
[01:03:04] top of this public network um which was
[01:03:07] the or is the Ethereum virtual machine.
[01:03:10] So they did that and um
[01:03:12] >> tornado cash did that
[01:03:13] >> tornado cash
[01:03:14] >> were they did they win the Nobel Prize
[01:03:16] >> um
[01:03:16] >> did they get the presidential medal of
[01:03:18] freedom? What happened next when they
[01:03:19] offered privacy? So there were I think
[01:03:24] it was three founders. Um Roman Storm
[01:03:27] who's an American citizen um Roman Seov
[01:03:32] who is a Russian national and um Alexe
[01:03:35] Petersf who is a Russian national as
[01:03:38] well who lives in the Netherlands. Um he
[01:03:41] has been convicted of um assisting in in
[01:03:46] moneyaundering for five years. um and
[01:03:49] and
[01:03:50] >> 5 years in prison.
[01:03:51] >> 5 years in in prison. Um and Roman Storm
[01:03:55] um has been convicted of um in the
[01:03:58] United States of
[01:04:01] um conspiring to run a money transmitter
[01:04:05] without a license. Now, why has this
[01:04:07] happened? Why did they suffer such grave
[01:04:10] consequences?
[01:04:12] >> They were arrested.
[01:04:13] >> They were arrested
[01:04:14] >> and brought on trial.
[01:04:15] >> Brought on trial. I mean it's it's
[01:04:17] actually if you look at what what Roman
[01:04:19] Storm has faced it was 40 years in
[01:04:22] prison for this
[01:04:24] >> um and in the United States
[01:04:25] >> in the United States of America and
[01:04:29] why why has that happened right they
[01:04:31] built up privacy tool well it was an
[01:04:34] illicit actor that used their privacy
[01:04:36] tool um and that is a shame because
[01:04:40] um it was an illicit actor that um a lot
[01:04:43] of people agree on that is an illicit
[01:04:44] actor I I think the two of us also agree
[01:04:46] that North Korea laundering stolen
[01:04:49] hacked funds is an illicit actor
[01:04:52] >> misusing a tool, right? So there's no
[01:04:54] question about this. The the underlying
[01:04:56] question really
[01:04:57] >> and we're sure that actually happened.
[01:04:58] >> We are sure that happened. Yes, for sure
[01:05:00] that that has happened. Um and so they
[01:05:03] they store funds um because they were
[01:05:07] able to hack um different systems and um
[01:05:11] then were able to to utilize this
[01:05:14] platform to gain privacy to then move
[01:05:16] those funds somewhere else.
[01:05:17] >> Did Romantorm participate in the North
[01:05:19] Korean hedge fund theft?
[01:05:21] >> He did not know.
[01:05:22] >> So if I rob a bank and then jump into my
[01:05:26] Chevrolet and speed away, does the
[01:05:28] president of General Motors get
[01:05:29] arrested? Usually he doesn't know. Okay.
[01:05:32] Which is interesting because he provided
[01:05:35] clearly this tool for you to car
[01:05:38] >> and he he knows that people um get away
[01:05:41] with cars, right?
[01:05:42] >> Yes, he does. So kind of weird how he's
[01:05:45] dodged those obvious charges.
[01:05:48] Oh, is that that's really what happened?
[01:05:50] >> That's really what happened. Yeah. And
[01:05:51] and has faced 40 years in jail. Um but
[01:05:54] the the jury um yeah could not find a
[01:05:58] unanimous decision on the on the main
[01:06:01] charges I guess circumventing sanctions
[01:06:03] and
[01:06:05] um and and um helping with with money
[01:06:07] laundering. Now the interesting thing is
[01:06:09] before they got arrested what has
[01:06:11] happened um the OFAC the ORUS for
[01:06:15] foreign asset control in the United
[01:06:17] States um they took the software that
[01:06:21] those developers had written and
[01:06:24] uploaded to the effort where it is
[01:06:27] become out of anyone's control um
[01:06:30] unstoppable by nature anyone can use it
[01:06:33] they they essentially wrote code for a
[01:06:36] software tool for anyone to get privacy.
[01:06:39] Um, that software tool got sanctioned.
[01:06:42] It got put on the SDN list for specially
[01:06:46] designated nationals where you put the
[01:06:48] names of terrorists and you put the
[01:06:50] address in this thing, right? Um um of
[01:06:54] the software. So the source code itself
[01:06:57] became illegal. It was deleted from the
[01:07:00] internet. All of the companies closed
[01:07:03] their developers account developer
[01:07:05] accounts. um the software they wrote the
[01:07:09] the free speech that they performed by
[01:07:11] coming up with those ideas and and
[01:07:13] publishing it to the world got censored
[01:07:16] because they were added to a to a list
[01:07:19] which I don't even belong on because
[01:07:20] it's uh it is not
[01:07:22] >> without any vote in Congress by the way
[01:07:24] or this is just yeah
[01:07:25] >> part of the I think it's under state
[01:07:27] department now but I could be or
[01:07:28] treasury I can't even remember but they
[01:07:30] have enormous power they've destroyed
[01:07:32] the lives of many thousands of people
[01:07:34] without any democratic oversight at all.
[01:07:38] And uh it's pretty pretty shocking.
[01:07:40] >> Yeah. And so so it got added onto this
[01:07:42] list. Um and I think last year um a
[01:07:47] court in the uh state of Texas actually
[01:07:50] ruled that OFAC
[01:07:53] um does not have the statutory authority
[01:07:56] to do any of that. And they then
[01:07:58] silently removed um tornado cache again
[01:08:01] from the SCN list. However, nobody is is
[01:08:04] able to use that tool now, right?
[01:08:06] Because every company for compliance
[01:08:08] reasons um outcasts you from from the
[01:08:12] user base if you have ever touched any
[01:08:14] any um anything related to that.
[01:08:16] >> And Roman Storm is
[01:08:19] >> he was convicted. You said he he there
[01:08:21] was a hung jury on on the strongest
[01:08:23] charges, but on other charges he was
[01:08:25] convicted. He was convicted on one
[01:08:26] charge um on I think they it is called
[01:08:31] um yeah conspiracy to um to run a money
[01:08:36] processor financial institution right a
[01:08:39] bank um without a banking license
[01:08:42] conspiracy to start a bank so so so they
[01:08:45] put him in jail um
[01:08:47] >> actually
[01:08:48] >> so so he's he is one year jail sentence
[01:08:51] that's on the charge right but he's he's
[01:08:53] currently in the process of appealing
[01:08:54] that So um Roman Storm um
[01:08:59] didn't run a bank. He he didn't create a
[01:09:02] bank. He created software, right? He
[01:09:05] made use of his inherent right for
[01:09:08] freedom of speech to build something
[01:09:12] that enables others to make use of their
[01:09:15] right for freedom of speech, right?
[01:09:16] Because that is at the end of the day
[01:09:18] the the freedom of economic interaction,
[01:09:22] right? That is what he helped others
[01:09:25] protect for themselves. He never
[01:09:27] processed a transaction for anyone,
[01:09:28] right? He's not an intermediary. He
[01:09:30] specifically built technology that is
[01:09:32] disintermediated where you yourself use
[01:09:36] that software. Um, yeah. And so, um, the
[01:09:40] remarkable thing is I pay some
[01:09:43] attention, obviously not enough. I was
[01:09:45] not aware of the story until I was
[01:09:47] reading up on you.
[01:09:49] Where's all the coverage on Roman Storm?
[01:09:51] He doesn't even have a Wikipedia page,
[01:09:54] >> I've noticed.
[01:09:54] >> So, um there is there is I think
[01:09:59] incredible um institutions like the
[01:10:02] Electronic um Frontiers Foundation, the
[01:10:04] EFF um and um DeFi Education Fund, but
[01:10:08] also companies like Coinbase who
[01:10:10] actually have invested substantial
[01:10:12] amount of money um into defending Roman
[01:10:15] Storm and um yeah, Alex Peters have as
[01:10:18] well. Um, I think Alexe Petersf also
[01:10:22] doesn't get enough attention. He's um I
[01:10:25] mean he's now under house arrest in in
[01:10:27] in in the Netherlands um and preparing
[01:10:30] to to appeal this decision. I think
[01:10:32] something like that.
[01:10:32] >> Why are so many of the players in this
[01:10:34] Russian? Um, I think it really boils
[01:10:37] down to them having a deep understanding
[01:10:40] about I think historically, maybe
[01:10:42] culturally, they have an understanding
[01:10:44] about the importance of privacy
[01:10:46] >> in a society to right
[01:10:48] >> to to uphold freedom. Um, which is a
[01:10:52] shame. Yeah. That they
[01:10:54] >> Well, they've they've suffered for that
[01:10:55] knowledge.
[01:10:56] >> Yes.
[01:10:56] >> For 70 years more than um so yeah, it's
[01:11:00] just it's very striking. It's a 140
[01:11:02] million people. It's tiny country
[01:11:04] relatively speaking and yet they are way
[01:11:06] over represented from Pav on down.
[01:11:09] >> For sure. Yeah, that is. Yeah, that is
[01:11:11] true. Um so I think um I think it's
[01:11:16] interesting um
[01:11:19] how we also all of us take that take
[01:11:22] that as a granted that these kind of
[01:11:24] people um go out of um their everyday
[01:11:27] lives and put a target on their head by
[01:11:30] by shipping this technology. Yes. to
[01:11:32] enable you um to gain privacy. Um and um
[01:11:38] simply the knowledge about the existence
[01:11:41] of bad actors in the world um has made
[01:11:45] them has made them victims and and put
[01:11:47] them in jail, which is insane.
[01:11:50] >> Well, I mean, it's something the rest of
[01:11:51] us should push back against, I think,
[01:11:53] but the hurdle for me is not knowing.
[01:11:56] Again, I didn't even know this was
[01:11:57] happening. I should have guessed. So, if
[01:11:59] you could be more precise about what you
[01:12:01] think the real motive was behind going
[01:12:03] after Tornado Cash and Roman Storm, like
[01:12:06] what why was the US government not
[01:12:10] prosecuting drug cartels in order to
[01:12:12] prosecute Roman Storm?
[01:12:14] >> I think um so that has taken place under
[01:12:17] the previous administration. Um, so I
[01:12:21] think President Trump with his
[01:12:23] administration has um done tremendous
[01:12:26] work in regards to pushing the adoption
[01:12:29] of decentralized technology of really
[01:12:33] allowing us all of the people in that
[01:12:35] space to to try to rethink the financial
[01:12:38] system and and build this technology
[01:12:40] because they've sort of realized that um
[01:12:44] technological innovation
[01:12:47] runs at a faster pace than than
[01:12:49] legislative processes and and um under
[01:12:52] the previous administration that looked
[01:12:54] differently. So so I think um that has
[01:12:57] helped this technology spread a lot. Um
[01:13:00] and
[01:13:02] it is however important to consider
[01:13:05] privacy. And when the executive order
[01:13:08] banning um CBDC's was signed um central
[01:13:12] bank digital currencies um an explicit
[01:13:15] reason why CBDC's should never be
[01:13:17] adopted in the United States was the
[01:13:20] privacy concern. Because if we look at
[01:13:22] all of those new digital shiny
[01:13:24] currencies being built in Europe um and
[01:13:27] all around the world, I guess besides
[01:13:29] the US, which is great, which which
[01:13:31] which actually um is amazing, I think um
[01:13:36] is that all of them are
[01:13:39] surveillance um machines to even a
[01:13:42] higher degree than the current financial
[01:13:43] system is already, right? It is already
[01:13:45] a um a surveillance system. But what's
[01:13:49] so important about this next generation
[01:13:51] of of money is um we're sort of at a
[01:13:54] crossroads. Do we want our money to
[01:13:58] enable us freedom, freedom of economical
[01:14:00] interaction, freedom of thought at the
[01:14:02] end of the day because whatever we think
[01:14:04] we do, right? Where we we want to put
[01:14:06] our money where our mouth is. Um or do
[01:14:11] we want a monetary system that enables
[01:14:16] automatic subsequent action um based on
[01:14:19] whatever activity you perform in your
[01:14:21] digital life which can mean things like
[01:14:23] now all of your money is frozen and you
[01:14:25] don't have any access to it anymore
[01:14:27] because whatever you just did um was
[01:14:31] deemed as undesirable by big brother I
[01:14:34] guess right so that is that is literally
[01:14:36] the the two possible futures that we
[01:14:38] have. It's two extremes. Um there's no
[01:14:41] no possible future in between. Um and
[01:14:46] what they've the architects of of
[01:14:48] >> So you're assuming cash is over.
[01:14:51] >> Um cash already is also um being heavily
[01:14:54] surveiled. So your bank node has a
[01:14:56] serial number. So if you actually think
[01:14:57] about something like um tornado cache or
[01:15:00] all of the I mean there's there's a lot
[01:15:02] of applications that for example utilize
[01:15:04] rum to also bring this level of privacy
[01:15:07] right um if you think about all of these
[01:15:09] these these systems
[01:15:12] they are in my mind personally I mean as
[01:15:15] long as you have an internet connection
[01:15:16] if you don't have an internet connection
[01:15:18] maybe um you you cannot spend your money
[01:15:21] right now um but as long as that exists
[01:15:24] even superior to cash because you don't
[01:15:27] have any serial numbers anymore, right?
[01:15:28] >> Wait, so you say cash is being
[01:15:30] surveiled?
[01:15:32] >> Sure. I mean, when I go to the ATM and
[01:15:34] withdraw money, the serial numbers are
[01:15:37] recorded in some database and when a
[01:15:39] merchant um at at Walmart, I guess, or
[01:15:42] wherever puts that into their cash
[01:15:44] registry, um you can also record a
[01:15:47] serial number. So, um
[01:15:48] >> is that true?
[01:15:49] >> Yeah, there there has been I I read an
[01:15:51] article a few months ago about a
[01:15:54] tracking system like that within Europe.
[01:15:56] So that is very practical and yeah
[01:16:00] >> I'm going to take a a magic marker a pen
[01:16:02] and distort the serial numbers in all my
[01:16:04] cash now.
[01:16:05] >> Yeah. Right. So I mean it should be
[01:16:08] should still be legal tender, right?
[01:16:10] >> I would think so. I'd never heard of
[01:16:11] that.
[01:16:11] >> I mean I mean there there could be other
[01:16:13] tracking mechanisms. I don't know. But
[01:16:14] but I've read about this technology
[01:16:16] which clearly exists um and is being
[01:16:20] used to even turn the cache system into
[01:16:22] a surveillance surveillance system and
[01:16:26] um it's not even again I think all of
[01:16:30] this is not even just
[01:16:33] um someone with governmental authority
[01:16:36] deciding to surveil people right it is
[01:16:38] also companies companies seeing
[01:16:40] economical value in surveilling you um
[01:16:43] and then utilizing
[01:16:44] this new technology utilizing the
[01:16:46] internet um to to do that and um it
[01:16:50] boils down to power I would say control
[01:16:52] right um if you have access to as much
[01:16:54] information as possible you can better
[01:16:57] prepare for the future and you can
[01:16:59] predict behaviors of your users or
[01:17:01] different actors and so that's why those
[01:17:03] those systems get implemented um so we
[01:17:06] are on this on this fork in in um the
[01:17:10] path towards the future and what the
[01:17:13] people that are architecting those
[01:17:15] central bank digital currency systems
[01:17:18] have realized and that's um so
[01:17:20] interesting to me is this old concept
[01:17:22] that the cipher punks in the 1990s um
[01:17:25] came up with which is code is law um
[01:17:29] which expresses what what has happened
[01:17:30] with tornado cache I think nicely where
[01:17:33] um it is the ultimate law sort of when
[01:17:36] you have this network that nobody
[01:17:38] controls and there's some piece of
[01:17:39] software and it just executes whatever
[01:17:42] is written within that software ware
[01:17:43] code executes there's no way of stopping
[01:17:46] it there's no way of doing anything
[01:17:48] about it and so that's what I mean when
[01:17:50] they say co is law code is law and the
[01:17:53] architects of those alternative systems
[01:17:54] have realized that there's so much power
[01:17:57] in being able to let's say take your
[01:18:00] chat messages and see that you have said
[01:18:03] something against big brother and big
[01:18:04] brother brother doesn't appreciate that
[01:18:06] right and so automatically now your
[01:18:09] money um is frozen and that is code is
[01:18:13] law, right? In the utopian sense and in
[01:18:16] the dystopian sense where software
[01:18:18] automatically can lock you out of all of
[01:18:20] those systems. And I would much rather
[01:18:23] um have a utopian future than dystopian
[01:18:25] future. But at the end of the day, from
[01:18:28] a technological standpoint, those things
[01:18:30] are similar. The only difference really
[01:18:32] is cryptography,
[01:18:35] >> privacy,
[01:18:36] >> privacy.
[01:18:37] >> Because you're offering that on a scale
[01:18:40] even larger than anything Tornado Cash
[01:18:42] or Roman Storm attempted,
[01:18:45] it has to have occurred to you that
[01:18:47] whether or not you have prominent
[01:18:49] investors, like you face some risk.
[01:18:52] >> Sure. So I think um
[01:18:55] what what what I'm doing with archum at
[01:18:58] the end of the day is I'm providing the
[01:19:00] most versatile and superior form you can
[01:19:04] execute a computer program right within
[01:19:08] encryption you can execute a computer
[01:19:10] program and you can have many people
[01:19:12] contribute encrypted data and you can do
[01:19:14] all sorts of things. You can do things
[01:19:16] starting with um financial transfers,
[01:19:20] right? You can add privacy to financial
[01:19:21] systems. But that doesn't just mean
[01:19:23] we're adding privacy to me and you
[01:19:26] Tucker interacting with each other. We
[01:19:28] can also add privacy to entire markets,
[01:19:30] right? Which again can also have
[01:19:32] downsides. I'm I'm not arguing that that
[01:19:34] there's only upsides with this
[01:19:35] technology. There might be actors that
[01:19:37] then utilize that um not not just
[01:19:41] talking about criminal activity, but
[01:19:42] just unethical activity, right? The way
[01:19:44] that people may interact. So um at its
[01:19:47] core it is neutral technology. Um but
[01:19:50] the use cases that that I'm really
[01:19:52] focused on enabling also are use cases
[01:19:55] like enabling within the healthcare
[01:19:57] system to actually utilize data that
[01:20:00] currently is being stored but it is
[01:20:03] being stored in a very inefficient way
[01:20:05] where it's isolated. Right? So with my
[01:20:08] technology, we can take this data and
[01:20:10] use it without ever risking that data to
[01:20:13] be exploited, without ever taking
[01:20:15] ownership of your data because you're
[01:20:17] the patient, you're the human, right? I
[01:20:18] have no right to to take ownership over
[01:20:20] that. And I don't need with that
[01:20:22] technology because you can consent and
[01:20:23] say, let's improve healthcare or
[01:20:27] whatever with my data, but you're not
[01:20:29] getting my data because it's encrypted,
[01:20:31] right? This is um I don't know. It's a
[01:20:33] crazy concept to wrap your head around.
[01:20:35] I I I get it. It enables so much also on
[01:20:38] a national security level that it is
[01:20:40] strictly superior technology. And I
[01:20:42] think this example that I told you
[01:20:44] earlier about verifiability, right? Um
[01:20:47] mathematically
[01:20:49] being able to be convinced that a
[01:20:52] computer um program, a computation that
[01:20:55] has been executed in privacy, right? Um
[01:20:58] has been executed correctly is such an
[01:21:01] amazing concept. And and the way I I
[01:21:04] think about it really is opening up a
[01:21:06] new design space um altogether and
[01:21:11] allowing companies to do actual
[01:21:13] innovation instead of innovating only on
[01:21:16] the front of how can I extract as much
[01:21:18] value as possible from my user by
[01:21:20] surveilling them. Um, so I don't really
[01:21:23] I don't really think about it the way
[01:21:26] that you frame it. I'm building this
[01:21:27] generalized computing platform that can
[01:21:30] be used by anyone um because I don't
[01:21:34] have any control over it. Right? I'm not
[01:21:36] building a controlled infrastructure.
[01:21:38] I'm building open um software that is
[01:21:42] used for good.
[01:21:43] >> And I'm grateful that you are. And I
[01:21:45] don't at all mean to make you
[01:21:46] pessimistic or paranoid, but in so
[01:21:48] doing, you're threatening
[01:21:51] current stakeholders.
[01:21:54] Um,
[01:21:56] sure. But I think that's that's that's
[01:21:57] always the case with with new
[01:21:59] technology, right? Of course. I mean,
[01:22:02] when when cars first came along, right,
[01:22:05] there were unions of um um horse
[01:22:09] carriage um taxi ride providers. They
[01:22:12] did not want to see cars on the road,
[01:22:14] >> of course.
[01:22:14] >> So, there's always um interests that um
[01:22:19] try try to utilize both technology and
[01:22:22] and law um to prevent others from
[01:22:27] >> keep the current monopoly in place. Of
[01:22:28] course, it all dep the stakes depend
[01:22:31] entirely on how disruptive the new
[01:22:32] technology is.
[01:22:33] >> Yes.
[01:22:34] >> Ask Nikolai Tesla.
[01:22:35] >> Yeah.
[01:22:36] >> Um right. Sorry, Dark. But
[01:22:39] >> so, it's not a concern. It is not a
[01:22:41] concern for me. No.
[01:22:43] >> H I wonder if that's just a quirk of
[01:22:46] your personality where you're just not
[01:22:47] afraid of stuff.
[01:22:49] >> That's actually a issue I I I would say
[01:22:52] I sort of suffer some suffer from
[01:22:54] sometimes not being afraid of things.
[01:22:56] But
[01:22:57] >> good.
[01:22:57] >> I think it's
[01:22:57] >> I think you need that in order to
[01:22:59] proceed.
[01:23:00] So from the perspective of the average
[01:23:03] American consumer who's not following
[01:23:05] this carefully, when does your life
[01:23:07] begin to look different as a result of
[01:23:09] this kind of technology? When will you
[01:23:10] see this sort of thing in action? How
[01:23:12] will you experience it?
[01:23:15] >> It's actually a brilliant question. Um
[01:23:21] I I think just trying to run numbers in
[01:23:25] my head and and trying to predict the
[01:23:27] future. That's something I've never
[01:23:28] done, by the way. I've never paused in
[01:23:30] mid-con conversation that I've got to
[01:23:31] run some numbers in my head.
[01:23:35] >> Do this all the time.
[01:23:36] >> Yeah, I never have.
[01:23:37] >> Um so I think I think it will will
[01:23:40] affect your your everyday life um
[01:23:43] positively
[01:23:45] um once
[01:23:49] once I guess there is an um infliction
[01:23:52] point reached um on on multiple fronts,
[01:23:54] right? I'm I was talking about
[01:23:55] healthcare and national security also
[01:23:57] financial system right um but it also I
[01:24:00] mean so that's a criticism I actually
[01:24:03] have for signal um that is that
[01:24:07] there exists one single point of failure
[01:24:09] within signal's um technological stack
[01:24:13] that I've been vocal about and I I
[01:24:15] dislike which is that um what they call
[01:24:17] private contact discovery um where I
[01:24:21] have a set of contacts in my contacts on
[01:24:24] my phone, right? You do the same thing.
[01:24:26] And if there is an intersection between
[01:24:29] the two sets that we have where I have
[01:24:31] you as a contact, you have me as a
[01:24:32] contact. Yes.
[01:24:33] >> I get um Tucker suggested on Signal,
[01:24:36] right? Um only in that case, how does
[01:24:39] that work, right? How
[01:24:42] >> does Signal ensure that those contacts
[01:24:45] are encrypted and secure, right? They um
[01:24:48] use trusted hardware for that and that
[01:24:50] is a critical flaw within their
[01:24:52] infrastructure. So there's technology um
[01:24:55] trusted execution environments is what
[01:24:58] they're called manufactured by by Intel
[01:25:00] for example. Um and this technology
[01:25:03] comes with this this promise of being
[01:25:07] secure and being able to basically do
[01:25:09] what what we're doing with mathematics
[01:25:11] but instead with trust. Um so they say
[01:25:13] we build a secure machines.
[01:25:14] >> You think we shouldn't trust Intel?
[01:25:16] >> I
[01:25:18] I think so. Yes. I
[01:25:21] >> I think I think the
[01:25:23] >> required to trust Intel.
[01:25:24] >> Yeah. I I think it's a insane idea to to
[01:25:28] begin with and last year it's been funny
[01:25:30] last year there have been a myriad just
[01:25:32] last year but over 10 last 10 years a
[01:25:34] myriad of exploits of that technology.
[01:25:37] Um so in the past it has always been
[01:25:40] sold sort of as here's this technology
[01:25:43] um and it does verifiability and privacy
[01:25:46] and just put your data in that. Um
[01:25:49] there's no there's no back door, right?
[01:25:51] Um of course not. Why would there be a
[01:25:53] back door? Um
[01:25:54] >> why would Intel cooperate with anyone?
[01:25:56] >> Sure. Right. Um and um you would do
[01:26:01] that. And then last year there were
[01:26:03] those researchers that said, well, if
[01:26:05] you have physical access to this
[01:26:07] computer, you can just read out all of
[01:26:09] the data and you can not even just read
[01:26:11] out all of the data, but you can fake
[01:26:13] keys and then you can perform fake
[01:26:16] computations on behalf of other people.
[01:26:17] So if you're building a financial system
[01:26:19] with a computer like this, I can just
[01:26:22] change numbers, right? And I know what
[01:26:24] your numbers and I can I can change
[01:26:25] those numbers. And that's not even the
[01:26:28] core issue I have with that in the case
[01:26:29] of of Signal, right? So Signal is
[01:26:33] I think still relying on that tech. So I
[01:26:36] think they run this hardware. I mean, I
[01:26:38] hope they run the hardware because at
[01:26:40] least there I have a little bit of
[01:26:42] remaining trust assumption that okay,
[01:26:44] they will not um yeah, try to hack those
[01:26:47] PCs, which is relatively
[01:26:48] straightforward. You just connect a few
[01:26:50] cables at the end of the day. Um and and
[01:26:53] then you can excfiltrate the information
[01:26:54] which is the the the interactions,
[01:26:58] right? Is Tucker my contact? Is Yanik
[01:27:00] Tucker's contact, right? That's very
[01:27:02] sensitive information. Um and so um that
[01:27:06] is a single point of failure. um whereas
[01:27:09] they could access that information or
[01:27:10] whoever gets access to that information
[01:27:13] um and we're not even thinking about
[01:27:15] potential back doors at that point right
[01:27:18] within that within that hardware so
[01:27:20] within the manufacturing process I mean
[01:27:22] I think it would be very naive to assume
[01:27:24] that there's no back door similar to
[01:27:27] what we talked earlier about with dual
[01:27:28] EC right um or something like um the
[01:27:32] clipper chip thing right that that was
[01:27:35] um attempted in the '9s so there's
[01:27:38] very it's very likely I would say that
[01:27:40] there's
[01:27:41] some randomness tempering let's call it
[01:27:43] that um that could be in place because
[01:27:45] you are literally also getting
[01:27:48] uh keys right from the manufacturing
[01:27:50] process right so it's this proprietary
[01:27:52] supply chain and then they ship that
[01:27:55] computer to you and it comes with random
[01:27:58] keys um that have been generated in that
[01:28:00] proprietary production line um so
[01:28:03] there's many single points of failure
[01:28:04] and that's what I what I don't like
[01:28:06] about signal because I don't want um
[01:28:08] this information out there, right? What
[01:28:10] does my address book look like? So, they
[01:28:12] can fix that. They can fix that with
[01:28:14] technology that we've built, right? They
[01:28:15] can use our technology. I'm more than
[01:28:17] happy to just give them a technology. I
[01:28:19] mean, it's open source, right? And and
[01:28:21] then they can just build this thing
[01:28:24] without a single point of failure
[01:28:25] without a way because this is sort of a
[01:28:28] reasonable way for a state also to say,
[01:28:30] well, you actually have this data, give
[01:28:32] us this data, right? where they cannot
[01:28:34] really argue that they don't have that
[01:28:35] data because they could connect a few
[01:28:38] cables to that computer and then get
[01:28:40] that data. So it's not the secure device
[01:28:42] that um people claimed in the past um it
[01:28:46] was. So I think that is important um
[01:28:50] um to resolve I actually don't recall
[01:28:52] how I got to that tangent. I wonder uh
[01:28:56] if any
[01:28:59] if any big hardware manufacturer will
[01:29:02] begin to offer truly secure devices for
[01:29:05] sale
[01:29:07] >> um
[01:29:09] or it's not worth it probably right
[01:29:11] >> so so I think it is it is worth it right
[01:29:14] you as a military want to have secure
[01:29:17] devices right
[01:29:18] >> um everyone I think everyone would
[01:29:21] rather compute on a secure device than
[01:29:23] an insecure whatever it was. Um,
[01:29:25] >> but the manufacturers aren't making
[01:29:26] their money from the devices.
[01:29:29] I mean, they're making money. I don't
[01:29:30] know what it costs to make an iPhone
[01:29:31] less than 900 bucks, but I mean, it's an
[01:29:35] annuity like the the long,
[01:29:37] >> you know, the second you buy an iPhone,
[01:29:38] you're making money for the company
[01:29:40] every day you use it, right?
[01:29:41] >> Sure. Sure. So, so, so I think um it is
[01:29:46] impossible to build secure hardware um
[01:29:48] in that regard where where those claims
[01:29:50] of full privacy and security um are
[01:29:54] factually true. There's impossible.
[01:29:56] There have been so many techniques where
[01:29:58] actually just um yeah use use so many
[01:30:01] different tools um to to play around
[01:30:03] with those devices where it is literally
[01:30:05] impossible to implement secure and um
[01:30:09] verifiable systems because even while
[01:30:12] verifying them you need to take them
[01:30:14] apart um sort of destroying them in the
[01:30:16] process. So that that does not exist.
[01:30:19] what I think however exists sort of is
[01:30:22] this concept of decentralization and why
[01:30:25] that's so powerful because it doesn't
[01:30:27] really matter if this manufacturer here
[01:30:30] um creates a backdoor um as long as I
[01:30:33] have 10 different computers or 100
[01:30:35] computers right from different
[01:30:36] manufacturers and there's one that does
[01:30:38] not have a full system level backdoor
[01:30:40] installed um I am secure under this
[01:30:43] trust model that we've developed in our
[01:30:45] company right so I think that's why
[01:30:48] decentral Centralization is so
[01:30:49] important. Um
[01:30:50] >> that was the basis of our political
[01:30:52] system when it was created. That same
[01:30:53] concept power is dangerous and so it has
[01:30:55] to be spread among different holders,
[01:30:57] different entities so it doesn't
[01:30:59] concentrate and kill everybody and
[01:31:01] enslave them. That's obviously going
[01:31:02] away. But that was that was the concept
[01:31:04] of the American republic.
[01:31:07] >> Yeah. Exactly. And and I think um it is
[01:31:10] sort of
[01:31:12] important to look at surveillance in the
[01:31:15] same way um where if you if you have
[01:31:19] access to surveillance you basically
[01:31:20] have access to unlimited power. So
[01:31:22] whatever surveillance system we we
[01:31:25] implement um be it chat control in the
[01:31:28] European Union where I've been very
[01:31:30] vocal vocally opposed to on on X um and
[01:31:34] I I actually just learned
[01:31:38] um last week that the UK implemented um
[01:31:42] their version of chat control on the 8th
[01:31:44] of January um which is a censorship
[01:31:48] machine and um um surveillance back door
[01:31:51] right installed within all of your
[01:31:54] messaging applications. Um, and it comes
[01:31:57] with this claim of well, we're
[01:31:59] implementing this because we need to
[01:32:01] fight child exploitation, right? There's
[01:32:03] always one
[01:32:04] >> child exploit. They care about the
[01:32:06] children.
[01:32:07] >> Yeah. Yeah. I I I strongly believe that.
[01:32:10] So, um, they they basically have there
[01:32:13] there's basically four four um reasons
[01:32:16] to implement surveillance. So, there's
[01:32:17] child exploitation. Yes.
[01:32:19] >> There's terrorism.
[01:32:20] >> Yes. There's money laundering
[01:32:22] >> and there's uh war on drugs on drugs.
[01:32:25] >> Those are the four reasons, right? And
[01:32:26] they always rotate.
[01:32:27] >> The people engaged in importing drugs
[01:32:29] into our country, laundering the money,
[01:32:32] exploiting the children, and committing
[01:32:34] serial acts of terror against their own
[01:32:36] population. They're all very concerned.
[01:32:37] >> Oh man, I I I really think we now need
[01:32:39] surveillance now that you say it.
[01:32:41] >> Not of us.
[01:32:42] >> Yeah. Yeah. So um what's so funny um is
[01:32:47] that in 1999
[01:32:50] um the some some policing working group
[01:32:53] of the European Commission um there was
[01:32:55] a transcript of their discussions and
[01:32:57] literally within the transcript when
[01:32:58] they were talking about implementing
[01:33:00] digital surveillance systems who were
[01:33:01] like I think we should switch our
[01:33:04] arguments over to um child exploitation
[01:33:07] because that is more emotionally charged
[01:33:10] right that convinces people and so it's
[01:33:12] not Not just that that for us it is
[01:33:15] obvious that that's not what's going on.
[01:33:17] Right. There's
[01:33:17] >> when the people who covered up the
[01:33:18] grooming gangs are making that case it's
[01:33:20] like I don't think it's sincere at this
[01:33:22] point.
[01:33:23] >> Exactly. Right. So um
[01:33:26] there there is a reason why we don't
[01:33:28] believe that that's the actual reason.
[01:33:30] But what I'm arguing for is that that
[01:33:33] doesn't even matter. Even if even if the
[01:33:36] politicians are convinced that it's
[01:33:38] about protecting the children and that's
[01:33:40] the most effective measure to do that,
[01:33:42] right, to surveil all of the chats. Um
[01:33:45] what's going to happen is thanks to this
[01:33:48] being implemented as infrastructure that
[01:33:51] exists everywhere and there being a
[01:33:53] small circle of people that have access
[01:33:56] to this technology, it will get abused.
[01:33:58] um it is very easy to abuse those
[01:34:00] systems because the abuse itself happens
[01:34:03] within secrecy. So there's there's no
[01:34:06] oversight
[01:34:06] >> of course and instantaneously because of
[01:34:08] the comput the rising computational
[01:34:09] power. It's not like someone has to go
[01:34:11] to the Stacey archives to read all the
[01:34:13] files. It's like
[01:34:14] >> and and and Sam Alman will gladly help
[01:34:16] you to to sip through all of the your
[01:34:19] your
[01:34:20] >> a good guy.
[01:34:22] By the way, a lot of these businesses
[01:34:24] draw the worst pe like the most
[01:34:25] unethical people have the most power in
[01:34:27] case you haven't noticed. It's wild.
[01:34:29] >> It is wild. Yeah. Um
[01:34:32] Yeah. I mean there's a there's a
[01:34:35] economical function sort of to reward
[01:34:37] this, right? Because if I build an
[01:34:40] application and um you build an
[01:34:42] application and we just provide some
[01:34:44] value to our user um and the user pays
[01:34:47] for that basically capitalism, right? um
[01:34:51] all of that works out nicely but then
[01:34:52] you decide ah what if I take all of this
[01:34:56] information from my user and I use that
[01:34:58] to extract additional value from him
[01:35:00] right you're way more profitable
[01:35:02] profitable through that
[01:35:03] >> so the incentives
[01:35:04] >> and so then those incentives shift
[01:35:06] towards the setup and these kinds of
[01:35:10] applications are the ones that receive
[01:35:12] investment right and so that that trust
[01:35:14] increases and so unethical behavior gets
[01:35:16] rewarded in the system
[01:35:17] >> just to be clear about what you're
[01:35:18] saying are you saying that All texts
[01:35:20] sent within the UK are now monitored by
[01:35:22] the UK government.
[01:35:24] >> Um I'm not 100% familiar with all of the
[01:35:27] intricacies of what the um digital
[01:35:29] service or online safety act I think
[01:35:32] it's it's called in in the UK.
[01:35:34] What is happening there is that there is
[01:35:37] censorship being applied to the
[01:35:39] messages. So you receive whatever
[01:35:42] unsolicited image right um and then um
[01:35:45] that's being censored. So um what's
[01:35:48] happening there is I think I think
[01:35:50] what's important to understand that
[01:35:51] censorship is a byproduct of
[01:35:54] surveillance generally speaking.
[01:35:55] >> Yes.
[01:35:56] >> And so um you need to take a look at all
[01:35:59] messages in order to be censor something
[01:36:01] to to censor something. Right. And so
[01:36:03] that's what's happening there. Um, and
[01:36:06] even if we assume only the best of
[01:36:09] intentions,
[01:36:11] you have this infrastructure in place
[01:36:13] that tomorrow cannot just be abused by.
[01:36:16] >> Well, we should test it. I'm in the UK
[01:36:17] all the time. I have family there and
[01:36:19] I'm going to do a double blind study
[01:36:20] with my wife. I'm going to test to every
[01:36:22] person on my contact list, overthrow
[01:36:24] Pier Kiermer.
[01:36:26] >> Okay. Yeah.
[01:36:28] >> And to thousands of people, exclamation
[01:36:29] point. And she won't. And we'll see who
[01:36:31] gets arrested.
[01:36:32] >> Yeah. Yeah. That's a that's a great
[01:36:34] experiment. Actually, I I need to attend
[01:36:36] a conference in in the UK um this year.
[01:36:40] Um and it's so funny because a month ago
[01:36:43] there was this I think it's also some
[01:36:45] proposal that basically specifies that
[01:36:48] people that work on on encryption are
[01:36:51] sort of persona non grata in the in the
[01:36:53] UK. Something like that. I think it's
[01:36:55] not yet implemented, but I saw that on
[01:36:57] on X. You
[01:36:58] >> mean you can't get in the country if
[01:36:59] you're for privacy?
[01:37:00] >> Something like that. Yeah. Yeah. Where
[01:37:02] are we going to like big picture where
[01:37:04] is everyone going to end up do you think
[01:37:06] if the control grid snaps into place and
[01:37:09] it is snapping into place where do
[01:37:11] people go US is that the only place
[01:37:16] um
[01:37:18] so
[01:37:21] all of those I mean we're basically I
[01:37:25] would say not just sliding into that
[01:37:27] direction but galloping um
[01:37:31] and the infrastructure. It has been
[01:37:34] quite a while um since they started
[01:37:37] trying to implement those in your face
[01:37:41] things, right? They literally call it
[01:37:43] chat control. I mean, imagine how crazy
[01:37:45] that is.
[01:37:46] >> Literally stating every single messaging
[01:37:48] platform, email, whatever, we need to
[01:37:50] scan for this madeup reason. Um but
[01:37:54] trust us, we will only do that for this
[01:37:56] madeup reason and no other reason. Um
[01:37:58] and it happens on your device, right? So
[01:38:00] that's why end to end encryption is not
[01:38:02] undermined because it is being scanned
[01:38:04] on your device. Right. So um
[01:38:07] >> and that's very different from putting
[01:38:08] microphones in your bedroom. Trust is
[01:38:10] very very different.
[01:38:11] >> Yes. Yeah. I mean I I think people don't
[01:38:14] realize the extent to how um
[01:38:18] surveillance is possible nowadays. So um
[01:38:22] with with Wi-Fi routers you can
[01:38:25] determine movements within your your
[01:38:28] apartment, right? And so there there
[01:38:31] there was this um this one company, I
[01:38:34] mean that wasn't a big scandal. It was
[01:38:36] literally just um I don't know if you're
[01:38:39] familiar um I think he's called Lewis um
[01:38:42] Rossman who's a um YouTuber from New
[01:38:45] York who was fighting for the right to
[01:38:47] prepare right to repair um devices and
[01:38:50] stuff, right? So he's always been very
[01:38:53] um much advocating those those efforts
[01:38:56] and so he he just made this this video
[01:38:58] um where he went through the privacy
[01:39:00] policy of um of some internet service
[01:39:04] provider um and that privacy policy
[01:39:07] explicitly stated that they're allowed
[01:39:09] to monetize the movement data um that
[01:39:12] they get from those devices that you put
[01:39:16] in your home. And the funny thing about
[01:39:18] this this case that he was highlighting
[01:39:20] is that um for you as a as a person that
[01:39:25] lives in this in this building, you
[01:39:27] didn't even have an option to choose a
[01:39:29] different internet service provider
[01:39:30] because um with um I guess bulk
[01:39:33] agreements between a land and the
[01:39:36] internet service provider, you are
[01:39:37] forced to have those routers and those
[01:39:39] routers aren't even within your um
[01:39:42] within your apartment. there in the
[01:39:43] walls or somewhere. And so you're just
[01:39:45] being scanned within your most intimate
[01:39:48] um
[01:39:50] intimate um area of life, your home um
[01:39:54] by your internet service provider.
[01:39:58] >> And what about phones listening to
[01:40:00] people, the microphone on the phone or
[01:40:02] the camera on the phone taping you? So
[01:40:05] there's an interesting concept of um
[01:40:10] um ultrasound um
[01:40:13] um um listening of those phones where
[01:40:15] where basically you have a TV
[01:40:17] advertisement um and we don't hear
[01:40:20] ultrasound, right? But your phone with
[01:40:22] its microphone can could could hear it.
[01:40:25] I don't know if it's ultrasound or
[01:40:26] whatever frequency, right? So within
[01:40:28] that advertisement, we're going to play
[01:40:30] that sound. So you your phone can pick
[01:40:33] that up and then when you go to our fast
[01:40:35] fast food restaurant on the same day, we
[01:40:38] know that this advertisement has has
[01:40:41] worked um because your phone previously
[01:40:43] registered it. So there have been a lot
[01:40:45] of attempts like this. I think that
[01:40:47] surfaced a couple of years ago. um this
[01:40:49] case I don't recall the exact name of of
[01:40:51] of how this technology was called but um
[01:40:54] especially um there were court cases
[01:40:57] actually against that um where they um
[01:41:02] required the company that offered the
[01:41:04] technology to make the user aware that
[01:41:07] this is happening because a lot of apps
[01:41:09] had um this technology installed and
[01:41:13] they had microphone permissions and they
[01:41:15] just installed this library because
[01:41:17] maybe that library pays the app
[01:41:18] developer for some money, right? Um, and
[01:41:21] at the end it is tracking you. So, what
[01:41:23] I'm what I'm just trying to say is
[01:41:25] there's an sort of infinite amount of
[01:41:26] ways you can be tracked. I mean, just
[01:41:28] enough of enough of last year in the US
[01:41:32] um there were um those cases surfacing
[01:41:36] surrounding
[01:41:38] um city surveillance cameras. um around
[01:41:40] 40,000 um of these I think exist in the
[01:41:43] US and
[01:41:46] those camera cameras or also license
[01:41:48] plate readers right all of that um are
[01:41:51] incredibly smart um equipped with
[01:41:54] artificial intelligence to directly
[01:41:56] track faces of humans um and
[01:42:00] um there was this this one YouTuber Ben
[01:42:03] Jordan who actually exposed that and
[01:42:05] funnily enough after exposing that got
[01:42:07] private investigators from that set
[01:42:09] company to his home to I guess fully
[01:42:13] destroy his privacy. Um but um so he he
[01:42:18] I think he helped expose that um that
[01:42:22] none of these c cameras were encrypted.
[01:42:25] So they were recording
[01:42:27] all cities across the US permanently
[01:42:31] 24/7 storing that everything being
[01:42:34] massailed while anyone could just via a
[01:42:37] Google search and some specific query
[01:42:40] get access to the camera feed and see
[01:42:43] what is going on. And he showed videos
[01:42:44] of playgrounds where children were
[01:42:47] playing, right? And so that's what I
[01:42:48] mean when I say that surveillance does
[01:42:51] not bring us safety or security. It is
[01:42:54] in most cases doing the opposite.
[01:42:58] >> It's also allworked. It's digital and
[01:43:00] it'sorked. So that means that companies
[01:43:03] can pull up CCTV cameras from around the
[01:43:05] world.
[01:43:05] >> Oh yeah. Facial recognition. Yeah. It's
[01:43:08] any anyone can. I mean it's and and what
[01:43:11] I really found so striking about the
[01:43:13] story is him outlining how he was able
[01:43:16] to follow people around, right? who was
[01:43:18] able to say, "Oh, yeah, they went to
[01:43:20] church here on Sunday and then they went
[01:43:22] there for shopping." That is insane,
[01:43:24] right? And I don't know, you as a human
[01:43:27] being just there was this one video of a
[01:43:29] of an adult man just going onto a
[01:43:32] completely empty playground and just
[01:43:34] hopping onto the swing and just just
[01:43:36] swing swinging there, right? If if this
[01:43:38] person knew that he was being watched,
[01:43:40] he would never have done that, right?
[01:43:42] And so this this this idea of escapism
[01:43:45] um is entirely um impossible in a in a
[01:43:48] world like this
[01:43:49] >> because there is no escape.
[01:43:50] >> There's no escape. Yeah. Um yeah. Also
[01:43:54] with with license plate readers which
[01:43:56] aren't license plate readers. They are
[01:43:58] surveillance cameras that pretend to
[01:44:01] only do a specific function. Um there
[01:44:03] was
[01:44:04] >> What other functions do they do?
[01:44:05] >> I mean record everything and be able to
[01:44:07] track cars even if they don't have a
[01:44:09] license plate. So you cannot be just a
[01:44:11] license plate reader if your one of your
[01:44:13] capabilities is to also help you
[01:44:15] identify cars that don't have a license
[01:44:17] plate. Right.
[01:44:18] >> Um
[01:44:19] >> fair.
[01:44:20] >> So um
[01:44:22] um I just just recall one case where
[01:44:24] there was a police officer who then used
[01:44:27] this access to technology to stalk his
[01:44:30] ex-girlfriend, right? Which is
[01:44:32] inevitable with this kind of technology.
[01:44:34] If you put that um that power into into
[01:44:37] the hands of of individuals who can use
[01:44:39] this technology in secrecy, right? It's
[01:44:41] not like throwing a nuclear bomb on on a
[01:44:44] country, people will notice, right? Mass
[01:44:46] surveillance, nobody notices.
[01:44:51] >> Can you uh if so people made it two
[01:44:53] hours into this interview, they're
[01:44:55] obviously interested in you first. Can
[01:44:56] you pronounce and spell your name?
[01:44:59] >> Yanikra.
[01:45:00] Y A N I K S C H R A D E.
[01:45:07] >> The name of your company and its
[01:45:08] spelling.
[01:45:09] >> Arum. A R C I U M.
[01:45:13] >> How do you speak English as fluently as
[01:45:15] you do since it's your second language?
[01:45:17] Um
[01:45:22] I would say as a as a it's funny because
[01:45:25] as a child um but when I was in in high
[01:45:28] school there were phases because I was
[01:45:30] consuming so much um English content um
[01:45:33] on the internet um that I was
[01:45:36] consciously thinking in English right as
[01:45:38] a child. Um, yeah, I would say that
[01:45:42] >> you're on Twitter. Where else can people
[01:45:44] go to read your views on technology and
[01:45:47] privacy?
[01:45:48] >> Um, mainly on my Twitter um at Yr Trade.
[01:45:53] Um, and
[01:45:56] I I also have a small small website um
[01:46:00] just just my personal website, I guess.
[01:46:02] I don't have a blog there. Um, I write
[01:46:04] all of my articles basically on on
[01:46:06] Twitter. Sometimes sometimes I get the
[01:46:09] chance to to publish my views on um on
[01:46:13] some very niche um um news outlets in in
[01:46:17] Germany. Um but um most news outlets
[01:46:20] don't really care about privacy. Um so
[01:46:22] so I I I stick with X and I I really
[01:46:26] like I really like talking on X, sharing
[01:46:29] my thoughts on X, writing articles
[01:46:31] there, right? um when I when I talked
[01:46:35] about um about chat control specifically
[01:46:38] on X and um
[01:46:41] it's so funny um we we haven't even
[01:46:43] touched on on on on the fact that chat
[01:46:46] control the way it's um aimed to be
[01:46:49] implemented in the European Union with
[01:46:51] the current proposal. I mean what what
[01:46:54] happened is that there was this proposal
[01:46:55] where they said you need to all
[01:46:58] providers need to have chat control
[01:47:00] which is so-called client side scanning
[01:47:02] right Tucker's phone is going to check
[01:47:05] the message that Tucker is sending right
[01:47:07] now if that message is illicit under
[01:47:10] some definition and if so then it's
[01:47:12] going to send a message to the police
[01:47:13] that is what client side scanning is um
[01:47:16] and in its most um I guess innocent form
[01:47:19] it would just be um we're going to
[01:47:21] censor the message because I I don't
[01:47:24] know, child exploitation or whatever
[01:47:25] madeup reason, right? So, so we're going
[01:47:28] to censor that message. In the worst
[01:47:30] case, it would just be we're going to
[01:47:31] forward that message. And that's that's
[01:47:33] what the law that they had um is that
[01:47:37] received a lot of backlash also thanks
[01:47:38] to Elon Musk um and didn't pass. And
[01:47:42] then um as you would expect shortly
[01:47:45] after I think it was less than a month
[01:47:48] um they came back with a new proposal
[01:47:51] and the new proposal um made it
[01:47:55] voluntary. So the new proposal basically
[01:47:58] states um hey Mark Zuckerberg do you
[01:48:01] want to voluntarily add a surveillance
[01:48:05] mechanism to your applications? um which
[01:48:07] which is insane, right? Because of
[01:48:09] course companies will will voluntarily
[01:48:11] implement those surveillance mechanisms.
[01:48:13] But if you go down um those different
[01:48:17] paragraphs in that proposal, what you
[01:48:19] will realize is that it is in fact not
[01:48:23] voluntary. Um what you will realize is
[01:48:26] that um in order to combat child
[01:48:30] exploitation, the European
[01:48:31] >> terrorism, money laundering.
[01:48:33] >> Yes. Yes.
[01:48:35] Um so in order to do that um they're
[01:48:38] going to um introduce a new bureaucratic
[01:48:41] agency um who is tasked with um risk
[01:48:46] assessing different platforms right so
[01:48:48] we're going to look at Signal we're
[01:48:50] going to look at WhatsApp we're going to
[01:48:51] look at Gmail every single platform
[01:48:53] we're going to risk assess and then
[01:48:55] we're going to be like how risky is that
[01:48:57] platform if it's risky then we apply
[01:49:00] coercive measures and they need to
[01:49:02] implement um all I guess all all all
[01:49:07] measures to to combat whatever illicit
[01:49:09] activity um is is targeted which in a
[01:49:13] case of child exploitation explicitly
[01:49:15] means that because that's the only thing
[01:49:16] you can do scan those messages right um
[01:49:20] and so it is not voluntary after all
[01:49:22] because if and it explicitly says that
[01:49:26] you if you don't want to land in the
[01:49:27] high-risk category just voluntarily scan
[01:49:31] and then you're not in that category
[01:49:33] That's uh in the US that's called
[01:49:34] extortion.
[01:49:36] >> Yeah.
[01:49:36] >> You don't have to give me your money,
[01:49:37] but I'll shoot you.
[01:49:38] >> Yeah. Yeah. Yeah. But but but feel free
[01:49:41] to not give me.
[01:49:42] >> It's your choice.
[01:49:43] >> Yeah. Um last question. Where do you
[01:49:47] You're 25 years old, which is
[01:49:48] remarkable. Where do you imagine you'll
[01:49:51] be at 45?
[01:49:53] >> At 45. Um
[01:49:58] you mean
[01:49:59] >> what will you be doing? What will the
[01:50:01] world look like? what the world will
[01:50:03] look like. Um I'm a very optimistic
[01:50:06] person. So um while there is those two
[01:50:10] trajectories, right, that I think not
[01:50:12] just the United States but humanity in
[01:50:15] general will either take right, one of
[01:50:18] those um I strongly believe that we will
[01:50:20] be able to um move into the utopian
[01:50:24] direction instead of the dystopian
[01:50:25] direction. And so um what it means for
[01:50:28] what I need to achieve um is I need to
[01:50:32] um not just tell tell people about the
[01:50:35] importance of this right um people sort
[01:50:38] of know that privacy is important right
[01:50:40] I think most of your audience
[01:50:43] realizes that right otherwise I feel
[01:50:44] like they wouldn't be listening to you
[01:50:46] so um it is of course about education
[01:50:50] and and and stuff but more importantly
[01:50:53] and that's this core realization that I
[01:50:55] had is that privacy is only going to get
[01:50:58] adopted if it enables strictly superior
[01:51:02] technology. And so that's what I'm
[01:51:03] doing. That's the mission. That's what
[01:51:05] I'm doing with Archium to enable
[01:51:08] um a situation in which you have to
[01:51:11] adopt it sort of because it would be
[01:51:13] to not do so. Um and so that's
[01:51:17] that's what I'm trying to do. And I
[01:51:18] think we can end up in a world like this
[01:51:20] where um because that's what it needs.
[01:51:23] You're exactly right. It's not enough to
[01:51:25] say, "We're not fully human without it."
[01:51:28] >> Yeah.
[01:51:28] >> The board of directors is going to say,
[01:51:30] "Well, yeah, but look at the returns."
[01:51:32] >> Exactly. Right.
[01:51:33] >> Yeah.
[01:51:34] >> I can't uh thank you enough if our
[01:51:36] viewers knew how this interview came
[01:51:39] about. Like they would believe it.
[01:51:43] So, I'm not even going to suggest I'm
[01:51:44] not even going to say how this interview
[01:51:46] came about, but it was through a series
[01:51:48] of um chance encounters that was just
[01:51:51] really felt like the hand of God.
[01:51:53] Thank you very much for doing this,
[01:51:55] Yanik.
[01:51:55] >> Thanks for having me, Tucker. I
[01:51:57] appreciate it.
ℹ️ Document Details
SHA-256
yt_crp-mkI1tj4
Dataset
youtube
Comments 0