After Words: Sam Bankman-Fried, Elite Fraud, and the Cult of Techno-Utopia
📄 Extracted Text (10,325 words)
[00:00:03] In 2024, Sam Bankman Freed was sentenced
[00:00:06] to 25 years in prison after committing
[00:00:09] one of the largest financial frauds of
[00:00:11] the 21st century. But what motivates
[00:00:14] someone to steal 11 billion?
[00:00:18] Journalist David Morris calls his book
[00:00:20] on Bankmanfreed a work of quote forensic
[00:00:23] philosophy.
[00:00:25] He goes beyond the facts of the case to
[00:00:27] describe the increasingly popular
[00:00:29] ideologies of technoutopianism,
[00:00:32] libertarian idealism, and
[00:00:34] utilitarianism.
[00:00:36] And he argues that Silicon Valley's
[00:00:38] imagination of the future puts our
[00:00:41] democracy at risk in the present.
[00:00:44] Coming up on Afterwards, journalist
[00:00:46] David Morris talks about his book,
[00:00:48] Stealing the Future. Sam Bankman Freed,
[00:00:51] Elite Fraud, and the Cult of
[00:00:53] Technoutopia.
[00:00:55] This is afterwards from C-SPAN. This
[00:00:58] week, journalist David Morris talks
[00:01:00] about the fallen cryptocurrency mogul
[00:01:02] Sam Bankman Freed. He sat down with
[00:01:05] journalist Benjamin Schiller at
[00:01:07] Powerhouse Arena in New York City in
[00:01:09] November of 2025.
[00:01:12] >> Thank you very much.
[00:01:14] >> Thank you so much.
[00:01:15] >> Great to be I'm going to work on that
[00:01:17] positioning.
[00:01:18] >> Great to be here. Thanks all for for
[00:01:20] coming and uh welcome to our listeners
[00:01:23] and viewers at home. Um so David uh you
[00:01:26] know you and I have worked together I
[00:01:28] think off and off for maybe 10 years. Um
[00:01:31] >> something on that order
[00:01:32] >> something like that. Uh Breaker magazine
[00:01:35] was an early crypto publication and then
[00:01:38] at CoinDesk as um our introducer was
[00:01:41] saying uh broke the story of uh Sam Bank
[00:01:43] Rupreed uh three years ago. So, it's
[00:01:45] been almost exactly 3 years the day that
[00:01:48] he was
[00:01:48] >> Exactly. In fact, November 11th, 2022
[00:01:51] was when the exchange closed down for
[00:01:54] withdrawal. So, we're exactly at the
[00:01:58] Took me a second. Sorry, I'm recognizing
[00:02:00] people in the audience. Um, yeah, three
[00:02:02] years exactly to the to the collapse and
[00:02:05] closure,
[00:02:06] >> right? So, and it took about a year for
[00:02:07] him to be uh prosecuted and then uh you
[00:02:10] know sentenced to prison for 25 years
[00:02:12] and he's currently in federal
[00:02:14] penitentiary somewhere.
[00:02:15] >> Yeah.
[00:02:16] >> Um and he's been playing this game of
[00:02:18] presidential pardon uh with the Trump
[00:02:20] administration which has been successful
[00:02:22] for some
[00:02:23] >> crypto criminals or people who went to
[00:02:25] prison, but not so far for
[00:02:27] >> and I think unlikely to be successful
[00:02:29] for him for reasons that we can talk
[00:02:31] about at some point maybe. But um yeah,
[00:02:33] I think I think it's probably going to
[00:02:35] do the whole bid,
[00:02:36] >> right? So uh we're going to get into why
[00:02:39] Sam Bankank Miffrey really matters uh as
[00:02:41] as a case in crypto and kind of the
[00:02:43] larger financial story of this this
[00:02:45] country, the world. Um but first of all,
[00:02:47] you're going to give us a quick uh
[00:02:49] reading. So um
[00:02:50] >> so let's make this awkward transition.
[00:02:53] >> I don't feel awkward. I think I think it
[00:02:55] feels very natural.
[00:02:56] >> Um so first of all um thanks for thanks
[00:03:00] everybody for for being here. I have a
[00:03:02] lot of friends and family here. Uh,
[00:03:04] first and foremost, why my wife Georgia
[00:03:06] Herdis to whom the book is dedicated.
[00:03:08] Uh, my great good friend Pam Bell is out
[00:03:10] in the audience. Came all the way from
[00:03:12] Florida. My parents are in here in the
[00:03:13] back. Um, so I just want to open Well,
[00:03:19] I'm going to do one thing really quick
[00:03:21] and watch my time. Um, so I want to read
[00:03:26] a section from chapter 3. I'm going to
[00:03:28] start with the epigraphs from chapter 3
[00:03:32] which oh sorry I'm making you write
[00:03:35] that. Um I'll keep it down here I swear.
[00:03:37] Um so there are two quotes that open
[00:03:41] this third chapter. The first
[00:03:45] Sam will never speak an untruth. It's
[00:03:48] just not in him. Barbara Frerieded
[00:03:51] quoted in the New Yorker 25th September
[00:03:54] 2023.
[00:03:57] Second epigraph, quote, I did not think
[00:04:01] it a fruitful use of time to spell out
[00:04:04] every time I thought Mr. Bankman Freed
[00:04:07] testified willfully and knowingly
[00:04:09] falsely at trial. There are more than
[00:04:12] the ones I've articulated, but that
[00:04:14] suffices. I've been doing this job for
[00:04:16] close to 30 years. I've never seen a
[00:04:19] performance quite like that. Judge Lewis
[00:04:22] Kaplan, 28th March, 2024.
[00:04:28] skipping ahead a little bit. So, um, one
[00:04:30] of the frames for the book and the ways
[00:04:32] that I present some of the events that
[00:04:33] happened was him, uh, doing various
[00:04:35] interviews after the collapse. And one
[00:04:38] of the main ones was with Andrew Ross
[00:04:40] Sorcin at the New York Times Dealbook
[00:04:41] Summit um, where he uh, lied a lot and
[00:04:45] we're going to talk about one of those.
[00:04:47] This section is this subsection is it's
[00:04:49] a margin trading platform.
[00:04:52] When he be when he began to attempt an
[00:04:54] actual explanation of events for Sorcin
[00:04:56] and the Dealbook audience, Begmanf Freed
[00:04:58] unfurled the first truly strange lie.
[00:05:02] One that could absolve him of guilt if
[00:05:03] true, but which was to anyone paying
[00:05:06] even moderate attention obviously false.
[00:05:09] Quote, Alama Research did have a
[00:05:11] longleveraged position on FTX,
[00:05:14] Bankmanfried told Sorcin and the
[00:05:16] international platform, it's a margin
[00:05:18] trading platform.
[00:05:21] After he delivered this explanation,
[00:05:22] Sam's downcast eyes darted up to barely
[00:05:25] glance into the camera, eyes that were
[00:05:27] not cowed, but sharply observant. He
[00:05:30] seemed to be gauging Sorcin and the
[00:05:32] audience's reaction from afar, while
[00:05:34] doing his best to appear supplicant and
[00:05:36] contrite. Peeking from behind the hair
[00:05:39] was a Sam Bankman Freed the public had
[00:05:41] never really seen. The Sam who was
[00:05:44] relentlessly focused on his goals and
[00:05:46] willing to do anything to win.
[00:05:50] In essence, Bankman Freed was claiming
[00:05:51] that Alama Research had legitimately
[00:05:53] borrowed billions of dollars from FDX
[00:05:56] customers, the long position, then lost
[00:05:59] it in bad trades, which he was very
[00:06:01] sorry for not paying attention to. This
[00:06:04] explanation, it would later emerge, had
[00:06:06] been concocted as a plausible fiction by
[00:06:08] an FTX lawyer at Sam's request. It was
[00:06:12] an obvious lie, even if it might slip
[00:06:14] past people who have spent their lives
[00:06:16] worrying about things more important
[00:06:17] than finance. Margin trading is a
[00:06:20] service available on many kinds of asset
[00:06:22] trading platforms that allows traders to
[00:06:25] borrow money to make bets on assets.
[00:06:28] Margin is often used to go long or bet
[00:06:30] that an asset will go up in value. If a
[00:06:34] trader has only $1,000 of their own
[00:06:36] capital, they might be able to borrow
[00:06:37] another $1,000 to, for instance, buy
[00:06:40] $2,000 in Bitcoin. If Bitcoin goes up,
[00:06:43] the trader pays back the borrowed funds,
[00:06:45] but gets to keep their gains. If Bitcoin
[00:06:47] goes down to co, on the other hand,
[00:06:50] their dollar collateral could be
[00:06:51] liquidated to cover the loan, doubling
[00:06:53] their losses.
[00:06:55] FTX did offer margin trading and margin
[00:06:58] traders did consent to have their assets
[00:07:00] available for borrowing by other margin
[00:07:01] traders in exchange for interest
[00:07:03] payments.
[00:07:05] When Sam Begmanfrieded said that Alama
[00:07:07] has a long position, he was admitting
[00:07:09] that the hedge fund was in debt to FTX
[00:07:12] customers. In his version of events,
[00:07:15] this borrowing was not illegal or even
[00:07:17] unethical. Instead, bad market timing
[00:07:20] was the ultimate culprit. As the crypto
[00:07:23] market tanked, Alama's highly leveraged
[00:07:25] long crypto trades blew up or became
[00:07:28] worth less than the value of the loans,
[00:07:30] taking borrowed customer money with it.
[00:07:33] All of this was a wisp of reality draped
[00:07:36] across a pile of bull
[00:07:39] in implying that Alama had the clear
[00:07:41] right to borrow those customer funds
[00:07:43] simply because FTX was a margin trading
[00:07:45] platform. Bankman Feed would Bankman
[00:07:48] Freed was speaking a very clear untruth.
[00:07:52] Forensic accountant Peter Eastston would
[00:07:54] present evidence at trial showing that
[00:07:56] for the entirety of 2022, Alama owed FTX
[00:08:00] billions more in assets than depositors
[00:08:02] had opted to lend through the margin
[00:08:05] program. At the peak in mid 2022, Alama
[00:08:09] was borrowing $10.8 billion more in
[00:08:12] assets than were voluntarily enrolled in
[00:08:14] margin lending.
[00:08:17] Andrew Ross Torcin immediately saw this
[00:08:18] for the lie that it was and read
[00:08:20] passages from FTX's terms of service to
[00:08:22] Bankman Freed, which explicitly promised
[00:08:25] most customers that their funds would
[00:08:26] not be borrowed. These simple spot
[00:08:29] trading accounts without borrowing
[00:08:30] enabled made up the vast majority of FTX
[00:08:33] customers activity and holdings. But
[00:08:35] Alama had taken effectively 100% of
[00:08:38] customer funds, spot and margin, for its
[00:08:42] own long position. The spot customers
[00:08:44] hadn't agreed to that and didn't know it
[00:08:46] was happening. We now know the source of
[00:08:49] this strange obviously false claim about
[00:08:51] margin borrowing.
[00:08:53] On 7th November 2022, as FTX was going
[00:08:56] up in flames, executives were launching
[00:08:59] a lastditch effort to raise money and
[00:09:01] Bankman Freed was about to pitch
[00:09:03] investors at Apollo Global Management.
[00:09:06] This was a serious bunch and they wanted
[00:09:08] financial statements. Things had
[00:09:11] progressed beyond the point of simply
[00:09:13] cooking the books. So Sam Bankman Freed
[00:09:15] took then FTX general counsel Ken Sun on
[00:09:18] a tense walk. Sun testified that Bankman
[00:09:22] Freed asked him for quote a legal
[00:09:24] justification for why the funds were
[00:09:27] missing.
[00:09:28] Sun testified that he had only
[00:09:31] discovered the amount of money missing
[00:09:32] from FDX a few days before. When he
[00:09:36] asked for an explanation, Sun says he
[00:09:37] got very vague responses.
[00:09:40] That night, Sun saw head of engineering
[00:09:43] Nishad Singh, a confessed co-conspirator
[00:09:46] who saw the scheme collapsing.
[00:09:49] Quote, his entire face was pale, gray.
[00:09:52] It looked like his soul had been plucked
[00:09:54] away from him.
[00:09:56] Sun was already sure that quote there
[00:09:58] was no legal justification for the money
[00:10:01] being taken.
[00:10:03] Nonetheless, on that night of 7th
[00:10:05] November, Sun willingly laid out to
[00:10:07] Bankman Freed what he called theoretical
[00:10:09] arguments
[00:10:10] that could explain the missing funds.
[00:10:13] This included the theoretical
[00:10:14] possibility that voluntary lending by
[00:10:16] margin traders accounted for the losses.
[00:10:19] This was not supported by the facts of
[00:10:20] what actually happened and Sun testified
[00:10:23] that he made this clear to Bankman
[00:10:25] Freed.
[00:10:27] Nonetheless, it became Bankman Freed's
[00:10:29] unwavering version of events. He
[00:10:32] repeated it not only at Dealbook but to
[00:10:34] George Stephanopoulos in an interview on
[00:10:36] Good Morning America just weeks later in
[00:10:38] December 2022.
[00:10:40] Bankman Freed supporters would also
[00:10:42] implicitly and explicitly echo the
[00:10:45] untruth cooked up by Sun for years
[00:10:47] afterwards.
[00:10:49] This was the most central of Bankman
[00:10:51] Freed's weird, compulsive,
[00:10:54] self-indicting lies.
[00:10:56] Like most of them, it contained a wildly
[00:10:58] inaccurate claim about basic principles
[00:11:00] of finance, accounting, or corporate
[00:11:03] structure. In this case, about how
[00:11:06] margin lending works.
[00:11:08] It's a genre of lies suggesting that Sam
[00:11:11] Bankman Freed was not a Mchavellian
[00:11:13] mastermind,
[00:11:15] but in fact confoundingly inept, either
[00:11:18] confused and deluded enough to genuinely
[00:11:21] believe his own absurd explanations or
[00:11:24] so alienated by the sense of his own
[00:11:26] unique brilliance that he couldn't grasp
[00:11:29] how obvious his lies were to the rest of
[00:11:31] us.
[00:11:33] Certain people, it's true, were eager to
[00:11:36] swallow these absurdities.
[00:11:38] In January 2024, months after Bankman
[00:11:41] Freed's criminal conviction, Ivy League
[00:11:44] law professors Ian Ays and John
[00:11:46] Donahghue would leverage their
[00:11:47] reputations to assert essentially the
[00:11:50] same claim in a column titled quote FTX
[00:11:53] was never really bankrupt. The pair
[00:11:56] argued that FTX was solvent that quote
[00:11:59] it had enough assets to cover all
[00:12:01] liabilities to customers on the basis of
[00:12:04] its less liquid venture investments like
[00:12:06] anthropic.
[00:12:08] In other words, heirs and Donahghue echo
[00:12:10] Sam's implicit admission that he was
[00:12:13] gambling with customers custodied funds
[00:12:16] and they too treat that as perfectly
[00:12:18] acceptable.
[00:12:20] It was in hearing Bankman Freed's
[00:12:22] strange clumsy lies that I was first
[00:12:25] magnetically drawn to the question of
[00:12:27] what exactly was going on in his head.
[00:12:30] In his complete departures from reality,
[00:12:33] we begin to see that Bankman Freed was
[00:12:35] neither simply hapless nor a figure of
[00:12:38] malicious evil, but a sick man alienated
[00:12:42] from himself, twisted into knots of
[00:12:44] selfdeception by what the world told him
[00:12:47] he was meant to be.
[00:12:50] Bankman Freed's flimsy lies were a last
[00:12:54] gasping effort to cling to the fay
[00:12:56] glamour cast by the magic circle of the
[00:12:59] tech investment world and to justify to
[00:13:03] himself most of all the sacrifices made
[00:13:06] over many years to give others what they
[00:13:08] wanted. These lies and the incompetence
[00:13:12] they imply do support perversely the
[00:13:15] essence of Bankman Freed's primary
[00:13:17] self-defense
[00:13:18] that as he at one point planned to tell
[00:13:21] the US Congress he simply quote up
[00:13:26] his apparent childish hlessness in turn
[00:13:29] suggests greater blame for the many who
[00:13:32] let him rise. Bankman Freed's
[00:13:34] ineptitude, ignorance, and in discipline
[00:13:38] were on display for insiders far before
[00:13:41] it all came crashing down.
[00:13:49] So, it's a really interesting passage. I
[00:13:51] mean, so just to be clear, I mean, uh,
[00:13:53] he's been trying to get a retrial. He's
[00:13:55] trying to get the, uh, conviction thrown
[00:13:57] out, and he's still making exactly this.
[00:14:00] >> Yeah. And um on last week, Monday
[00:14:04] um his appeal hearing happened in the uh
[00:14:07] second district in NCY up in in town. Um
[00:14:11] and I'm going to get her name wrong, but
[00:14:13] I think his appeals lawyer is Andrea
[00:14:15] Shapiro. Um she's a very famous appeals
[00:14:17] lawyer. Um and they they spent, I'm
[00:14:19] sure, tons of money that I don't
[00:14:21] understand where it came from to have
[00:14:23] her go up and and echo this argument
[00:14:25] that it was all margin trading. I mean,
[00:14:27] it was quite stunning to listen to the
[00:14:28] tape. Actually, they they have a
[00:14:30] recording of this appeals hearing you
[00:14:31] can go listen to. And she just runs down
[00:14:33] the exact theory completely already
[00:14:37] disproven at trial that a all of this
[00:14:40] money was was borrowed legitimately and
[00:14:43] b we should give him credit for taking
[00:14:46] all of this stolen money and making good
[00:14:47] investments with it.
[00:14:48] >> And they were trying to claim that it
[00:14:49] was a bank that there was simply a run
[00:14:51] on the bank that there wasn't enough
[00:14:52] deposits to meet all the withdrawal
[00:14:54] demands. Yeah, run on the bank is the
[00:14:56] language that was was used very widely
[00:14:58] in sort of um the autopsy of this
[00:15:03] situation from certain figures. Um and
[00:15:07] it's very deceptive because a bank makes
[00:15:09] money by lending out your deposits and
[00:15:12] that's what it's supposed to do. But a
[00:15:14] crypto exchange um you are promised that
[00:15:19] your assets will be custodied not lent
[00:15:22] out but but protected and and held
[00:15:24] safely and that's the thing that you
[00:15:26] sign up for. That's what you expect an
[00:15:28] exchange to do right. Um so I mean many
[00:15:31] criminals when they get into a situation
[00:15:33] where they're about to be going to
[00:15:35] prison will change their case. They show
[00:15:37] some contrition, some guilt, but uh I
[00:15:40] mean SPF never did that. He stuck to
[00:15:42] this story that he was a hapless genius.
[00:15:44] He just kind of got in over his head.
[00:15:47] >> Yeah. And and you know, I quoted Judge
[00:15:49] Kaplan who you know, I I sat through the
[00:15:51] entire trial. It was about 30 is exactly
[00:15:54] 30 days. I think we were pointing this
[00:15:55] out. The the trial actually ended um in
[00:15:59] 2023 also on this day, November 11th. Um
[00:16:04] so this is also the second anniversary
[00:16:05] of his conviction today. Um, and uh,
[00:16:09] yeah, I I you know, it's it's nuanced
[00:16:12] because I think he made some gestures of
[00:16:16] contrition or regret at least, but um,
[00:16:19] at the end of the trial, Judge Kaplan at
[00:16:21] sentencing definitely was very explicit
[00:16:24] that at least in his perception, uh, Sam
[00:16:26] never expressed contrition. Um, and he
[00:16:29] certainly since his conviction has um,
[00:16:31] basically
[00:16:33] both publicly and I believe through kind
[00:16:36] of private back channels where I think
[00:16:38] he's paying people to echo a certain
[00:16:40] line somehow. Um, he he's continued to
[00:16:44] stick with this story of it was all
[00:16:46] legitimate. Um, and also somehow it's
[00:16:48] the bankruptcy lawyer's fault is another
[00:16:50] big one.
[00:16:52] So, but what's interesting about this
[00:16:53] book and about this whole case, and I
[00:16:55] think what's so interesting about um
[00:16:57] what you've written here is that this
[00:16:59] wasn't simply like a bad person doing
[00:17:01] bad things and not admitting to it. I
[00:17:03] there was a kind of a a guiding
[00:17:04] philosophy that led him to behave the
[00:17:07] way he did and to not admit to what he
[00:17:09] did. And that was some version of
[00:17:12] technoutopianism, which is what you call
[00:17:14] it in the book, which has many different
[00:17:16] strains in it. just explain what you
[00:17:18] mean by technoutopianism and how did
[00:17:20] that contribute to his downfall do you
[00:17:22] think? Well, um I mean it's interesting
[00:17:24] because I think the landscape has
[00:17:25] changed quite significantly since this
[00:17:27] happened over the last three years.
[00:17:29] Obviously, we've had kind of a great
[00:17:31] rise in the influence of um what I would
[00:17:34] call broadly technoutopian thought in US
[00:17:37] politics in particular. Um obviously
[00:17:39] Elon Musk very much a technoutopian. He
[00:17:42] has said he has a lot of emp or
[00:17:44] alignment with effective altruism which
[00:17:47] is kind of the main philosophy that
[00:17:49] Bankman Freed was was motivated by. Um
[00:17:52] and essentially
[00:17:54] um well the we can we can get into it
[00:17:57] but um the the real fundamental thing
[00:17:59] here is that effective altruism is a
[00:18:01] strain of what's known as utilitarianism
[00:18:04] which is essentially the idea that if
[00:18:06] you really boil it down um the outcome
[00:18:09] of your actions is what matters
[00:18:12] ethically. That ethically all you should
[00:18:14] concern yourself with is outcomes rather
[00:18:17] than rules. And so at trial, for
[00:18:19] example, we had um Carolyn Ellison, a
[00:18:22] confessed co-conspirator, say that Sam
[00:18:25] had told her that um she believed that
[00:18:28] FTX user deposits were a good source of
[00:18:31] capital to accelerate FTX's growth. So
[00:18:34] the rationale for all of this sort of
[00:18:36] illicit borrowing was um I am a good
[00:18:39] investor. Therefore, these people, even
[00:18:42] though they've told me they just want me
[00:18:44] to hold on to their money, what I'm
[00:18:45] actually going to do is I'm going to
[00:18:47] take it and I'm going to invest it in
[00:18:49] venture investments and uh I'm going to
[00:18:51] make all the money back. Um and and one
[00:18:53] of the other just to tack on, you know,
[00:18:55] there's so many
[00:18:57] troublesome parts of this, but um a lot
[00:18:59] of the public rhetoric following the
[00:19:01] collapse of FDX has sort of been an
[00:19:04] attempt to claim that there was a full
[00:19:07] recovery by victims of this crime. Um
[00:19:10] and um one of the very gratifying things
[00:19:12] that happened in the hearing on Monday
[00:19:14] is that the the the federal prosecutor,
[00:19:17] Thain Ren, who was um one of the one of
[00:19:19] the three leads in the prosecution and
[00:19:21] argued for the appeal on Monday, he
[00:19:23] actually laid out into the official
[00:19:25] record for the first time from the
[00:19:27] prosecution's perspective this point
[00:19:29] that a lot of the people who lost assets
[00:19:32] um they were repaid at par dollar value
[00:19:35] at a point when the market was very low
[00:19:37] and many of their assets have since gone
[00:19:39] up by on the order of 10x. Um, and so
[00:19:41] even when there was a technical recovery
[00:19:44] under American bankruptcy law that went
[00:19:48] beyond the value on the claim date,
[00:19:50] these people are actually in many cases
[00:19:52] out millions or tens of millions of
[00:19:55] dollars that they would have now if
[00:19:58] Bankman Freed had done what he promised
[00:20:00] and custodied their funds.
[00:20:02] >> And he basically said that it's not
[00:20:03] stealing because they got their money
[00:20:05] back.
[00:20:06] That's the that's the argument that
[00:20:08] again people are trying to make and that
[00:20:10] again Shapiro his appeals lawyer tried
[00:20:12] to make last Monday um on appeal and um
[00:20:16] you know the the the fun part was
[00:20:17] listening to the appeal and and hearing
[00:20:19] the appellet judges not buy it at all.
[00:20:21] Um but uh that is still what they're
[00:20:24] going with incredibly.
[00:20:26] >> We'll get right back to journalist David
[00:20:28] Morris on afterwards from C-SPAN.
[00:20:32] Now back to journalist David Morris
[00:20:34] talking about his book Stealing the
[00:20:36] Future, Sam Bankman Freed, Elite Fraud
[00:20:38] and the Cult of Technoutopia.
[00:20:41] >> Okay, so I mean when I went to school we
[00:20:43] learn about utilitarianism as this idea
[00:20:45] of the happiness of the greatest number
[00:20:47] that the ends justified the means. It
[00:20:49] doesn't sound too bad to me like why
[00:20:51] wouldn't you have a philosophy of
[00:20:53] helping as many people as possible? But
[00:20:56] you show very clearly in the book that
[00:20:57] this leads to real moral failings in
[00:21:00] both SPF but also in his parents that
[00:21:02] were kind of pushing his pushing their
[00:21:04] sons.
[00:21:05] >> Yeah. And and I mean I would say that
[00:21:06] like while it sounds good on the
[00:21:08] surface, utilitarian has utilitarianism
[00:21:11] has some really fundamental
[00:21:13] >> conceptual problems. And
[00:21:16] um those are that it is a philosophy
[00:21:19] based on the idea that you can know the
[00:21:22] greatest good for all human beings alive
[00:21:26] um across the world. And um I think many
[00:21:30] people would would agree that that's
[00:21:31] actually kind of not a realistic claim
[00:21:33] that you can even have that knowledge
[00:21:35] about the total state of the world. Um
[00:21:38] even more so utilitarianism assumes that
[00:21:41] everybody has the same definition of
[00:21:43] what is good and what is right and what
[00:21:45] is you know a positive outcome for
[00:21:46] people. Um and so yeah, one of the
[00:21:48] things that I argue in the book is that
[00:21:50] utilitarianism from that perspective is
[00:21:52] sort of fundamentally authoritarian in
[00:21:54] the sense that if you are the
[00:21:56] utilitarian, you are deciding what is
[00:21:58] good for everybody and you're you're
[00:22:00] claiming the knowledge to you know stake
[00:22:04] that position. Um there's another layer
[00:22:07] here which is where we get into the
[00:22:08] technoutopian angle. Um which is that uh
[00:22:12] in addition to utilitarianism
[00:22:14] um Sam and a lot of others in this sort
[00:22:17] of universe of movements believe in
[00:22:19] what's called long-termism. Um and this
[00:22:21] is where you look into you know not just
[00:22:24] decades but potentially centuries or
[00:22:26] even thousands of years into the future
[00:22:29] and you base your decisions on these
[00:22:30] long-term outcomes for humanity. Um and
[00:22:34] and the vision of humanity's future that
[00:22:36] these people have is a very specific one
[00:22:39] based on interplanetary travel and
[00:22:42] colonization, the development of
[00:22:44] artificial general intelligence and the
[00:22:46] ability to upload human minds to
[00:22:48] computers. And that third one in
[00:22:51] particular, there are arguments against
[00:22:52] the feasibility of any of these as
[00:22:55] genuine things that we can accomplish.
[00:22:57] Um but the idea that you can upload a
[00:22:58] human mind to a computer um is both you
[00:23:02] know sort of improbable for a lot of
[00:23:04] scientific reasons and a very specific
[00:23:06] ideological position about just the way
[00:23:09] knowledge works. Um but but the the to
[00:23:12] bring this back around that long-
[00:23:14] termism that very long horizon if you're
[00:23:17] a utilitarian and a long- termist what
[00:23:20] you're claiming is that you can predict
[00:23:22] the consequences of your actions
[00:23:24] thousands of years into the future on a
[00:23:26] population of trillions of humans living
[00:23:30] in your model spread across the entire
[00:23:32] universe. And the problem with that is
[00:23:34] that you actually can't predict any of
[00:23:36] those consequences. Um but in the course
[00:23:39] of trying to justify the actions that
[00:23:41] will bring about what you think are
[00:23:43] those future consequences, you can
[00:23:44] actually rationalize a lot of harms to
[00:23:47] individuals in the present. And Sam and
[00:23:50] FDX are um I think a sterling example of
[00:23:53] that. He really thought he was doing the
[00:23:56] good for the long term but wound up
[00:23:58] hurting a lot of people in in the
[00:24:00] immediate present.
[00:24:01] >> So if you're a long termist, you're not
[00:24:03] thinking about next week and some people
[00:24:05] might lose money. you're thinking about
[00:24:07] history and and
[00:24:08] >> and the other really important example
[00:24:10] of long-termism right now because you're
[00:24:12] seeing perhaps some ads for his book in
[00:24:15] the subway around here is um Elizer
[00:24:17] Yudowski, the the leader of the the
[00:24:20] machine intelligence research institute
[00:24:22] and the head of what you would call the
[00:24:24] AI doomer brigade, the people who
[00:24:27] believe that we're going to achieve AGI
[00:24:28] and then it's going to destroy us all.
[00:24:31] Um, and you know, Yudkowski and a lot of
[00:24:34] people affiliated with him, um, have
[00:24:36] made arguments that seem to rationalize
[00:24:39] terrorism and violence, um, in their
[00:24:42] quest to defeat this AI that doesn't
[00:24:45] exist yet. Um, and so you get into a lot
[00:24:48] of, you know, I'm going to do this
[00:24:50] extreme thing that by most moral
[00:24:53] standards would be
[00:24:55] um, objectionable or hurtful, but
[00:24:57] because I imagine a future where I'm
[00:25:00] stopping the Terminator, um, it's it's
[00:25:03] okay,
[00:25:04] >> right? So SPF was motivated by this
[00:25:06] theory of utilitarianism and thinking
[00:25:09] about the kind of probabilistic outcomes
[00:25:11] of actions that he was making. And this
[00:25:14] all came from his parents that were very
[00:25:15] much like steeped in this philosophy,
[00:25:17] right?
[00:25:18] >> Yeah. Barbara Freed in particular um has
[00:25:20] described herself as something close to
[00:25:22] a uh diehard or hardcore uh both
[00:25:26] utilitarian and determinist which is to
[00:25:28] say she believes that humans don't have
[00:25:30] free will, which is another aspect of
[00:25:33] this constellation of beliefs.
[00:25:35] >> They're just subject.
[00:25:38] >> Yeah, pretty much entirely. And it's
[00:25:40] interesting because she comes at this
[00:25:41] from what I think a lot of people would
[00:25:43] consider a liberal um perspective
[00:25:47] because she's mounting that argument
[00:25:49] specifically in pursuit of prison
[00:25:51] reform. Um and and sort of this argument
[00:25:54] that like well people aren't really
[00:25:55] responsible for their actions.
[00:25:57] Therefore, we should um orient prison
[00:26:00] around um sort of rehabilitation rather
[00:26:03] than punishment. She's very against
[00:26:05] punishment. And this is all stuff that
[00:26:06] she wrote before her son became uh one
[00:26:08] of the biggest criminals in history. So
[00:26:10] it's it's quite interesting and ironic
[00:26:12] from from that perspective.
[00:26:14] >> And how is effective altruism linked to
[00:26:16] utilitarianism, would you say?
[00:26:19] So, effective altruism specifically is
[00:26:21] focused on charitable giving and um they
[00:26:25] believe that using um controlled trials,
[00:26:29] experiments and and things like that,
[00:26:31] you can kind of determine which
[00:26:35] nonprofit is saving the most lives per
[00:26:38] dollar and then you maximize that
[00:26:40] donation. And the the sort of classic
[00:26:42] example is um bed nets for mosquitoes
[00:26:45] which are very inexpensive to buy in
[00:26:48] parts of um you know subsaharan Africa
[00:26:50] where they have problems with malaria
[00:26:52] and they save a lot of lives and so
[00:26:53] that's a very effective intervention. Um
[00:26:56] but one of the sort of more interesting
[00:26:58] things that I write about in the book is
[00:27:00] that you know buying bed nets to stop
[00:27:02] malaria is a very effective intervention
[00:27:04] for um for saving lives. But they've
[00:27:08] done it so much that now these nets are
[00:27:11] being used widely in Africa and
[00:27:14] Southeast Asia for fishing. And because
[00:27:16] of the fine grain of the net, those fish
[00:27:20] nets actually catch eggs and hatchlings
[00:27:22] and destroy ecosystems where people are
[00:27:25] relying on the fish. And so I think it's
[00:27:28] a great example of the way that when you
[00:27:29] have this like maximizing experimental
[00:27:32] data dictated behavior, there are always
[00:27:36] unintended consequences and
[00:27:38] externalities that don't fit into your
[00:27:40] model or your equation.
[00:27:43] >> But the argument for effective altruism
[00:27:45] is that we're not just spending money
[00:27:46] willy-nilly with philanthropy, right?
[00:27:48] It's just
[00:27:49] >> Yeah. And and there is a there is a you
[00:27:52] know it started off as I think a fairly
[00:27:54] defensible position in terms of let's
[00:27:58] actually like look at the outcome and
[00:27:59] there are some good examples of very
[00:28:01] ineffective
[00:28:02] um
[00:28:04] nonprofit efforts that feel good, they
[00:28:06] look good on a pamphlet um and but they
[00:28:09] don't actually help people that much or
[00:28:11] maybe they even harm people. Those
[00:28:12] things do exist. Um, and so that's kind
[00:28:15] of the the deep irony here is that it
[00:28:18] starts out as a relatively defensible
[00:28:20] position and then partly because they
[00:28:23] they are based they they sort of go into
[00:28:26] this long-termist thing where most
[00:28:28] effective altruists or a lot of
[00:28:30] effective altruists now are primarily or
[00:28:33] even exclusively concerned with far
[00:28:36] future concerns like asteroid strikes or
[00:28:39] the emergence of artificial general
[00:28:41] intelligence that wants to kill us and
[00:28:43] have diverted their efforts from
[00:28:45] actually helping people. Um and so that
[00:28:48] is kind of uh the the sort of tragic
[00:28:51] irony of effective altruism is that it
[00:28:54] basically at its base there is a certain
[00:28:56] sensibility to it but it's been captured
[00:28:58] so thoroughly by Silicon Valley by the
[00:29:02] billionaires who mostly fund it um that
[00:29:04] it has kind of become a little bit I
[00:29:06] would say perverted
[00:29:07] >> right so if if this kind of philosophy
[00:29:09] that SPF and his family espoused is so
[00:29:12] widespread in Silicon Valley what does
[00:29:15] that mean for the future do you I mean
[00:29:17] >> I mean I think we're seeing it right now
[00:29:18] and this is actually my my next book um
[00:29:21] which is the a lot of these principles
[00:29:23] also undergur again the the development
[00:29:26] of artificial intelligence and the
[00:29:29] basically massive overspending that's
[00:29:31] going into something like open AI right
[00:29:33] now is actually largely premised on this
[00:29:37] fantasy I I'll just say it that we're
[00:29:39] going to I mean and by the way if you
[00:29:41] look into the science for anybody out
[00:29:42] there who hasn't done this like if you
[00:29:44] look into the way the human brain works
[00:29:46] works. You actually cannot simulate it
[00:29:48] with a digital computer. It's literally
[00:29:50] impossible. And there are very
[00:29:51] compelling arguments for why artificial
[00:29:54] general intelligence at least as an
[00:29:56] analog of human thought is not
[00:29:58] achievable technically. Um but this is
[00:30:02] the premise that you know the effective
[00:30:04] altruists, the rationalists and these
[00:30:06] other technoutopian movements are kind
[00:30:08] of putting out there as this is the way
[00:30:11] we're going. and they have all of this
[00:30:13] huge inflow of investment into OpenAI in
[00:30:16] particular. And then for anybody who's
[00:30:18] paying attention, we now as of last week
[00:30:21] um have Sam Alman of OpenAI saying that
[00:30:24] well, you know, we're not quite there
[00:30:25] yet. So I think we're going to need a
[00:30:27] little bit of that STEMI. I think we
[00:30:28] need some government stimulus to
[00:30:30] underwrite our far future creation of
[00:30:33] the God AI. Um and uh and I you know
[00:30:37] they've made themselves systemically
[00:30:38] important um in a way that uh should
[00:30:42] scare all of us about where the the
[00:30:43] market and uh development is going over
[00:30:46] the next I'm going to say six months. So
[00:30:48] you know watch your portfolios. These
[00:30:49] people are really doing it. They're
[00:30:50] running the game.
[00:30:51] >> It's a matter of national security of
[00:30:54] geop.
[00:30:54] >> Well that's what they they say. Yeah.
[00:30:56] >> That's the thing. If you believe that
[00:30:58] you're going to invent God then it
[00:31:01] definitely becomes a matter of national
[00:31:02] security. Um, and uh, that's the
[00:31:06] narrative that they're trying to push,
[00:31:07] >> right? So, just uh, going back to the
[00:31:10] actual cryptocurrency industry. I mean,
[00:31:12] just uh, talk about how impactful SPF
[00:31:15] was when he was still running around and
[00:31:18] how impactful his his trial and
[00:31:20] prosecution was.
[00:31:21] >> Well, you know, I have a I have a kind
[00:31:23] of um, you know, a different I have a I
[00:31:25] have a complex answer here because I you
[00:31:28] know, I came to CoinDesk in 2022.
[00:31:31] No, 2020. 2020 2020
[00:31:34] >> time does not have any meaning anymore.
[00:31:36] Um and I really did not know who SPF was
[00:31:40] in 2020. Um I had been at Fortune where
[00:31:43] I was reporting a lot about the
[00:31:45] development and distribution of COVID
[00:31:47] vaccines and writing about Tesla. Um and
[00:31:50] had kind I mean I was still kind of one
[00:31:51] foot in crypto but not that attentive.
[00:31:54] and Bankmanf Freed had founded a
[00:31:56] centralized exchange um which is
[00:31:59] probably the one of the least
[00:32:01] interesting things you can do in in the
[00:32:02] cryptocurrency world um and so I didn't
[00:32:05] I didn't pay that much attention to him
[00:32:06] I you know I think his importance is um
[00:32:11] interesting he did help launch a chain
[00:32:13] called Salana that is actually still
[00:32:15] important but one of the things that
[00:32:17] actually came out in the the hearing on
[00:32:20] Monday that you know it's a little bit
[00:32:22] embarrassing sometimes you put these
[00:32:23] pieces together too late for them to be
[00:32:26] in the book. But um again, Thane Ren,
[00:32:29] one of the prosecutors said, and this is
[00:32:31] such a critical small thing, small
[00:32:33] critical thing or just critical thing.
[00:32:36] The means of the embezzlement that
[00:32:37] happened here was in part that FTX
[00:32:41] customers deposited a bunch of cash, US
[00:32:44] dollars, with the exchange. And then,
[00:32:48] you know, one of the sort of steps that
[00:32:50] they omitted was they never actually
[00:32:52] bought the cryptocurrency that people
[00:32:54] believed they were ordering on FTX and
[00:32:57] therefore those dollars just actually
[00:32:59] went straight into these venture
[00:33:01] investments that Sam Bankman Freed was
[00:33:02] punting on. Um, and and that's
[00:33:04] interesting.
[00:33:04] >> They weren't even buying and selling
[00:33:06] crypto.
[00:33:06] >> So, this is what's interesting is that
[00:33:07] we actually had obviously a huge runup
[00:33:10] in the crypto market from 2020 to 2022.
[00:33:13] Um, and it seems now in in retrospect
[00:33:15] that FTX was actually not pumping the
[00:33:19] market because they were not actually
[00:33:20] going out and buying the assets that
[00:33:23] they told their customers they were
[00:33:24] buying. And and I think that there was
[00:33:26] at least some part of my brain at the
[00:33:28] time that was like, oh, we're having
[00:33:30] this market runup or we had this market
[00:33:32] run up because there was this fraud that
[00:33:34] was doing sort of illegitimate things.
[00:33:36] And there was still a lot of that in the
[00:33:38] market, but it was quite interesting to
[00:33:40] hear that no, FTX was not actually going
[00:33:42] out and making these purchases. So like
[00:33:44] that market runup, at least FTX did not
[00:33:47] have like that direct influence on it of
[00:33:50] like double counting things that it
[00:33:52] seemed like they might have been at the
[00:33:54] time,
[00:33:54] >> right? Um, and SPF was a was a national
[00:33:57] figure. He was on the front page of
[00:33:59] Fortune, on the front page of CoinDesk,
[00:34:02] of of many other publications. He was
[00:34:04] kind of the golden boy. He went to
[00:34:06] Washington at least three times, I
[00:34:08] think, something like that. He spoke
[00:34:09] before Congress. He was held up as the
[00:34:12] model of how, you know, he was the adult
[00:34:15] in the room of how crypto should be
[00:34:16] regulated.
[00:34:16] >> It's quite ironic now because at the
[00:34:18] time, crypto was struggling for
[00:34:20] acceptance and he was seen as a kind of
[00:34:21] safe Yeah.
[00:34:22] >> kind of uh conveyor of the future. So
[00:34:25] that's why he was so acceptable on uh in
[00:34:28] >> and a lot of that, you know, one of the
[00:34:30] things that we didn't find out until
[00:34:31] later, you know, his image was very
[00:34:33] central to that and it was very
[00:34:35] carefully crafted. So you you'll
[00:34:37] probably mostly remember there were
[00:34:38] these photos of him with his crazy hair
[00:34:40] flying out. Um at trial, we heard
[00:34:43] Carolyn Ellison say, you know, he knew
[00:34:45] that his hair increased his salary at
[00:34:48] Jane Street where he worked. He knew
[00:34:49] that his hair was a very important part
[00:34:51] of his image. They actually made plans
[00:34:53] for the FTX headquarters in the Bahamas
[00:34:56] to be laid out as a building in the
[00:34:59] shape of Sam's hair. Um, but not just
[00:35:02] that, he actually like had them trade in
[00:35:06] luxury cars that they had initially
[00:35:08] bought for the executives for cheaper
[00:35:11] cars because he wanted to convey that
[00:35:13] effective altruist image.
[00:35:15] >> He drove a Corolla, right?
[00:35:17] >> The what?
[00:35:17] >> He had a Corolla. He drove a Corolla and
[00:35:20] uh Ellison had like a Honda Civic. Um I
[00:35:24] believe at certain points they described
[00:35:26] um the place where all of these
[00:35:28] executives were living together as a
[00:35:30] dorm. The dorm was in fact a $10 million
[00:35:33] Bahamian penthouse. Um, and so you see
[00:35:36] all of these there are these like very
[00:35:38] intentional efforts by SPF to I think
[00:35:42] present himself both as sort of this
[00:35:44] naive innocent, this kind of like
[00:35:47] manchild who another really interesting
[00:35:49] example is that when he one of the times
[00:35:51] he went to testify before Congress, he
[00:35:53] did so with his shoes untied. Um, and
[00:35:55] that's where he kind of gives the game
[00:35:56] away, right? because it's like you're 29
[00:35:59] years old. You know how to tie your
[00:36:01] shoes and and not only do you know that,
[00:36:04] but you really know that when you go
[00:36:05] before Congress, you should probably do
[00:36:07] that. And so this is where he's like
[00:36:09] being very performative about, you know,
[00:36:12] I I compare it to Mark Zuckerberg,
[00:36:14] right? Like Zuckerberg was the first
[00:36:16] Silicon Valley executive to wear a
[00:36:17] hoodie in the boardroom. He like broke
[00:36:19] that barrier. And then Bankman Freed is
[00:36:22] like the trailing edge of that where not
[00:36:24] only am I going to wear a hoodie, I like
[00:36:26] don't bathe and I sleep on a bean bag.
[00:36:29] Um, and that is like the image of the
[00:36:32] tech genius that we've arrived at. And
[00:36:34] that's what I'm, you know, that that was
[00:36:36] his marketing.
[00:36:37] >> And he famously went into a meeting with
[00:36:39] uh Bill Clinton in shorts, right? In
[00:36:41] those ky shorts. And again, we had even
[00:36:44] at the time
[00:36:47] an interview with an executive who
[00:36:50] worked with him who was like, "Yeah, we
[00:36:52] intentionally wore shorts." Because that
[00:36:54] was part of the image that we were
[00:36:56] trying to present,
[00:36:57] >> right? And so when he was prosecuted in
[00:37:00] in 2022, this was like uh terrible for
[00:37:03] the crypto industry. And it's taken
[00:37:04] crypto industry like two years to come
[00:37:07] back from that.
[00:37:08] >> Yeah. Well, I think it Well, yeah, it
[00:37:10] did take about two years. It's been
[00:37:12] about a year now. Yeah. Um and uh it was
[00:37:15] a reputational thing. It was a lot of um
[00:37:19] a lot of they blew up a lot of other
[00:37:21] people because they had made fraudulent
[00:37:24] loans or taken taken loans on a
[00:37:27] fraudulent basis from lenders like for
[00:37:30] example Genesis. Um I don't know we can
[00:37:32] talk about CoinDesk a little bit.
[00:37:34] >> Sure.
[00:37:34] >> Um
[00:37:35] >> but but yeah, we we blew ourselves up.
[00:37:38] Um,
[00:37:38] >> yeah, there was there was kind of an
[00:37:40] irony at CoinDesk because we were the
[00:37:42] publication that was breaking these
[00:37:44] stories about SPF, but the parent
[00:37:46] company of Coindesk, which was Digital
[00:37:48] Currency Group or DI DCG, was itself
[00:37:52] impacted by those stories and you can
[00:37:54] maybe explain.
[00:37:55] >> Yeah. Um, so there was a trading shop
[00:37:57] under DCG. Coindesk was another uh
[00:38:00] company under DCG and um, Genesis got
[00:38:04] blown up after FDX collapsed. They lost,
[00:38:08] I don't know, I'm going to guess $300
[00:38:09] million. I'm making that up. Um, but
[00:38:12] basically, it created a crisis at our
[00:38:14] parent company, DCG, that then meant
[00:38:17] Coindesk had to be sold off to a new
[00:38:20] owner who has since become a rather poor
[00:38:23] steward of of the institution. Um, but
[00:38:26] uh all all becau and and but more to the
[00:38:29] point, I lost my job. I don't Michelle I
[00:38:32] don't remember when you got laid off but
[00:38:34] um but so yeah I mean we we we lost our
[00:38:37] jobs because of the reporting that um
[00:38:41] that we did about SPF and we on the plus
[00:38:44] side I will say um not usually a huge
[00:38:47] fan but the New York Times um did a
[00:38:49] profile of us New York mag New York
[00:38:51] magazine did a did a profile of us so we
[00:38:53] we got at least some some uh publicity
[00:38:57] out of losing our jobs. We'll get right
[00:38:59] back to journalist David Morris on
[00:39:01] Afterwards from C-SPAN.
[00:39:04] Now back to journalist David Morris
[00:39:06] talking about his book Stealing the
[00:39:08] Future, Sam Bankman Freed, Elite Fraud
[00:39:10] and the Cult of Technoutopia.
[00:39:13] >> Yeah, just one more question before we
[00:39:15] turn it over to the audience. I mean,
[00:39:16] how much of a legacy does SBF have in
[00:39:19] the crypto industry today? I mean, are
[00:39:21] there other young kids going around with
[00:39:23] these crazy ideas of utilitarianism?
[00:39:27] Well, I I mean, I don't I think in
[00:39:28] crypto his name is like as bad as it
[00:39:31] could possibly be. And it's been very
[00:39:33] interesting watching um the reaction to
[00:39:36] a lot of these efforts to reframe the
[00:39:39] story to um you know, get him a pardon
[00:39:43] or get him off. Um the the the crypto
[00:39:45] industry, I don't think there's a single
[00:39:48] more hated person. Even like other
[00:39:50] fraudsters of the same generation like
[00:39:52] um who were the three arrows guys? uh
[00:39:56] Kyle Davies and
[00:39:59] >> I can't remember
[00:39:59] >> Suzu.
[00:40:00] >> Suzu. Yeah.
[00:40:01] >> Who are also complete psychopaths, but
[00:40:04] like they have a better shot of of
[00:40:06] rehabilitating their image than Sam. Um
[00:40:09] the more interesting question I think is
[00:40:12] um with effective altruism and this
[00:40:15] broader like world of of the Silicon
[00:40:17] Valley ideologies. Um and effective
[00:40:20] altruism is also in a really bad place
[00:40:22] as a movement largely thanks to Sam. I
[00:40:25] mean there were the fallout was actually
[00:40:27] incredibly broad. Um
[00:40:31] the uh Will Macascal is the guy who is
[00:40:35] one of the two heads of effective
[00:40:37] altruism. He brought Sam into the
[00:40:39] movement. Um he has stepped down from
[00:40:41] almost all of his positions. He was
[00:40:43] funneled about $36 million worth of
[00:40:46] largely stolen FTX funds to funnel into
[00:40:50] effective altruism uh groups. Some of
[00:40:52] that was given back, most of it
[00:40:53] grudgingly. Um so he lost his stature.
[00:40:57] Um another sort of second order
[00:41:00] consequence was um Nick Bostonramm is a
[00:41:02] is a guy who's very affiliated with the
[00:41:05] sort of technoutopian world. He to a
[00:41:08] significant extent invented the idea of
[00:41:11] hostile or unaligned AI. Um and uh in
[00:41:16] parallel with the collapse of FTX,
[00:41:18] investigators discovered some
[00:41:20] unfortunate comments that he had made in
[00:41:21] the past. Um Oxford ultimately defunded
[00:41:24] what was called the Future of Humanity
[00:41:26] Institute, which was one of these
[00:41:28] long-termist think tanks that Bostonramm
[00:41:30] had founded. Um, so a a lot of people
[00:41:33] had their reputations deeply damaged by
[00:41:37] their affiliation with Sam Bankman
[00:41:38] Freed. And I should point out it's welld
[00:41:40] deserved because one of the things that
[00:41:42] we found out afterwards was that in 2018
[00:41:44] there was actually a rebellion inside of
[00:41:47] Alama Research because a lot of people
[00:41:49] who were there at the time, again, four
[00:41:51] years before what became the FTX
[00:41:54] collapse, people who were at Alama
[00:41:55] Research when it was founded were
[00:41:57] talking about the fact that there is no
[00:41:59] accounting here. the money is going
[00:42:01] everywhere. Um we Sam is having sex with
[00:42:05] his executives which he kept doing um
[00:42:07] apparently. Um and and and that was
[00:42:10] ignored including by Will McKascal and
[00:42:13] other effective altruist leaders who
[00:42:14] were informed of some of the concerns at
[00:42:16] the time. So, um, you know, there's
[00:42:19] there's some gray areas there, but, um,
[00:42:22] you could argue that effective altruist
[00:42:24] leaders participated in a cover up that
[00:42:27] ultimately led to the FDX catastrophe.
[00:42:31] >> Definitely a lot of people uh, to blame.
[00:42:33] Um, who would like to ask a question?
[00:42:39] >> Pam, go for it. Oh,
[00:42:41] >> okay. Um well, first of all, David, uh
[00:42:44] your grasp of the subject is absolutely
[00:42:46] amazing. Um you've done an excellent job
[00:42:50] explaining it and I actually I've
[00:42:52] learned a lot. So I I have a question
[00:42:53] that may be a little bit simplistic,
[00:42:55] probably a lot. So, if I'm understanding
[00:42:59] correctly, um if they weren't investing,
[00:43:03] if they weren't actually buying the
[00:43:04] cryptocurrency,
[00:43:06] um that they were claiming that they
[00:43:08] like why did the why did the markets
[00:43:10] fluctuate like that if that wasn't why
[00:43:12] if they would actually if they actually
[00:43:14] were not buying these assets,
[00:43:15] >> right? I mean, yeah. So this is the, you
[00:43:17] know, sort of the um the argument that
[00:43:21] crypto is real is that FTX was was a bad
[00:43:25] actor in this market, but there was
[00:43:27] still a huge market happening outside of
[00:43:29] FTX and a lot of it was um I'm not going
[00:43:32] to say like it was all honest, but a lot
[00:43:34] of it was an honest market of people
[00:43:36] trading and using these assets and that
[00:43:39] kept the activity going um sort of even
[00:43:42] though FDX this one relatively small.
[00:43:44] small. I mean, FDX was not that big
[00:43:47] really relative to the size of the
[00:43:49] larger market, I think it's fair to say.
[00:43:51] Um, and and so they were really just
[00:43:53] kind of like piggybacking on activity
[00:43:55] that was happening elsewhere. And, you
[00:43:56] know, the assets they were trading were
[00:43:58] also being traded on a bunch of
[00:44:00] different other venues. So, the prices
[00:44:02] were were set elsewhere essentially.
[00:44:06] >> That makes that makes sense because I
[00:44:07] was just sitting there trying to put my
[00:44:09] head around that. And the other thing um
[00:44:12] this uh artificial general intelligence
[00:44:16] just so that I'm understanding is that
[00:44:17] like maybe a computer like waking up
[00:44:19] like there
[00:44:20] >> this is the singularity this is um again
[00:44:23] this is the next book and these are like
[00:44:26] you have to take a couple hops to
[00:44:27] connect this to effective altruism but
[00:44:30] it's very connected um this idea that
[00:44:32] yeah computers we're going to invent an
[00:44:36] artificial intelligence so good that
[00:44:38] it's both conscious ious like an actual,
[00:44:41] you know, being. Um, and also smarter
[00:44:44] than humans. Smarter than humans is kind
[00:44:46] of key to this whole
[00:44:47] >> smarter than the whole of human
[00:44:49] intelligence.
[00:44:49] >> Well, smarter than humans and smarter
[00:44:51] than everybody. Um, which you know, this
[00:44:54] once you're once you're doing make them
[00:44:56] ups, you can really make them up. Um,
[00:44:59] and uh, and so that's the that's the
[00:45:01] idea. Yeah. that it's like the the the
[00:45:03] computer becomes I mean and and and I
[00:45:06] said God and that actually is kind of a
[00:45:09] significant and literal part of of their
[00:45:11] belief is that the AGI will attain a
[00:45:14] kind of godlike intelligence and you'll
[00:45:16] now you'll hear Sam Alman different Sam
[00:45:21] perhaps the same cycle um saying that
[00:45:25] once we hit AGI we will solve all human
[00:45:30] sicknesses we will solve all economic
[00:45:32] problems will solve climate change,
[00:45:33] which is convenient because they're
[00:45:35] boiling the oceans right now um with
[00:45:37] with the AGI servers. Um and and so
[00:45:41] these are all of these like far future
[00:45:43] we're going to do it, therefore we have
[00:45:46] to make these shortcuts now, make these
[00:45:49] investments now, get these government
[00:45:51] subsidies now because I swear to God,
[00:45:54] mom, I promise in three years we're
[00:45:57] going to have AGI. in five years we're
[00:45:59] going to invent God and we're going to
[00:46:01] cure cancer. That is the like
[00:46:04] long-termist promise and the way that
[00:46:06] they are getting people to give them
[00:46:08] money now for things they haven't done
[00:46:10] yet.
[00:46:18] >> Hi. Uh thank you. you you create a
[00:46:21] really fascinating um and really clear
[00:46:25] uh vision of the the techno uh utopia
[00:46:29] and and the basis of that in in the um
[00:46:34] I in in Bankmanf Freed and and and his
[00:46:37] mother's vision and I I just wonder is
[00:46:40] is there a nexus do you see a nexus
[00:46:43] between this techno the rise of the
[00:46:46] technoutopia idea and and and and our
[00:46:50] own like federal government and the
[00:46:52] oligarchs that support
[00:46:55] that government or that are behind
[00:47:01] it or involved in it.
[00:47:03] >> Yeah. I mean I mean let's be really
[00:47:05] clear. I believe that technoutopianism
[00:47:09] writ broadly is a
[00:47:13] slightly obfuscated fascist movement.
[00:47:15] Um, and that's despite the fact that a
[00:47:18] lot of the people involved with it
[00:47:19] probably think of themselves as like
[00:47:21] democratic liberals. Um, they're they're
[00:47:23] in this kind of like, you know, control
[00:47:26] space of we believe we can do the math
[00:47:28] to figure everything out. And, you know,
[00:47:31] as I said, Elon Musk is one of the big
[00:47:33] sort of he has he's perhaps the most
[00:47:35] prominent adherent of of these ideas.
[00:47:38] Peter Teal is another very big one. Um,
[00:47:41] and and obviously they have huge
[00:47:43] influence in the government right now.
[00:47:45] Um and so you'll see things like um the
[00:47:48] sort of more friendly regulations
[00:47:51] perhaps for for AGI certainly for crypto
[00:47:54] which we have mixed feelings about um
[00:47:57] and uh and we also have the um effective
[00:48:01] accelerationism is another sort of term
[00:48:04] that these people adopt and there's some
[00:48:06] we we won't get into accelerationism but
[00:48:08] um the fundamental idea is deregulate
[00:48:12] everything get rid of all the rules
[00:48:14] let technology advance freely and
[00:48:17] therefore we're going to get to utopia.
[00:48:18] So like literally the government they
[00:48:20] have that we have right now their
[00:48:22] deregulatory agenda is is driven by some
[00:48:26] to some degree by technoutopian
[00:48:28] thinking.
[00:48:29] >> Right. So it's sort of along the lines
[00:48:31] that you were saying before about
[00:48:32] deciding for everybody what is what is
[00:48:36] good for everybody is a kind of fascism.
[00:48:39] >> Yeah. Yeah. and and and and specifically
[00:48:41] this idea that if we
[00:48:44] have no controls on technology, it will
[00:48:46] inevitably lead in a good direction. Um
[00:48:50] that is uh that's kind of core that's
[00:48:52] it's really worked out so far for us.
[00:48:54] You know,
[00:48:55] >> it's kind of market forces for
[00:48:56] technology.
[00:48:57] >> Yeah. Yeah. And and and it's also
[00:48:59] significant that um you know the EA
[00:49:02] people really believe in market forces
[00:49:06] as a source of information about the
[00:49:08] future. So we now have um one of the
[00:49:11] things that's happened recently is that
[00:49:12] we have prediction markets that are
[00:49:14] legal in the United States. Prediction
[00:49:15] markets were invented by a guy named
[00:49:17] Robin Hansen who uh worked for DARPA.
[00:49:20] He's still around. He's very much
[00:49:22] identified with the effect of altruism
[00:49:24] and uh rationalist movement. And it's
[00:49:26] this idea that if you have people, you
[00:49:28] know, you've probably heard about poly
[00:49:29] market. If you have people betting on
[00:49:31] different outcomes, that is
[00:49:33] theoretically a good way to get
[00:49:35] information about what's going to
[00:49:36] happen. Um but it also becomes this very
[00:49:40] like sort of specific crutch for people
[00:49:44] who think that the market is the source
[00:49:46] of all truth. Um and uh and it has you
[00:49:49] know various I mean the uh the other
[00:49:51] thing I will say about the sort of
[00:49:52] authoritarian and oligarchical
[00:49:54] connections here is that effective
[00:49:56] altruism the the specific idea that will
[00:50:00] mascal presented to Sam Bankman Freed at
[00:50:02] a coffee shop in in um like outside MIT
[00:50:07] in 2012 was what's called earn to give
[00:50:10] and this is one of the EA concepts which
[00:50:13] is if you are a person who's talented in
[00:50:16] a few specific specific fields including
[00:50:18] finance,
[00:50:20] maybe the most effective thing that you
[00:50:22] can do for the future of humanity is
[00:50:24] earn as much money as possible. Even if
[00:50:27] in the course of earning that money, you
[00:50:29] maybe aren't actually doing something
[00:50:31] that's good for the world because you're
[00:50:33] then going to take that money, you're
[00:50:34] going to give it back, and that's going
[00:50:36] to be your your contribution. So, one of
[00:50:39] the really interesting things about Sam
[00:50:41] is that it there there's some indication
[00:50:45] that he was not actually good at very
[00:50:47] much except gambling. Um he has spoken a
[00:50:52] lot about how much he hated reading
[00:50:54] books. Um the the other excerpt that I
[00:50:57] was considering um reading tonight was
[00:51:00] about how much he hates books. Um and he
[00:51:03] also turns out he went to MIT thinking
[00:51:05] he would be a physics researcher. Turns
[00:51:07] out he wasn't actually good at physics
[00:51:08] either. Despite having sort of this
[00:51:10] mathematics background, he was kind of
[00:51:12] an indifferent student his entire life
[00:51:14] who got leveled up into elite schools
[00:51:16] kind of just because of his economic
[00:51:18] background. Um and uh and and you know
[00:51:22] did not actually wind up becoming a
[00:51:23] genius, let's put it that way.
[00:51:26] >> More questions.
[00:51:30] >> Thank you. So, I was a history minor and
[00:51:33] one thing that all authoritarian regimes
[00:51:36] have is this like cult of personality. I
[00:51:39] unfortunately don't have a copy of your
[00:51:40] book, but I was wondering if you touched
[00:51:42] on that at all about kind of we have
[00:51:44] picture of Mao Dong in China and then we
[00:51:46] have SPF building a building.
[00:51:49] >> Yeah.
[00:51:49] >> Looks like him.
[00:51:50] >> Yeah, I mean a million%. I mean, this is
[00:51:52] kind of um one of the
[00:51:55] points of the book is that I've spent a
[00:51:57] lot of time not just writing about Sam
[00:52:00] Bankman Freed, but investigating and
[00:52:01] writing about frauds in general. And one
[00:52:03] of the things that's really shocking to
[00:52:05] me or has been shocking to me over the
[00:52:06] years is that when you're sitting there
[00:52:08] trying to explain to somebody how
[00:52:10] they've been defrauded, they are very
[00:52:13] resistant to getting that message. And
[00:52:17] the essential argument that I make in
[00:52:18] the book, and I mean it's not that, you
[00:52:21] know, complicated or advanced, but we we
[00:52:23] have these heroes of capitalism. They
[00:52:25] are heroes. Um, and we want to follow
[00:52:28] them. And like Elon Musk obviously has
[00:52:30] the most insane cult personality of
[00:52:32] probably anybody alive right now. I
[00:52:34] mean, beyond Mao, honestly, like I think
[00:52:37] Elon has has transcended. Um but but
[00:52:41] Bangman Freed had a very strong cult of
[00:52:43] personality both kind of in the public
[00:52:45] in terms of people willing to invest in
[00:52:47] him um and also privately. I mean he had
[00:52:50] this inner circle of people who shared
[00:52:52] his philosophical beliefs who were
[00:52:55] willing to participate in multi-billion
[00:52:58] dollar fraud because they trusted him.
[00:53:02] Um and and there was this like inner
[00:53:05] group of people who were really helping
[00:53:07] him out. Um, and it was all ideological.
[00:53:10] It was all because they had committed to
[00:53:11] this thing called effective altruism and
[00:53:14] they believed that Sam was kind of the
[00:53:16] avatar of of these uh values. Um but
[00:53:20] yeah, I mean obviously he was on
[00:53:21] magazine covers and the thing that's
[00:53:24] very like particularly infuriating is
[00:53:26] that you know he knew that the reason he
[00:53:30] was getting on magazine covers was not
[00:53:32] because he had founded I mean plenty of
[00:53:34] people I don't think Brian Armstrong has
[00:53:37] ever been on the cover of Fortune
[00:53:38] magazine and he's the founder of coin
[00:53:40] coinbase the uh very large domestic
[00:53:42] crypto exchange. The reason Sam Bankman
[00:53:44] Freed got on the cover of Fortune, even
[00:53:46] though Brian Armstrong, who has been
[00:53:47] running a successful business for a
[00:53:49] decade, never has been, is because he
[00:53:51] made a big deal about his charity
[00:53:52] giving. He was like, "My charity and
[00:53:54] political giving, I'm going to give a
[00:53:56] billion dollars to Democrats," which by
[00:53:58] the way was a number that he pulled
[00:53:59] completely out of his ass. Um, and
[00:54:02] >> he was the biggest donor in that cycle,
[00:54:03] right?
[00:54:04] >> He was the biggest donor, but he just
[00:54:05] kind of made up a future number, which
[00:54:07] is they're very good at making up future
[00:54:08] numbers. Um, I think his his total
[00:54:11] overall wound up being 200 million or
[00:54:14] less before he crashed out. Um, and and
[00:54:17] so yeah, like the cult of personality is
[00:54:20] very real. And and my argument is that
[00:54:23] Bankman Freed has such success because
[00:54:25] he's presenting this image. He's young.
[00:54:28] He's a young relatively young guy. I
[00:54:30] mean, people people used his youth to
[00:54:32] try and apologize for him, but he was in
[00:54:34] his late 20s, early 30s, relatively
[00:54:35] young. And then he has this promise that
[00:54:37] like, yeah, we're all going to make a
[00:54:39] ton of money, but then we're also going
[00:54:41] to fix all the problems. Um, so like if
[00:54:44] you're just a pure capitalist, I think
[00:54:45] people have cooled on that idea a little
[00:54:48] bit. But if you mix in just a little bit
[00:54:49] of dooodtery, uh, you can get on the
[00:54:52] cover of Fortune. And he really played
[00:54:53] it like a harp from hell.
[00:54:56] >> Thank you. It's really good. Good
[00:54:58] insight.
[00:55:00] >> Any more questions?
[00:55:04] One more or
[00:55:06] >> um so this just kind of speaks to maybe
[00:55:09] where the future of of all of this is
[00:55:11] going now that you know crypto has had
[00:55:13] this huge scandal and um and everybody
[00:55:16] in their mom is getting into it and um
[00:55:18] and you have people who um are just
[00:55:20] known as and I don't want to get into
[00:55:22] politics here but what do you do about
[00:55:24] um government um culpability um you know
[00:55:29] helping with the negligence or just
[00:55:30] greed um you know Donald Trump now um is
[00:55:34] has jumped into the game and he's just
[00:55:36] making money hand over fizz and he's not
[00:55:38] pretending that oh I'm you know this you
[00:55:40] know effective altruism no I'm just
[00:55:42] greedy so I'm just wondering um
[00:55:46] >> what do you see that what do you see
[00:55:48] what do you have to say about that
[00:55:50] >> um well I I think that it is like a
[00:55:54] there are two things I think that so Sam
[00:55:57] actually made all of these political
[00:55:59] donations and was very effective at
[00:56:02] getting political influence for the time
[00:56:05] before he killed himself. Um, and that I
[00:56:09] think became a model for the crypto
[00:56:12] industry more broadly, which has then
[00:56:14] since since Sam in the last couple years
[00:56:17] and obviously in the run-up to the
[00:56:18] election lobbyed very heavily, spent a
[00:56:21] ton of money. Um, I think was it
[00:56:23] Coinbase or Kraken that gave a bunch to
[00:56:25] the ballroom remodeling that's happening
[00:56:27] right now?
[00:56:28] >> Absolutely. Yeah. I mean, Coinbase
[00:56:30] sponsored the US Army parade the other
[00:56:33] day, for instance,
[00:56:33] >> right? That was a great choice. Um, and
[00:56:36] and so you do have these crypto
[00:56:38] companies that are now very influential
[00:56:40] in politics just from having money to
[00:56:43] give. And you know, I think that, you
[00:56:45] know, Ben can check me here, but the the
[00:56:48] irony of course is that like I think
[00:56:50] both of us believe that um crypto
[00:56:54] actually is a real thing. Like it's not
[00:56:56] a scam in itself. Sam Bankman Freed's
[00:56:59] fraud could have had nothing to do with
[00:57:03] crypto at all and it still would have
[00:57:05] been the same fraud. He just embezzled a
[00:57:06] bunch of money. Um he didn't do any
[00:57:08] >> embezzling
[00:57:10] anything,
[00:57:11] >> stocks, whatever. He didn't do any like
[00:57:12] crazy hacking or anything really. Um and
[00:57:16] uh and so these crypto companies and I
[00:57:18] mean this is like the real irony for
[00:57:20] we've been watching crypto frauds
[00:57:22] happening for 10 years now and it's like
[00:57:24] if you had done it honestly you actually
[00:57:27] would have come out way ahead of where
[00:57:30] you were because you committed a crime.
[00:57:32] Um but regardless the point is these
[00:57:34] these crypto companies have a ton of
[00:57:36] money. They're using it to influence
[00:57:37] politics and some of it is I think good.
[00:57:41] Some of it actually is like um there
[00:57:43] there is a need to regulate these
[00:57:45] systems. They exist. They're real. Um
[00:57:48] but obviously we are also at the same
[00:57:51] time in the crime is legal era of
[00:57:54] American politics. Um and and that's
[00:57:56] visible everywhere. Um and I think that
[00:58:00] in a weird way
[00:58:02] SBF was like the last guy to go to jail
[00:58:05] for these crimes.
[00:58:07] And I think that um in in some sense um
[00:58:11] we are going to look back on it with
[00:58:13] nostalgia because I think there's a lot
[00:58:15] of it going on that's not getting
[00:58:16] prosecuted right now
[00:58:18] >> obviously and you pointed to one case
[00:58:19] that I won't comment
[00:58:25] >> please. Well, one thing I remember about
[00:58:28] SPF before the collapse is that he was
[00:58:30] like very pro-regulation when basically
[00:58:34] every other crypto founder was not. Do
[00:58:36] you think the fact that he was so
[00:58:38] pro-regulation like we should be
[00:58:40] regulated like we'll work with Congress
[00:58:42] to be regulated kind of
[00:58:45] >> duped more people and like
[00:58:47] >> I'm gonna actually give you a real I'm
[00:58:49] going to give you the the answer and
[00:58:50] then I'm going to drop some insane lore.
[00:58:52] Um which is I mean the real answer is
[00:58:54] yes. Sam Baitman Freed specifically had
[00:58:57] policy proposals that benefited him. He
[00:59:00] actually wanted um regulation that some
[00:59:04] critics argued would have cut off what
[00:59:06] we call DeFi um which is actually like
[00:59:09] distributed
[00:59:10] uh exchanges and and lending and things
[00:59:13] like that would have stopped that so
[00:59:15] that FTX as a centralized exchange
[00:59:18] controlled by one guy could continue
[00:59:20] making all this money. And I think a lot
[00:59:22] of people like the the furthest fringe
[00:59:24] of this conspiracy theory I would say is
[00:59:27] that Bankmanfreed wanted to change the
[00:59:29] laws so that he controlled so much of
[00:59:32] the market through FTX that he could
[00:59:34] cover up all the money that he had
[00:59:36] stolen. So he was actually going to
[00:59:37] Washington to get the laws changed in
[00:59:40] such a way that he could cover up the
[00:59:41] crimes that he was actually committing
[00:59:43] while he was going to Washington. Um so
[00:59:46] I think that is one um element here and
[00:59:49] the uh the other thing this is the lore
[00:59:51] that I'll drop because he was fighting
[00:59:53] against DeFi platforms like unis swap um
[00:59:57] and other analoges but going way back to
[01:00:01] 2019 2020 Ben I'm not even sure I've
[01:00:04] talked to you about this there was a
[01:00:06] fork of unis swap which is a an exchange
[01:00:09] a decentralized exchange that operates
[01:00:12] on the blockchain there was a fork of
[01:00:13] unis swap back back in 2019 2020 called
[01:00:16] Sushi Swap. Um and they did what's
[01:00:19] called a vampire attack. They give a lot
[01:00:21] of incentives for customers from Uniswap
[01:00:23] to go over to Sushi Swap, this clone um
[01:00:26] DeFi exchange doing swaps, doing all of
[01:00:29] the same stuff that FTX would have done.
[01:00:32] Um, it basically failed, blew up, but it
[01:00:35] was clearly an attack on a decentralized
[01:00:38] exchange that had been established and
[01:00:41] was growing and was becoming a
[01:00:42] competitor to the centralized exchanges.
[01:00:46] I'm trying to figure out how to phrase
[01:00:48] this publicly, but I have spoken to
[01:00:51] sources who have strong suspicions that
[01:00:54] Sushi Swap um was actually founded by
[01:00:58] and controlled by Sam Bankman Freed and
[01:01:01] was intended to attack their DeFi
[01:01:04] competitors to make FTX stronger, which
[01:01:07] is the kind of thing that you can do
[01:01:09] when you're on incredible amounts of
[01:01:12] amphetamines.
[01:01:16] Maybe that's a good place to leave it.
[01:01:18] Uh,
[01:01:20] >> it really is.
[01:01:22] >> Well, David, uh, you're amazing writer,
[01:01:24] amazing person. This book is amazing.
[01:01:26] Should all read it. So, thanks so much.
[01:01:28] >> Thank you so much, Ben. And thanks to
[01:01:30] all of you for coming.
[01:01:39] >> Thanks for listening to Afterwards from
[01:01:41] C-SPAN. You can watch more conversations
[01:01:44] with authors like this one every Sunday
[01:01:46] on C-SPAN 2's BookTV.
[01:01:49] Book TV and all the C-SPAN networks are
[01:01:52] now available to watch live on YouTube.
[01:01:55] Learn more at tv.youtube.com.
[01:01:59] And next week on Afterwards, Veronda
[01:02:01] Montgomery will talk about the botanical
[01:02:03] knowledge developed by African-Americans
[01:02:06] and her book, When Trees Testify:
[01:02:09] Science, Wisdom, History, and America's
[01:02:12] Black Botanical Legacy.
ℹ️ Document Details
SHA-256
yt_2iRZ7z5NsSw
Dataset
youtube
Comments 0