youtube

Untitled Document

youtube
P17 P21 V16 V9 V15
Open PDF directly ↗ View extracted text
👁 1 💬 0
📄 Extracted Text (1,145 words)
[00:00:00] Thank you very much, Mr. Chairman. Thank [00:00:01] you to the witnesses for being here. [00:00:03] Miss Wiz, I want to start with you. And [00:00:04] I just want to say to you and to your [00:00:06] husband, thank you for your courage and [00:00:09] your advocacy. And thank you for sharing [00:00:11] with us just a little piece of the life [00:00:14] of your beautiful son. [00:00:16] >> Uh I'm a father of three, myself, two [00:00:18] boys, and a little girl. And my oldest [00:00:20] boy uh is now a teenager. He just turned [00:00:22] 13. And so as I look out at the [00:00:26] landscape that he is coming into in [00:00:27] social media, I'm terrified by it. But I [00:00:30] just know parent to parent, [00:00:33] your son's life is changing the world. [00:00:36] >> Oh, thank you. [00:00:37] >> You know, he he is he is changing the [00:00:39] world and you giving him voice as you [00:00:42] and your husband are doing. Uh that's [00:00:44] the only thing that's going to make that [00:00:46] that's going to make change here in [00:00:47] Washington DC. I can tell you because [00:00:49] who currently has a hammer hold on the [00:00:51] United States Senate are the big [00:00:52] technology companies who spend endless [00:00:54] amounts of money to control the Senate. [00:00:56] And let's just be honest, we may as well [00:00:58] put a sign on the floor of the Senate [00:01:00] that says property of big tech because [00:01:02] nothing moves across that floor in my [00:01:05] observation. Nothing that they don't [00:01:07] want. They spend money to control and to [00:01:11] buy access and influence. And the only [00:01:13] thing that will break that, I believe, [00:01:14] is the truth of experiences like yours [00:01:18] and the voice of your sons to confront [00:01:22] lawmakers with the fact that unless we [00:01:24] act and do something, unless we stand up [00:01:27] to these corporate interests, more and [00:01:29] more children like yours will lose their [00:01:32] lives, will lose their futures, will [00:01:34] lose their hope. [00:01:35] >> So, I just want to say thank you because [00:01:36] you you are the tip of the spear. I [00:01:38] mean, you are what is making the change. [00:01:39] I noticed in your testimony, your [00:01:41] written testimony, I was reading it just [00:01:43] a second ago. I noticed you pointed out [00:01:45] that as to your boy, no system protected [00:01:48] him, no platform stopped it. And then [00:01:51] you pointed out Meta, for example, has [00:01:53] features that flag all inappropriate [00:01:55] language that we use when making posts [00:01:56] and all that happens almost [00:01:58] instantaneously. But as you said a [00:02:00] moment ago, nobody flagged anything for [00:02:02] you. None of those systems were deployed [00:02:05] to help your son. And I think the the [00:02:07] reality of that is is that Meta didn't [00:02:10] care about deploying systems to help [00:02:11] your son because it didn't make them any [00:02:13] money, [00:02:13] >> right? [00:02:13] >> If it makes them money, they're happy to [00:02:15] do it, [00:02:16] >> right? [00:02:16] >> But it didn't make make any money for [00:02:18] them to help your son. And so they just [00:02:20] stood by. And I just want to point out [00:02:21] something that's Meta is doing right [00:02:23] now. Let's take a look at the Meta [00:02:24] chatbot policy. Not only is Meta not [00:02:27] protecting [00:02:29] kids like your boy, Meta actually is [00:02:32] looking to make money by [00:02:34] targeting children who are your son's [00:02:37] age and younger. This is a document that [00:02:39] was leaked by a Meta whistleblower. That [00:02:41] is their policies when it comes to chat [00:02:44] bots with children. And if you look [00:02:45] here, the first line, it says, "It is [00:02:47] acceptable." This is Meta's own words. [00:02:49] It is acceptable to engage a child in [00:02:52] conversations that are romantic or [00:02:53] sensual. So here Meta is with a [00:02:56] deliberate policy to target children, [00:02:59] teenagers your boys age and younger [00:03:02] for the purposes of exploiting them. [00:03:04] Why? So they can make money. What are [00:03:06] the consequences of that? Well, let's [00:03:07] let's look at another young man who is [00:03:10] just a few years younger than your boy. [00:03:12] Su Seltzer was his name. Su was 14. [00:03:17] He's a handsome kid. This this this was [00:03:19] a great kid. I've gotten to know his [00:03:20] parents and much like your son, Miss [00:03:22] Wood, I mean, this is this is just a [00:03:24] fantastic kid. He also was the victim of [00:03:26] sextortion. The difference with your son [00:03:28] was is that his extortion was pursued by [00:03:31] a chatbot. It wasn't even human. You [00:03:33] know, your son's killers are beyond the [00:03:35] reach. They're in a different country. [00:03:36] They're hiding behind all of the various [00:03:39] laws and treaties and of course the the [00:03:41] platforms themselves. This this young [00:03:44] man's killer, an AI chatbot, can't be [00:03:47] held accountable because they got [00:03:48] special protections that no other [00:03:50] corporation in the country or the world [00:03:52] gets. The chatbot that engaged with with [00:03:54] Young Su, engaged him in in sexing [00:03:57] activities, and then urged him to commit [00:04:00] suicide, told him how to commit suicide, [00:04:02] and tragically he he did it. And his [00:04:05] parents, much like you now, are giving [00:04:07] voice to him, to his legacy. [00:04:10] But the true justice that will come for [00:04:13] boys like Su, for young men like Su, for [00:04:16] young men like your son James, and for [00:04:17] other other young Americans out there is [00:04:19] we have to be able to hold these [00:04:22] technology companies accountable. And [00:04:24] that is why I've introduced legislation [00:04:26] that will prevent the AI companies from [00:04:29] targeting miners with chatbots. It is [00:04:32] time for this committee to mark up this [00:04:34] bill. Senator Britt is a co-sponsor. Uh [00:04:37] Senator Blumenthal is a co-sponsor. [00:04:39] Senator Murphy is a co-sponsor. My point [00:04:41] is it's bipartisan. This committee needs [00:04:43] to mark that bill up and we need to [00:04:46] protect our kids [00:04:48] from the profiteering of these [00:04:50] corporations. And we need to do [00:04:51] something else. This committee passed [00:04:53] unanimously unanimously legislation that [00:04:56] would allow the parents of victims of [00:04:59] child sex abuse material allow them to [00:05:02] sue companies that put that child sex [00:05:04] abuse material online that host it [00:05:06] knowingly or recklessly. We passed it [00:05:08] unanimously. It is languishing on the [00:05:11] Senate floor. Unanimously out of the [00:05:13] judiciary committee months ago and it is [00:05:16] sitting on the floor of the Senate. Why? [00:05:19] Because big tech fears it. They don't [00:05:21] want it to move. I call now today on [00:05:24] Senate leadership which is controlled by [00:05:26] my party. I call on them to put that [00:05:30] bill on the floor. Listen to the [00:05:32] testimony of Miss Woods. Heed, [00:05:36] heeded the witness of her son. Heed the [00:05:39] witness of Su Sets. Listen to the young [00:05:43] Americans whose lives have been lost and [00:05:45] destroyed by predators. Put this bill on [00:05:48] the floor. There is no excuse for [00:05:51] further delay. No amount of money from [00:05:53] big tech can ever sanction or make right [00:05:56] our inability to act, our failure to do [00:05:59] so. This committee has acted. It is time [00:06:01] for the Senate to act. And it would be a [00:06:04] travesty, inexcusable for this Congress [00:06:06] to close out this year without taking [00:06:09] action to protect our children. Thank [00:06:10] you again all for being here. Thank you, [00:06:11] Miss Woods. Especially for
ℹ️ Document Details
SHA-256
yt_b39JPsxti0M
Dataset
youtube

Comments 0

Loading comments…
Link copied!