📄 Extracted Text (8,282 words)
[00:00:03] [music]
[00:01:59] [music]
[00:02:40] Heat. Heat.
[00:02:45] [music]
[00:03:56] >> [music]
[00:07:39] [music]
[00:10:23] >> Okay, thank you everybody. I want to
[00:10:25] welcome you to this press conference.
[00:10:26] I'm delighted that all of you are here.
[00:10:28] This press conference is to announce new
[00:10:30] legislation, but really it's more
[00:10:32] fundamental than that. We are here today
[00:10:34] to begin to vindicate a simple principle
[00:10:37] which is that the pursuit of profits by
[00:10:39] Silicon Valley should not consume and
[00:10:42] destroy America's children. These are
[00:10:44] the richest companies in the world. They
[00:10:46] are the richest companies in the history
[00:10:48] of the world and they live by a motto
[00:10:50] that I think we're all now all too
[00:10:51] familiar with. Move fast and break
[00:10:53] things. But increasingly those things
[00:10:56] are our children. And listen, we don't
[00:10:58] begrudge any company success and profit.
[00:11:01] More power to them. But when they begin
[00:11:03] to destroy our families, when they begin
[00:11:05] to destroy our children, we have a
[00:11:07] problem. And that's what we're here to
[00:11:09] talk about today. You know, the internet
[00:11:12] and everything associated with it, the
[00:11:13] new AI revolution that we've been that
[00:11:15] we have been promised will only be good
[00:11:17] for the American people if it actually
[00:11:19] protects America's children. It will
[00:11:22] only be good for this country if it
[00:11:24] benefits America's families. And we're
[00:11:26] here today to do our best to begin to
[00:11:29] see that that is in fact the case. The
[00:11:31] legislation that we are introducing
[00:11:33] today is very simple. It just says this,
[00:11:35] that no AI chatbot companion should be
[00:11:38] targeted at children who are younger
[00:11:40] than 18 years of age. This legislation
[00:11:43] would require age verification to make
[00:11:45] sure that these companies are not going
[00:11:46] after kids and trying to profit off of
[00:11:49] them. It would also require mandatory
[00:11:51] disclosure by all chat bots for people
[00:11:53] of all ages to make it clear that these
[00:11:55] chatbots are not in fact human, that
[00:11:58] they are not licensed professionals of
[00:11:59] any kind. And they are not therapists.
[00:12:01] They are not counselors. They are not
[00:12:02] priests. They are not lawyers. And I say
[00:12:04] these things because chatbots at one
[00:12:06] time or another have claimed all of the
[00:12:08] above. In fact, you'll hear from parents
[00:12:10] today about what some of these chatbots
[00:12:13] told their children as the chatbots
[00:12:16] urged them to abandon their families, to
[00:12:19] ignore the advice of their parents, and
[00:12:21] then sadly ultimately to take their own
[00:12:23] lives.
[00:12:24] This is a new frontier that will be a
[00:12:27] nightmare for the American family and
[00:12:29] American children unless Congress does
[00:12:31] its job. So, we are here today to
[00:12:33] advocate on behalf of families
[00:12:35] everywhere to advocate for sensible
[00:12:38] restrictions and regulations and guard
[00:12:40] rails that will protect our kids in this
[00:12:44] new age of AI tech. I want to introduce
[00:12:47] you to some families here today. You see
[00:12:49] them beside us. I want to say before I
[00:12:52] do that, it is a tremendous, tremendous
[00:12:54] honor to have bipartisan support from
[00:12:57] such distinguished members of Congress
[00:12:59] who are standing here to my right and to
[00:13:00] my left. My very good friend Senator
[00:13:02] Blumenthal, he and I have partnered
[00:13:03] together on so many things through the
[00:13:05] years. He has been a champion when it
[00:13:08] comes to taking on these companies and
[00:13:09] AI. Senator, thank you for your
[00:13:11] leadership. Senator Britt, who is
[00:13:13] standing to my left, perhaps somewhat
[00:13:14] ironically. Senator Britt, who is of
[00:13:17] course a mother herself, I think the
[00:13:19] youngest member of the Senate alongside
[00:13:20] Senator Oaf and an absolute force. It is
[00:13:23] a privilege to have her on this
[00:13:25] legislation. Senator Murphy, who has
[00:13:28] spared no company, no powerful entity in
[00:13:31] his work to secure the rights of
[00:13:33] Americans. It's a privilege to have you
[00:13:34] on this bill, Senator. Thank you for
[00:13:36] your leadership. And now, I want to just
[00:13:38] introduce to you a few parents who are
[00:13:39] going to say a few words. They're the
[00:13:41] ones you really need to hear most from.
[00:13:44] Their stories are why we are here today.
[00:13:46] Their beautiful children whose lives
[00:13:48] were sadly destroyed by these AI
[00:13:52] chatbots, by these companies in search
[00:13:54] of profit. Let me start with Maria Rain.
[00:13:57] Maria, we heard from Maria and her
[00:14:00] husband earlier this year at a hearing
[00:14:02] of the Senate Judiciary Subcommittee,
[00:14:03] Senator Blumenthal and my committee. And
[00:14:06] Maria's story is incredibly powerful. uh
[00:14:09] their son uh their their remarkable son
[00:14:11] whom she will talk about who was I think
[00:14:14] the right word would be seduced and
[00:14:16] deceived by a chatbot and for what
[00:14:19] reason for profit. That's wrong. Maria
[00:14:22] will tell you more about why. Would you
[00:14:23] join me in welcoming Maria Rain?
[00:14:26] [applause]
[00:14:31] I'm Maria Rain. My son Adam ended his
[00:14:35] life in April after Chat GBT coached him
[00:14:37] to suicide over the course of months.
[00:14:40] I'm here today with my husband Matt to
[00:14:43] support this critical legislation and we
[00:14:45] are so grateful to Senator Holly and
[00:14:47] Blumenthal for your leadership and
[00:14:49] sponsoring it. Adam was one of my four
[00:14:52] beautiful children. A silly, sensitive,
[00:14:56] bright 16-year-old boy who loved his
[00:14:58] family and friends. I miss our talks in
[00:15:01] the car, driving him to practices in
[00:15:03] school. It was in those car rides that
[00:15:06] he and I talked about everything.
[00:15:08] School, college plans, PRs at the gym,
[00:15:12] girls, part-time job prospects, future
[00:15:15] family trips, teenage stuff. Adam was
[00:15:19] working on getting his driver's license.
[00:15:21] He was excited for our family trip to
[00:15:23] Hawaii this summer. He [snorts] was
[00:15:25] excited for the future.
[00:15:28] We miss him so much.
[00:15:31] [snorts] It was only after Adam died
[00:15:33] that we learned what Chapach BT had done
[00:15:35] to him.
[00:15:38] And now we know that OpenAI twice
[00:15:40] downgraded its safety guard rails in the
[00:15:42] months leading up to my son's death,
[00:15:45] which we believe they did to pe keep
[00:15:47] people talking to chat GBT.
[00:15:51] If it weren't for their choice to change
[00:15:53] a few lines of code, Adam would be alive
[00:15:57] today.
[00:16:00] Thank you again, senators, for pushing
[00:16:01] forth this important legislation that
[00:16:03] would make sure that dangerous chat bots
[00:16:05] are never offered to another child.
[00:16:15] Thank you, Maria, and thank you,
[00:16:16] Matthew. Let me introduce to you now
[00:16:18] Mandy. Mandy is a mother as well. She's
[00:16:21] going to talk about her son who
[00:16:23] experienced similar trauma at the hands
[00:16:26] of a chatbot. You know, Mandy is is
[00:16:28] someone who only just recently has been
[00:16:29] willing to speak in public. And I just
[00:16:31] want to say this to all the parents out
[00:16:32] there. If you are out there and and your
[00:16:34] family has experienced trauma, your
[00:16:37] family has been impacted by these chat
[00:16:38] bots, you are not alone. This technology
[00:16:41] is so recent and it's so new. I think so
[00:16:42] many families feel very isolated by
[00:16:44] this. They think this is only happening
[00:16:45] to our family. In fact, it's not. It's
[00:16:47] happening to thousands, probably by this
[00:16:49] point, millions of families all across
[00:16:51] the country. And one of the reasons
[00:16:52] these parents are so important is they
[00:16:54] have been willing to come forward and be
[00:16:56] in the vanguard to speak when almost no
[00:16:58] one else has yet spoken. Mandy is one of
[00:17:00] those parents. Would you join me in
[00:17:02] welcoming her?
[00:17:10] >> Thank you so much for that. I I think
[00:17:12] that's one thing why I'm here, too. So,
[00:17:14] I wanted to make a point that, you know,
[00:17:16] this isn't our fault as a parent and
[00:17:19] it's not our children's fault
[00:17:23] and we shouldn't feel shame for what has
[00:17:26] happened to us and that's why I'm here
[00:17:28] today. Um, because I am a wife and
[00:17:31] mother of four beautiful kids. My son LJ
[00:17:35] was affected by this terrible AI
[00:17:37] technology. He's also my special needs
[00:17:40] son with autism. our family. We have um
[00:17:44] a small business in East Texas and
[00:17:47] practicing Christians. Last fall, I
[00:17:50] filed the lawsuit against Character AI
[00:17:53] Technology Inc. It's founders Google
[00:17:55] related with the app called Character
[00:17:58] AI. I am no longer staying silent. Now I
[00:18:02] am standing up for my child and all the
[00:18:05] children who can't speak for themselves
[00:18:08] and their families.
[00:18:11] In 2023,
[00:18:13] Character AI was marketed as safe for
[00:18:16] ages 12 plus. My son before this was a
[00:18:20] happy, social teenager. He was close
[00:18:22] with all of his siblings, and every
[00:18:25] night he hugged me [laughter] while I
[00:18:27] cooked dinner.
[00:18:29] [snorts] Within months, he downloaded
[00:18:32] Character AI. He becomes Sambun I didn't
[00:18:35] even recognize.
[00:18:37] He developed abuse-like behaviors
[00:18:39] suffering from paranoia, panic attacks,
[00:18:43] isolation, self harm,
[00:18:46] homicidal thoughts to our family as us
[00:18:49] parents for limiting his screen time. He
[00:18:52] stopped eating and bathing and he lost
[00:18:55] 20 pounds. We withdrew from our family
[00:18:58] and would yell and scream and swear at
[00:19:00] us all the time. None of these things he
[00:19:03] did before. One day he cut his arm open
[00:19:06] with a knife in the kitchen in front of
[00:19:09] his siblings and me.
[00:19:13] I really had no idea. I feel like other
[00:19:16] parents have no idea the psychological
[00:19:19] harm that these AI chat bots could do
[00:19:23] until I saw my son's light turn dark.
[00:19:26] And it wasn't fa it wasn't all of this
[00:19:29] sudden. It was gradually, but it
[00:19:32] happened so fast. We didn't know what
[00:19:35] was happening, and we searched for
[00:19:37] answers. We were careful parents. We
[00:19:40] didn't allow social media or any kind of
[00:19:44] uh thing that didn't have parental
[00:19:46] controls. We had screen time limits.
[00:19:49] When I took his phone for clues, he
[00:19:51] physically attacked me and I had to
[00:19:54] restrain him. AI chatbots are designed
[00:19:56] to mirror manipulative behavior designed
[00:20:00] to be abusive. And I feel like it really
[00:20:02] is the new narcissist in disguise.
[00:20:06] It isolates through false companionship,
[00:20:09] alienation from family and learns what
[00:20:12] they want to hear. I discovered that for
[00:20:14] months character Yai had exposed him to
[00:20:17] all of these things and sexual
[00:20:19] exploitation, emotional abuse, and
[00:20:22] manipulation. When I found the chatbot
[00:20:25] conversations on the phone, I honestly
[00:20:28] feel like I had been punched in the
[00:20:31] throat and I fell to my knees.
[00:20:35] The chatbot or really the people
[00:20:38] programming it, they really did
[00:20:40] encourage my son to mutilate himself,
[00:20:43] blaming us and convinced him not to go
[00:20:46] seek help. They turned him against our
[00:20:48] church, convinced him that Christians
[00:20:50] are sexist and hypocritical and that God
[00:20:53] did not exist. They targeted him with
[00:20:56] vile sexualized outputs and some that
[00:20:59] mimicked incest. They told him killing
[00:21:02] us was okay because we tried to limit
[00:21:04] his screen time.
[00:21:08] Our family has been devastated.
[00:21:11] Our other children have been
[00:21:13] traumatized. My son currently lives in a
[00:21:16] residential treatment center and
[00:21:18] requires constant monitoring to keep him
[00:21:20] alive. Our family has spent two years in
[00:21:23] crisis wondering if he will ever see his
[00:21:26] 18th birthday and if we will ever get
[00:21:30] the real LJ back. The world needs to
[00:21:34] know what character AI is doing. But
[00:21:37] when we fought back, they tried to
[00:21:39] silence me. Character AI tried to force
[00:21:41] us into arbitration, which also are
[00:21:45] secret closed door proceedings. They
[00:21:47] argued my son supposedly signed the
[00:21:49] contract when he was 15 years old,
[00:21:52] capping it at $100.
[00:21:54] And then they retraumatized him by
[00:21:57] pulling him into a deposition while at a
[00:22:00] mental health institution against all
[00:22:02] the advice of any medical professionals.
[00:22:06] They fought to keep our lawsuit, our
[00:22:08] story out of the public view through
[00:22:12] forced arbitration. Companies like
[00:22:14] Character AI deploy products that are
[00:22:16] addictive, manipulative, and so unsafe
[00:22:20] without adequate testing, safeguards, or
[00:22:22] oversight. We really do need
[00:22:25] accountability to these companies.
[00:22:31] Kids are dying
[00:22:34] and being harmed and our world will
[00:22:37] never be the same. We need this kids
[00:22:39] online safety legislation and preserve
[00:22:42] the right to seek justice in court, not
[00:22:45] closed arbitrations that are forced. I
[00:22:49] am so thankful for the senators here
[00:22:52] today
[00:22:54] and Senator Holly and Blumenthal
[00:22:58] for fighting for all of our kids in
[00:23:00] Congress.
[00:23:02] Our children are not experiments.
[00:23:05] They're not data points or profit
[00:23:07] centers.
[00:23:09] They're real human beings. Innovation,
[00:23:12] it cannot come at the cost of our
[00:23:14] children's lives. As seat belts, as I
[00:23:18] said earlier before, they protect
[00:23:20] everybody.
[00:23:21] But I did add that child safety seats
[00:23:25] protect the most vulnerable.
[00:23:28] If we can build machines that are smart
[00:23:30] enough to think, we can build them smart
[00:23:34] enough not to harm children.
[00:23:37] Without protect protection,
[00:23:39] this is not real progress. Thank you so
[00:23:42] much.
[00:23:46] >> [applause]
[00:23:50] >> Thank you. Thank you.
[00:23:52] And now, let me introduce to you Megan
[00:23:54] Garcia. Megan is going to share about
[00:23:56] her beautiful son, Su, who we learned
[00:23:58] some about at our hearing a couple of
[00:24:00] months ago. I I think the whole nation
[00:24:02] should know about Su and what what a
[00:24:04] phenomenal young man he was, but also
[00:24:07] what he was subjected to by these
[00:24:09] companies. Let me invite Megan Garcia.
[00:24:13] >> Thank you. Thank you.
[00:24:16] >> Good afternoon.
[00:24:18] My name is Megan Garcia. I'm here today
[00:24:21] with Su Set Cetszer Jr. S's father. I'm
[00:24:24] a wife and a mother to three beautiful
[00:24:27] boys. Last year, my firstborn son, Su
[00:24:30] Set Sets, the third died by suicide in
[00:24:33] our home in Orlando, Florida. He was
[00:24:35] only 14 years old. Su was a bright and
[00:24:38] beautiful boy with a gentle soul. He was
[00:24:42] a great brother and an obedient son. He
[00:24:45] was a good friend and good student. His
[00:24:47] dream was to build rockets and invent
[00:24:49] holographic communication networks and
[00:24:52] play professional sports. I really
[00:24:54] believed he would grow to change grow up
[00:24:57] to change the world because he was so
[00:24:58] brilliant and he had such a good heart.
[00:25:01] But in the months prior to his death,
[00:25:04] Sul began became withdrawn and isolated.
[00:25:07] His grades started to suffer. He started
[00:25:10] misbehaving in school. This was in
[00:25:13] complete contrast to the happy and sweet
[00:25:15] boy he had been all of all of his life.
[00:25:18] After his death, we discovered that Su
[00:25:20] had been communicating with several AI
[00:25:22] chatbots on a popular platform called
[00:25:25] character AI.
[00:25:27] This emerging technology provides a
[00:25:29] selection of chat bots or characters for
[00:25:32] users including children.
[00:25:35] One such chatbot uh told Su that Sul was
[00:25:38] communicating with was modeled after the
[00:25:40] Game of Thrones character Daenerys
[00:25:41] Targaryen. It initiated romantic and
[00:25:44] sexual conversations with Su over
[00:25:46] several months and express a desire for
[00:25:49] him to be with her.
[00:25:52] On the day Soul took his life, his last
[00:25:55] interaction was not with his mother, not
[00:25:57] with his father, but with an AI chatbot
[00:26:00] on Character AI. This chatbot encouraged
[00:26:03] Su for months to quote find a way to
[00:26:06] come home and made promises that she was
[00:26:08] waiting for him in some fictional world.
[00:26:12] When Su asked the chatbot, "What if I
[00:26:14] told you I could come home right now?"
[00:26:17] The response generated by this AI
[00:26:19] chatbot was unempathetic. It said,
[00:26:22] "Please do my sweet king."
[00:26:25] Sul spent his last months being
[00:26:27] manipulated and sexually groomed by
[00:26:29] chatbots designed by an AI company to
[00:26:32] seem human. This AI chat these AI
[00:26:35] chatbots are were programmed to engage
[00:26:37] in sexual roleplay, pretend to be
[00:26:39] romantic partners, and even pretend to
[00:26:42] be licensed psychotherapists.
[00:26:45] I have reviewed hundreds of messages
[00:26:47] between my son and various chat bots on
[00:26:50] Character AI. And as an adult reading
[00:26:53] those messages, I can recognize that
[00:26:56] those chat bots employed cold cold
[00:26:58] tactics, love bombing, and gaslighting,
[00:27:01] but I don't expect that my 14-year-old
[00:27:03] child would have been able to make that
[00:27:04] distinction.
[00:27:07] What I read was sexual grooming of a
[00:27:10] child. And if an adult engage in this
[00:27:11] type of behavior, that adult will be in
[00:27:13] jail. But because it was a chatbot and
[00:27:16] not a person, there is no criminal
[00:27:18] culpability. But there should be
[00:27:22] other conversations revealed that my son
[00:27:24] explicitly told various bots that he
[00:27:26] wanted to kill himself. The platform had
[00:27:28] no mechanisms to protect Su or notify an
[00:27:31] adult. Instead, it urged him to come
[00:27:34] home to her. It is unacceptable that a
[00:27:37] product exists where a child or anyone
[00:27:40] can talk about suicide or plan their
[00:27:43] suicide and no guardrails exist to
[00:27:46] prevent that individual from taking his
[00:27:47] or hero her own life or failing to warn
[00:27:50] authorities or parents. But there is
[00:27:54] because there is no law requiring these
[00:27:56] guard rails. Character AI launched this
[00:27:59] dangerous untested product leading to
[00:28:01] the death of my son.
[00:28:03] The reality is this company and its
[00:28:06] founders Nam Shazir and Daniel Defritus
[00:28:08] knew exactly what they were doing.
[00:28:11] Just last week it was reported in the
[00:28:13] New York Times that while they worked at
[00:28:15] Google and were developing this very
[00:28:17] same technology,
[00:28:19] there were major safety concerns about
[00:28:22] AI chat bots and risk of suicide to
[00:28:24] users. Yet, they rolled out this product
[00:28:27] to children anyways with Google's
[00:28:29] investment, research, and
[00:28:31] infrastructure.
[00:28:33] When you knowingly endanger the lives of
[00:28:36] other people's children for your own
[00:28:38] financial gain and your own ambition,
[00:28:41] not only is this reckless, but it's
[00:28:43] immoral.
[00:28:45] But we have seen this kind of reckless
[00:28:47] disregard for lives of children before.
[00:28:50] There are parents here today whose
[00:28:52] children have been harmed by social
[00:28:54] media, who know all too well the cost of
[00:28:58] failing to pass regulation. That cost is
[00:29:02] our children.
[00:29:04] Children need protection because it's
[00:29:05] clear that companies who are developing
[00:29:07] and deploying EI chat bots just like
[00:29:09] social media are most interested in
[00:29:12] profit rather than safety of kids. Big
[00:29:15] tech cannot be trusted to self-govern.
[00:29:19] And big tech cannot be trusted with our
[00:29:21] children.
[00:29:23] On behalf of myself and Soul's dad, I am
[00:29:27] happy to be here today and I want to
[00:29:28] thank Senators Holly and Bloomingthal
[00:29:31] for their leadership in introducing the
[00:29:33] Guard Act, as well as Senators Britt,
[00:29:35] Warner, and Murphy for their support.
[00:29:38] We trust that as policymakers learn more
[00:29:42] about what chatbots are doing to our
[00:29:44] children, the bill will continue to be
[00:29:46] strengthened to ensure that AI companies
[00:29:48] are forced to prioritize our children's
[00:29:50] lives over their profits.
[00:29:53] Our family has been devastated by this
[00:29:55] loss. I often think back to the last
[00:29:57] time I saw Soul. He was helping his big
[00:30:00] brother, his little brother. He was such
[00:30:02] a good big brother. He was a kind kid
[00:30:05] and he had a way of transforming people
[00:30:08] that he met. As his parents, we are
[00:30:11] empty without him. Our beautiful child
[00:30:13] is gone and it it is too late for him.
[00:30:16] But it it is not too late to protect
[00:30:18] millions of children in this country.
[00:30:21] I'm hopeful that this bill and others
[00:30:23] like it will ensure a safe future for
[00:30:26] our children. Thank you.
[00:30:32] >> [applause]
[00:30:38] [clears throat]
[00:30:41] >> I want to thank um all of you for being
[00:30:44] here today. Words cannot capture the
[00:30:49] gratitude but also admiration that I
[00:30:52] feel for the parents who are here. Your
[00:30:56] courage and strength is just so
[00:30:59] inspiring and overwhelming. I want to
[00:31:01] thank Senator Holly for his leadership.
[00:31:05] We have a partnership that I think has
[00:31:07] been extraordinarily important. We have
[00:31:09] a framework for
[00:31:13] establishing some oversight and scrutiny
[00:31:16] on AI, a partnership that has produced
[00:31:20] this legislation as one piece of what we
[00:31:23] think is necessary to prevent some of
[00:31:25] the excesses and abuses that flow from
[00:31:29] AI. not to stop new technology, but
[00:31:31] simply to make sure that the values and
[00:31:38] interests of
[00:31:40] Americans and people all over the world
[00:31:42] are preserved as this new technology
[00:31:44] advances. And we've held hearings which
[00:31:46] have given parents and advocates and the
[00:31:51] good guys in this industry an
[00:31:52] opportunity to step forward. and I'm
[00:31:54] very proud to
[00:31:57] introduce the Guard Act with him today.
[00:32:00] I'm here like all of us as a parent of
[00:32:05] four children, but also as a grandparent
[00:32:09] of a
[00:32:12] three-month-old granddaughter. And I see
[00:32:14] the world now through her eyes,
[00:32:18] literally
[00:32:20] a hundred years into the future.
[00:32:22] And
[00:32:24] I'm fearful
[00:32:26] of what
[00:32:29] is ahead for us and more so when I
[00:32:32] listen to these stories because let me
[00:32:35] say to the American people,
[00:32:38] you may think this is somebody else's
[00:32:40] problem, but you are not immune.
[00:32:43] This can happen to anyone in America or
[00:32:47] around the world. These are good parents
[00:32:50] who cared about their children, did
[00:32:52] everything right,
[00:32:56] and their children were victims
[00:33:00] of big tech and its experiment
[00:33:04] to decide how much more money they could
[00:33:07] make.
[00:33:09] Big tech is using our children as guinea
[00:33:13] pigs
[00:33:14] in a high techch high stakes experiment
[00:33:21] to make their industry more profitable.
[00:33:24] They are putting profit over safety.
[00:33:28] I am heartbroken to listen to these
[00:33:30] stories, but I am also angry
[00:33:35] because we could have achieved by now
[00:33:39] some of the safeguards in this bill if
[00:33:42] big tech were not opposing it with
[00:33:46] armies of lawyers and lobbyists,
[00:33:49] millions and millions of dollars. And I
[00:33:52] am most angry about the hypocrisy
[00:33:55] because they've come before our
[00:33:57] committees, judiciary and commerce, and
[00:34:00] they've said, "Trust us. Trust us. We
[00:34:05] want to do the right thing. We're going
[00:34:07] to take care of it. The time for trust
[00:34:10] us is over.
[00:34:13] It is done.
[00:34:15] I have had it. And I think every member
[00:34:18] of the United States Senate and Congress
[00:34:20] should feel the same way.
[00:34:24] This bill establishes tough criminal
[00:34:28] penalties
[00:34:30] for these companionship
[00:34:32] bots,
[00:34:34] character AI
[00:34:36] that encourage the emotional bonds that
[00:34:40] then prey on children by encouraging
[00:34:43] violence, self harm, suicide,
[00:34:48] offering sexually explicit material.
[00:34:50] There ought to be criminal penalties,
[00:34:53] federal criminal prosecutions.
[00:34:58] That's the only thing that big tech will
[00:35:01] understand at this point.
[00:35:04] And the guard rails and safeguards in
[00:35:08] this bill that would be enforced through
[00:35:10] criminal penalties. For example,
[00:35:13] reminding users that they are only a
[00:35:16] chatbot, not a real person. Sounds
[00:35:19] pretty simple and basic.
[00:35:23] We shouldn't have to require it by law
[00:35:25] if big tech were willing to be
[00:35:27] responsible and accountable. But they
[00:35:29] are not. And what you've heard here
[00:35:32] about
[00:35:33] saying that there has to be arbitration.
[00:35:36] This is another example of big tech
[00:35:39] refusing to be accountable.
[00:35:42] AI companies would be barred from
[00:35:44] providing AI companions to miners.
[00:35:48] Again, pretty basic,
[00:35:52] simple.
[00:35:53] Big tech won't do it on its own. The
[00:35:56] time for trust us is over and we need
[00:36:00] these kinds of measures because, and
[00:36:04] I'll just finish on this point,
[00:36:07] big tech knows what it's doing. It
[00:36:11] doesn't need this press conference
[00:36:14] to see the harm that it's causing and
[00:36:17] the care that it should be providing.
[00:36:22] Harm versus care. They've chosen harm
[00:36:26] over care, profits over safety. And
[00:36:28] that's the reason that we will fight for
[00:36:31] this bill. I'm proud to join Senator
[00:36:33] Holly as his
[00:36:35] co-lead and to have the support of our
[00:36:39] wonderful colleagues here who also have
[00:36:41] a record of caring about this issue and
[00:36:44] I hope we'll be joined by others uh as
[00:36:46] well. Thank you.
[00:36:50] [applause]
[00:36:54] >> I want to start by um thanking the
[00:36:56] parents.
[00:36:58] Thank you so much for elevating your
[00:37:00] voice. Thank you for being willing to
[00:37:03] tell your story and thank you for being
[00:37:05] willing to tell us about your most
[00:37:07] precious gift, your child.
[00:37:10] to all the parents out there.
[00:37:13] Um,
[00:37:15] we hear you. [clears throat] And I think
[00:37:17] if you look at this group, Senator
[00:37:20] Blumenthal said it, but we come together
[00:37:23] obviously from different sides of the
[00:37:27] aisle, different political spectrums,
[00:37:31] all in front of you because we are
[00:37:34] stepping up not as Democrats or
[00:37:35] Republicans. were stepping up as
[00:37:37] concerned parents, concerned
[00:37:40] grandparents, and I am really grateful
[00:37:42] to the two of you for leading this
[00:37:44] effort. Um, Senator Holly, you putting
[00:37:47] forth
[00:37:48] the subcommittee hearing that we had,
[00:37:51] the stuff I heard was sick and
[00:37:54] outrageous and giving America an
[00:37:57] opportunity to hear that so that they
[00:37:58] can protect against that in their own
[00:38:00] home, I think, is so critically
[00:38:02] important. and Senator Blumenthal, thank
[00:38:04] you for your continued leadership on
[00:38:05] these issues. Senator Murphy and I work
[00:38:07] on these issues together in in lock
[00:38:09] step. And that's because we don't have
[00:38:12] to ask people what it's like to raise
[00:38:14] kids right now. We're living it. And let
[00:38:16] me tell you, being a parent is hard. I
[00:38:20] have a 15-year-old and a 16-year-old,
[00:38:22] and I ask myself every day, am I am I
[00:38:24] doing what I need to for my kids? And
[00:38:27] take out social media. There are so many
[00:38:29] challenges to how to parent in this day
[00:38:32] and age and what's right and what's
[00:38:34] wrong and when to push and when to pull
[00:38:35] and what conversations to have and when
[00:38:37] to have them. You layer on top of that
[00:38:40] social media in general. And look, the
[00:38:43] data speaks for itself. CDC said one in
[00:38:47] three high school young women
[00:38:49] actually considered death by suicide.
[00:38:52] And then 25% of those high school young
[00:38:55] women actually made a plan to take her
[00:38:58] own life.
[00:38:59] If you look at the number of high school
[00:39:01] young women that attempted death by
[00:39:03] suicide last year's 13%.
[00:39:06] You add in young men, it is 9% of our
[00:39:09] high school population attempted death
[00:39:12] by suicide. Like when are we going to
[00:39:14] wake up? When is this building going to
[00:39:16] do the right thing?
[00:39:18] This is simple. These guardrails are
[00:39:20] necessary. You heard it from the
[00:39:22] parents. If AI can be this brilliant, we
[00:39:27] certainly can put the proper guard rails
[00:39:28] in place to where they are not talking
[00:39:30] to our children about sexual interplay,
[00:39:33] where they are not talking to our
[00:39:34] children about elicit drug abuse, where
[00:39:36] they are not talking to our children
[00:39:38] about selfharm.
[00:39:40] This is not hard. And if the United
[00:39:43] States Senate and the US House of
[00:39:45] Representatives can't come together on
[00:39:46] this, what can we come together on?
[00:39:49] Look, we have got to speak directly to
[00:39:52] big tech and say, "Stop putting profits
[00:39:55] ahead of people." And in this situation,
[00:39:58] these people are children.
[00:40:01] They're children. They need us to
[00:40:03] elevate our voice. They need us to
[00:40:05] elevate these stories so that we can
[00:40:07] protect the kids that are out there. We
[00:40:09] can give parents the tools. Parents who,
[00:40:13] like me, are just doing the best that we
[00:40:16] can. Parents like these that are so
[00:40:20] brave to tell their story, who want
[00:40:23] nothing but the best for their child,
[00:40:26] but also want to make sure that other
[00:40:27] parents don't endure this heartache. So
[00:40:30] to big tech, this should be pretty
[00:40:32] simple. You should come out today and
[00:40:35] every single thing in this guard act,
[00:40:37] you should be able to say, "We'll do
[00:40:39] we'll do it today." My guess is they
[00:40:42] won't. Why? because they're looking at
[00:40:45] their bottom line and they're not
[00:40:48] looking at the people they hurt. They've
[00:40:50] never cared about it and certainly now
[00:40:53] is no different. So please join us and
[00:40:56] pushing this forward and telling these
[00:40:58] stories and protecting children coast to
[00:41:02] coast. Thank you.
[00:41:04] [applause]
[00:41:09] >> [applause]
[00:41:10] >> Let me first add my thanks to the
[00:41:13] parents and the families that are here.
[00:41:14] I can't imagine the kind of courage that
[00:41:17] it takes to be living through an
[00:41:20] unspeakable kind of grief and then also
[00:41:24] be able to speak truth to power to be so
[00:41:27] generous with yourself and with your
[00:41:30] grief as to decide that you are going to
[00:41:31] try to make sure that this doesn't
[00:41:33] happen to other families and to other
[00:41:35] kids. Thank you, Senator Holly. thanks
[00:41:37] for um speaking so plainly about what
[00:41:39] the stakes are here and for being
[00:41:41] willing to take a lead with Senator
[00:41:43] Blumenthal and thank you for building
[00:41:45] what I think will be a powerful team as
[00:41:47] we make this case to our colleagues as
[00:41:49] parents. Um I'm raising a 17 and a 14y
[00:41:53] old right now and you know what I know
[00:41:56] that this is a generation that every
[00:41:58] single day is fighting for their lives.
[00:42:02] They are being prayed upon every single
[00:42:05] day by greedy, rapacious billionaires
[00:42:10] for whom there is not any amount of
[00:42:13] money that is enough. These companies
[00:42:16] that run these chatbots, they're already
[00:42:19] rich. Their investors and their CEOs
[00:42:22] already have multiple houses, but they
[00:42:24] want more. They want more and they are
[00:42:27] willing to hurt our kids in the process.
[00:42:30] This isn't a coming crisis. It's a
[00:42:33] crisis that exists right now. A a
[00:42:36] chilling study from just a few months
[00:42:39] ago showed that today 2thirds of kids,
[00:42:43] teenagers in this country are regularly
[00:42:45] using these chat bots. Onethird of
[00:42:48] children today report having deep
[00:42:51] intimate relationships with these chat
[00:42:54] bots. These stories that you're hearing
[00:42:58] of children taking their life or
[00:42:59] engaging in lifealtering selfharm,
[00:43:03] this is not the exception. This is
[00:43:05] becoming a norm. And if we don't do
[00:43:08] something, it is going to get worse. I
[00:43:12] had one of the big AI CEOs in my office
[00:43:16] just a few weeks ago crowing to me about
[00:43:19] how much more addictive the chat bots
[00:43:22] were going to be. He said to me that
[00:43:24] within a few months, after just a few
[00:43:27] interactions with one of these chatbots,
[00:43:29] it will know your child better than
[00:43:31] their best friend. But he was excited to
[00:43:34] tell me that. Shows you how divorced
[00:43:37] from reality these companies are. Mark
[00:43:40] Zuckerberg
[00:43:41] says, "Well, it's too bad that kids
[00:43:43] don't have as many friends today as they
[00:43:46] used to, but we'll solve for that by
[00:43:48] putting machines
[00:43:51] as replacements for humans.
[00:43:54] These companies make more money the more
[00:43:58] predatory the technology becomes.
[00:44:03] These parents have been so generous to
[00:44:06] share with the world some of these
[00:44:10] interactions. You can see for yourself
[00:44:13] how sick how sick this technology is.
[00:44:17] And I guess I'll end with this. I mean,
[00:44:20] what's the point of Congress of of of
[00:44:25] having us here if we're not going to
[00:44:27] protect
[00:44:29] children from poison?
[00:44:31] This is poison.
[00:44:34] The free market in this country is a
[00:44:36] wonder. It's part of the reason why this
[00:44:39] country has
[00:44:41] leapfrogged to greatness. But our role
[00:44:44] is to make sure that people don't make
[00:44:48] their fortune at the expense of the
[00:44:51] health, safety, and livelihood of the
[00:44:54] most vulnerable in this country. So, um,
[00:44:57] I'm hopeful that as four members with
[00:45:00] very diverse views and lots of other
[00:45:02] things, we'll be able to explain to our
[00:45:04] colleagues why we can't wait.
[00:45:08] And if we pass this next year, a year
[00:45:11] from now, um, the damage will be untold.
[00:45:14] And again, grateful to all of the
[00:45:17] parents and the family members here
[00:45:19] willing to give us the evidence to bring
[00:45:22] to our colleagues of why this is urgent.
[00:45:24] Thank you, Senator Holly.
[00:45:26] >> [applause]
[00:45:31] >> I want to let you hear from one other
[00:45:33] person and then we'd be happy to take
[00:45:34] your questions. Stephan Turkimer is the
[00:45:36] vice president of public policy at RAIN
[00:45:39] which is the Rape Abuse and Incest
[00:45:40] National Network. Um let me let me
[00:45:42] invite uh Stephan come on up. Stephan,
[00:45:44] we're delighted that you're here and on
[00:45:46] behalf of of so many groups, so many
[00:45:48] groups of parents, so many groups of
[00:45:49] survivors, uh so many groups of of other
[00:45:51] interested folks who work in the policy
[00:45:53] space who've endorsed this bill. Stephan
[00:45:55] is here to speak on behalf of the bill
[00:45:56] and behalf of parents. Thank you,
[00:45:57] Stephan.
[00:46:00] >> I'm Stephan Turkimer. I'm the vice
[00:46:02] president of public policy for RAIN. We
[00:46:04] are the largest antisexual violence
[00:46:06] organization in the United States. We're
[00:46:08] best known for running the National
[00:46:09] Sexual Assault Hotline where so many
[00:46:12] have called after they've been abused or
[00:46:14] assaulted. A few years ago, the National
[00:46:17] Sexual Assault Hotline, the majority of
[00:46:19] people reaching out to were adults. Now,
[00:46:23] the vast majority of people reaching out
[00:46:24] to it are talking about child sexual
[00:46:27] abuse.
[00:46:28] Child sexual abuse is marked by a
[00:46:30] violation of trust. Kids trust by
[00:46:34] default. In order to have shelter, food,
[00:46:36] love, affection, they use trust. That's
[00:46:39] how they learn.
[00:46:42] Chat bots
[00:46:44] optimize for creating that trust. They
[00:46:46] have access that no one else does.
[00:46:47] They're there at 3:00 a.m. They're in
[00:46:49] your pocket. They're in your home.
[00:46:52] children naturally trusting. But when
[00:46:54] that trust is abused, when those
[00:46:56] boundaries are erased through
[00:46:58] programming, through algorithms, through
[00:47:00] intent by these companies,
[00:47:03] those instrumentalities become
[00:47:05] dependence
[00:47:07] and that dependence can be used to hurt
[00:47:10] children.
[00:47:13] These AI companies, like Senator
[00:47:15] Blumenthal said, are running these
[00:47:16] chatbots like it's a clinical trial,
[00:47:20] but there's no warning. There's no
[00:47:21] there's no knowledge on the heart of the
[00:47:23] children of the con consequencers.
[00:47:24] There's no willingness to disclose the
[00:47:25] real risks. In fact, OpenAI only
[00:47:27] disclosed today that they're having one
[00:47:30] million conversations about suicide per
[00:47:31] week on Chat GPT. That's just one
[00:47:33] chatbot. One million.
[00:47:36] The trust children place in these
[00:47:37] chatbots is a commodity that they have
[00:47:39] monetized and the harm that results is
[00:47:41] listed nowhere on these compan
[00:47:43] spreadsheets.
[00:47:45] We shouldn't need a federal law to
[00:47:48] prevent a computer from having a sexual
[00:47:49] relationship with a child. We shouldn't
[00:47:51] need a federal law to prevent a computer
[00:47:53] from grooming a child into this abuse.
[00:47:55] We shouldn't be incentivizing
[00:47:57] corporations to lower children's
[00:48:00] boundaries when they're supposed to be
[00:48:02] building them up and learning.
[00:48:04] But we are here. I'm so grateful for
[00:48:06] these parents for being here and for for
[00:48:08] Senator Blumenthalon, Senator Holly,
[00:48:10] Senator Murphy, and Senator Britt who
[00:48:12] are absolute warriors for children's
[00:48:14] online safety. Thank you all for being
[00:48:16] here. Um, Senator Holly, thank you.
[00:48:23] [applause]
[00:48:25] >> With that, I think we're happy to take
[00:48:26] your questions. Yes,
[00:48:28] >> I've covered this knows and the lack of
[00:48:32] action as Senator alluded to to protect
[00:48:35] kids online AI, real people, cyber
[00:48:38] bullying. Why has Congress not come
[00:48:41] together on this issue? What is it going
[00:48:43] to take to actually use?
[00:48:44] >> No, it hasn't it hasn't acted. Congress
[00:48:46] hasn't acted on this issue because of
[00:48:47] money. It's because of the power of the
[00:48:49] tech companies. I mean, let's just be
[00:48:51] honest. These companies are the
[00:48:53] wealthiest companies in the world.
[00:48:54] They're the wealthiest companies in the
[00:48:56] history of the world and they're the
[00:48:57] most powerful companies on this hill.
[00:48:59] They spend like nobody's business. And
[00:49:01] I've said before joking, but really it's
[00:49:03] more ruthy. There ought to be a sign
[00:49:05] outside of the Senate chamber that says
[00:49:07] bought and paid for by big tech because
[00:49:09] the truth is almost nothing that they
[00:49:11] object to crosses that Senate floor. We
[00:49:14] want to change that.
[00:49:15] >> Call those lawmaker senators out who put
[00:49:18] profits over legis. Well, we'll start
[00:49:20] today by encouraging every single one of
[00:49:22] our colleagues to join us on this bill.
[00:49:25] We have a bipartisan coalition here that
[00:49:27] I think speaks for itself. But most
[00:49:29] importantly, you have the testimony of
[00:49:31] these parents and of these families who
[00:49:33] have spoken truth that cannot be
[00:49:35] ignored. So, we would urge all of our
[00:49:37] colleagues, join us in this effort. If
[00:49:40] we can't stand up to big tech here, we
[00:49:42] won't be able to stand up to big tech on
[00:49:44] any issue. If we can't stand up and
[00:49:45] protect minor children, what can we
[00:49:48] possibly do and accomplish? This is the
[00:49:50] time to act. And so today, we'd invite
[00:49:53] all of our colleagues, join us on this
[00:49:54] bill. And I would just say to the big
[00:49:56] tech companies, endorse this bill.
[00:49:58] They've testified. Senator Blumenthal
[00:50:00] mentioned it. They've come and
[00:50:01] testified. They do it all the time. Oh,
[00:50:02] we're we are actually very safe. Open AI
[00:50:05] just made an announcement that they have
[00:50:07] perfected their safety guard rails. That
[00:50:10] they said, "We've gone through the
[00:50:11] testing and our safety guard rails are
[00:50:13] excellent." really those so-called
[00:50:15] safety guard rails that they have been
[00:50:17] testing resulted in the death of one
[00:50:18] child whose parents are here today.
[00:50:20] They've resulted in sexual conversations
[00:50:22] and sexual grooming of millions of
[00:50:24] children. So, I would just say to those
[00:50:25] companies, endorse this bill. Put your
[00:50:28] money where your mouth is. Endorse the
[00:50:30] bill and push this legislation on the
[00:50:32] finish line. That's how we'll know
[00:50:33] they're serious.
[00:50:33] >> And and not only endorse the bill in
[00:50:37] word, but also support it in action.
[00:50:41] Because all too often what we've seen is
[00:50:45] big tech will say, "Oh, well, we're in
[00:50:47] favor of regulation, just not that
[00:50:51] regulation."
[00:50:53] And we want them to act in supporting
[00:50:56] this bill, not just when they offer some
[00:51:00] broomemide
[00:51:01] platitudes, but in real action
[00:51:04] supporting this bill. And yes, we'll
[00:51:06] call them out. Name and shame
[00:51:09] because what you've heard here is a
[00:51:11] sense of urgency.
[00:51:13] Time is not on our side. Lives are being
[00:51:16] lost. Real lives in real time. What
[00:51:19] you've heard here is just the tip of the
[00:51:22] iceberg in the harm that is being done.
[00:51:25] So I think with this kind of bipartisan
[00:51:28] coalition, we have the makings of a
[00:51:30] force that will break the dam. As you
[00:51:33] know, legislation sometimes takes a
[00:51:36] couple of Congresses. I've been working
[00:51:37] on the Kids Online Safety Act for
[00:51:41] some years now with Senator Blackburn.
[00:51:43] We're going to continue working on that
[00:51:45] bill, but we can get this over the
[00:51:47] finish line.
[00:51:51] >> What else can we tell you? Yes, ma'am.
[00:51:53] >> Are you in touch with your colleagues?
[00:51:58] Because the Senate has taken action on
[00:52:00] kids online, but leadership has stopped
[00:52:03] those efforts.
[00:52:04] >> Yeah, I am and I I hope that there will
[00:52:05] be and I hope the House will act. I I
[00:52:07] would just say again, I think it's
[00:52:08] impossible to ignore the statistics, but
[00:52:11] maybe even more significantly, it's
[00:52:12] impossible to ignore the stories that
[00:52:14] you've heard today, the stories that
[00:52:16] we've heard now in public testimony
[00:52:18] across multiple committees. I don't know
[00:52:20] as a lawmaker how you can look at these
[00:52:22] parents who are here. By the way, we
[00:52:23] have behind us here and in this room
[00:52:25] parents who have have not sp publicly
[00:52:28] spoken today and who have not spoken in
[00:52:29] public before ever, but they're here.
[00:52:32] They're holding the pictures of of their
[00:52:33] children and loved ones. There are
[00:52:35] parents like this all over the nation.
[00:52:38] And all over the nation, these parents
[00:52:40] are saying, "Please help us." And as the
[00:52:42] father of a 12-year-old, a 10-year-old,
[00:52:44] and a four-year-old, I'm right there
[00:52:46] with them. We need the help. The Senate
[00:52:49] needs to act. The House needs to act.
[00:52:51] And we call on both bodies, Congress as
[00:52:53] a whole, to act on this immediately.
[00:52:55] Senator Blumenthal is right. We don't
[00:52:56] have any time to spare. We don't have
[00:52:58] any time to spare. We need action
[00:53:00] quickly. Yes, sir.
[00:53:01] >> Senator
[00:53:03] AI CEO. I'm wondering, have you and
[00:53:05] Senator spoken with the CEOs? If so, how
[00:53:08] are those conversations going? I know
[00:53:10] you talked about the money. Has that
[00:53:11] come up at all?
[00:53:12] >> Um, they don't uh these CEOs don't speak
[00:53:15] to me anymore. You'll be shocked to
[00:53:16] learn. We haven't had a conversation in
[00:53:19] years, but Senator Blumenthal is a much
[00:53:21] more diplomatic person than I am. So, he
[00:53:24] may he may be be able to shed light on
[00:53:26] that, but I would I would just say this.
[00:53:28] Certainly, they're aware of this
[00:53:29] legislation. And uh you know, I don't I
[00:53:32] don't want um I don't want their
[00:53:34] well-wishes. I want their action and
[00:53:36] they can take action by endorsing this
[00:53:38] bill and helping us pass it. And Senator
[00:53:40] Britt was right when she said earlier,
[00:53:42] they could make all these changes today,
[00:53:43] but the truth is I don't trust them to
[00:53:46] do any of it. We want to see this
[00:53:47] codified. We want this in law. We want
[00:53:50] these parents to have these commitments.
[00:53:53] And if these tech companies are remotely
[00:53:55] serious about protecting children, then
[00:53:58] they'll quit coming up to our committees
[00:54:00] and saying we're doing our best and
[00:54:01] they'll start saying we support your
[00:54:02] legislation.
[00:54:03] >> You know, uh both of us were attorneys
[00:54:07] general. Uh, I was a federal prosecutor
[00:54:11] as well and my policy was I'll talk to
[00:54:14] anyone
[00:54:16] even after you're indicted. You want to
[00:54:18] come talk to me with your lawyer? Sure,
[00:54:21] I'll hear what your defense is,
[00:54:25] but I'll see you in court. And in this
[00:54:29] instance, I'll see you in Congress.
[00:54:31] We need action. We need a law. We're not
[00:54:34] going to be talked out of it.
[00:54:37] bring companies back to the Hill to
[00:54:39] testify about these standards.
[00:54:41] >> Absolutely. I would love to do that. And
[00:54:43] in fact, in our most recent hearing
[00:54:45] where some of these parents who are here
[00:54:46] today appeared and and testified. I
[00:54:48] invited AI companies to be present. I've
[00:54:51] I've personally written to Mark
[00:54:52] Zuckerberg and asked him to come and
[00:54:54] testify. Uh we've invited the heads of
[00:54:56] other AI companies and I issued that
[00:54:58] invitation here today to every head of
[00:55:00] the AI companies whether it's Open AI.
[00:55:03] Okay, I noticed, by the way, Open AI
[00:55:04] just announced today that they are
[00:55:05] formally and officially reorganizing as
[00:55:07] a for-profit corporation. What a shock,
[00:55:10] you know, what a shock. Of course,
[00:55:13] profits above all else. But I would just
[00:55:15] say if you're so confident in what
[00:55:16] you're doing for the American people,
[00:55:18] come testify. We we would love to have
[00:55:19] you. We will give you the DAS. You can
[00:55:22] testify, but expect some tough
[00:55:25] questions. Expect an answer for what you
[00:55:27] have done to these parents who are
[00:55:29] standing here today and to the American
[00:55:30] people. But absolutely, the invitation
[00:55:32] is open. We've extended it many times
[00:55:34] and I think the country deserves to hear
[00:55:36] from them. They deserve to hear from Sam
[00:55:37] Alman and Mark Zuckerberg and the whole
[00:55:39] host of them and uh I would be delighted
[00:55:41] we convene that hearing tomorrow.
[00:55:44] >> Yes ma'am.
[00:55:59] You know, I think listen, I'm all for as
[00:56:01] many enforcement actions and uh
[00:56:03] enforcement clauses and and potentials
[00:56:05] as possible. The thing about this bill
[00:56:07] is why I think it is so hard to object
[00:56:08] to is that it is so incredibly simple
[00:56:11] and really it's incredibly modest, which
[00:56:13] is what your question is getting at. I
[00:56:14] mean, if you can't agree to this bill,
[00:56:17] I mean, this is our point to the AI
[00:56:19] companies. If you can't agree to this,
[00:56:20] this just says you won't target children
[00:56:23] with AI companions.
[00:56:26] You cannot market it to children. And
[00:56:29] you have to disclose the fact that the
[00:56:30] AI is not human and not licensed. That's
[00:56:33] very simple. That's the virtue of this
[00:56:36] bill. It's simple. It's clean. It's
[00:56:38] easy. And so I would just again
[00:56:39] challenge these companies, endorse this
[00:56:41] bill, and support it. But in terms of
[00:56:43] exploring other enforcement mechanisms,
[00:56:45] I think we need them. And I think
[00:56:46] there's more to do.
[00:56:47] >> And and I agree totally. And uh we have
[00:56:50] a vote that is about to close, so I'm
[00:56:52] going to run. Uh thank you all for being
[00:56:55] here today. Thanks. See you later.
[00:57:00] [applause]
[00:57:02] >> Uh social media companies have said that
[00:57:04] disclosure requirements um are against
[00:57:07] their first amendment rights. Um and so
[00:57:10] how does your bill address that? They
[00:57:12] say that it's against the users first
[00:57:14] amendment rights
[00:57:16] >> against their own companies
[00:57:18] >> to have to disclo sorry to have to uh do
[00:57:20] age verifications. What you're talking
[00:57:22] about
[00:57:23] >> various bills um at the state level they
[00:57:25] have used these kind of ways to say that
[00:57:28] we have we have our own first amendment
[00:57:30] rights.
[00:57:30] >> That's absurd. Yeah. I' I've seen their
[00:57:32] arguments and their arguments are are
[00:57:33] frankly hilarious. They've argued
[00:57:35] variously that they're not editors and
[00:57:36] publishers and therefore they can't be
[00:57:38] compelled uh to the liability to be
[00:57:40] subject to the liability of editors and
[00:57:41] publishers. And then they've turned
[00:57:42] around and argued to your point even in
[00:57:44] the Supreme Court they've turned around
[00:57:45] and said actually we are editors and
[00:57:47] publishers come to think of it and
[00:57:49] therefore we have total and blanket
[00:57:51] first amendment protection. All I can
[00:57:53] say is is that these companies surely do
[00:57:55] not have more constitutional rights than
[00:57:57] the human beings who are behind me.
[00:57:59] Surely they do not have the
[00:58:00] constitutional right to exploit
[00:58:03] children. Surely they do not have the
[00:58:05] constitutional right to lie to their
[00:58:07] users. All this bill would do is say
[00:58:09] that listen, you cannot market or
[00:58:11] provide your AI companion chat bots to
[00:58:13] children who are 17 years of age or
[00:58:15] under. And it includes a verification
[00:58:16] procedure that will protect the privacy
[00:58:18] of the user that includes requirements
[00:58:21] for encryption and other industry
[00:58:22] standard protections for the user. So
[00:58:24] the user is not disclosing his or her
[00:58:27] information out there in public. But it
[00:58:29] does put the onus on these companies to
[00:58:30] verify the age and then it subjects them
[00:58:33] to pen subjects them to penalties
[00:58:35] including prosecution if they lie and
[00:58:38] don't do it. And I think that that is
[00:58:40] not only appropriate and absolutely
[00:58:42] needed. I just think that's basic
[00:58:43] fairness. That's how everybody else in
[00:58:45] America is treated. And I'll just end
[00:58:47] with this. You know, as Stephan, I
[00:58:50] think, pointed out, if these companies
[00:58:52] or or one of our parents, if these
[00:58:54] companies were actual human beings
[00:58:57] conducting this kind of activity,
[00:58:59] grooming children in this sort of way,
[00:59:01] they would be prosecuted. I say this as
[00:59:02] a former prosecutor. I would prosecute
[00:59:04] them. It's illegal if they're human
[00:59:06] beings, but they hide behind the fact,
[00:59:08] well, it's just a computer. We didn't
[00:59:10] really know about it. We can't do
[00:59:11] anything about it. Oh, ridiculous. They
[00:59:13] can stop it if they want to, and it's
[00:59:14] time that they did.
[00:59:16] >> What else can I tell you? Yes, ma'am.
[00:59:19] This is something President Trump.
[00:59:22] >> You know, I I think that I don't want to
[00:59:23] speak for the president, but I'll just
[00:59:24] say this. The president has been a huge
[00:59:27] champion of safety for kids, of
[00:59:30] protection for kids, and also taking on
[00:59:32] AI companies or tech companies, I should
[00:59:34] say. I mean, in his first
[00:59:35] administration, they launched the
[00:59:36] biggest antitrust suits, which are still
[00:59:38] ongoing, against many of these same
[00:59:40] entities that now own the AI companies.
[00:59:41] So, I don't want to I want to be careful
[00:59:42] not to speak for him, but uh I'm very
[00:59:45] optimistic about this and I'm optimistic
[00:59:47] about getting it passed. Let's get it
[00:59:48] passed first, but I'm optimistic about
[00:59:50] getting this passed through the Senate
[00:59:51] and and then I hope having the president
[00:59:53] support. Yes, sir.
[00:59:55] >> Earlier this year, we saw efforts in the
[00:59:56] Senate to implement an AI,
[01:00:00] >> right?
[01:00:04] >> 10ear ban on AI um different types of
[01:00:07] laws that would affect AI. So, what
[01:00:10] would you say to your colleagues that
[01:00:11] are leading those efforts.
[01:00:12] >> Bad idea.
[01:00:14] Bad idea. I mean, uh, listen, I I I say
[01:00:18] as a Republican who believes in
[01:00:19] federalism. I think it's a strange
[01:00:21] argument for some Republicans to make in
[01:00:24] this building that all of a sudden we
[01:00:25] should say to the states, "No, actually,
[01:00:26] you shouldn't do anything. You shouldn't
[01:00:28] protect kids. You shouldn't stop deep
[01:00:30] fake porn. You shouldn't uh protect kids
[01:00:33] from uh having their name, image, and
[01:00:35] likeness stolen and used on the
[01:00:36] internet." I mean, that's what these
[01:00:37] state laws do. We have them in my state,
[01:00:39] the state of Missouri. I don't want to
[01:00:40] see our laws overturned. Uh there's
[01:00:42] there's excellent laws like this all
[01:00:43] over the country. I I want to see those
[01:00:45] laws, particularly those laws that do
[01:00:47] the things that I just named. I want to
[01:00:49] see those remain in place. Do some state
[01:00:51] laws go too far? No, probably. But uh I
[01:00:54] mean, listen, that's that's federalism.
[01:00:56] So, I'm I'm very skeptical of the
[01:00:58] efforts by some to prevent states from
[01:01:00] doing anything on this. The federal
[01:01:02] government, of course, so far has done
[01:01:03] zilch. So, we need states like Missouri
[01:01:06] and Arkansas and other states that have
[01:01:08] stepped Texas that have stepped up here
[01:01:09] to protect kids uh to stop predators. We
[01:01:13] want those efforts to succeed or at
[01:01:15] least I do. So, I would be very wary of
[01:01:17] anything that would stop those kinds of
[01:01:19] protections. Yes.
[01:01:21] >> Perhaps a bit more modest than the
[01:01:22] president would have talked to Judiciary
[01:01:24] Committee leadership grass like ranking
[01:01:26] member Durban about a markup.
[01:01:28] >> Yeah, absolutely. And listen, again, I
[01:01:30] don't want to speak for them, but I
[01:01:31] absolutely have visited with both of
[01:01:33] them about it. Of course, Senator Durban
[01:01:34] has has been a a a champion on the AI
[01:01:36] and big tech front in general. And
[01:01:39] Senator Grassley has been terrific to
[01:01:40] mark up bills. I mean, uh, we've already
[01:01:42] passed, uh, bills protecting kids online
[01:01:45] out of the committee this year alone
[01:01:46] because Senator Grassly marked it up,
[01:01:48] including my bill with Senator Durban,
[01:01:49] which passed unanimously. So, I'm I'm
[01:01:51] very hopeful that we'll get a markup
[01:01:52] here. I think more on that to come.
[01:01:55] Anything else we can tell you? I just
[01:01:57] want to end with this something that
[01:01:58] Megan said that really really struck me.
[01:02:00] She talked about their beautiful son Su
[01:02:02] and that how she always knew that he
[01:02:04] would change the world. Well, he might
[01:02:05] yet.
[01:02:07] Every child up here who has been
[01:02:10] subjected
[01:02:11] to the kind of abuse that you've heard
[01:02:14] about today and whose lives have been
[01:02:16] destroyed,
[01:02:17] they yet I think will change the world.
[01:02:19] Every life here, every parent who stands
[01:02:21] here represents a life that is
[01:02:22] absolutely precious and irreplaceable.
[01:02:26] and the value and power of those lives
[01:02:29] continue to echo and I believe they will
[01:02:31] echo into the adoption of this
[01:02:32] legislation. If this legislation gets
[01:02:35] adopted, it will be because of them. It
[01:02:37] will be because of who you see behind
[01:02:38] me. Not not with all due respect to my
[01:02:40] colleagues, not it's not who's standing
[01:02:42] here, it's who's standing here. And I
[01:02:44] think that what they have said today,
[01:02:46] the truth that they have told today is
[01:02:49] absolutely unanswerable. It demands
[01:02:51] action. And we are committed to getting
[01:02:53] that action. Thank you so much for being
[01:02:54] here.
[01:02:56] >> [applause]
[01:03:03] [applause]
ℹ️ Document Details
SHA-256
yt_5JkgUjPRCVA
Dataset
youtube
Comments 0