📄 Extracted Text (2,522 words)
[00:00:00] We had a situation where a mom called me
[00:00:02] and [music] there were girls in her
[00:00:04] class at school, an all girls school,
[00:00:05] that had sugar daddies. I was like, "How
[00:00:07] are they finding the girls?" And it was
[00:00:10] Vinmo. I'm like, Vinmo?
[00:00:11] >> Vinmo? That's the app I used to pay my
[00:00:14] babysitter. That's not something on
[00:00:16] anyone's radar as where the traffickers
[00:00:18] go to find kids. They could find out who
[00:00:20] a 13-year-old girl was just [music] by
[00:00:21] her patterns on Vinmo, right? Like money
[00:00:24] at Sephora, girls night at the movie.
[00:00:26] you know, it's just uh for a predator,
[00:00:28] they're smarter than uh [music] parents
[00:00:30] can keep up with, right? So, that was a
[00:00:32] new one. Um Venmo,
[00:00:35] >> that's a Wow.
[00:00:36] >> And so, [music] then they would send
[00:00:37] these girls like $100 for a headsh shot,
[00:00:40] you could be a model.
[00:00:41] >> All right, Elizabeth, getting ready for
[00:00:43] the big interview, but we got these
[00:00:45] things we do every once in a while. It's
[00:00:46] a new thing actually for this year. Uh
[00:00:48] we're calling them hot questions.
[00:00:50] >> All right.
[00:00:51] >> And um so it's going to be a really
[00:00:52] heavy interview about
[00:01:00] That was weird.
[00:01:01] >> Mhm. [laughter]
[00:01:03] >> What is that thing?
[00:01:03] >> It's going to be I think my phone just
[00:01:05] fell off.
[00:01:06] >> Oh.
[00:01:06] >> Uh going to be a really heavy interview
[00:01:08] about your brother, his suicide, what
[00:01:12] happened at Camp Canakook. And um and so
[00:01:16] that's kind of be the the topic of the
[00:01:18] of the interview is sexual abuse. And um
[00:01:22] and so the hot question we have here is
[00:01:25] we've uncovered a lot about how children
[00:01:27] are being targeted online, especially
[00:01:29] through video games and platforms that
[00:01:31] parents often seem often see as
[00:01:34] harmless. When Ryan Montgomery came on
[00:01:37] the show, he broke down how predators
[00:01:39] are actively using games like Roblox to
[00:01:42] groom and exploit children and how these
[00:01:45] platforms have struggled to stop it.
[00:01:47] What are the most popular games that
[00:01:48] they that the 764 cult is luring their
[00:01:53] victims out of?
[00:01:53] >> Roblox, Minecraft, Instagram, Tik Tok,
[00:01:56] Snapchat.
[00:01:57] >> Okay. Tell me about Roblox and
[00:02:00] Minecraft. Cuz I don't know anything
[00:02:02] about gaming. I don't game. I don't
[00:02:04] >> Understood. I I don't either, but it's
[00:02:06] it's part of the investigation. You got
[00:02:08] to a lot of times I'm investigating I
[00:02:10] end up on these games or apps and Roblox
[00:02:13] is the one that I focus on the most
[00:02:15] because that seems like the majority of
[00:02:17] the issues right now.
[00:02:18] >> I know little kids that use that game.
[00:02:20] >> There's 75 million active daily users.
[00:02:23] >> You're a parent yourself. What specific
[00:02:26] steps and safeguards do you put in place
[00:02:28] to protect your own children from online
[00:02:30] predation?
[00:02:32] [clears throat] and what should every
[00:02:34] parent be doing right now that many
[00:02:36] still aren't?
[00:02:37] >> It's a great question and especially
[00:02:40] because things keep evolving and so
[00:02:42] you've got to keep up, right? And um I'm
[00:02:45] really grateful for people like Ryan who
[00:02:47] are exposing this stuff. Uh my kids are
[00:02:50] 13, 11, and seven, almost eight. And so
[00:02:54] I've uh I just know too much, right, to
[00:02:57] let them have access to the worldwide
[00:03:01] web. It's crazy out there. And these
[00:03:03] tech companies aren't doing anything to
[00:03:05] protect kids. They build their
[00:03:06] businesses for profit, not for
[00:03:08] protection. So, um, my oldest being 13,
[00:03:12] a lot of his friends have cell phones.
[00:03:14] Even my 11-year-old's friends, and we
[00:03:16] just don't do that. I mean, we let them
[00:03:19] have, um,
[00:03:20] >> they don't have any phone,
[00:03:21] >> no phones for the kids. Um there's a
[00:03:26] kind of group consensus of wait till 8th
[00:03:28] and then get like a very basic phone
[00:03:30] that doesn't have apps on it just for
[00:03:32] communication.
[00:03:33] >> Um so we'll do something like that for
[00:03:36] our oldest when he's in eighth grade
[00:03:38] probably.
[00:03:39] >> Uh but they have iPads. They're super
[00:03:42] locked down. Uh we do not let them play
[00:03:45] Roblox. And uh here's an example of what
[00:03:49] people don't even realize that I want
[00:03:52] parents to know. When I was the board
[00:03:53] chair of an anti-trafficking
[00:03:54] organization during CO, I would get a
[00:03:57] lot of tips or parents calling because
[00:03:59] they knew like I was doing some work
[00:04:00] around this and they also knew about my
[00:04:03] brother's story. And uh we had a
[00:04:06] situation where mom called me and there
[00:04:08] were girls in her class at school, an
[00:04:10] all girls school that had sugar daddies.
[00:04:13] And these were, you know, 15-year-old
[00:04:15] girls and multiple of them had sugar
[00:04:19] daddies. And
[00:04:21] I was like, "How are they finding the
[00:04:23] girls?" And it was Vinmo.
[00:04:25] >> I'm like, "VMO?
[00:04:26] >> Vinmo? That's the app I use to pay my
[00:04:29] babysitters or, you know, like reimburse
[00:04:31] someone for co-hosting a birthday dinner
[00:04:34] or something." That's not something on
[00:04:36] anyone's radar as where the traffickers
[00:04:39] go to find kids. And what was happening
[00:04:42] is they would find through the patterns
[00:04:45] because Vinmo's if you don't uh have
[00:04:47] privacy settings on it defaults to
[00:04:49] public. And I don't know if that's
[00:04:51] changed, but in this case, they could
[00:04:54] find out who a 13-year-old girl was just
[00:04:55] by her patterns on Vinmo, right? Like
[00:04:58] money at Sephora, girls night at the
[00:05:00] movie. You know, it's just uh for a
[00:05:02] predator, they're smarter than uh
[00:05:05] parents can keep up with, right? So that
[00:05:07] was a new one.
[00:05:08] >> Venmo. Vinmo. That's a Wow.
[00:05:11] >> And so then they would send these girls
[00:05:12] like $100 for a headsh shot, you could
[00:05:15] be a model, you know, the whole like
[00:05:19] love bombing, right? And then it would
[00:05:21] escalate into if you go lower, I'll pay
[00:05:23] you more money. And then lower then it
[00:05:25] became sextortion
[00:05:27] and then there's so much shame they
[00:05:29] don't tell a parent. The reason this
[00:05:30] came to our attention was because one of
[00:05:32] the girls uh one of the families was
[00:05:35] hosting a girl uh like as a boarding
[00:05:38] student and she didn't understand what
[00:05:41] was happening and she just said, "Yeah,
[00:05:43] my friends have sugar daddies." And the
[00:05:45] mom that was hosting her was like,
[00:05:46] "What?" And that's how we uh ended up
[00:05:49] figuring out what was going on there and
[00:05:51] getting law enforcement involved and
[00:05:53] stuff.
[00:05:54] >> The the school didn't really do much
[00:05:55] about it, but
[00:05:56] >> Wow.
[00:05:58] That's the world we live in now. It's
[00:06:00] like even these apps we think are so
[00:06:02] innocuous um and are [snorts] helpful
[00:06:05] right in day-to-day life. Uh they can be
[00:06:08] used under the right cir I mean during
[00:06:10] co just as every industry had to pivot
[00:06:12] so did the predators and so they went
[00:06:15] more online and they find
[00:06:17] vulnerabilities in all of these apps.
[00:06:19] >> Man,
[00:06:20] >> it's crazy.
[00:06:21] >> You know, it's crazy how many of them
[00:06:23] there are.
[00:06:24] >> It's wild. I mean, I don't know how many
[00:06:26] of these shows you've seen or if you
[00:06:28] even know who Ryan Montgomery is, but um
[00:06:31] when the first time I interviewed him, I
[00:06:35] I just I didn't believe this I was
[00:06:37] like, "Nah, it's not that bad." And so,
[00:06:40] I made him prove it to me. I was I don't
[00:06:42] know if you saw it, but I was like,
[00:06:43] "Whip your laptop out. Get in any chat
[00:06:45] room. I don't care what it is. I I will
[00:06:47] sit here for two days."
[00:06:48] >> Yeah. And probably like within seconds,
[00:06:50] they were
[00:06:51] >> It was 5 seconds.
[00:06:52] >> 5 seconds. Yep. Exactly. I was like,
[00:06:54] "Whoa."
[00:06:55] >> But, you know, and then I look at
[00:06:57] parents. I mean, I have two little ones
[00:06:59] myself and um one of them's kind of just
[00:07:02] now starting to show interest and you
[00:07:06] know, he wants to take my iPad or you
[00:07:09] stuff like that. He's starting to learn
[00:07:11] how to work a remote
[00:07:12] like that. And you know, but I see
[00:07:14] a lot of parents and they don't even
[00:07:16] they don't even take advantage of the
[00:07:18] stuff that the platforms are using like
[00:07:20] YouTube Kids and stuff like that. steps.
[00:07:23] And I if you're not in this, if you
[00:07:26] don't understand this world, I can see
[00:07:27] how parents are overwhelmed by just the
[00:07:30] amount of steps you have to take to lock
[00:07:31] down a kid's device.
[00:07:33] >> We're overwhelmed. Well, we don't have
[00:07:34] any kids' devices yet. But
[00:07:36] >> yeah, already overwhelmed. I mean,
[00:07:37] here's another thing I'll say. Keep your
[00:07:39] kids off YouTube. Uh I've had friends
[00:07:42] who their son was uh on the bus to
[00:07:45] school and some kid had an iPad and they
[00:07:49] were watching cartoons and it just rolls
[00:07:50] to the next on YouTube. you know, I'll
[00:07:52] like go from one video straight into the
[00:07:54] next. And uh it was animated porn.
[00:07:58] It was cartoon characters doing sexually
[00:08:02] explicit things. Uh and this was on they
[00:08:05] were watching something innocent and
[00:08:06] then it rolled to this video and uh that
[00:08:10] kid has had so many issues from that.
[00:08:13] >> Fortunately, he eventually told his
[00:08:14] parents and they got him some help and
[00:08:17] you know that's not your fault, but this
[00:08:18] is why we don't want you on YouTube. Um
[00:08:22] there are even some issues with YouTube
[00:08:23] kids.
[00:08:24] >> Is there what are those issues?
[00:08:26] >> Yeah. Uh so I actually I got in touch
[00:08:29] with um the team at Mr. Beast. We met at
[00:08:34] this thing and um I told them some exact
[00:08:36] examples and they were like we don't
[00:08:40] need to be on YouTube Kids. We're
[00:08:42] YouTube, right? But like YouTube Kids,
[00:08:44] we can call up there and say we're out
[00:08:46] of here if you're like promoting these
[00:08:48] materials on YouTube Kids. So, like the
[00:08:51] chief technology officer at the time for
[00:08:52] YouTube Kids like made a couple phone
[00:08:54] calls and got this stuff taken down, but
[00:08:56] same stuff on YouTube Kids.
[00:08:58] >> Wow.
[00:08:59] >> And these channels had been reported
[00:09:01] multiple times, like not just a couple
[00:09:04] of times.
[00:09:05] >> Yeah, it's interesting.
[00:09:06] >> And until they got that call from
[00:09:07] someone that works for Mr. Beast, they
[00:09:08] didn't do anything about it.
[00:09:11] So, our kids, if they want to watch
[00:09:13] YouTube, it's with a parent in the room.
[00:09:16] And they know this. We're very strict
[00:09:18] about it. And um [snorts]
[00:09:20] it's very limited.
[00:09:21] >> What do they watch? What are they
[00:09:23] allowed to watch on their own?
[00:09:25] >> Anything?
[00:09:27] >> We love family movie night. We do family
[00:09:29] movie night on Friday nights and we pick
[00:09:31] out a movie together and they're
[00:09:32] watching with their parents. Now, I
[00:09:34] don't think my kids need to live in a
[00:09:35] bubble. I think parents need to be uh
[00:09:40] watching this stuff and processing it
[00:09:42] with their kids
[00:09:44] >> so that there can be conversations.
[00:09:46] >> Mhm. And then you have the conversation
[00:09:47] with them too around um if you ever see
[00:09:52] something that feels off to you. There
[00:09:55] are tricky people and there are there's
[00:09:58] tricky content like that animated
[00:10:00] cartoon porn, right? Um come to us and
[00:10:04] you don't need to deal with that alone
[00:10:06] and we'll figure it out. And then like
[00:10:07] that Mr. Be situation getting that taken
[00:10:09] down off YouTube Kids, they were like,
[00:10:11] "Oh, cool. We just made YouTube." And I
[00:10:13] was like, "Yeah, so now you're my
[00:10:14] YouTube kid moderators. If you ever see
[00:10:16] anything like that, come to us. We'll
[00:10:18] get it taken down. It's not an excuse
[00:10:20] for them to watch more more of it, but
[00:10:22] um I think helping kids feel like part
[00:10:24] of the solution is a great way to
[00:10:26] parent. And then having those
[00:10:26] [clears throat] conversations with them
[00:10:28] around what's appropriate, what's not
[00:10:30] appropriate. Um but the average age a
[00:10:33] child's exposed to pornography is 11
[00:10:35] these days,
[00:10:37] >> man.
[00:10:37] >> And they're not seeking it out. I mean,
[00:10:39] vast majority of the time it finds them
[00:10:42] or it's that situation on the bus I
[00:10:44] mentioned. And um they don't want to get
[00:10:46] in trouble cuz they don't want to lose
[00:10:47] their iPad so they don't go to mom and
[00:10:49] dad.
[00:10:50] >> Do your do you I mean do your kid do you
[00:10:51] get a lot of flack for not letting your
[00:10:52] kids have phones? I mean is this a daily
[00:10:54] battle?
[00:10:56] >> Is it a every other day battle? Yeah.
[00:10:59] >> Every other [laughter] day battle. Yeah.
[00:11:01] >> Especially now that like my boys are in
[00:11:03] middle school. So like other kids have
[00:11:05] them.
[00:11:06] >> Yeah.
[00:11:06] >> And I'm like here's why we don't and
[00:11:08] won't. Um so no phones till 8th grade.
[00:11:12] And again, a very simple phone that's
[00:11:13] for communication. We have laptop and uh
[00:11:16] cuz my my oldest needs a laptop for
[00:11:18] school. The younger two not really yet.
[00:11:20] They're handwriting assignments, but um
[00:11:23] we have a contract for them that they
[00:11:26] sign and it's there's one for their
[00:11:27] laptop and there's one for other
[00:11:28] devices. And we agree as a family on
[00:11:30] these terms and they sign the contract
[00:11:32] and we put it on the bulletin board in
[00:11:33] their room by their desk. And um it's
[00:11:36] you know, don't get on group text
[00:11:39] threads. If people add you to a group
[00:11:40] text thread, you have to get permission
[00:11:42] from your parents
[00:11:43] >> to uh be on that thread or to add people
[00:11:46] to your contact list, things like that.
[00:11:48] Um Thorn for Parents is a great resource
[00:11:50] on this. They have templates of these
[00:11:52] contracts. I mean, we just drafted our
[00:11:54] own as like
[00:11:55] >> Thorn for Parents.
[00:11:56] >> Thorn for Parents. So Thorne um is doing
[00:12:01] a lot with online and digital safety
[00:12:04] [snorts] and they realize to your point
[00:12:06] like it's overwhelming for parents to
[00:12:08] know how to raise kids in this
[00:12:10] environment and um so they're kind of
[00:12:13] spoon feeding parents and if you go to
[00:12:15] Thorn for parents they'll uh there's all
[00:12:18] these links to a just educating parents
[00:12:21] on like what they need to know and how
[00:12:22] bad these apps and games can be and what
[00:12:26] predators are doing, but then it also
[00:12:29] will spit out a template for you to have
[00:12:31] that conversation with your kid, how to
[00:12:32] have the conversation, and then a
[00:12:34] template contract if you want to go over
[00:12:36] some terms with your kids around
[00:12:38] boundaries with online devices.
[00:12:40] >> Wow, that's good to know. I'm going to
[00:12:42] check that out tonight.
[00:12:43] >> Yeah.
[00:12:43] >> Thorn for parents.
[00:12:45] >> Well, thank you. That's uh
[00:12:48] >> uh Carly Ryan Foundation, another great
[00:12:50] resource. Um,
[00:12:52] Carly Ryan was lured to uh her murder,
[00:12:56] rape and murder in Australia.
[00:12:59] And uh, her mom, Sonia, has been
[00:13:01] fighting ever since to uh, change laws
[00:13:04] in honor of Carly. And on the Carly Ryan
[00:13:06] website, they have a resources section,
[00:13:08] and they have one pagers on all the apps
[00:13:10] from Roblox to Discord to YouTube. And
[00:13:13] that's like a one pager for parents to
[00:13:15] know how to adjust the settings on that
[00:13:17] app or should you just not have the app?
[00:13:19] Yes, that's really helpful, too. That's
[00:13:20] really helpful.
[00:13:21] >> Wow. I'll I'm checking that out as soon
[00:13:24] as I get home, too, man. Thank you.
[00:13:26] Thank you. Yeah.
[00:13:27] >> You ready to get into the interview?
[00:13:29] >> Yeah.
[00:13:29] >> Let's go.
[00:13:30] >> All right. Let's do it. No matter where
[00:13:31] you're watching Sean Ryan show from, if
[00:13:33] you get anything out of this, please
[00:13:35] like, comment, subscribe, and most
[00:13:38] importantly, share this everywhere you
[00:13:41] possibly can. And if you're feeling
[00:13:43] extra generous, please leave us a review
[00:13:46] on Apple and Spotify podcasts.
ℹ️ Document Details
SHA-256
yt_X2SOBVZrPDQ
Dataset
youtube
Comments 0