AI That Cares: Tim Estes on Building a Safer Internet for Children
Sam Acho: Welcome back to the Sam
Acho podcast featuring Cliff Marshall.
And we have a very, very
special guest today.
Uh, his name is Tim Estes, and I'm
actually gonna read a little bit
of his bio because he's actually
in a world that I think a lot of
you all might be interested in.
So, uh, Tim Es is a pioneering executive
in the artificial intelligence and
natural language processing domain.
He founded and led digital
reasoning from 2000, an AI leader
in the space of unstructured
data analytics for 20 plus years.
ES is envisioned a means by which
computers could learn to accurately
interpret language, understand context,
and extract critical intelligence.
As you can see, we're gonna have
a conversation about ai, but this
conversation is a little bit different.
So, since Tim's time at Digital
Reasoning, he actually started
a company called Angel ai.
And specifically I think we're
gonna talk about here, is this
is this app called Angel Q.
It's an app for kids ages five to 12 that
protects them from some of the online,
I'll call it predators, if you will.
And Tim, you can even get in deeper
that we see sometimes as we're
browsing or on social media or even
some AI generated, uh, information.
And so, without further ado,
Tim, we're so glad to have you.
Tim Estes: Sam Cliff,
thanks for having me on.
This is gonna be fun.
Sam Acho: Yeah, we're excited.
And Cliff, I know Cliff, we were, so,
cliff and I were talking beforehand
and as soon as Cliff heard and I, we
could have you on the podcast, he's
like, dude, let's get it scheduled
'cause I've got some questions.
Cliff is a father as well.
I'm a father Tim, you're
a dad on this podcast.
We talk about faith, talk about family,
talk about football, talk about finance.
We try to go first, try to give space.
We try to grow hope.
And so this podcast is really
critical to a lot of the topics.
So Cliff, I'll let you go first as
we start our conversation with Tim.
Clif Marshall: Yeah, Tim, I'm so fired
up again to have you on the show.
I guess, you know, Sam just mentioned it.
We talk on this podcast often about
football, uh, you know, finances,
faith, but today we're really gonna
focus in hone in on the family
because I have two kids, right?
Uh, Sam is a father of four, and
Tim, I know you are a father of two.
And so I just see my kids a 10-year-old
and a 14-year-old, and the interaction
they have each and every day on
the worldwide web, the internet.
And so I want to know, number one, uh,
about your family and then number two
about how, uh, this Angel AI company,
uh, was started and if your, your
kids, how they would be affected by it.
Tim Estes: Mm-hmm.
Yeah, no, I'm happy to.
Um, and so yeah, I'll probably share more
here than I have in some other places.
Um, you know, I, I, because of knowing
some of the threats that are out
there, you know, I'm fairly careful
with what we talk about with our kids.
You probably have similar
kind of, you know, guidelines.
Y'all, you want to be open and
transparent with people and audience,
but we also have a first obligation
to protect our families, right?
Clif Marshall: Right.
Tim Estes: so we're
always figuring that out.
Um, so I've got, uh, two boys,
nine and six and, um, and
a lovely bride of 10 years.
um, I, um, I think that, where
did I, where to begin in this?
Um, been in this area for a long
time, uh, for, I started my first
company when I was 20 years old.
I was a third year at UVA and um, I, you
know, I, I went on this journey where
I got put in the weirdest place is from
like classified, you know, facilities
and going and finding terrorists
with AI in the early two thousands,
going into Wall Street and these like
global banks like Goldman Sachs and
others, and finding bad guys there.
And so I spent about 20 years,
like through going, you know, going
through finding all these bad guys,
uh, on different areas and then some,
occasionally some really good things
like using it to find like patients
for, um, like potential cancer patients
for HCA, like at, at scale, so as to
find like grandparents and, uh, that
might be sick and get to them sooner.
So I spent a lot of time on that and,
um, in that journey I got exposed
to one particular effort, called,
um, it was a group called Thorn.
Uh, it's based outta Los Angeles.
It's, you know, Ashton Kutcher, Demi
Moore's charity, they focused on
sexual exploitation of children online.
Um, and the stories they told me.
And so this is all happening
before I'm a parent, right?
So, you know, it's one of those things,
you, we all have lived through the
journey, all of us, multiple times.
You're the kind, you're one kind of man,
you know, and one kind of believer before
you have a family, and then you're a
different kind of one on the other side.
Um, you know, there is no better
force, uh, to bring about humility.
I think that, that, you know, God
intended than to give us kit, uh,
because you feel like the limits of,
um, your power in, in influencing that,
you know, they have these independent
people that they can be little people,
but they're, they're their own people.
Uh, and we get to kind of nudge
them along and we get to watch
over 'em, and that's stewardship.
Uh, but at the same time, uh,
you also feel a little powerless.
Like you can't control, if you try to
control it fully, try to shape it fully.
you, you end up running into,
you know, various limitations
that are human and walls.
Um, and there's a big bad world out there.
I mean, there's a, there's a lot
of goodness in the world, right?
But there's a lot of badness.
And, and, but that thorn experience,
I brought it up and I walked my
story to that point because came
telling us about all these children
that were being essentially
trafficked for sex, like sold online.
And so, uh, like Sam had shared this
amazing Tim Tebow clip with me, you
know, in the last week or so where, uh,
he was on Sean Ryan's show and just laid
out how really horrifically endemic like
sexual material on children is spread
over the internet and, and actively, I.
were exposed by Thorn
who had done the work.
He interviewed survivors to
all these kids that have been
advertised as escorts online.
We're talking like 10 year olds,
12 year olds the open internet.
There's a place called Backpage.
You could get to it, you could get
to somebody's ad and call their pimp.
clicks from Google.
Okay?
And it was out there for a decade,
like doing nothing but evil.
And we actually worked with them
to find, uh, to build a system that
would go pull all those ads and
figure out which one were the kids.
'cause you got, you know, escorts
on there, which is still illegal,
still wrong, but it's adults.
And you got these children.
And these children are the ones that
are being like, you know, it's one of
the gross egregious evils in the face
of the earth to do that, children.
And so, uh, so we got exposed
to like how bad that was.
And we built a system that literally
crawled all those ads we had
like a hundred million of them.
sounds insane, that we literally
had an index of a hundred million
escort ads, probably a million or
two that were unique and then, you
know, copies upon copies of others.
And I would, I I think we estimate
one with like 6% were actually kids.
And so just think about the math of
that, of what that means in terms of how
many kids were being sold in traffic.
Two clicks from Google.
Okay.
So I, I come up before I'm a dad, right?
With this kind of evil, like in my face.
And we're working with law enforcement.
We're trying to like, help them find
these kids and take 'em down and, and
Thorn's the one who deserves the credit.
It was, um, thorn, you know, gave
us the resource to build the system.
They had the vision for it.
We were just the technical guy, so could
figure out how to build out parts of it.
And, um.
So then I become a dad you know, I'm
naturally gonna be concerned about, okay,
this is what can happen to your kid.
You know, don't leave him too
more outta your side on vacation.
You know, that type of,
of feeling we all have.
Um, uh, and so this, this angel q uh,
piece emerged as a second company.
I got, I finished up the first, the
first one in 2020 and, and kind of
wrapped up, uh, in 21 with the acquirer.
Um, 22, I came across this story, um,
of a little girl named Nila Anderson.
And so Nyla is a 10-year-old, um, you
know, girl went to charter school,
three, spoke her languages, really
smart, and was just on TikTok, you
know, watching dance videos, you know,
'cause it was just silly dance videos,
like silly singing, whatever, right?
That was all in the pandemic.
All the kids were doing it.
Everybody was being cool about it.
And this, this algorithm served
up a video to this little girl
called a blackout challenge.
And, uh, she never looked for it.
This is something where the technology
decided it was gonna put something in
front of this little girl because it
knew that this negative, like crazy
stuff would get their, get her eyeballs
and they could make money off of it.
So the sweet girl goes into her mother's
closet, hangs herself by, by her mother's
purse, by the neck, like emulating this
and suffocates herself, her mother Tiwan
come home and finds her and takes her
to the hospital, dies five days later.
Clif Marshall: Hmm.
Tim Estes: And, and that got to me,
I, I I, I say this in a couple of
contexts that that thorn thing back
in, that it came in originally in 2012,
so this is like 10 years before this.
So 10 years earlier I had a
moment of like, this is evil,
not in my country kind of moment.
Like, do something about it.
And then this happens as soon as
I get done my last company and I'm
like, this is evil, not in my country.
Do something about it.
And, um.
I was like, you know this, this
can't be allowed to happen.
And also when you're a parent,
mean, I'm sure you all can identify
with this, which is something bad
happens to somebody else's kid.
We don't go, oh, well that's just too bad.
We think about what
happens if it was our kid.
Okay.
God gives us this empathy encountering
like suffering and having love for other,
like our own close things to help us open
up our hearts to other people, right?
And so part of this is you encounter
something that bad and yeah, it hits you.
It's like, this could be
like my 9-year-old boy.
he might be overconfident watch
something like this if he was
allowed on that stuff, then that
might be what I came home to, right?
So that kind of, that, that stuck with me.
and I think that what we, know, what
we had this opportunity to do, you
know, in this life to like, take
those moments and let it activate us.
You know, because is hard, like affecting
something as big as like, you know, TikTok
and all these kids exposed social media.
Huge, huge problem.
You know, truly Goliath,
you know, scale problem.
Um, and I'm just like, okay, I don't
know where you're gonna take me with
this, but I wanna put an AI between
every child on the internet and give
that over to the parents so they can know
that something that's kind of working
on their behalf isn't doing their job
for them, but takes parenting online.
And that's where Angel Q comes in is
we, we, we thought of, okay, we could
build something that would be an AI
that would serve the family and it
would be a way for kids to search and be
curious, but be a hundred percent safe.
I.
and that's, that's kind
of where that goes.
So long journey, 25 years, I've
tried to compress in the last maybe
three or four minutes of that.
But that's, you know,
that's what got me here.
And, uh, a little bit about like
maybe me and my family and, and the
things that, uh, actually drive us
Sam Acho: Tim, just to follow up on
that conversation, you talked about
a couple of those, those like high
mark moments tho or step that are ET
in your memory, what was the process
like of actually starting Angel Q?
Tim Estes: well.
Uh, so it's very different
the second time around.
Right?
And so, so this was my second go at it.
I had done decently from my last company,
so I didn't come in kind of cold.
I didn't come in completely unresourced.
and so what happened was I, I
heard that story and I started
thinking about what would that look
Clif Marshall: Okay.
Tim Estes: ai that could be between
the kids and the internet look like?
I went to the people I knew,
um, my, one of the smartest
guys who had ever worked for me.
Uh, my last company, guy named Brian,
who became the CTO and is the CTO, the
chief Technology Officer for Angel Q.
Um, and he and I started like iterating
and talking about how it might work.
Um, and then uh, I shared also
with, um, a woman named Randy,
who's my chief market officer.
She was the best mind I knew about how to
reach people, build a brand, because I, it
became clear to me if we're gonna succeed.
We needed to think about how
parents would, would perceive
what we were building.
Like AI could be very scary.
how can you turn it into a
positive tool for the parents?
Like what are the values you have to
have like imbued in that system and
in your brand to make that stick?
and Randy was the best brand
mind I've ever had, and so she
became my chief marketing officer.
what you do is you go through
and you find the right people.
And then the, the last one was just
fascinating, is someone I went to high
school with, that, uh, I, I didn't
know that well, um, but had immense
respect for guy named Josh Thurman Josh.
Um, he, uh, he had made the story,
he'd gone to Wake Forest, uh, nine 11
happens and he decides he's going to go
become a Navy Seal, not a, not an easy
thing to go, just decide on a whim.
'cause he was actually going
to like Yale Divinity School.
So he, he had like, he's gonna med
school or Yale Divinity School and then
he ends up going, being a Navy Seal.
Not only being Navy Seal,
he ends up in Team six.
So he ends up like on the
real deal for 12, 13 years.
I, I, I'd heard about him and
I, I knew, like I said, I knew
him from when I was younger.
We had some friends that were in common.
Uh, I was not a great athlete by any means
when I was in, you know, in high school.
He was an amazing cross country athlete.
And you know, we had a lot of friends from
that because I was like the cross country
Water boy basically back in those days.
uh, and so, so yeah, so I had,
I knew him in respect, but I
just honestly, so true story.
I, I was at my, my, um, my oldest kid's
school meeting with the headmaster,
just, you know, normal check-in, you
know, they inter they take, bring the
parents in and once a year and like, give
you just some feedback on it, whatever.
All good stuff.
And, uh, it came out that Josh
was actually the roommate of
the headmaster in college.
And I, I, I have no better way to say it.
It was almost like, you know, it
was almost like from above it was
like, Tim, you need to go talk Josh.
And so I just randomly,
I hadn't talked to him.
I, I met him briefly in 2018.
Hadn't talked to him since.
Um, when he got out of the service,
uh, he, he went and met with me when
my ran my last company, uh, had a good
rapport, but like, I hadn't talked to
him in, at this point, like four years.
And I was like, you know what?
I think I'm gonna build this thing.
Um, you have all this track record
of execution, operational excellence,
were the highest things on the line.
think he could be an amazing
operating partner with this.
And it took him, he, he was
actually involved in another
company he helped found, so I kind
of had to recruit him outta that.
but, uh, but he found a, you know,
he basically went ahead and, and,
you know, made that call has been
an amazing partner, you know, just
like my other two founders with this.
Um, so I think if I was gonna give
you the, the baseline, gotta get
your core team first, like your team
and who you work with is everything.
And so that was where
I spent the first year.
Um, then you got the people, the, so 2023
was the initial investment figuring out
a little bit of the product, talking to
a lot of people who didn't invest in it.
you're gonna get 20 nos for everyone.
Yes.
Maybe you might get a
hundred nos for everyone.
Yes.
Uh, it doesn't matter if you know.
Okay, so last win around, I raised
a hundred million dollars from my
last company and, and created kind of
the, the standard for compliance and
surveillance in the top tier bank.
So I had a decent track
record from my last go around.
Doesn't necessarily matter 'cause
this go around's different, right?
you're building an app for kids.
Like if you went and built
something for Goldman Sachs,
we'd probably fund that again.
You know, that's kind of
how the attitude goes.
So aren't necessarily motivated by,
oh, this mission needs to get solved.
It's, hmm, I heck, you're probably
making a lot of money is good, right?
And so we had to search and find
the kind of people who believed.
And interestingly enough, my initial
investors were people that had
been investors in my last company.
They just decided, you know, we know you.
We've worked with you 20 years.
We're in.
are people I didn't ask to invest.
They like sort of said, yep,
we went in and then that.
So what I also say is when you're
building something new, it's almost
certain that the first people
that back you, they just love you.
I mean, they kind of will
back you in whatever.
so your character, what you bring into
that, your track record, that opportunity.
Um, so, so yeah, I, I think
that, you know, getting the team
together, you know, cooking up the
idea in a way that's thoughtful,
um, and, and getting it together.
Find the people that basically will invest
with love initially and then getting
far enough, showing enough profit or,
uh, progress that, you know, maybe new
people you didn't know, uh, would jump
in, which is what happened for me in 24.
So we started to get some
really serious investor.
We got like, re Hoffman's Seed Fund
Village, got into it, which was awesome.
Um, you know, uh, Jason or, uh, sorry.
Jeremy Achen who built DataRobot,
a multi-billion r AI company.
Uh, his fund invested
Jeremy's on the board with me.
He's been an outstanding investor.
And then, uh, we had this really
awesome group that kind of came in
at the end called Magnify, which is
tied in with Melinda French Gates
and the Pivotal ventures and stuff.
And, and the Magnify, the two
amazing women that run that, I mean,
they, they're all about Fantech.
They're literally probably one of
the truest enough, the truest seed
fantech investors in the country.
and we're lucky that our investors
just happened to, to run into
them, and then we fit kind of
the mission they wanted to be on.
So we end up with, you know, three
funds that cared about Mission,
deeply connected, you know, giving
us the runway to build this.
And, you know, a year later we built
it, we shipped it, and now we're
gonna take it to the next level.
So, um, so yeah, but it's always building,
it's building on the trust of other
people to sign up and be on your team.
The, the, the work with that
team to figure out what it is
really, like, what do you think
it is, it's gonna change iterates.
Then the people that love you have to kind
of pull the trigger probably to give you
the resources, even take a shot at it.
You gotta make the most of that
shot, because that's what gets you
to the next milestone of, oh, we get
the backing to do something real.
And that's kind of been the journey for
last two and a half years of annual Q.
Clif Marshall: Tim, I wanted to just
quickly read the mission or part of the
mission statement that you have for Angel.
just so people get a great understanding
of what this is and how important it is.
And I.
Just hearing your story, I
have a ton of respect for you
because there's a problem, right?
And that problem is online, and
that problem is algorithms and
exploitation and just dangerous to
children, uh, throughout our world.
But part of your mission statement,
it says, an AI that cares because
it's objective function, is to improve
and protect a child's wellness.
One that will do everything
in support of them.
Look out for their mental,
social, and emotional wellbeing.
Know them better than anyone, but
never sell or exploit that knowledge.
Partner with parents to make sure
their child is protected and enriched,
and AI to bring families together
in ways, both online and offline,
day, all the time to make a lot
of healthier and happier humans.
That's the mission behind Angel Q.
Tim, I want to ask you, uh.
does success look like for this
company 10, 15, 20 years from now?
Tim Estes: Ooh, that's a,
that's a, that's a big one.
Um, you know, it's one of the,
the challenges when you're running
something is, my mom used to
say this, uh, life is so daily.
And so you get focused in on the
urgent, you know, there's the urgent
and there's the important, right?
So I love the questions.
Making it is gonna make me
think of important for a change.
Not every day you get a chance
to think of the important stuff.
Um, so I believe that last,
um, 20 years of technology have
kind of gone backwards, okay?
So technology, the fundamental purpose of
technology is to make us more productive
to like augment ourselves, to create
kind of an augmentation of ourselves.
Productive, healthier,
wealthier, all this goodness.
That's the history of
civilization and technology.
Something weird happened, you know,
between like 2005, 2010, where the
number one applications, the killer apps,
the internet search, and then social.
What happened was they
went from being useful.
Like initially search was find
stuff right that you needed to find.
'cause there's all this stuff
out there and social was, find
the people you need to find and
learn what's going on with 'em.
Those are the two killer apps.
They started using that killer
app to then hack people's brains.
And, and this is not speculative,
Sean Parker, one of the co-founders
of, of Facebook, you know, ran it
with, with Mark Zuckerberg early on.
he was at conference in 2017 at Stanford
basically said, you know, that their
whole purpose was, they were hacking
the human mind and they had fought
through like what it was gonna do.
And then he said a verbatim like, God only
knows what this is doing to our children.
Okay.
So, so I give you, I give you
that kind of anchor point, right?
Of have 20 years where they
turned technology from being
a tool into a destination.
The destination is, is you're
hooked on it because they basically
use digital narcotics, okay?
They, they come up with ways to fire
off dopamine, like by, you know, not
knowing if you've had someone thumbs
up, you yet or not, or not knowing
what's gonna pop up next in the feed.
There's all these little tricks
they're doing all the time.
There's a science behind those
tricks that makes people need
that hit over and over again.
So it works very similar
to addictive Narcotics.
Um, and what's interesting is, and,
and one of my, uh, good friends, Claire
Morrell, she has a great book, um,
called The Tech, ex Tech Exit that's
coming out in about a, like, I believe in
like two weeks here, I think next week.
Um, she makes a great point in
her book, which is, um, real
relationship tends to require, um,
physical engagement, uh, oxytocin.
It's why frankly you hug your wife and
you feel something, you know, I mean, it's
like that, or your kids for that matter.
virtual cannot replace that.
It can wire our brains.
To want something else and
think it replaces it, but
it's actually a false thing.
Okay, so I, I know you asked me one
question I needed to build up to this.
Okay?
want to
Clif Marshall: Yes.
Tim Estes: of that as
possible in 20 years.
My hope is that the world we have
today where kids are walking around
looking down at their phone, but I do
think it's metaphorically interesting,
they're looking down, not up.
So I want a world Gen alpha and
the generation that comes after
that, is lame and uncool to be
on your technology that much.
Why?
Because that means the technology is dumb.
It's not useful, it's not productive.
I want a world where they're on
a screen 20 minutes a day because
that's what it takes to talk to their
AI and their AI cousin and does all
this stuff for them and comes back
and actually values human time.
Alright?
See, the technology is actually there.
it to be built that way to do all
this stuff, for us to do things that
would take us weeks and get done in
minutes now, like that tech is the
revolution last two or three years.
Alright?
If that was designed in a way that
wasn't about trapping people's
attention and trying to hook them into
being, I mean, in what universe is
is it actually healthy for kids to be
looking at a screen eight hours a day?
Like, I mean, that, that's not
childhood, that's not even life.
And, and so I believe that to change
that vector, you gotta start kids
early and young, which is why at
AngelQ we're focused on this sort of
five to twelve year old early piece.
And it's really about giving the kids
the ability to explore, explore safely.
But there's nothing we built in this
thing to keep them like going in.
They have to have the drive.
You know, you watch a video
in AngelQ and it just stops.
It doesn't take you anywhere.
Why?
Because we think if you watch a video
after you search for it, basically.
Before you watch another
one, you should decide.
You should stop, pause, and decide,
is that what I wanna do next?
Go spend twenty minutes
watching something else.
And it's amazing if you don't,
if you don't trick them into
doing it, just keep watching.
most
times the kids will just check out.
They'll be like, yeah, I think, you
know, I'm just gonna do something else.
It isn't interesting enough
for them, um, because they
know there's something better.
So I think that, you know, my dream
is we have tens, hundreds of millions
of kids that are on Angel cue or
derivative of that, they trust as an ai.
Not because it's their friend,
not 'cause they're companion.
We super clear there's some really
dangerous stuff going on out there
with people trying to turn AI into
companions right now and being like
a substitute for relationships and
that is not a road we're going down.
We do want it to be, we want it to
be your C3 PO and your R 2D two.
We want it to be the thing that's,
you know, kind of cute, kind of fun,
you know, maybe you think well of it,
but like it really is not there, like
to replace your human relationships.
It's there to free your time up so
you can have more real relationships.
Sam Acho: Tim, I want
to follow up on that.
I think two things that come to my mind.
You talked about how with Angel
Q, it gives children the decision.
Do I wanna watch again?
I think about some of the other apps
that we use that other people use, and
it's like as soon as the show ends 5,
4, 3, 2, 1, the next show's already up.
And you think, and, and to your
point, it's not an accident
that's by design, right?
Did uh, so I think about that,
but also following up on Cliff's
question of what is your, what
does success look like in 15 years?
My follow up is what would hinder
you in Angel Q from being successful?
Tim Estes: I mean, so there's this
stuff that, that normally, you know,
you've gotta deal with as a company.
So you've gotta keep having
the funding to build out and to
keep, uh, essentially expanding.
I mean, like, so we run on,
just Mac stuff right now, right?
On iOS, we run on like iPhones and iPads.
We gotta go get this thing on Android.
We gotta find clever ways to put
it on TV top, you know, boxes.
You know, we, we have to do
that kind of work and that's
gonna take time and resources.
So there's the kind of stuff like that.
There's find the best people to do that.
These are all the things
that just build business.
a couple things that are not just
related to building a business
that are specific to Angel Q.
That could stop us.
Two ones in particular.
One is, I don't think the thing
we're trying to pull off works
without the parents being a partner.
I think a parent has to decide Um, I mean
a way to think about is this is if your
kid is staring at, you know, a device that
you've given them and there are, you know,
four or five super addictive apps and
games and you're giving 'em the freedom
to use their time however they want.
It's almost like you take 'em
out to a restaurant and you don't
make 'em have to eat dinner first.
You just say, pick what's in the menu
and they go straight to desserts.
Like, parenting is not that.
Right.
Parenting is knowing that, you
know, if a kid had the freedom
to choose their way forward, they
probably eat Skittles every night.
Okay.
Especially younger ones they had
a belly ache and maybe they pick a
different kind of candy thinking that's
it before they finally give up on
the sweets because it makes 'em sick.
So like that, that's, that's
normal kid behavior, right?
We as parents intervene all the time and
nudge them better their own good and then
they grow up and they start respecting
that and they start understanding that
and start growing in their own maturity.
Um.
We're trying to do that digitally, right?
We're trying to do that
in the digital space.
And what that means is we need
to partner with parents in that.
Okay.
if you've got into a queue, you
don't need YouTube on there anymore.
And so keeping YouTube like on your,
on your actual, uh, device, like the
kids' device where they could get access
to it, it's just asking for trouble.
'cause it's just rabbit
holes, it left and right.
I mean, you mentioned about the
autoplay, like that's not great.
What's really bad is when the autoplay
is something weird that shows up right
after what they were watching that, that
the algorithms decided is related to it.
But it's like nothing like what
you would want 'em to watch.
And so, uh, or there's a comment
because it is kind of a social network
and the comment might, you know,
drag them down some rabbit hole.
the world we have today, since it's
all monetized based on attention,
these, these different apps are
all monetized based on attention.
I.
They're going to want your kid to go
down as many rabbit holes as possible,
because that means they're down the
hole and they're not getting out.
Right.
So they're not going to protect your
kid from going down these rabbit holes.
So that means that part of the,
you know, the parenting process
is, you know, substantial limits
probably on the stuff around AngelQ
and treating it like at, at best,
treating it like dessert, like games.
Okay.
Yeah, I, I grew up playing
some video games too.
I didn't spend all my time
doing it, but, you know, I don't
think I ended up totally, you
know, antisocial because of it.
So I think there's a lot of like
legitimate, um, you know, playing
there a lot of people that are, um.
fact, probably one or two people on this,
on this, uh, podcast, Sam, have actually
been characters in games, is my guess.
Uh, and so given the, the pro
history, uh, and uh, and so I think
that that's some way the kids like
love and follow people and do it.
So I totally, I think it's legit.
But you know, if, if a kid's gonna spend
eight hours a day on Madden, I mean,
that's probably not great either, right?
So, so there's gotta be these
Sam Acho: Hmm.
Tim Estes: and, and I think that we
have to work with parents who have had
enough, don't want their kids to be
addicted to these things don't want to
be, have their kids be dysregulated.
Um, and Angel Q becomes
kind of a safe go-to option.
So, you know, they don't need to go do
research on Google, uh, without you there.
They could go down and do
the research with Angel Cube.
Um, if they're gonna use those other
tools, it's probably to do it with them.
It's probably taking the effort to
say, okay, you wanna see some of
YouTube, but you can't go on Angel.
We'll do that together.
Okay, uh, you want to go search on
Google and try to dig into something
for whatever reason, because like what
Angel Q's giving you isn't enough yet.
And by the way, every month or
two, we're gonna be expanding that.
So our hope is that the thing where
it doesn't do it is gonna get smaller
and smaller, um, but our goal would
be that we become the safe option
when the parent can't be present.
that requires the parent to sort of,
I mean, it should be a good thing.
They can say yes to it, right?
Without the risk.
Uh, but on the other hand, um, if
they just let it be, anything goes
because you're just trying to buy
space as a parent and let anything go,
that's a hard thing for us to succeed
in because like I said, we're, we're
then up against people are actively
trying to addict your children to
technology, but they can make money.
And we are actively trying not to, you
know, those are called dark patterns.
Like if you, if you dig into
the literature about this
stuff, like, um, endless scroll.
Right.
Endless scroll is known as a dark pattern.
It's because they figured out that
if your brain was not given any
natural place to stop, it would just
keep going and it would flush out
what you just read from your memory
before you got to the next thing.
And by that happening, what happens is
you would lose the impulse control and you
would spend 45 minutes on a social media
thing when would naturally spend five.
Okay?
And they figured this out,
Aza raskin figured it out.
Uh, and he's kind of been repenting of it
ever since he went and, uh, and, and is
now part of this group called the Center
for Humane Technology with Tristan Harris.
And they're the ones that did the social
dilemma that, that movie on Netflix.
Uh, and, and they, they cracked us.
They kind of said, this
is why it's all bad.
And they're called dark patterns.
We've tried at angel put light patterns
in, you know, natural stopping points,
natural places for kids' judgment to
come out and decide they wanna do more.
We're gonna keep improving
that, trying to nudge kids off
the app to do things offline.
Like the big priority for us in
the near future the best stories
we're getting from parents.
I mean, there's, there's great stories
from parents that have had big insights
and great insights for their kids,
but there's also great stories of
them going and doing things together.
You know, we had one parent that, uh,
ended up like going out and, um, uh,
they adopted a family, you know, pet as a
result of the investigation they had done
on Angel to pick out what kind of pet.
so, and now there's a,
like a memory of that.
Sam Acho: No, Tim, this is super helpful.
You talked about the parent involvement,
so a lot of people don't know I got
a chance to test out Angel early.
I got a chance to use Angel for
my family, for my kids, and.
And, and to your point, it,
there's this ease of use.
Kids can go in and you have this
conversation with this ai, you know, with,
you know, you say, Hey, I wanna watch
this video, I wanna watch that video.
And they would show it, and it's great.
It's, it's like the safe place.
But the best part about Angel for me
so far has been the parent involvement.
What do I mean by that?
There's a, there's a, a, a technology
within the app that actually has
essentially allows parents yes.
To be able to control screen time,
but more so for me, it lets you know
a little bit of what your child has
been searching, what they've been
asking about, the conversations they've
been having online and now four kids.
And, and, and one of 'em, it's been, it's
been a challenge at the, at the school.
And
I saw the angel report for the day
and it said, man, you know, your,
your child, you loves to learn
really inquisitive and wants to grow.
And, and, and we noticed that by the
question and if he asked, this is.
How do I get smart?
How do I get smart?
And that question provided a lot of
insight for me because as you know,
whether you're at school or even
at home, there are conversations
that are happening that you may
know about or may not know about.
And for me it was this reminder of, oh
man, let me remind my child of who they
are so that the world on the outside,
whether it's friends at school or maybe
stuff that he's searching, doesn't make
him think that he is what he isn't.
And so one of the biggest benefits
for me, you talked about the parent
involvement, is this, is this insight, I
think it's actually called Angel Insight.
It's it's insight.
Tim Estes: Insights.
It's a weekly, uh, weekly kind of
digest, uh, with a summary of what
your kid's interested in and then also
kind of some suggestions around it.
And, And, and so what it did for you,
Sam, is, is what we dreamed it would do.
It's basically something to bring
parents and your kids closer together.
Uh, it's like, it's interesting.
Those insights are not things we're
ever using, like relative to your child.
Like there's no one, there's
never an advertisement, there's
never information going anywhere.
The AI's learning about your child
is actually just to supply you so
Sam Acho: Hmm.
Tim Estes: it can be something that
helps you make that outreach, you know,
to your, to your, you know, your kid.
Sam Acho: Wow.
That's good.
That's good.
Uh, no, just for me, like you
talked about, the, our kids are
in a world where they're, a lot of
'em are, are spending time online.
They may not be living online,
but they're spending time online.
And so just as a parent to be able
to partner and have that insight
for me, just provided another tool.
To be able to help raise and train my
children and to get some insight into
what the world may be trying to tell them.
Tim Estes: Yeah, it's interesting,
like sometimes, sometimes your kid
will bring up good questions to you and
sometimes they're, they'll hold back,
you know, and they, it's not that they
don't trust you as the dad or, you
know, in this case, I mean, or, or mom.
Um, it's that they, they want a
bit of a judgment free zone, uh, at
all levels sometimes when they're,
they're curious and they're uncertain.
And, um, I'll give you a recent one.
Um, so my, you know, my son was asking
questions about, um, you know, were
there, were there, you know, being
a Christian, like were there men
around when the dinosaurs were here?
He started asking about like pre earth
history and, uh, and, and, and he,
he asked me about it and I had, you
know, a fairly long, and hopefully
not totally obstru kind of, you know,
explanation for him about all that.
but, but you know, he's nine,
mean, these are the ages.
They're gonna be wrestling
with different things.
They're learning from different sources
and it'll contradict sometimes, right?
it's, it's, you wanna make sure they're,
they're basically having access to
truth, but even more than just access
to truth, you want them to start being
able to discern the truth, right.
To figure out, like, 'cause, because
we're not always gonna be there, you
know, we're, we're always there for
them if they reach out and we know, but
they're gonna have a lot of points in life
where we have to build these resilient
kids that turn into resilient adults.
And, um, my hope is that what Angel Q does
is it actually allows them to explore ways
they naturally want to, but without the
risks that should never have been there.
Right?
I mean, it's not like we're
cutting off this internet.
They're entitled to, I mean, it's,
we're basically saying the red light
district, there's no access point here.
Clif Marshall: Hmm.
Tim Estes: so like, that's how
it always should have been.
It was always almost a, a corruption that
these things were built the way they were.
I mean, it would be like, you know,
having a town where you took off
every single, like, you know, alcohol
for 18 and over or 21 and over,
sign off every single door and they
could roll in any bar they wanted.
Like that's how they built the internet.
And, um, you know, that, that just,
that ends badly, you know, and
it's not just because, you know,
alcohol for young kids is bad.
Of course it is.
But think about the people that are
in those environments and what they're
actually exposing them to and, and
what they might be dealing with.
And that is not what we want our
kids engaging with because based
on healthy development, they have
to go through a different journey
with their family and their peers.
To build up this resilience.
And if you just throw them out in the
wild, if you throw them in the middle of
Times Square without like, you know, a
guide or anything, and they can wander
anywhere, like don't be surprised.
Bad things happen.
uh, and I think that, know, we think the
antithesis of that is have something safe,
or they can be curious, they can push,
you know, and, and be interesting and
know that they can grow without judgment.
uh, the parents are there alongside
that journey to understand it and find
ways to intervene and love the kid,
you know, through all the hard things.
I mean, my, my 6-year-old, I haven't
talked about this that much, but,
so he's neurodiverse, he's, he's
a little bit on the spectrum and,
um, and his name is actually Angel.
and that's one reason I named this app
that is, uh, one thing you find, and it's
actually a real tragedy, these apps that
are designed to be so addictive, they're
10 x addictive on Neurodiverse kids.
These apps just abuse these kids.
Because what they do is they
dysregulate them, they make them hooked.
This dopamine hit thing I talked
about earlier, when you take a kid
that's kind of struggling with those
kind of difficulties in terms of
how their brain, you know, grew up.
And as we know, like this is getting
super common, like especially among boys,
like boys that are in the age of our
boys, it's probably 10 times more likely
that they have these kind of struggles
than it was 15 years ago, 20 years ago.
It's amazing how much it's exploded
and, and they realize that they can hook
these kids and they can't break away.
Okay.
And, and so I, you know, decide I can sit
here and pound on how evil I think that
is, but you know, I've got one, right?
And so like, that's my boy.
Like this is more personal, right?
It's not just a, and so
part of this is, um, I.
How do you give kids that are like,
coming from all these different diverse
places, how do you give them something
that, you know, does have their,
their wellness in mind, you know?
Um, and, and, and you know,
we have to build a business
model that works that way.
I mean, we're saying no advertisement.
We gotta have people that support
and actually use and pay for this.
And it's transparent, right?
It's like, this is how much it is a month
used, and this is 'cause it cost us, you
know, X or y to, to actually deliver it.
Uh, and we're serving it for you.
So, um, so I think that's, that's the big
stuff going on now is that the, the kids,
uh, especially the neurodiverse ones, um,
they, um, they need a place that's not
just safe, but isn't trying to exacerbate,
you know, their weaknesses, that isn't
trying to play to those weaknesses.
and, uh, and so, so mine like is
probably even more, you know, of a
user of Angel Cube because anytime.
he's got a question I said,
and, and I'm like, I stumble.
He's like, oh, let's ask Angel.
You know, it sort of becomes
a cheat co for parents.
Those of us that are using it,
we know what we're talking about.
Like, like they wanted to know like,
what was, this was a, this was crazy.
'cause it's like classic case of
as a dad, you, you kind of get
this little aura of, you know, most
everything, but you really don't.
And, uh, and so the, the kids when you
could like, laugh through the fact when
you don't know, so they're like, well,
what's a number bigger than a Google?
know?
So Google, you know, a number with, you
know, one with a hundred zeros behind it.
And it's like the Angel Q said, well,
a Google plex, and that's like a, a
one with a Google Zeros behind it.
And then of course they don't stop,
you know, kids don't know what to stop.
So they're like, well,
what's bigger than that?
You know, it's like, who's gonna win?
And uh, and it's like teamed up with this
thing called a Grams Number, which like,
I'd done some math in undergrad, like
I didn't even know what this thing was.
But basically it's a number that is
so big that it's longer to write than
all the molecules in the universe.
Clif Marshall: Hmm.
Tim Estes: And so now like my
little 9-year-old, 6-year-old, like
quizzing their peers about like,
you know what Graham's number is?
Like they got this little piece
of knowledge they know and they
just can use it and love it and
I think like that's what their
childhood's supposed to be like.
Finding new things that are positive,
things that are interesting things.
Sharing with others and the
others sharing with them.
And it builds up, you
know, a kind of community.
A community of learning and curiosity.
I think what's happened is a lot of
our young people they face, and my kids
haven't faced it fully that much yet
'cause they're young enough, but the
parents that have, you know, 10, 11, 12
year olds and going into high school,
these engines of online communication
have been basically, they've all
been amped up on comparison, right.
How, like, how am I look
in versus this other girl?
How is my life versus this?
And you know what, like there's almost no
sin that is as pernicious as envy, right?
That just tears through things.
And you've got a whole engine of envy
and insecurity being used to hook our
kids to spend more time because a few
companies make billions off of it.
And we should get mad about that.
Like we should just, we should
be, we should not settle for that.
uh, and that's kind of like,
that's where we come in.
And I know my, one of my, you
know, frustrations is we're
going as fast as we can.
We're covering these younger kids, but
man, we're gonna have to build something
for the teenagers too, because like, they
got a whole nother fight, which is, you
know, being growing up and a, you know,
an idea with their socialization is mostly
virtual, which is just bizarre, right?
they hacked the kids and
so the kids are wired now.
To think of socialization
as being virtual first.
Like not, oh, what did I meet up with?
You know, Harry or Sally, uh, you
know, and then have this conversation
or did I just message them?
I DMed them somewhere, you
know, and it's probably nine
the latter and won the former.
And so it's just so different now.
And, uh, I mean, I mean, one thing that
blew my mind is, I mean, I don't know,
I'd, I'd love to get a kinda memory lane.
Do y'all remember what it
was like when you turned 16?
You know, whether you like lined up at
the door of the DMV to make sure you
got your license, like that day right?
Or maybe
Clif Marshall: Yes.
Tim Estes: that whole learner's
permit thing going on, or you
get some special consideration.
You get, you get to be one of the
really cool 15 year olds that just
happens to get one of those, right?
that was a rite of passage.
And, and I think it probably was until
about 10 years ago, the number of
kids getting driver's licenses, when
they hit 16 all the way up to when the
grade college is dropped dramatically.
And the only explanation that, that
I can give for it and that some
various social scientists that are
far smarter me, like Gene T Twinge and
others, uh, give for it, is they've
been pulled in such an online world.
They don't have the drive to
actually go out and have real
connections and relationships.
like, that's super concerning.
Especially speaking of like, put
my, like we have a diversity of,
of basically beliefs and, uh, views
on all kinds of things in my team.
Angel Q So what I'm gonna say here,
I'm gonna say just in my personal
capacity, speaking kind of as a
believer, 'cause I know y'all bring,
like on this podcast, you, you dig into
these areas of, of like faith and how
we're supposed to live in this life.
Um, there's something deeply wrong
and deeply missing, uh, when we
don't have embodied relationships.
We talked to the oxytocin earlier,
like the physical thing that we're
wired in by design to, to have
that possible bonding with people.
We gen, we have a whole generation
that's being conditioned to
not with people as people.
basically engaging in the virtual
world with people as things
as you know, ends to be used.
And it's almost like abuse, like when
somebody takes advantage of somebody
else, often there's a history of where
they were abused themselves, right?
Like there's a, there's a
lot of correlation in that.
so what's interesting is I think
there's gonna be a similar correlation
when you have people that are brought
up live on the other side of screens,
is the primary way they interact.
Um, I think the dehumanizing like
that's implied in that lives on,
in all the ways they treat people.
I think empathy gets to be super hard.
know, and, you know, that's key.
I mean, part of like at the root,
you know, of all love is empathy.
It's the putting yourself, you
know, behind somebody else.
that starts with viewing someone as a
means, or sorry, as an end in themselves,
that they're worthy, that they're of
human dignity, they have human dignity
'cause they were created the image of God.
like when you think about what
we've substituted, how is any
of that gonna break through?
Uh, in fact, we probably created one
of the only systems take the truth
of how people are like embodied and
in relationships and you know, kind
of obfuscate that truth by changing
the way they interact and they
can, it's so ubiquitous that people
forget that, you know what, what's
great is when I've got friends that
will come help me move a couch.
You know, not when I've got a hundred
likes on Instagram and none of them
would show up for me if I needed them.
Right.
So like, that's the difference.
And uh, and like I said, I said,
I, I don't necessarily think when
I say I wanna turn that back,
I wish we had never come here.
I want it to go to a better place.
The better place probably looks a little
more like it was than it is today, and
it'll be a little different because
there'll be all this magic with AI
where it can help them and be present.
And I just think the magic
is one way outta the hole.
uh, that's the thing I'm hoping is,
and we're, we're trying to build
and build thoughtfully, is if, if
it comes super easy for computers to
do all this stuff, to go find stuff
online for you, why in the world
should you use your time up that way?
Right?
And uh, and if we can, like I said, make
it a universe where 14 year olds see other
14 year olds looking down at their phone
and they're like, what's wrong with you?
Like, I mean, I'm over here and
like, do you have like software
that's 10 years old or something?
Because nobody does that anymore.
That's the world I think
we can still get to.
Um, but it's a world that's gonna, be
fault tooth and nail by some companies.
I'll just name one like Meta for instance.
I mean there's no way out of the model
they've chosen, there's an ethical
compromise at the heart of their business
that I don't see how they can escape.
Okay.
Google for its issues,
there probably is a way out.
They can be a supplier of technology,
they can charge people for that.
They can charge companies and they
can charge people and they could
probably have an amazing business.
It's just gonna be hard.
Transitioning Meta is fundamentally
a digital narcotics company, so
I don't know how it gets out.
I dunno how Snap gets out.
I don't know how these
other guys all get out.
It's like asking Paolo Escobar, Hey,
do you want to go sell sugar instead?
You know?
And
Clif Marshall: Hmm.
Tim Estes: yeah, I mean that's, that's
what we got and uh, and so I think
we're gonna see some companies, I
hope that dramatically diminish in
value or get totally replaced and
we'll have others that maybe pivot.
You can see like Microsoft and a
couple of others that are trying
to find a line down this to try
not to go down the same dark path.
what's so funny is like people talk
about the gaming stuff, you know, the
gaming companies, most of these had
great age verification and protections
on kids going back like 10, 15 years.
Things that could have been put
into Instagram on day one and
we wouldn't have had 98% of this
total garbage on it in terms of the
Clif Marshall: Hmm.
Tim Estes: Like it was a choice
not to, it was widely known.
In fact, it was weird that when you got
an Xbox takes taking a credit card to be
able to validate a kid getting online.
Right?
That's what it took.
Okay.
Kids signed up to be able to do gaming
and multiplayer gaming online, going
back to like 2011 with whatever basic
kind of connection they had back then.
Like had to give a credit card.
Parent had to give a credit card.
16, 17 year olds, or even 13 years,
they're not having a credit card
company won't give it to 'em, Probably
be something really extraordinary.
So that was almost like a bulletproof way
to know this is a minor, this is the adult
and the adult signing off for the minor.
Nothing like that was done
for these other systems.
And uh, and when you go strip away, why?
I'm gonna get into a little
bit of the soapbox moment, but
like when you strip away, why?
The reason is us as adults, we can't spend
six or eight hours on a screen today.
I mean, we could do it part of
our work and all, but we can't be
sitting there like doom scrolling
for four or five or six hours.
We just literally can't get away with it.
We got family, we got work to
do, we got money to bring in.
We got stuff to deal with.
Um, only a tiny fraction of adults
can do that, but a lot of kids can.
And so what they figured out was that
they get these under 18 kids on there.
These ones even earlier, like,
you know, 10, 11, 12, ones that
legally never should have been on.
Um, they're worth probably three to
four times what an adult's worth.
Okay.
It's the same ads.
Same.
That way they're counting.
So when you realize, when you
cycle, why did they do this?
Like, why this is so
straightforward to fix it.
You know, if once you get away
from the idea that, oh, they, they
can't fix it, it's just too hard.
Well that's, that's never been true.
So why do they not put friction in?
Well, it's because if they could
get enough kids addicted on it,
they would be worth four times
the adult in terms of time spent
because they had the time to waste.
I mean, they really didn't.
What they did is they traded, you
know, being good at athletics, they
traded having time with friends.
They traded, you know,
volunteering with their church.
They traded learning a new language.
They traded all this other
stuff for that virtual drug.
And, um, like that, once you put
that math together, like, okay, so
there's 20 million kids that are 30
million kids that are illegally on
these apps, or 40 million, but that's
the equivalent of 120 million adults.
the math.
That's why they haven't fixed it.
You do the math on that, and
you're talking about costing them
tens of billions of dollars if
they actually protect the kids
versus what they have today.
Uh, and, you know, tens of billions of
dollars ends up, you know, translating
with, you know, modern types of earnings
calculations to probably, I know half
a trillion dollars of market cap.
And so these companies, they're,
they're basically eating up our kids
because it's probably made them a half
trillion to trillion dollars richer.
And it's that simple.
Like, I mean, if you want to, if someone
wants to compete and gimme another
explanation, I am all ears for it.
having been in the trenches in this.
'cause one thing I spent time on, in
addition to trying to build a tool is
I spent a lot of time in Washington,
dc been a lot of time trying to fight
for like, better safety regulations.
I don't think a, a single
tool can fix all this.
I mean, I love what we're building, Sam.
I'm so excited that you've had the
great like, experience with this.
Uh, even though it's our one, oh,
it's like so much, so, so little
of what we hope to finally have.
but it's arrogant to think that
a company can just do all this.
I mean, maybe we've got all things are
possible and we'll see, but, uh, but
I really think it's a societal problem
and that means the, our, our political
leaders have to step up and say, you
know what, when do we give digital
narcotic companies basically full
immunity on everything they're doing,
which is basically where they live today.
Like, there's very little that's
broken with the courts, despite all
the harms now I think that might
be changing over the next year.
There's some stuff pending with
attorney generals and others.
Um, we'll see, just like with tobacco,
eventually that reckoning came.
um, right now we're in that
pre-Obama phase, or right in
the heart of the tobacco phase.
We're gonna find out if our society's
gonna step up and change it.
And, um, yeah, that's, uh,
but that's, that's the thing.
That's my mission is my calling right now.
And you know, I'll be on
it till we deal with it.
Clif Marshall: Tim,
one more thing from me.
Um, we're talking about addiction today.
But not drugs and not alcohol.
We're talking about dopamine, right?
And clicks and likes.
And I read a recent Newsweek article on
your angel, uh, AI product, and it said
in quote, if social media is already
a digital heroin for our youth, and
enhanced AI will become their fentanyl.
Tim Estes: Mm-hmm.
Clif Marshall: And that hit me, uh,
like a ton of bricks because again, I
have a 10-year-old and a 14-year-old,
and I'm living through this right now.
uh, I think as a parent, you know,
especially as you get to the teenage
years, you feel the pressure of, well.
The kids that my daughter's gonna school
with, they all have Snap, they all
have Instagram, they all have TikTok.
And Dad, why are you holding that
against me and even dad when I
turned 13 for my 13th birthday?
Would you allow me to get, uh, Snapchat?
And so obviously there's a huge problem
and you're working on the solution.
I don't know if the solution is
all the kids flip phones and I wish
that could, that could solve a lot
of issues if they all were, you
know, carrying around flip phones.
Um, I have a ton of respect
for the mission that you're on.
And I guess the question I have for
you right now, if a parent wanted
to go to the app store on an iPhone,
is the Angel product already out
there where you can purchase that?
Tim Estes: Yeah.
Uh, angel Q is live on the app
store, um, for some reason.
So if you, if you, if you
search literally Angel Q.
Uh, it, you're gonna find it, you
may get it autocorrecting to Angela,
and there's like four things that
are called Angela in the App store.
We've noticed this as a funny thing,
but hopefully as we get more and more
people, it's just gonna go straight to it.
yes, it's there.
Uh, it's, it's a great tool
for five to 12 year olds.
We do have some older kids,
13, 14, that are on it.
Love the answers and so forth.
picking up what you said though,
in your, um, like your point Cliff,
like, I think for parents, um, that
are getting that conversation with
their kid, you know, 13, they wanna
be on Snapchat or something else.
Um, really I would, I mean, I, I I, I
fully endorse sort of, uh, some of the,
some of the ideas that Jonathan Hyde has
put out, like an anxious generation, and
one of those is no social media till 16.
Like, I, I think that, and even
then, I mean, I'm almost wondering
like, is it really needed?
I mean, it's, it's
Clif Marshall: Mm-hmm.
Tim Estes: things where, the reason
is you're literally throwing them
into Times Square without a guide.
It isn't just even that simple.
There are people lurking
knowing they're coming.
if you've got a 13-year-old girl growing
in a snapshot, she's gonna get essentially
digitally sexually assaulted multiple
times within the first year she's on it.
I just guarantee you there, she's
gonna get people sending her
stuff, random people, because they
haven't put these protections up.
they, they, you know, they have some sort
of, each Instagram teams now this product
and then, and Snap, they try to add some
things in the last year or two because
Congress was coming down once they finally
were forced to try to do something.
even in the last, um, I think week and
a half, there was a, I'll send a story
offline maybe if you want to include
it with the notes on the podcast.
But like, there are some kids that, um.
Basically did a deep dive.
These are like college kids.
They, so they're Gen Zers who have
grown up with all this stuff happened
to them and they, they run two groups.
One's called Design It for
Us, and the other's called
the Young People's Alliance.
And they've been on Capitol Hill, you
know, arguing for safety standards
for these apps because of what's
happened to their generation.
they went and tested in Strine and
they were getting just unbelievable
stuff, still coming straight through.
It's like they don't want the
safety stuff to work no matter
what they keep telling people.
Um, and you know, given the math I
gave you earlier, probably, you know,
why they may not really want to work.
Like if they're gonna
have, be forced to do it.
So if that's the environment, like
is the dis is the disapproval of your
child worth the risk to your child?
I think that's the hard question.
And it's hard as a parent is,
you know, taking the, your, your
kid's disapproval is one of the
hardest things as a dad you can do.
Um, at the same time, it's absolutely one
of the most important things you ever do
as a dad is take on your kids' disapproval
because you know, you have, you know,
you have great intent for them and you
have years and years of life lessons.
And when you second guess yourself on
it, I mean, there'll be a, occasionally
the thing where they're right about
something as they get older and they
get their own means, but they're
still gonna be wrong most of the time.
And, you know, we're, we're here to
be true to ourselves and protect them.
We're not here to be their
then they we're their dad.
We're not their buddy.
Like just making them happy with stuff.
Like we're here to give them a fruitful
life and protect them along the way.
And even more fully teach them the how
to get wisdom to protect themselves.
'cause we won't always be there.
Uh, and so that, so that's what I'd say.
So I, I, I definitely angel cues in the
App store, people can go pull it down.
I think it'll, you know, do some
great things for your family.
It's not a replacement for like every
addictive act by design, but it is
a way that you can leave YouTube
and Google behind and have your kid
do some great stuff in a way that's
thoughtfully done and affirmative
and learn about them in the process.
All that's the promises I
can say we're gonna keep.
but beyond that, when you're dealing
with older kids, like, do not be
bullied by your, you know, your bullied
be the wrong word, but like, not be
moved by this pressure on your child.
your kid is not going to do something
horrible to themselves because they do not
have Snap or or whatever, or, you know,
any of the meta products or Instagram
or TikTok, they're not gonna do that.
But if they go on it,
because we let them on it.
You might end up with a Gavin Guffey.
I don't know if you know the situation or
know that name, but there's a wonderful
human being named Brandon Guffey.
He's a state representative
in South Carolina.
And um, uh, he, he had protections like
in his house, like on his devices, like,
and they, they had certain standards
and he had a, a 17-year-old boy who
was on Instagram and uh, and this, they
were going for family vacation, you
know, they were go, they were about to
go to family vacation the next day and
11 o'clock at night or midnight or so.
Um, you know, his son Gavin is on
Instagram and some attractive girl
starts really flirting with him Right.
And starts chatting, you know, with
him and then eventually escalates
it over an hour or two online.
'cause this stuff has no friction.
It's going full fast uh, sends
some pictures then ask him for it.
And then as soon as he sends
it along at like 1:00 AM in the
morning or 1230, which is another
estimate, it's like there really is
no good use of digital technology.
Like after, after, you know, probably
after eight o'clock, you know, like,
unless you can locked down to just
things that help their schooling.
And that might be possible,
but it's pretty hard.
Like it ain't worth it.
It just, there's just nothing.
My, my wife has phrase like, nothing good
happens after dark on that kind of stuff.
So,
but one here, that's how the story ends.
Uh, supposed girl immediately flips
on him and starts blackmailing him
and says they're gonna send this
picture out to all the friends and the
rest of it and just berates him and
tears in them for like an hour or so.
It turns out it's actually a Nigerian
gang that, um, um, were, they're
called the Yahoo Boys and, uh,
they did this to all kinds of kids.
And Gavin, uh, in despair, uh, probably
would love to be a pretty solid, you know,
citizen life as a, as a teenager in his
high school and engagement had a certain
kinda reputation he killed himself.
So
Clif Marshall: Wow.
Tim Estes: his dad, you know,
had to go wake up to that.
and uh, and then this is
how sick these people are.
They actually contacted him, the
dad and the younger brother, and
tried to blackmail them too the
older brother had killed himself.
And it was
Clif Marshall: Hmm.
Tim Estes: thing.
They actually passed a law in
South Carolina called Gavin's Law.
Um, and you know, I've more,
I've walked the Halls of Congress
with, with Brandon Guffey.
I say he's a good, decent human
being Who's taking that pain?
I mean, we we're parents
like, there's nothing worse.
Our nightmare, right, is
losing one of our kids.
Like everything else, it's like, you
could take anything from me, but not that.
Right?
And this is why, you know, the,
the story with, you know, Abraham
and Isaac so important, right?
Is there's nothing so greater than
the love you've got for your kid.
And losing that is the hardest
thing to deal with in life.
So any parent that has had that happen
and they find the courage to go,
basically go on the march with it, take
that pain, have to relive it, and I've,
I've met numb tons of those parents.
Um, like we need to listen to
them and not just with what they
say we should do to protect kids.
We need to listen to them.
And what they're also saying, which
is it really wasn't worth it to have
actually given my kid access to that.
And,
Clif Marshall: Hmm.
Tim Estes: and I think so I, I just
leave that stick with, I, you've got
some great listeners on here and, you
know, everybody's got their opinion
with this, but I'm gonna say, you
know, there's a huge virtue in just
staying strong and saying no to it.
Um, I'm hopeful the next two to
three years or so, uh, we'll have
alternatives where they'll end up,
you know, for the younger kids,
like, it just won't be as appealing.
It won't be as interesting.
There won't be an Instagram childhood.
Um, that's what I hope can
happen, but that's hope.
Right Now, that's not a reality.
The reality of today, if your kid's facing
it, is, don't send 'em to the wolves.
Sam Acho: Wow.
Wow.
Well, Tim, thank you so much, uh,
for not just joining the podcast
or the conversation, but joining
the fight to not just protect
your children, but protect so many
children who are in the digital space.
Now, we are grateful for
your time, for your insight.
For the last you said
talk about three decades.
Fighting.
I think that the term people called you
was like the digital police for a while
before you started what you're doing now,
Tim Estes: by the way.
So yeah,
Sam Acho: the Robo Cop.
Well,
Tim Estes: My fun, my kids
had so much fun with it, so I.
Sam Acho: I love it.
I love it, man.
We're so grateful for you, for your
doing and building at Angel Q, angel
ai, and even the stories you shared
about your son, angel, uh, and
also your 9-year-old son as well.
For anyone who wants to learn more about
Tim, there's information about Angel
online, but also go to sam macho.com.
You'll be able to get all the info about
this podcast and so many more, and so much
more of what Tim and his team are doing.
So on behalf of me, Sam Macho, cliff
Marshall, Tim Estes, we say thank you.
We'll see you next time.
Creators and Guests

