I Found Out My Favorite Influencer Wasn’t Human | The Rise of AI Illusions
Stay in the Fray PodcastMarch 22, 2025x
4
00:33:4630.93 MB

I Found Out My Favorite Influencer Wasn’t Human | The Rise of AI Illusions

What do you do when your favorite Instagram model turns out to be… a codebase?

In this episode of Stay in the Gray Podcast, Ryan returns from the mountains and dives headfirst into the rise of AI—from fake influencers and manipulated media to the looming takeover of jobs, journalism, and politics. It’s funny, it’s creepy, and it’s closer than you think.

🎙 Topics include:
– The AI-generated model that fooled Ryan
– AI in politics, content, and social platforms
– How artificial intelligence is reshaping work and identity
– Why we need boundaries before the line disappears completely

This episode blends comedy and concern, because honestly—what else can you do?
🔥 Hit Follow if you’re not ready to be replaced by a chatbot just yet.


Chapters:

00:00 - Intro

01:33 - Welcome Back! Vacation Recap

01:51 - A Funny Story About AI

04:45 - AI in Politics: A New Era

14:51 - AI and Social Media: The New Influencers

21:22 - AI's Impact on Jobs

26:17 - The Future of AI in Everyday Life

32:06 - Final Thoughts and Farewell



00:00:01
The person who started the page, if you will, responded well,

00:00:05
that'll be a difficult interview seeing that Lucy, the name of

00:00:08
the girl is completely AI generated, but nothing.

00:00:12
I was in a way that you could make a 100% AI generated woman

00:00:16
and she's perfect. I just thought this woman, it's

00:00:19
perfect. Would you be as worked up by

00:00:21
things that I say if you knew I was just a computer generated

00:00:24
program? So I'm hoping that this industry

00:00:26
is one that cannot be overtaken. And if we get to the point where

00:00:29
we just kind of listen to different AI thoughts and ideas

00:00:33
on podcasts and on radio and on television and on all these

00:00:38
things, YouTube, all the platforms, then that's scary.

00:00:42
And we've become the very robots that we are listening to.

00:00:57
Welcome to Stay in the Grave podcast.

00:00:59
I'm your host Ryan, and I'm here to bring you raw conversations

00:01:02
about life and those undeniable human truths.

00:01:06
We dive into the chaos 1 issue at a time, blending comedy,

00:01:10
controversy, and the unexplainable.

00:01:17
Get ready for transparent ideas straight from the Gray areas

00:01:20
where most answers are hiding. So stay curious, stay

00:01:26
inquisitive, please stay laughing, stay in the Gray.

00:01:30
Come check it out. We'll see you then.

00:01:34
I'm back. Welcome me back people.

00:01:38
Much needed vacation, a week in the mountains.

00:01:42
You saw some of the clips posted.

00:01:45
It was great, relaxing, cleared my head.

00:01:48
I have a fresh perspective on life, if you will.

00:01:53
I'm going to go into some stuff today and I've talked about it

00:01:55
briefly before and I'll lead off with a story that about myself

00:02:01
so you can make fun of me. You can call me naive.

00:02:05
My best friend calls me naive all the time to technology, to

00:02:10
anything IT computer related and especially now with AI.

00:02:15
So I've talked about AI before, but I'm going to do a little

00:02:18
different, a little differently now because the other day, well,

00:02:24
I take it back longer ago than the other day, but I came across

00:02:27
this profile of a lovely young woman on Instagram.

00:02:32
And I'm a happily married man. But she was so gorgeous that I

00:02:37
had to go and tell my wife because I felt guilty for

00:02:40
looking at her. I felt, I felt every time I saw

00:02:43
her, I was like, Oh my God. And then she reached out and she

00:02:45
would reply to some of the, the, the likes of the comments or

00:02:48
that she'd, you know, and, and there was discussion and it was

00:02:51
completely innocent, obviously, but it was still there.

00:02:54
So I said, by the way, to my wife, you know, I don't know who

00:02:59
this person is, but you have nothing to worry about.

00:03:02
And it turns out I don't have anything to worry about.

00:03:04
Because when I reached out to ask if she would come on the

00:03:06
show to to talk about what it was like to be an Instagram

00:03:11
model and to make money on Instagram and to be sought after

00:03:15
for being ridiculously hot. To her credit, the person who

00:03:20
started the page, if you will, responded, well, that'll be a

00:03:23
difficult interview seeing that Lucy, the name of the girl is

00:03:27
completely AI generated. You obviously saw this coming

00:03:30
because I prefaced it that we were going to talk about AI.

00:03:33
However, I felt like an idiot. It was probably about a month

00:03:36
where these this woman's picture would come up and you know,

00:03:39
obviously they were risque and they were all the, you know,

00:03:41
great. And turns out she's not even

00:03:44
real. Now I admire the art or, or the

00:03:47
the AI, the ability for AI to do this.

00:03:53
Don't get me wrong, it's fun to look at, but at the same time

00:03:57
it's not the same. And I don't mean to take away

00:04:00
from what this young lady who said she's from Canada and she

00:04:04
wishes she looked like her creation.

00:04:06
So I felt bad in that regard. But it's just it's crazy that I

00:04:13
could sit there and be so mesmerized by something that's

00:04:15
completely fake. Obviously there are women that

00:04:18
put their picture up and they and they alter it.

00:04:21
They make themselves look nice with AI generated.

00:04:25
All the different stuff that they do with their pictures, all

00:04:27
the you guys know what I'm trying to say, the filters.

00:04:31
Thank you very, very much. I couldn't remember and for all

00:04:34
the filters and everything, but nothing.

00:04:37
I was unaware that you can make a 100% AI generated woman and

00:04:43
she's perfect. I just thought this woman was

00:04:46
perfect. I don't know.

00:04:47
And so it, it kind of got me thinking about AI again.

00:04:50
And I, and I'm going to go into a few things today and let's get

00:04:55
it out of the way. AI when it comes to politics,

00:04:58
politics has taken over the world.

00:05:01
Probably more accurate to say social issues have taken over

00:05:05
the world because of politics in the campaigning, even with AI

00:05:09
generated speeches and things like that.

00:05:13
And there are stories that I had no idea about that Now I'm

00:05:17
looking, they're out there and I, I never heard heard about

00:05:20
them, but there, there's news in information being given in all

00:05:24
of its AI and, and it's crazy to me.

00:05:28
So, so tell me that I'm silly for not knowing about these

00:05:31
stories. Case number one was a Joe Biden

00:05:35
phone call in New Hampshire that people were receiving.

00:05:39
And this guy Stephen Kramer, apparently he's already been

00:05:42
convicted. So I'll say his name.

00:05:44
So this came from the Department of New, excuse me, the New

00:05:46
Hampshire Department of Justice. And basically it was he was

00:05:50
trying to suppress the turn out for the primaries in New

00:05:54
Hampshire. Thousands were called and they

00:05:56
were asked to save their vote till November because this

00:06:00
Tuesday, the primary vote didn't matter.

00:06:03
Let's get it in November, make sure.

00:06:06
And so of course he got caught and but but it was Joe Biden's

00:06:11
voice. It was apparently pretty good.

00:06:15
I never heard it, but from everything that I read about it,

00:06:18
people were fooled. Not that everybody, you know

00:06:22
what, why would you think otherwise?

00:06:25
But we have to start thinking about it now.

00:06:26
And that's that's what got me so excited to talk about this

00:06:30
because it's the possibilities are crazy and the possibilities

00:06:33
are endless. And I tell you what, that's just

00:06:36
one case. Second case, this is BBC that I

00:06:38
was looking at. There was a Zelensky and a Putin

00:06:41
separate videos of these two leaders in a war.

00:06:44
Currently one of them, Vladimir Putin, who has been known as the

00:06:48
aggressor who's leader of Russia case.

00:06:51
You needed that information calling for peace now.

00:06:54
Oh, we're sorry. We didn't mean to do this.

00:06:56
And obviously it wasn't Vladimir Putin.

00:06:59
Vladimir Zelinsky, the Ukrainian president, there's a video of

00:07:03
him using AI that where he's basically saying to all of his

00:07:08
people, put down your weapons, we're going to surrender, which

00:07:12
is not what he intended to do and still doesn't intend to do.

00:07:15
And of course, everybody got all, all crazy and say this is

00:07:20
horribly done. How would anybody fall for this?

00:07:22
Well, I guess a few people did and, and, but the fact that

00:07:26
everybody was so ready to jump on the anybody that believed it,

00:07:31
it was a little outlandish because it was that bad.

00:07:33
But the point is, is, is it going to get better?

00:07:35
Is it going to become more accurate?

00:07:37
Are there going to be more and more and more and more of these

00:07:41
to where people start believing it?

00:07:44
And when you're across the world and you see something like this

00:07:47
spread on social media now, apparently Meta and X and all of

00:07:53
them, apparently they all YouTube took these down, which

00:07:57
which they should, I mean, if it's complete, especially with

00:08:00
something like that now, where's the line with parodies and memes

00:08:04
and videos and all that? Where's the line?

00:08:06
And that's what you have to be careful of because I definitely

00:08:08
don't want free speech taken away and I don't like

00:08:11
censorship. But I suppose when there's an

00:08:15
actual war going on and there are fake videos from the two

00:08:18
leaders, maybe you should go ahead and use your authority to

00:08:23
take care of that. So those are a couple cases,

00:08:26
But, you know, I didn't even know about those.

00:08:28
And that's what's crazy. So should it be allowed in

00:08:31
politics? That's the question.

00:08:33
How would it be monitored, though?

00:08:35
That's my question. I don't understand how there

00:08:40
could be. I mean, you can keep track of

00:08:42
it, I guess to a point, if you had the right authoritative

00:08:46
groups, an authoritative figures that would be able to somehow

00:08:50
monitor this. But I don't think it's 100%

00:08:52
foolproof. Do you?

00:08:54
I don't know. So I don't know if this is the

00:08:57
right question. I don't know who can even ask if

00:08:59
AI should be allowed because I don't know if you can be

00:09:01
monitored. But Can you imagine the debates?

00:09:04
I mean, fact checks are accurate and instant.

00:09:09
First off, we'd never get through a debate because both

00:09:12
candidates, assuming that they're similar to the ones

00:09:16
we've been having, are fudging the truth.

00:09:19
I don't, I can't imagine that these days with fact checks that

00:09:23
either candidate knowingly flat out lies.

00:09:26
They might embellish thinking that it'll slide by.

00:09:29
And I think that's what happened a lot on both sides.

00:09:32
Possibly with Joe Biden. There was a little mental, you

00:09:36
know, stuff going on there. I don't know for sure, but that

00:09:40
seems to be the general consensus.

00:09:42
But I just feel like it would be better to Fact Check using AI

00:09:48
than it is with these people who can can be biased.

00:09:52
I think that's what happened in two of the three debates.

00:09:54
And I think that at least we'd be able to eliminate that part

00:09:58
of it. I can't imagine that AI holds

00:10:00
any bias at this point. That could change.

00:10:03
But, you know, at this point, speeches, I mean, I have it

00:10:06
written down. Your speeches.

00:10:08
Debates first isn't speeches. I mean, right now you're you're

00:10:11
listening to speeches. I think the candidates should

00:10:13
have to write their own and none of this going to ChatGPT for

00:10:17
your presidential speech addressed to Congress or

00:10:21
campaign speeches or you should have to do it yourself.

00:10:24
I think that benefits Donald Trump because I don't think he

00:10:26
gives a shit. He'll just go up there and say

00:10:28
whatever he wants. I think that's one of the things

00:10:30
that's appealing about the man. If you're choosing something.

00:10:33
I think Joe Biden had to write his own speech.

00:10:35
Kamala, write your own speech, Donald, write your own speech.

00:10:38
But they're not. And at least there's a level

00:10:42
playing field that everybody can use AI, but it's just not

00:10:45
genuine. It's not sincere.

00:10:46
And I think everybody knows that it's out there.

00:10:49
And I mean, it's to the point where one of my really close

00:10:51
friends, not long ago, we were in a disagreement and, and, and

00:10:55
he wrote, he said, you know, how did you feel about my response?

00:10:59
Because I used ChatGPT to answer you.

00:11:02
And I said, yeah, I thought that.

00:11:04
I thought that was pretty good. What you you know, how you wrote

00:11:07
it? It was very, very well done and

00:11:10
but so is it better or worse that somebody's able to do that?

00:11:13
Maybe the point got across better, But at the same time,

00:11:16
are you you're taking out true emotion, your true genuine

00:11:21
feelings, I suppose if you're just shifting the words out or

00:11:24
are you, are you able to look through AI and say, all right, I

00:11:27
can see that he's genuinely affected by this.

00:11:31
So that that that's my thought. I mean, you know, are we going

00:11:36
to have 2036, I guess would be 32 and 3636 or 32?

00:11:41
Geez, the way we're going, what if we just had two AI

00:11:44
presidential candidates and decisions were just made all by

00:11:47
AI? It's getting scary.

00:11:49
People, you know, can kind of chuckle at that, but it's

00:11:51
getting scary. I don't want that.

00:11:54
The human element needs to be there.

00:11:56
And we'll talk about more examples of that.

00:11:58
I just, there've been too many books written about this that

00:12:03
were way before their time and some of it's coming true.

00:12:06
You know what they are Brave New World 1984 and so on and so

00:12:11
forth. But how about the news?

00:12:12
The news I just realized, I don't know why I didn't think of

00:12:16
this, but I don't know why I would have seen it.

00:12:19
The, the China's already been delivering the news and

00:12:23
government approved information to be delivered by AI.

00:12:27
The idea was, I guess 2018 was kind of the first one.

00:12:30
I think they'll hold me to that. But the Jinhua News Agency in

00:12:35
2018. And the idea was that they

00:12:37
wouldn't get tired. You wouldn't have, you know,

00:12:39
anchors that made mistakes. You would have just 24 hours a

00:12:43
day, you know, news stories and anything they, the government

00:12:48
felt you should hear, which doesn't surprise me with the

00:12:51
lovely country of China. But there's an example where

00:12:53
they came on and they had an AI person just bashing this one of

00:12:58
the the presidents of of Taiwan or the candidate in order to

00:13:01
help promote the other. And it was, it was interesting

00:13:06
to read about that. But you know, the issue I have

00:13:09
with it when I looked at an example of it is it looks real.

00:13:13
It looks like there's a person sitting there talking, but it

00:13:16
still sounds robotic. It mean it sounds like what it

00:13:18
is, which is a computer basically giving you

00:13:20
information. And I know that they're

00:13:22
developing that. I don't necessarily want them to

00:13:25
develop that. Are we going to get to the point

00:13:28
to where our brains are so programmed to take in robotic

00:13:32
sounding words and robotic sounding information that we

00:13:35
just accepted that sounds boring as hell?

00:13:39
I don't want that. That sucks.

00:13:41
Count me out. I don't want this AI, this AI

00:13:44
guy telling me the news. I want some, you know, regular

00:13:48
person that fucks up every once in a while.

00:13:51
You're telling me what's going on?

00:13:53
I don't mean weatherman. I want a weatherman that's

00:13:55
passionate when he sees that a tornado's on his way here, get

00:13:58
all freaked out like you do. It's hilarious, the people, in a

00:14:01
way, the tornado is hilarious. But these meteorologists, it's

00:14:04
like they're waiting for some disaster to be like, oh, this is

00:14:07
my time to shine, baby. And it always is entertaining to

00:14:11
me. But take that away.

00:14:13
What do you have? I don't know, I don't mean to be

00:14:16
this, I didn't think I would be this passionate about it, but I,

00:14:19
I kind of am. I'm starting to go, this is not

00:14:21
good. So anybody that thinks it is and

00:14:24
we'll get to that, let me know. Tell me, tell me that things

00:14:28
will be better. Tell me why they'll be better.

00:14:30
The last time I talked about AII went through pros and cons and

00:14:35
this time I won't because they're already been given, but

00:14:37
they're kind of obvious. And I'm, I'm giving them to you

00:14:39
through these stories. Italy just released an AI

00:14:44
edition of one of their publications there.

00:14:47
And so journalism's taking a hit over there.

00:14:49
But is it more accurate? Maybe.

00:14:51
I don't know. Let's go to influencers on

00:14:54
social media. I just talked about one, the

00:14:57
lovely Lucy, who, you know, made a merry man swoon.

00:15:03
Please don't worry. My wife thinks it's hilarious.

00:15:05
She's she's not worried. She has no reason to worry.

00:15:09
But these influencers, and, and I'll admit I don't know a lot

00:15:12
about these, these people or these influencers.

00:15:16
Lil Mikela, Eitana Lopez and Emma or IMA, spelled Imma.

00:15:22
Apparently these are three of the largest influencers who

00:15:26
happen to be AI and there are millions.

00:15:30
It says most of them still think they're real.

00:15:32
I don't know how you attract that.

00:15:34
And if I don't know if I would, I would probably lie.

00:15:36
If I found out, I'd be like, yeah, I knew they were fake.

00:15:39
But people, people don't. And unless you look into it, is

00:15:42
there a reason for you to think they're fake?

00:15:45
Unless I guess you go to the comments section or something

00:15:47
and see other people talking about it.

00:15:49
But again, you see something and it looks real and it sounds real

00:15:53
to a point. You go with it.

00:15:55
No one's questioning it. And so they endorse products and

00:15:58
they shape trends and they even interact with the fans.

00:16:04
Sure, AI can answer anything and I don't know who's pulling the

00:16:09
strings. I don't know how smart AI is.

00:16:12
I don't know if if there there can be something automatic to

00:16:15
people on social media who are who are following that

00:16:18
influencer. But to me, I don't like that.

00:16:22
I don't want somebody somebody. I don't want responses to my

00:16:27
work, the comments to to the things I'm talking about to be

00:16:33
AI generated. That defeats my purpose.

00:16:36
I mean, I suppose it can boost the algorithms, but I think it

00:16:39
also hurts them. You know, let's talk about

00:16:42
things like TikTok and and Instagram and all these bot

00:16:44
accounts. That's an issue.

00:16:46
Can you imagine if all these bots had faces and all these

00:16:49
bots were actually able to harder to even be identified?

00:16:54
They're hard enough now and they're annoying enough now, but

00:16:57
imagine it being even more difficult.

00:16:59
And, and you know, that sucks because they hurt people who,

00:17:02
who are trying to grow and they cause a lot of problems on

00:17:05
social media. I've talked about being scammed

00:17:07
in the past and recently they're, you know, they keep

00:17:10
trying different people and it's AI.

00:17:13
It seems real after you look into it, it turns out that it's

00:17:18
just somebody that's on some sort of program and they, they

00:17:20
know exactly what to say, when to say it, how to respond to

00:17:24
the, to your, to your hesitancy and your doubts.

00:17:27
So that's a little scary, but you know, when it comes to bots,

00:17:31
I mean, you're also talking about what are known as troll

00:17:33
farms. I mean, I, that's a perfect,

00:17:36
perfect phrase to describe what I deal with on a day-to-day

00:17:40
basis on social media troll farms and a lot of them are

00:17:44
bots. And so one of the reasons I've

00:17:47
backed away from I talked about it on my videos from Colorado.

00:17:51
I talked about it on past episodes from season 4.

00:17:56
And one of the reasons that I've I'm trying so Adam, so hard to

00:18:01
back away from the troll responses, from responses that

00:18:07
have no substance because some of them aren't even people.

00:18:11
Some of them are bots that are just trying to to do what

00:18:14
they're doing, which is troll. And so people get everybody

00:18:16
wound wound up and I fall for it.

00:18:19
I'm sure some of them are assholes and some of them are

00:18:21
real people. But overall, why waste my time

00:18:25
with it? Why get myself worked up?

00:18:27
I don't want to have my fate be because of some stupid assholes,

00:18:32
especially some stupid bot assholes that are on social

00:18:35
media. So.

00:18:36
So let's go back to Lucy Lucy, Lucy Lucy you fake you fake

00:18:42
hottie. People are falling in love.

00:18:45
Well, what they think is love with these AI social media

00:18:50
models. In discussion with the creator

00:18:54
of Lucy, again, I won't give much more information than that

00:18:58
until I hopefully get her to maybe answer some questions

00:19:02
about the process because I am very curious.

00:19:05
And if you are one of those AI influencers or the person who

00:19:10
creates an AI generated woman, is that wrong?

00:19:15
It's legal as of now. Is it wrong?

00:19:18
Should you take backlash? And these are the types of

00:19:21
questions that I would love to ask her.

00:19:23
But in a short conversation with that, she admitted to me that

00:19:29
there have been men that have written to her and they were not

00:19:34
happy. They were threatening her.

00:19:36
They were, I mean, that's about it.

00:19:39
They were threatening her. I mean, they, they were so

00:19:42
mesmerized by this woman and the back and forth that this woman

00:19:47
was giving to them that I don't know if they thought they had a

00:19:50
chance with the beautiful Lucy or if they felt taken advantage

00:19:57
of or, or betrayed or, or however.

00:19:59
But I imagine you have to look out for that and you have to be

00:20:02
careful. But these days, I don't, I don't

00:20:05
know how somebody finds you, but you have to wonder.

00:20:07
And so to me, that was very interesting that people are

00:20:11
taking it to that level. And of course they are.

00:20:14
I don't know why I'm surprised, but some of these models, again,

00:20:18
I'm using my quotes models, these AI models are they have

00:20:24
only fans accounts or like the similar platforms like 2 only

00:20:28
fans and they have these accounts where people are paying

00:20:32
monthly fees to see AI generated women.

00:20:37
I mean, I guess if people, people enjoy that, I mean, I

00:20:39
don't like I said, I don't mind looking at it.

00:20:42
It's different to me than if she was a real woman.

00:20:47
But people are paying, you know, $10 a month to look at, you

00:20:51
know, say, I mean, it's almost like you're looking at some sort

00:20:53
of cartoon. I mean, it's not, it looks very

00:20:56
real, but it's still, it's not, it's not real.

00:21:00
If any of you have subscription to one of these, please let me

00:21:03
know because I'd love to interview you.

00:21:05
I'd love to have you on and, and, and learn about and no

00:21:08
judgement. I mean, that's your thing.

00:21:11
There's no judgement. I, I really just genuinely want

00:21:13
to know what it's like to, to pay for something that isn't

00:21:18
real. And I guess we do it with a lot

00:21:20
of things all the time. So I can't judge.

00:21:23
So how about jobs? I talked a bit about jobs

00:21:27
before, so I'll do it quickly, but there are so many jobs.

00:21:32
And as I looked down a list of jobs that are being affected by

00:21:37
AI, you just, it just keeps going.

00:21:40
You don't realize how many. It's like when you you you talk

00:21:44
about a band that's that was popular, like wow, they had this

00:21:46
many hits. You just don't realize it.

00:21:48
That's the same here. And I think it's ironic that in

00:21:53
order to find out this information about jobs that are

00:21:56
affected by AII AM using chat GPD to help me.

00:22:03
So is that irony? I guess so how about

00:22:07
photographers and illustrators? Well, there's actually one

00:22:10
first. I thought this was interesting

00:22:13
so I put it first. The 8 Arizona Supreme Court

00:22:16
actually uses AI to disseminate news regarding rulings and

00:22:21
enhance the understanding for the everyday man and woman.

00:22:24
And I thought that was interesting.

00:22:27
So it's it's at a high level as well.

00:22:30
But how about creative? It's similar to what I do and

00:22:34
the things I enjoy. Photographers and illustrators

00:22:36
being replaced by AI or is AI better at it?

00:22:41
Maybe on the technical aspect of it, but where, where does that

00:22:45
end? Do you miss having the human

00:22:49
element? The biggest argument in sports

00:22:52
would be the the in baseball or softball to have AI being doing

00:22:56
the pitch, the umpire's job, calling balls and strikes.

00:23:01
A lot of people, traditionalists really, you know, baseball

00:23:05
enthusiasts, most of them and I've talked to do not want

00:23:08
computer. It would make it accurate.

00:23:10
Anytime you sit there and go, man that UMP was terrible behind

00:23:13
the plate or else maybe we would have won.

00:23:15
Can't do that anymore. However, the human element of

00:23:19
the game and the umpiring, the reffing the part of the game is

00:23:24
now taken away and maybe it takes some of the passion for

00:23:29
the game away as well. I don't know.

00:23:31
What do you think? Should umpires be replaced by AI

00:23:34
in baseball? Let me know.

00:23:36
I mean, it's, it's interesting, but all these are interesting.

00:23:38
So in China, 70% of video game designers are gone.

00:23:43
AI taking over. How about translators?

00:23:47
I mean, I know that that's not a a huge industry, but it's big

00:23:49
enough to be feel the effect customer service.

00:23:54
Some businesses have gone, 90% of their staff is gone because

00:24:00
they're now using AI. That's crazy.

00:24:02
I don't know if I blame them and why, why?

00:24:04
Why would you not? You're just taking advantage of

00:24:08
a system, taking advantage of technology.

00:24:11
We do it in many ways with technology and this is just

00:24:14
another one of them. Manufacturing and factories and

00:24:18
docks and you know, shipping and things like that.

00:24:20
That is a big one. That's a lot of different jobs

00:24:25
in that in that field and to replace a lot of that with AI

00:24:30
will feel it. The food servers industry.

00:24:34
Hello, I don't know what that was.

00:24:36
Hello, drive through restaurants again, you're you're becoming

00:24:41
robotic. Instead of having, you know,

00:24:45
somebody talking to you in a drive through, they're getting

00:24:48
it to where now you can just self order and then go pick it

00:24:51
up in restaurants. They're getting to the point

00:24:53
where you know, they're putting those things down the table.

00:24:55
Now where you can order, you can pay, you can do all these things

00:24:58
that were normally done by waiter or waitress.

00:25:00
Tips will be affected. I know a lot of people like why

00:25:04
am I tipping? This person literally just

00:25:06
brought me water and then told me to do everything on this

00:25:09
little computer here. I don't disagree.

00:25:12
And that's a problem for people that work in the industry.

00:25:16
There's a bunch administrative roles.

00:25:18
Of course, retail, like all the checkout, you know how anybody

00:25:21
checking you out anymore, you're doing it yourself from the self

00:25:23
checkout, journalism and content, content creation.

00:25:27
This is a big one for me because I'm a writer.

00:25:29
Can anyone just put an idea now and write a book?

00:25:32
Will people be able to tell the difference?

00:25:35
That's the big one. I hope so.

00:25:38
I really hope that AI will not push out the creative minds when

00:25:44
it comes to filmmaking, storytelling, novel writing, any

00:25:49
type of, of, of creative writing.

00:25:51
And that scares me and maybe one of the main reasons that I'm

00:25:54
against developing AII Suppose it's the best way to put it.

00:25:59
Legal services, finance, duh. I mean, we use calculators and

00:26:04
all that shit all the time. I think that's OK.

00:26:07
This is a little different government agencies, obviously

00:26:10
we're going through a lot of that right now for different

00:26:12
reasons in in eliminating government roles, but this is

00:26:16
another one that will add to that.

00:26:19
Lastly, let's talk about what I do here, which is podcasts.

00:26:26
I am of the opinion that I don't believe that AI can overtake a

00:26:34
podcaster because of the all of the things I've just said.

00:26:39
If you look at me, you know I'm not AII hope.

00:26:43
I mean, I know this looks perfect, but I am not AI.

00:26:49
If I was and you could tell, would the things I say to you

00:26:55
matter as much? Would they mean as much?

00:26:58
Would there be passion behind it?

00:27:00
All the things you guys like to bitch at me about on the comment

00:27:03
sections, would you be as worked up by things that I say if you

00:27:08
knew I was just a computer generated program?

00:27:12
So I'm hoping that this industry is one that cannot be overtaken.

00:27:17
And if we get to the point where we just kind of listen to

00:27:19
different AI thoughts and ideas on podcasts and on radio and on

00:27:26
television and on all these things, YouTube, all the

00:27:31
platforms, then that's scary. And we've become the very robots

00:27:38
that we are listening to. So lastly, before I end this

00:27:43
evening, I'm going to give you a, a few statistics that some of

00:27:49
them weren't that surprising to me.

00:27:50
And, and, and a couple were. And the Pew Research Center.

00:27:54
Pew in 2023 did a study and 52% of Americans said they were

00:28:01
concerned more than excited about the direction and the

00:28:05
efficiency of AI. Only 10% of Americans are

00:28:10
excited and the rest are neither excited or concerned.

00:28:17
I love those those those polls that you take where it's like,

00:28:19
you know, the middle is neither agree or disagree.

00:28:22
OK, well then why am I doing this God damn question anyway?

00:28:25
The Gallup telescope survey. First off, what the fuck is

00:28:28
that? A telescope survey?

00:28:31
That does not make me feel OK. Is there somebody with the

00:28:34
telescope looking right now and surveying what I'm doing?

00:28:38
Because according to this, 99% of Americans use AI enabled

00:28:43
products within a week's time and 2/3 of those people don't

00:28:49
even realize that they do it. Again, how are they getting this

00:28:52
information? The idea that it's called a

00:28:54
telescope survey does not make me feel OK.

00:28:58
I do not feel comfortable with this.

00:29:00
However, I digress as I'm just giving you statistics.

00:29:04
That's interesting as well though. 99% use AI enabled

00:29:10
products in a week's time. Here we go people.

00:29:14
It's, it's happening. Lastly, the Edelman trust a

00:29:17
barometer and this doesn't surprise me at all.

00:29:20
The wonderful Chinese, our neighbors to the east, if you

00:29:23
will, 72% of these people are encouraged.

00:29:28
They trust AI. In America, the United States,

00:29:33
only 32% trust AI. Some people, when they think of

00:29:37
China and China, don't get mad at me, I'm just telling you how

00:29:41
it is. A lot of people think of China

00:29:44
because of how many people are there and the fact that a lot of

00:29:47
this technology is happening there, being invented there,

00:29:52
created there. They're one of my examples with

00:29:55
the news being driven by AI. But we get this image of China

00:29:59
in the cities of China and the people walking around and

00:30:02
they're very robotic. They just are.

00:30:05
And, and a lot of it's just the the laws over there, the way

00:30:09
people are brought up and there's nothing wrong with it.

00:30:11
It's just seems to be the stereotype of Chinese people.

00:30:15
So this doesn't surprise me because maybe there's a comfort

00:30:18
zone there. Maybe there is a, a, a place

00:30:20
where they can say, hey, you know, we don't mind this AI is

00:30:24
great. It's accurate.

00:30:26
We don't have to question it. And they're comfortable with the

00:30:29
robotic style approach. However you want everyone to say

00:30:33
it versus Americans who, you know, aren't quite like that

00:30:36
yet. We're not conforming to that.

00:30:39
And I, I believe there's still a lot of emotions that supersede

00:30:45
that robotic technological element.

00:30:48
And I don't know if that made any sense, but it did to me in

00:30:51
my head. So good luck with that.

00:30:54
So, so I just thought that was, that was something I wanted to

00:30:57
throw some, some actual statistics and studies at you.

00:31:00
So you know, I'm not full of shit, at least on this, but but

00:31:04
I'll end here. I think that's something to

00:31:06
think about. It's just crazy to me that it's

00:31:09
kind of being ignored to a point.

00:31:10
I mean, we all know what AI is. We hear about it.

00:31:12
We we're fascinated by some of the things that can do.

00:31:15
I've posted some of it before. The lovely AOC and Elon Musk

00:31:20
romance video was, was AI generated?

00:31:25
A lot of it is. And I think I'm, I don't know to

00:31:28
what level it was, but it had to be something, right?

00:31:30
And so when you look at things like that, you're entertained by

00:31:35
it, but then you don't think big picture if it levels out and we

00:31:39
keep it in under control. Not to quote a or not to give a

00:31:44
plot of a sci-fi movie like I, Robot or some of these where the

00:31:50
robots finally turn on us. If we can maintain it and

00:31:53
control it at this level, or at least at a level that we can

00:32:00
progress with it, then OK, let's enjoy it.

00:32:04
But I'll tell you what, it's just something to keep our eye

00:32:07
on as we move forward. So with that being said, I'm

00:32:11
glad to be back. Give me a like subscribe

00:32:15
comment, let me know. Trust me, I'm getting I don't

00:32:18
even need to say comment. I know you people will comment

00:32:21
because I get tons of them just because I haven't even looked

00:32:25
yet. But I released a clip today

00:32:28
where all I said was why does everybody have a problem with

00:32:31
the America America First mindset?

00:32:35
And I tell you what, I haven't even looked.

00:32:36
But it's easily one of the most common in clips that I've gotten

00:32:39
so far 'cause I just kept seeing them pop in comment, comment,

00:32:41
comment. Oh great, here we go.

00:32:43
So I love the interaction. All in all.

00:32:46
Hopefully some of those comments are just fucking bots that who I

00:32:49
hate and who I won't respond to. But a lot of them are genuine

00:32:54
and I enjoy that part of it. So come at me, challenge me, and

00:32:58
definitely agree. If you do agree, please let me

00:33:01
know so that I don't look like everybody hates me.

00:33:05
My ego is fragile. Not really, but hope you enjoyed

00:33:09
and I'll see you in the next one.

00:33:12
I'm back. No more vacations for a while

00:33:15
and we'll get it rolling. Season 5 has been underway and

00:33:19
hope you enjoyed the ski videos and we'll go with a lot of

00:33:23
guests. I've got so many ideas and I'm

00:33:25
trying to get it done. Until then, I'll keep doing

00:33:29
these and hope you enjoy. So again, like I always say,

00:33:33
stay inquisitive, stay laughing, stay in Gray.

00:33:37
I'll see you next time. Love you.