Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Warren Cavlanas picks one of six. Quinny can't. The most
(00:02):
unbelievable story of the day is not the pope passing away.
It is no, it's it's it's it's it's Billy Ray
Cyrus and Liz Hurley dating. That is is the most
striking story of the day. Can't for me, no question
about it.
Speaker 2 (00:14):
It is pretty strike I said I saw it yesterday
and didn't believe it. Yeah, just wrote it off as
some kind of clickbait.
Speaker 1 (00:20):
I'm telling you even now, am I getting Am I
getting fooled?
Speaker 2 (00:23):
Here?
Speaker 1 (00:23):
Is there something going on? You think?
Speaker 2 (00:25):
Well, I never read that they were actually dating. I
saw a picture of them kissing the sun. The sun
was driven some emojis, But the world has Hell's frozen over.
Speaker 3 (00:34):
He was just like boot off of a one stance
for being hammered. I know, I know there's legit concerned
about him.
Speaker 1 (00:41):
Shocking story.
Speaker 2 (00:42):
She either makes really bad decisions or it's some kind
of some kind of pr spinding.
Speaker 1 (00:48):
She makes bad decisions I think because of the old
uh Hugh grand thing.
Speaker 3 (00:52):
I don't know. That's one thing. I don't know.
Speaker 2 (00:54):
What's forty years I don't even get why that's a
bad decision. What's the bad decision.
Speaker 1 (00:57):
Because he went and got a prostitute to have a
oral seconds.
Speaker 3 (01:01):
Her bad decisions, bad decision.
Speaker 1 (01:04):
When she should have seen it him right.
Speaker 3 (01:06):
You can't put that on her.
Speaker 1 (01:08):
I'm just for the sake of conversation.
Speaker 3 (01:09):
Forty years ago, kids, we used to put things on women.
Speaker 1 (01:12):
Don't worry.
Speaker 3 (01:12):
You do a lot of victim shaming back in the day.
Forty years I.
Speaker 1 (01:15):
Didn't put anyone put anything on the woman.
Speaker 2 (01:17):
She makes terrible decisions because of Hugh Grant.
Speaker 1 (01:20):
In a hindsight. Jesus Christ, Hugh.
Speaker 3 (01:22):
Grant makes bad decisions in hindsight.
Speaker 1 (01:24):
Why are you taking such a hard card corter as
his stance on this. I'm telling you this is incredible.
We should really all be happy for aaky break yard guy.
I'm not.
Speaker 2 (01:34):
Hey, yeah, we saw this story. Quinn mentioned it earlier
in f News. It's worth repeating. You shared it on
the Quentin Canteren picts on six Facebook page.
Speaker 1 (01:40):
But may not be too sexy right now, but it's
gonna be sexy in ten years.
Speaker 2 (01:44):
There was a half marathon where humans competed yep and
and twenty one robot humanoid robots competed.
Speaker 1 (01:51):
Which is fabulous, isn't it. It's amazing.
Speaker 2 (01:54):
Most of them fell down or burnt out. The fastest
one was an hour and forty minutes behind a human
and that's after three battery changes.
Speaker 1 (02:04):
Oh my god.
Speaker 3 (02:05):
Six of the of the robots finished the half. Six
of them finished, six finish.
Speaker 1 (02:09):
So I mean this, But the whole point is that
we're this is again. There's always the first time, you know,
the first time that we do this, and right we're
going to see the how AI affects. It's gonna be insane.
We're gonna be all running from robots at some point,
I guess are running with them.
Speaker 3 (02:26):
I don't disagree with that.
Speaker 1 (02:27):
Yeah, no, get ready.
Speaker 2 (02:28):
My issue is this, like we have people you and
then people higher than you. Quinn No, No, like like
with positions within iHeart.
Speaker 3 (02:39):
Oh okay, like just hammering us to utilize this AI
that iHeart pays for.
Speaker 1 (02:44):
Yeah, I love it.
Speaker 3 (02:44):
AI can't even finish the race. Hey, I can't even
beat the fattest human.
Speaker 1 (02:49):
That those I don't know are these all AI robots.
Speaker 3 (02:52):
I'm lumping them in, Okay.
Speaker 1 (02:54):
Professor of Computer Sciences for AI and Robotics at Oregon
State I think it's a combination.
Speaker 3 (02:58):
You need to write a promo, put it in AI.
But why, I've been writing promos for forty years.
Speaker 2 (03:03):
I got it.
Speaker 1 (03:03):
One robot did have three guys running behind it with
a remote control.
Speaker 3 (03:06):
I don't like. That's not a robot.
Speaker 2 (03:08):
You're flying a glandro I know the other robot had
to at leash.
Speaker 3 (03:12):
On it because it kept tipping over.
Speaker 1 (03:13):
Now, I imagine that there are robots that are much
more advanced that we have for you know, for military purposes.
Can I like to think so we don't know about yet.
So I think maybe they are the ones that don't
fall down as much.
Speaker 2 (03:24):
Before you tell me to put every egg in my
in the human race basket into AI, let me see
one of them beat a human at a race.
Speaker 1 (03:33):
Then you're too late, my friend. Then you're too late.
Speaker 3 (03:36):
I don't care.
Speaker 1 (03:36):
Okay, all right. I like your stance. I like your stance.
It's you're never gonna win this one.
Speaker 3 (03:41):
I think you guys think Google is AI.
Speaker 1 (03:44):
It's the same thing Google has an AI.
Speaker 2 (03:47):
I don't like it because you can talk to it.
Speaker 1 (03:49):
It's because it talks back to you. That's why. Yeah, yeah,
I love it. Yeah, absolutely,