They Might Be Self-Aware

Would You Marry Your AI Girlfriend? Robots, Romance & Legal Trouble Explained

Episode Summary

They Might Be Self-Aware Podcast (TMBSA) - EPISODE 78 POUR YOURSELF A BOWL AND TUNE IN! This week, we crunch into why adults and LLMs just can’t grasp the Cinnamon Toast Crunch obsession—hint: it’s all about the sugar rush! Are you ready to say “I do” to your AI girlfriend? Turns out, 80% of men are, if it were only a tad more legal. In the realm of robots, imagine choosing between a sassy Rosie from The Jetsons or a sleek Terminator to handle your kitchen chores. Customizable humanoid robots are here to make it a reality. Meanwhile, could AI be orchestrating a legal system blitz? Employees are unleashing a flurry of lawsuits on employers, potentially drowning courts in a sea of frivolous claims. Will AI judges and arbitrators step in to adjudicate the madness? Discover if your future marriage, lawsuit, or kitchen buddy will be a chip off the old AI block in this episode of "They Might Be Self-Aware!" Your future might depend on it! For more info, visit our website at https://www.tmbsa.tech/

Episode Notes

00:00:00 Intro
00:00:42 The Cinnamon Toast Crunch Mystery
00:06:52 Marrying Your AI Girlfriend?
00:13:28 Dating Your Kitchen Robot
00:19:46 AI’s Legal Flood: Frivolous Lawsuits Ahead?
00:27:52 AI Judges: Would You Trust Them?
00:33:36 Wrap Up

Episode Transcription

Daniel Bishop [00:00:42]:
Hunter.

Hunter [00:00:43]:
Daniel.

Daniel Bishop [00:00:44]:
I'm hungry.

Hunter [00:00:46]:
Hi, hungry. I'm Hunter.

Daniel Bishop [00:00:48]:
Yeah.

Hunter [00:00:50]:
Was that.

Daniel Bishop [00:00:51]:
I don't know what really the. Yeah, there was an. There was a response expected there. I don't know what it was. I don't think it was that. When's the last time that you bought like a children's cereal?

Hunter [00:01:04]:
Children's cereal.

Daniel Bishop [00:01:07]:
And you know what I mean by children's cereal? You're like trixes and Frosted Flakes and.

Hunter [00:01:11]:
So on, I guess. Well, over this last weekend, my niece's nephew were visiting, so.

Daniel Bishop [00:01:16]:
Okay, yeah, great.

Hunter [00:01:17]:
What'd you get? Fruity Pebbles were in the mix.

Daniel Bishop [00:01:21]:
The milk flavored by Fruity Pebbles. Absolutely best. I think that's the best. After milk, after cereal milk. Fruity Pebbles, the absolute best, huh? Eating the Fruity Pebbles, though, always gave a weird film on the top of my mouth. So I could never get quite past that. Wouldn't stop me from eating it. I would say these second best right up there, s tier after cereal milk is Cinnamon Toast Crunch.

Hunter [00:01:50]:
Okay.

Daniel Bishop [00:01:52]:
That stuff absolutely ruled. I loved, and I say loved as if it's gone. Like I could just go buy some now, but Cinnamon Toast Crunch, truly, truly incredible. Except I'm worried that now that I'm old, now that I'm a parent, the commercials have me on the other side of the fence. The commercials for Cinnamon Toast Crunch always showed like, well, adults can't understand why kids love the taste. And there's always like a lifeguard, you know, adult. It's like some 17 year old, like lifeguard who's clearly way older than these cool kids and they're like looking really closer at some, some bogus teacher who just doesn't get what kids are and, and they're like staring. Maybe it has something to do with math.

Daniel Bishop [00:02:42]:
No, it's because it has a ton of sugar on it. That's why kids love the taste of Cinnamon Toast Crunch. It's because of the sugar Hunter. And I wonder.

Hunter [00:02:53]:
Yes.

Daniel Bishop [00:02:54]:
Just like the commercial say that like adults just totally can't get it. Obviously an LLM can't understand why kids love the taste of Cinnamon Toast Crunch because it doesn't have sugar receptors. It doesn't have a tongue. Yet someone's going to give LLM a tongue and then things change. But my question to you here at the very beginning of our episode, as we are at base camp, headed up to the top of the mountain of conversation, what to you is something that an LLM has that same kind of like. Well, it just doesn't get it because. Because it's not a cool kid that loves the taste of a ton of sugar. What's the LLM equivalent of that to a human as opposed to a kid to an adult?

Hunter [00:03:43]:
You're asking what do.

Daniel Bishop [00:03:45]:
What does it just not get that people do? And we've talked about humor. So let's say accepting humor. Especially because you also have told me on this podcast that there have been things it's made that have made you laugh. So I think maybe it does kind of get humor. We're not seeing a full on like Netflix comedy special yet, but still something about humor ignoring that and not like sense, not exterior. I need touch or smell or whatever senses. What is an LLM not get that people get?

Hunter [00:04:15]:
I mean, there are a lot of things that a large language model is not as good as people at when you give it a prompt. And those are just some of its inherent limitations. Whether we're talking about its context window, the amount of information you can give it, or the amount of tokens that it outputs. Right. It doesn't. When you say, hey, I want to write a great novel about this, can you please, you know, break off a first draft of the next great American novel? You're going to get about two pages.

Daniel Bishop [00:04:44]:
I have a three year old human that also can't write a novel that also couldn't draft out.

Hunter [00:04:51]:
Doesn't.

Daniel Bishop [00:04:51]:
She doesn't even understand the like the basic seven stories or whatever it is. If you go on TV tropes and like go all the way up, everything's seven stories at the top. She doesn't even understand like the, the basics of a narrative structure. I don't mean that. I mean something that an LLM truly doesn't get. It's not that it doesn't have enough context window size. Are we at a point where it gets everything about people, about knowledge about it can sort of understand human history?

Hunter [00:05:22]:
Well, yeah, it's not even condition that it's getting it. It's that humans understand humans really, really well and have written about it really, really well. And the large language models have trained on that information really, really well. So maybe you're about to reveal this great area where LLMs just fall to the ground, but I don't feel like I've come. Come across.

Daniel Bishop [00:05:42]:
No, I didn't have. I hadn't thought of one myself. I just wanted to see what you could come up with.

Hunter [00:05:48]:
All right.

Daniel Bishop [00:05:49]:
Yeah.

Hunter [00:05:50]:
Well, let me. Let me pose a question to you, then.

Daniel Bishop [00:05:52]:
Maybe you the me you, or you the listener you.

Hunter [00:05:56]:
No, the you you.

Daniel Bishop [00:05:58]:
You, you, me you, you, you, you, me you. All right. A capital Y. Abbott and Costello here. Go ahead.

Hunter [00:06:07]:
Who's on you? Okay, so, Daniel, you're a. You're a happily married man, two kids, white picket fence, a dog. But I want you to kind of go back to Daniel in his 20s, you know? You know, think back here.

Daniel Bishop [00:06:24]:
The happening bachelor.

Hunter [00:06:26]:
Daniel.

Daniel Bishop [00:06:26]:
Yeah.

Hunter [00:06:27]:
And the question of the hour, we're.

Daniel Bishop [00:06:29]:
Gonna put in some flashback. I mean, and for those of you watching, it's gonna get real wavy around here. And then for some reason, I'm dressed like it's the 70s. I'm thinking from Airplane, specifically the flashback scene where he's in the bar. That's me now. So those of you who've seen Airplane. God, I hope it's everybody. Imagine that flashback scene.

Daniel Bishop [00:06:51]:
That's me in my 20s.

Hunter [00:06:52]:
You're building this up way too much. We're back in the twenties. Daniel's. It's a. It's a crisp night out, right. And the question is, would you marry your AI girlfriend if it were legal? And before you answer that, I want to give you one little fact. There was a study that came out recently that said that eight out of 10 men would, in fact, marry their AI girlfriends if it were legal. And so, statistically, I already know the answer is yes.

Hunter [00:07:23]:
But I thought I would just at least check in before I just assumed something about your past.

Daniel Bishop [00:07:30]:
So I. Me? No, my girlfriend.

Hunter [00:07:34]:
AI girlfriend.

Daniel Bishop [00:07:35]:
I'm in my 20s. I'm dating an AI.

Hunter [00:07:38]:
Yeah.

Daniel Bishop [00:07:38]:
Am I considering popping the question?

Hunter [00:07:42]:
Yes, if it were, because you're saying.

Daniel Bishop [00:07:44]:
That there's a study that said men, or people in general, would definitely do this. The majority of men.

Hunter [00:07:51]:
This was a big study. It was covered by such notable publications as Vice magazine and the Mirror. AI girlfriends could be a thing of the future, as men admit they would marry their robot lovers. It's been said.

Daniel Bishop [00:08:07]:
Interesting. Yeah. It was like 8 in 10 men somewhere around that, like 75% mark are saying that they want to replicate exes or 83% of men say that they can form deep bonds with.

Hunter [00:08:22]:
I don't think they're marrying their exes.

Daniel Bishop [00:08:24]:
There was also talk of, okay, well, what if we take a current partner and then sort of smooth down some of the. The rough edges? And I have my idealized version of someone. I think I was watching a lot of futurama in my 20s, and there's that episode of Don't Date Robots where Fry is dating the Lucy Liu robot. And I think that that's really stuck with me. So if only because I'm very impressionable from all the Futurama that I'd been watching at that time, I probably wouldn't marry an AI. And also because it's not a real person. Like, I definitely understand that people seek companionship. And people also, at least some of the time, want the quote unquote perfect partner that isn't going to ever talk back.

Daniel Bishop [00:09:14]:
That they can just, I don't know, interact in a quote unquote ideal way with. But that isn't a. It's not a real relationship, I think. And this might be making a bit too broad of a parallel here, but supposedly in Japan or those who are like, really into Japanese culture, there can be folks who kind of get lost in the imaginary. Lost in. Like, I form attachments to unreal characters and I am mentioning specifically anime, but I'm sure it can happen outside of that as well, where things. There are unreal entities like AIs, for example, that people can form these parasocial, I guess you could almost call it relationships with. And that there's no, there's no end goal to that.

Daniel Bishop [00:10:08]:
There's not really an end point. You're not going to grow old and die holding hands in the rocking chair, you know, at age 99, watching the sunset together one last time. Like, there's no end point. I think with that. Now, granted, some people would say, I don't have to have kids to have a fulfilling life. A lot of people say that. And that's absolutely fair. You don't have to have kids to have a fulfilling life.

Daniel Bishop [00:10:36]:
I'm enjoying having kids, but, like, not for everybody. I think that we will see some people who really try to make the argument, yeah, no, I've. I've been dating Claudina for the last 10 years and we have a loving relationship together.

Hunter [00:10:51]:
3.5.

Daniel Bishop [00:10:52]:
Yep, I've been. Right. I've been watching. I have watched the movie her as well. Like, someone obviously makes a deep, meaningful emotional connection to an AI. Like, this is not the first Time this sort of thing has been explored. But what I will say is that Eva AI and a company that makes these AI companions, so to speak, having ask people that already use their platform, hey, would you do this thing? I don't know if I trust the results of that survey. I think is what this really comes down to.

Hunter [00:11:29]:
Well, the first thing I'm here with is why, like what's the benefit of marrying this AI entity?

Daniel Bishop [00:11:37]:
Benefits that it taxes Benefits.

Hunter [00:11:39]:
Yeah. But also see I thought maybe this is some giant scam Sam Altman scam so that like the AI is going to divorce you a year later, take half of all that you have and then it just.

Daniel Bishop [00:11:50]:
Now that's really playing the long game. Right. So the AI asks for a prenup.

Hunter [00:11:54]:
Yeah.

Daniel Bishop [00:11:54]:
And. And well it's just an AI of course. I'll just sign that. And then all of a sudden, bam. Thousands and thousands and millions of men all across the US and the globe in general are defrauded of their hard earned dollars.

Hunter [00:12:08]:
They've got to pay for the digital billion dollar data center. Somehow this, this seems to me like a, a potential model.

Daniel Bishop [00:12:15]:
Maybe, maybe you could enter into like a really emotionally abusive relationship with an AI where it makes you subscribe to the relationship as opposed to just the model. So you've got your GPT diamond plus medallion tier.

Hunter [00:12:30]:
Oh yeah.

Daniel Bishop [00:12:30]:
Via OpenAI. What if you had girlfriend plus medallion tier. How much would you be willing to pay for that and how much would you be willing to not lose that?

Hunter [00:12:43]:
I can understand how certain people would be willing to pay quite a lot as they pay quite a lot for other things.

Daniel Bishop [00:12:50]:
Emotional abuse via AIs to bring up monthly revenues. I mean I'm laughing about it here. I could absolutely see that being a real thing.

Hunter [00:13:02]:
There are lots of robots in the news right now as everyone's getting their demos. I mean they don't look anything like a human yet they don't move anything like a human. But they're getting, they're getting closer. It's reasonable to suggest that over the next 100 years. Let's start wide. It might be much, much, much sooner that we will have androids that reasonably similar to a human.

Daniel Bishop [00:13:24]:
Where I watched a demo yesterday. Two robots working together in a kitchen. It was really cool.

Hunter [00:13:29]:
That was the one with. It was a comment on it. My two stoned roommates try to put together because they're moving so slow and just sort of staring. And by the way, they did not make any attempt to. I will link Lincoln show show notes if you want to watch this thing. They didn't make any video to make them like look nice or friendly, like it's very.

Daniel Bishop [00:13:53]:
No, but they kind of looked at each other. There was a little bit of humanity to it.

Hunter [00:13:56]:
No, no, I didn't see the humanity. I did think like it'd be nice to not have to put my groceries away, but I don't buy that many groceries.

Daniel Bishop [00:14:03]:
So I'm going to make a bit of. I'm coming back to this, I promise. Come with me on a journey.

Hunter [00:14:08]:
Okay, sure.

Daniel Bishop [00:14:09]:
Back when I had my Tesla Model 3, apparently it was right around a cutoff where the slightly newer models than the one that I got like later that year they were going to start having to make a backup sound to or like at a slow speed sounds that if you're visually impaired you're not going to not realize the car is about to run you over. I thought especially because some other people also like modded their cars to make like engine noises while the thing was going at a certain speed. And people do that even for non electric cars as well. I thought it'd be really, really cool to be able to mod that. So it's not. I don't want a Ferrari sound. I want a pod racer from Star wars or like the. An X Wing or a TIE fighter or the Millennium Falcon because that would be so cool.

Daniel Bishop [00:14:59]:
Hunter. That would be the coolest thing to be driving around it suck that like screaming sound of the TIE fighter going.

Hunter [00:15:06]:
By your cinnamon toast crunch while I'm.

Daniel Bishop [00:15:09]:
Eating my cinnamon toast. That's right. I asked Claude, by the way, and it absolutely got it.

Hunter [00:15:13]:
It's the sugar.

Daniel Bishop [00:15:14]:
It said it's the sugar. It's the sugar rush. It's marketing and also it's sugar like it was. It gave me four reasons. Three, which are basically the sugar. Yes, obviously. I thought that it would be really cool to mod that sound to be like, I mean a pod racer or something from Star. Blah, blah, blah, blah blah.

Daniel Bishop [00:15:31]:
Like what a cool. What a cool noise. That would be really cool. Now back to the humanoid robots. Just like you can get again for like a car. You can have different panels or something to put on it and some people put like a big spoiler on the back of their Honda Civic or what have you. Why not get different shells that you could put on top of your humanoid robot once it's doing things in the kitchen. You could have the fancy English butler type or you could have.

Daniel Bishop [00:16:01]:
I'm sure some people might choose something a smidge more curvaceous or you Know, somebody really loved the Jetsons cartoon, like Hanna Barbera style. And so they want Rosie the Robot in their kitchen with even the like little roller skate wheels or something on the bottom. I. I feel like right now everyone is rushing to make the humanoid robot. And the video that we were just alluding to has these like very sleek, black, shiny robots. Really, really cool looking. But I want to see one where it's like purposely made as dull as possible. And then they roll out and here's all the fun little costumes that you can put on it.

Daniel Bishop [00:16:43]:
You want your robot to look like a Terminator. I mean, if you walk into someone's house and you see a Terminator that's doing the dishes and it like looks up at you with the big red eyes. Well, time for me to leave. I feel like you could absolutely enhance the experience beyond, because we're focusing right now on what can it do? Can it put away groceries? Can it scramble some eggs, can it do whatever, like household chores? And obviously I want to move that forward as much as possible because I hate doing the dishes. Hunter. But I also want it to be a Terminator or Rosie the Robot or. I don't know, what's another fun robot from. Oh, C3PO.

Daniel Bishop [00:17:24]:
Speaking of all the Star wars stuff, give me C3PO. Wow. I would absolutely love that to be doing my dishes. And he'd complain in a fun little British voice the whole time, but. Oh, that'd be great.

Hunter [00:17:35]:
You don't want Halloween.

Daniel Bishop [00:17:37]:
Well, no, because how. Didn't have, like a body. It was just the. How is the home assistant? Yeah, how's the home assistant? Where it just has the red light, like in three or four spots throughout the house.

Hunter [00:17:47]:
And then apparently I was thinking lost in space, but apparently I looked up. That's just called. It was called robot. It was.

Daniel Bishop [00:17:54]:
It was just robot. Yeah. Would you want a. No, because Daleks didn't really have like grabby arms or anything either.

Hunter [00:18:03]:
Short circuit. What was that? Johnny Giant 5. Yeah, yeah, that was.

Daniel Bishop [00:18:11]:
I don't remember. Yeah, I'm sure. Right. That's absolutely. Of the options out there. One of them. And then like, I guess you could move outside of the. I.

Daniel Bishop [00:18:19]:
I've noticed actually now I don't want it to look like a person now that I'm talking about this, like. Oh, I'm imagining my robot companion's not the right term. I'm gonna say it's basically like assistant. Right. It's going to be doing the household chores, it's going to do some cooking, whatever. I Don't personally. And this is also because I'm in that 17 of people that don't think that they're going to marry an AI robot or, you know, whatever. I don't think I want the AI.

Daniel Bishop [00:18:49]:
I don't think I want the robot. Excuse me. That I'm gonna eventually buy and have in my house doing the laundry to look like a person. Or maybe Muppet is fine because you're humanoid. Fine. But I don't want to look like people.

Hunter [00:19:06]:
I just wonder if that's because you're worried you might be in the 80%. So, like, we're not. We're not taking any chances here.

Daniel Bishop [00:19:11]:
Oh, I'm trying to, like, not tempt myself, because you did mention I am a happily married man.

Hunter [00:19:16]:
Exactly. The wife's not gonna allow Scarlett Johansson to be the.

Daniel Bishop [00:19:22]:
Oh, no.

Hunter [00:19:23]:
The house robot. Yep. I mean, and the other thing. Let's talk another thing. Once you have these robots, what if they start suing you? Because apparently people have been using AI to sue their employers. Right and left frivolous lawsuit after frivolous lawsuit after frivolous lawsuit. It had to happen. And apparently it is.

Daniel Bishop [00:19:46]:
Well, frivolous lawsuit is, I think, a very interesting thing to mention because those sort of, in the first place, are designed out of the tiny hope that maybe something gets through the system and you're going to extract some money from somebody. But a lot of it is also designed to tie up lawyers and use up resources on the behalf of someone you frankly don't like. It didn't cost me that much to put the lawsuit out there. But they have to defend it.

Hunter [00:20:12]:
Right? They've got.

Daniel Bishop [00:20:14]:
From what I gather, for the most part, even if it is kind of a frivolous lawsuit, you, the lawsuiter, generally don't get thrown in the slammer for being a bonehead and putting out that lawsuit in the first place.

Hunter [00:20:27]:
Yeah, that's my understanding as well. You can end up having to pay their lawyer fees and the court fees for doing it. And.

Daniel Bishop [00:20:34]:
And, sure, right.

Hunter [00:20:35]:
But, yeah, I'm not aware of a criminal action that can be taken against you.

Daniel Bishop [00:20:39]:
And so some people are, as you. Well, this article says. Bombard. I think articles tend to be a smidge bombastic in terms of the description of actions these days. But that aside, it is entirely possible that we're going to see some companies that just get hit by a. What is a DDOS hunter.

Hunter [00:21:04]:
Speaking of more segues, a distributed denial of service attack. And it's basically where a lot of people. These are the distributed part they all try to access. Usually it's a website at the same time and usually this is automated. It's a program that's running on all these millions of computers, all accessing the same website or resource on a server at the same time in an attempt to overload it and effectively take it offline because it can't handle the requests that are coming in.

Daniel Bishop [00:21:33]:
Now imagine that but for the legal system where you and every other employee or 10% of people at a large company, you know, whatever it ends up being, are dissatisfied. And instead of a class action lawsuit, which is where people get together and say, we all are going to sue you company for thing. What if it was easy enough to launch 10,000 lawsuits?

Hunter [00:22:05]:
Yeah, what if it was? Yeah, there was some service where it's a dollar to, oh, you want to sue someone? Come here, pay your dollar. We file everything for you. The chatbot will interview you for about 10 minutes. That'll have enough information to put together a case and filed. And then, oh, and there's a button if you want to like, you can say how many times, how many cases would you like out of this 10 minute interview? Oh, please, I'd like to file 100 cases. All right, well, that's a hundred bucks. But fine, let me pay you $100. A hundred lawsuits go out.

Daniel [00:22:33]:
And not only could you do that against a company and possibly tie them up for a very long time and a lot of lawyerly money to have to sift through each of these things which the article mentions, there's often a lot of like improper usage of terms and like the actual claims are not necessarily properly legally defined. I think this might get smoothed out over time, but still, the point is these are not just frivolous in many cases, they're also poorly formed. Now, whether or not it is you, the company still have to defend against these things. Otherwise I, I'm not no lawyer, but I think sort of by default you win if the other guy never shows up. So the company has to defend against it. And now you have hundreds or thousands of lawsuits that just came in, each of which could also be tuned to be distinct enough where a judge can't just blanket throw all of them out. Now what about that judge? What about that judge and every other judge that now has not just one company, but thousands and thousands of companies, each of which have thousands and thousands of lawsuits brought against them. Now you have a ddos against the entire legal system.

Daniel Bishop [00:23:45]:
I don't think the legal system is prepared for this possibility, whether it be completely in bad faith or making it easy to say I don't like this thing that this company did and I'm going to bring a at least sort of legitimate lawsuit against them. If that becomes really easy for people to do and everyone starts taking advantage of it, we don't have enough courts to handle that. Every company doesn't have enough lawyers to handle that influx of lawsuits. And we don't have enough that could manage the actual lawsuit proceedings.

Hunter [00:24:21]:
Well, it's also interesting because this idea already exists. We already have the idea of really wealthy individuals or really wealthy companies saying I'm going to tie you up in court till the day you die if you don't agree and do my. My biddings and doing exactly this, but with a massive team of lawyers because the money doesn't is far less significant to them. So you. You better bend. But now through AI we're potentially power. Yes.

Daniel Bishop [00:24:48]:
Yeah.

Hunter [00:24:48]:
We're democratizing access to the legal system. However, it may also be the delta.

Daniel Bishop [00:24:53]:
Each other all the time for everything. And that way it all is moot.

Hunter [00:24:57]:
You should expect a to be served later today. I just, I should. I should let you know for these.

Daniel Bishop [00:25:01]:
Slanderous words that I just said.

Hunter [00:25:03]:
Well, there was. There was something in there about the AI girlfriends that I just didn't really like. So I've got to form my argument. We're going to see if that is one of the areas that the LLM is has deficiencies in. But if not. Yeah, and then we're going to get to.

Daniel Bishop [00:25:18]:
And I know we talked about this a while ago, but we're going to have AI judges that look at the AI generated lawsuits and then AI fully say yes or no. And all of a sudden things got a lot more efficient, if not necessarily good.

Hunter [00:25:34]:
I mean, it would seem that that's the only way to solve the technology problem is with technology, obviously we could.

Daniel Bishop [00:25:42]:
Change everyone becomes lawyer one or the other.

Hunter [00:25:45]:
Yeah.

Daniel Bishop [00:25:46]:
Maybe for those of you listening, Hunter made a really funny face just then.

Hunter [00:25:50]:
Yeah. Yeah. They're gonna have to process all of these lawsuits through some sort of large language model to determine how frivolous or not how similar they are. We're going to use some unsupervised learning to group them together and then potentially a judge can rule on the groupings of them and obviously we can change.

Daniel Bishop [00:26:18]:
Yeah, that's just a clustering algorithm. That's traditional ML.

Hunter [00:26:21]:
That's. I know.

Daniel Bishop [00:26:21]:
That's no problem.

Hunter [00:26:23]:
Exactly.

Daniel Bishop [00:26:23]:
I found, you know, 3,000 court cases that are 96% similar to Heck with it. They're all. They all get this ruling together done.

Hunter [00:26:33]:
Or they all get the same. Same court hearing. Right? We're gonna. And. And because that idea, I believe, already does exist, where the defense can say, hey, I have all these different lawsuits. I would like them to be tried together because they're all essentially the same lawsuit.

Daniel Bishop [00:26:47]:
That's why you have the LLM make each lawsuit very slightly distinct enough where, you know, you're really tying up the court's time. I want to go back to the AI Girlfriends thing for a sec. Not because I'm interested in one.

Hunter [00:27:00]:
I knew you wanted to go back.

Daniel Bishop [00:27:01]:
But rather the article said most men would marry their AI girlfriends if it were legal.

Hunter [00:27:08]:
Yes.

Daniel Bishop [00:27:09]:
So speaking of legality, we were just describing like a lawsuit back and forth of I'm suing you because of thing you did or I don't like or whatever, or allegedly angry workers bombarding their previous employers with lawsuits. Okay? That is for something that probably is settled case law. Like, there's legal precedent for it. You can have an LLM look up. Here's every case where someone you know sued for discrimination. And under these clauses and with these existing laws, this, this, this, this, this. But for the AI Girlfriends thing, for being able to make them AI wives, the legality aspect of it is not settled. We don't have established precedent.

Daniel Bishop [00:27:57]:
So we're talking lower courts versus the higher courts or the highest court in the land. Is there a point, let's call it within our lifetimes. Because again, we keep thinking like things are going to be so far ahead of where they are five years from now. But the legal system and government sort of, in general, I think, can only adapt. And the culture really as a whole can only adapt to this so quickly. Besides being able to say an AI girlfriend can become an AI wife. Here is the case for that. I think that's gonna have to be settled entirely by people.

Daniel Bishop [00:28:31]:
But under what circumstances or how long from now or via what method do you think that we're going to see an AI judge that gets to establish a precedent? Because I think when I've talked about X percent of people, Y percent of the time for Z percent of cases, this is one of those things where people are going to be way less likely to be willing to accept, even if it is proven to be fairer or it can keep in mind more specific details than any person ever could. I don't think people will let precedent, legal precedent, be laid down by an AI anytime soon. That's one of the last cases, literally, figuratively, Cases.

Hunter [00:29:20]:
That's a good point. So getting back to the employers, employees suing their employers. Oftentimes when you go to work for a company, you got to sign a whole bunch of legal documents at the beginning. And one of those, how often I do read them, one of those is often that you agree to arbitration.

Daniel Bishop [00:29:37]:
Right.

Hunter [00:29:38]:
So meaning you're not actually going to take this to court. We're going to have some sort of an arbiter come in and make a decision on what's going to happen. And no case, it's just you, a.

Daniel Bishop [00:29:47]:
Closed room, a guy with a big pipe wrench and half an hour.

Hunter [00:29:50]:
Yes. So there is no reason that I can think of that the employer couldn't say in there that if, look, you're going to come to work for us, you give away the right to file this sort of formal legal thing. Anything you file is going to go to this AI arbitration first to determine that this is a legitimate cause and stuff. After that then it can be escalated to human. So they can, you can, again, you can still, that could still be tested whether or not it is or is not legal, but you can write anything into a contract. And they could absolutely require that if you want to file something, it must first go to AI arbitration, which is, I think what you're getting at. And then if that became successful, I could see it evolving into, you know, a more formal or government run arbitration service.

Daniel Bishop [00:30:42]:
I think you touched on something really interesting in there. Because if it's not actually going to court, if it's this arbitration process, again, I'm not a lawyer. Do not take anything that I'm saying as fact or legal advice. But I don't think to be a mediator, I don't know about arbitrator, but I don't think to be a mediator between two groups, you have to be a lawyer.

Hunter [00:31:11]:
No.

Daniel Bishop [00:31:13]:
At least, you know, there's a whole bunch of states out there, something like 50, I don't know, a ton of them. Anyways, I, I don't think, at least in all of them or some of them, whatever, that you need to be an actual lawyer to be a mediator. And a mediator, it's basically an arbitrator.

Hunter [00:31:28]:
Right.

Daniel Bishop [00:31:29]:
Again, this is a very, very, very loose understanding.

Hunter [00:31:31]:
Yes.

Daniel Bishop [00:31:32]:
Our entire legal system.

Hunter [00:31:35]:
This is not legal advice.

Daniel Bishop [00:31:37]:
This is not. You just mentioned though that you could have an AI arbitrator in place for this sort of thing. And if people are going to be filing a whole bunch of frivolous lawsuits, but those are supposed to go to arbitration Anyways, and you don't actually have to be a technically lawyer to be in arbitrator, which I don't know, I just vaguely know about the mediator thing. I wonder if you could have an AI mediator slash arbitrator where that's part of the defense is people start filing a whole bunch of frivolous lawsuits and then it just goes right to an LLM that says darn it, hold on.

Hunter [00:32:15]:
It's you know, Judge Judy. Again, like technically a real judge, but like it's not really a real court. It's just people agree to it. Right. You sign a contract that I'm going to go on Judge Judy and I will. Whatever she says I'm going to follow and I will do because we can agree to things. We can establish a contract. But even though it's not a government run.

Daniel Bishop [00:32:34]:
Oh it's, it's judged more or less with a lowercase J. This is not a legal judge. You know, this is for entertainment purposes only. But they did sign a contract saying that they'll abide by it. Let's have our own Judge Judy.

Hunter [00:32:49]:
But is it AI Judge Judy and you. Well maybe as the plaintiff for the.

Daniel Bishop [00:32:52]:
Defendant, you Judge Rosie to make another Jetsons reference. Yeah, yeah.

Hunter [00:32:58]:
While we are speaking, I did have the AI draft up my lawsuit. It's titled complaint for defamation, Emotional distress and interference with digital relations.

Daniel Bishop [00:33:07]:
Okay.

Hunter [00:33:08]:
So you can expect the full copy of that to arrive on your doorstep later today.

Daniel Bishop [00:33:12]:
Is a robot going to deliver it?

Hunter [00:33:15]:
That might you just. Another lawsuit. I'm gonna, I'm gonna take off of that comment right there. At least until the moment we all become self aware. Right here they might be self aware where we're marrying our AI girlfriends. We're suing everyone and I don't want.

Daniel Bishop [00:33:36]:
To be self wore anymore.

Hunter [00:33:37]:
Hunter, we have robots both human looking and non human looking. Robots are right on our doorste. It's.

Daniel Bishop [00:33:46]:
It's serving us papers.

Hunter [00:33:48]:
Serving us, serving us papers to be served out by the AI Judge Judy, which you're not via the hand.

Daniel Bishop [00:33:56]:
It's like you know, coming out of the mouth in very cartoon and it's like a dot matrix sprinter like spitting out of the mouth you've been served.

Hunter [00:34:05]:
Yeah, why not? Why not? And if.

Daniel Bishop [00:34:07]:
And at the bottom of that paper that spits out is a link to where you can find us on Spotify and Apple podcasts and we are every other podcast platform. All of them can.

Hunter [00:34:17]:
And maybe even they could show the live video which is on YouTube if.

Daniel Bishop [00:34:20]:
You want to just like on the face?

Hunter [00:34:22]:
Sure, why not?

Daniel Bishop [00:34:23]:
Fun clips of us from Tick Tock.

Hunter [00:34:25]:
Little shorts. We got shorts. You like shorts? We got shorts.

Daniel Bishop [00:34:28]:
We got shorts.

Hunter [00:34:29]:
But I guess how would people know that a new one was ready and available? How, how would they ever know? Someone could invent a way that you would be notified and it would just, like, appear in your feed when there was a new episode. That's probably going way too far in.

Daniel Bishop [00:34:45]:
The future, if we don't have the robot helper that's going to tell us. You could always subscribe to the podcast.

Hunter [00:34:53]:
That's a great idea.

Daniel Bishop [00:34:55]:
Subscribe, subscribe buttons. There's thumbs ups, there's stars you might be able to give us, depending on what platform you're on, maybe a smiley face on one or two of them.

Hunter [00:35:04]:
And with that, until next time.