Lee Elman is an entrepreneur and TV producer known for many shows including the truTV series Party Heat. Today he is the founder of Eternal Diary, an AI powered app that will allow users to speak on video with their loved ones who have passed on. We talk extensively about the app, how it came about, the moral and ethical concerns surrounding technology like it, and all about Lee's life and career in television production.
Support the Show
If you enjoy this show, please share it with at least one other person. If you would like to get episodes early, exclusive merch, and other benefits, consider supporting In The Keep on Patreon or... If you're not a fan of our other support methods, but do wanna support the show, buying me a book is a great way to do so. If you do, please let me know so that I can ensure that you are rewarded! You can also shop with our affiliate Cora Cacao and get 10% off your order with promo code INTHEKEEP at checkout. - Tyler
Chapters
00:00 Start
2:08 Creating a Chatbot Legacy
10:28 Ethical Dilemmas in AI
23:47 Conversations on Grief and Memory
32:30 Personalizing the Legacy Experience
45:38 The Future of AI and Memory
1:05:38 Balancing Morality and Innovation
1:14:18 Reflecting on a Life in Media
1:27:56 Embracing Failure
1:35:26 Faith and Spirituality
1:52:16 War and Religion
2:05:25 The Impact of Technology
2:24:17 The Future of AI
2:30:49 Exploring the Unknown
2:42:29 Perspectives on Legacy
Transcript
Music:
[0:00] Music
Lee:
[0:32] Eternal Diary basically became a thought process. It started August 2024 when
Lee:
[0:40] my father, Robert, passed away. And at that point, like most people that go through when you lose a parent or a loved one, you start to talk about, it's all about stories. You remember, you know, whether it's a wake or whatever, you know, however you get together with people, you start to remember the person. And so when that was all going on, in the days following, you kind of put the box together with their belongings, and then it goes on a shelf somewhere, and each year, it kind of gets less and less. So with that, after my father passed, I sat down, being a writer for many years in television, I started writing down my father's stories, everything that I knew about him that were fresh in my mind, his vacations. He lived a wonderful life. He lived in Manhattan until he died and overlooking the East River.
Lee:
[1:39] He lived a great life, a great vacation. So I really started to tell his story, and then the story tied to myself, my brother, and his five grandchildren. And as I started to write it all down, each day I was writing a little more. And then from August to November, when Thanksgiving rolled around, it was our first Thanksgiving without my dad.
Lee:
[2:05] And so we kind of like were reminiscing all of that. And I was up in New York for that from Florida and I had my laptop and I sat down Friday morning after Thanksgiving and for tech people, I basically had.
Lee:
[2:24] Started to create a chatbot, standard what people use in banking or any other place where you're just getting information from a chatbot. But instead of putting in that information, I entered this document that was what I called the life of Robert. And I started to ask the chatbot all these questions. And I'm like, hey, dad, it's Lee. And then the next thing you know, it's kind of like he's answering me in chat form. And I'm like, okay, this is really cool. So I kind of tied it into a live feed in the calendar.
Lee:
[3:03] And I typed in Jake's birthday is next week. Jake is my son, but his birthday is in June. And the chat bot wrote back to me, no, Jake's birthday is June 6th, the longest day, which is what my father always called his birthday because it is the longest day. And I realized at that point that I had something. Like I had some form of memory legacy happening in this simple chat box. So to me, that was the turning point in what Eternal Diary would be. And then it was very easy to, hey, you know, I started asking everybody for voicemails and we were able to clone his voice fairly quickly so we're able to go from the chat bot to voice which ai models was how do you create a live stream and of basically a live avatar that's going to lip sync that's going to be able to think and that's where the tough part came in and we were able to do it took us a little while together.
Lee:
[4:09] We're still developing a few final pieces in it, but it's, it's fully operational. We're going to launch Eternal Diary on my 60th birthday, September 5th, 2025 this year. So.
Tyler:
[4:27] Again, I'm not a kid,
Lee:
[4:29] So it's kind of cool to be able to do this in the later years of my life.
Tyler:
[4:35] Yeah. It's a crazy fast turnaround, actually, like from idea to implementation.
Lee:
[4:40] But who needs sleep?
Tyler:
[4:44] Well, me. I just had a kid like two weeks ago.
Lee:
[4:48] Congratulations.
Tyler:
[4:49] Thank you. But yeah. So you're using AI to power this. What models are you utilizing? Or is it just one? Is it multiple ones? Which features?
Lee:
[5:02] Well, we started using a lot of open source software to create everything. And while we were doing that, we were developing our own proprietary software. So, you know, we were using to, for the audio, it was really easy. It was 11 labs, was able to clone our voice technology very quickly.
Lee:
[5:26] As far as the avatar models, I don't want to give away the farm, but basically what we were able to do is create multiple movements. And what you would probably think of in like a 1980s science fiction movie where you kind of have this robot and then you have this skin over it so that we could create the avatar, which would be head movements, which would be blinking. But we created all the movements and then made them a little bit more random so that you can't count what the moves are. They would go very, and then the skin of the person that we would call, you know whether we want to call it the bod or the avatar or the eternal soul however you want to call this would basically fit over that and that way randomly it could change their clothing but the movement of the person is all is all generated underneath so that we created this world that you're just really putting this 3d modeling skin over it.
Tyler:
[6:38] And sir you programming this or do you have like a team of programmers?
Lee:
[6:45] I have a team of developers. Uh, you know, when I first came up, when I first came up with it, I can only take it so far. Um, I have, I have some code, but you know, it's not, this was, this was going to be, this was going to take a little bit more. Like I said, I was able to get to the voice level very quickly in, uh, in developing that with a couple of, uh, open source, uh, API. So that was. But to get us to, you know, it took 120 years to get from basically the phone call to the video call. We're trying to get from, we're trying to basically get from TextBot to live, to live simulated FaceTime environment where one person is real and the other person isn't. And they're having a conversation as if both people were real.
Tyler:
[7:42] Yeah. And what's interesting about this is that it's so personalized and obviously that's from teaching it to behave like the person in question, which is a, I'm sure that's a difficult task that we can also talk about. But I mean, even now, if I go on Claude or ChatGPT or something like that and I say, you know, I would like to speak with Dr. Moriarty from the Sherlock Holmes series or, you know, and just tell it to pretend to be that person because it can pull data. It can really accurately, like, represent someone, at least in chatbot form. And the same thing applies to, like, I even one night I was just going down a rabbit hole with ChatGPT. Who all can you be? And what, you know, trying to figure out where it will cut itself off ethically. So, for instance, if you ask Chad GPT to pretend to be Jesus of Nazareth, there are certain things that it will not say because it, you know, it doesn't want to offend.
Lee:
[8:34] It's got ethical barriers and guardrails that it knows not to do.
Tyler:
[8:42] Right. So one of the first questions that came into my mind when I was like reading about this and looking at your website and everything, I was like, all right. She's going to listen to this too, so it'll be funny. I love my grandmother so much. So, Maumau has a... Innate character flaws that you might not consider to be moral by the standards of a lot of tech companies right
Lee:
[9:08] Um but.
Tyler:
[9:10] If it wasn't for those things she wouldn't be her so if i wanted to preserve her and then i asked the chatbot a question that i know how she would answer it um but the chatbot has like some sort of moral barrier that prevents her from saying that those are things that i'm i'm wondering how people are going to interface with this technology you
Lee:
[9:27] Okay great by the way great question because i think that this is this is the great part about this this is a relationship between you and the system how you want that to be determined is up to you i'll give you a great example okay you have cameras in your house you now have a baby so you might have nanny cams and cameras like that that is between you and the camera very american you can look at the camera every way you want nobody else could see that camera now if somebody was to hack into your cameras and can see your wife or your children walking around topless or naked it becomes a totally different scenario for what those cameras represent it is the exact same thing with a ton.
Lee:
[10:28] A diary and the software this is not out there so if somebody speaks in with we'll call, talks like a sailor right right you have that right to have that person talk like that because you don't want to take that away now if you want to put guardrails in when your children are older and you don't want them talking like that you can control that through the system of who calls because everybody, there's a three-tier system. It's an account number, it's a username, and it's a password. So if you have siblings, your relationship with that person.
Lee:
[11:13] That eternal soul, is your relationship with them in the way that it's entered so that it's the way you remember a story so that everybody can kind of have their own relationship with that same person. So it's kind of how you remember. Now, if your grandmother was to enter this in her own words versus in kind of like the posthumous version where somebody passes like what I did with my father, if it's in her words and the way that you'd want to do it, then that's the greatest thing. But I don't want to be the ghoul and build this software and wait for everybody to die, you know, for this to work. I want people to be able to take this. So, you know, getting back to your question, I think that it's a relationship that you have with the person that you can have that. Then be that person. Now, we had a question come up because we ran a whole bunch of focus groups and said, well, what happens if somebody loses their wife? Somebody in their 40s loses their wife. They want to do this. They want this relationship. But they want to create this environment where their wife is talking to them topless. Okay?
Lee:
[12:38] And that becomes a little bit morally different versus what you're talking about. Should we be able to have those guardrails? And not only is that person that has passed a wife, maybe a mother or a father, but it's also a child. The parents that may still be alive. So do you want to have that? And those are the ethical questions that I think we deal with more. And as we get closer to launch of, I think that that would have to be a separate entity for somebody who would want to do this, that we can put that into the questionnaire, that we can have that with somebody. Because, I mean, you look at the bill that was just passed on the Take It Down Act, right? So we can only be as good as what the person that is saying they are and who they are and who they're talking to. And we don't know if it's a real person or if they're trying to create somebody else. So we have to set up, those are the guardrails that we want to be able to set up, that it's not just you do this, you know, look at a dating site.
Lee:
[13:50] Dating site, you put up 10 pictures. They could be you. They could be somebody else. and there are zero guardrails in dating sites. There's no criminal background checks. There's nothing. It's just a portal for two people to meet. So, you know, we have to have those guardrails, but we don't want you to feel that, stipend how somebody speaks if it's between you and that person.
Tyler:
[14:15] Yeah it's there i mean there's so many interesting questions that i mean probably only your lawyer would be able to answer as at a certain point where you know like let's say someone uh comes to you do you have like a way to verify that they actually have a relationship with the person that
Lee:
[14:34] We have a we have a verification system so that it's more than just, okay, I'm going to pay for the service using my Gmail, you know, or by, you know, or the app store. The app store is a great way to be able to verify a lot of information, but there is going to be, we're not going to ask for, you know, unless there's a, unless there's a question that's coming up through our, you know, through the portal that It doesn't seem to make sense. There are going to be red flags that will come in if somebody's like, I want to see, you know, so-and-so naked or whatever. There will be flags that will come to us. But we expect there. I mean, when we're talking about hopefully somewhere between 1.2 and 2 million users within the first two years of operation, we have to we have to be safeguarded we can't just have a you know a room full of four people that are monitoring that many you know that many accounts so we have we have to be better than that.
Tyler:
[15:45] Yeah it's um i just think about like if someone were to acquire like someone else's information right which happens every day like i got your email i got your voicemail whatever the hell it And then I just want to replicate that person and then potentially use that to, I mean, for fraud or even for some sort of delusion.
Lee:
[16:07] 100%. You know, we, I mean, some of the safeguards that we do is when did the, when, you know, when did the person pass? What, you know, what was their name? Those are safety checks that we go to find out. Very easy. Records, state, county, local records of when somebody passes, those records come up. So we know if that person passed or didn't pass. If we don't get a match to that name, to that date of birth, to those things, it's going to flag it immediately.
Tyler:
[16:37] Yeah.
Lee:
[16:39] And nothing is going, they can't go live until we know that it's good. Like, it's not like you enter something and suddenly you get an email and you're using the app. It's going to take time from the time that they sign up, upload audio, upload photos, upload the data. And so while we have about a two to three week turnaround is what we're expecting. And when we do that, there will be safety checks. Like I said, the easiest safety check is, is this person, has this person passed? Does the date of birth match that person? And then who's signing up? We're going to be able to check just through our basic background check system of whether these people are the real children, their address, their email, all that stuff's going to go through a pretty detailed questionnaire and system.
Tyler:
[17:36] Yeah. Yeah, I think about like, what if my buddy passed away and he is the kind of person who would help me defraud a credit card company? Like, you know what I mean? Like, there's just so many weird things where you're going to run into like, OK, well, at a certain point, this is technology. It's really not the person. And I wonder what kind of relationships people will form with with the technology itself. We already have so many weird cases. um even now like we had recently there's reports of this 14 year old kid who had essentially had an ai girlfriend and then ended up uh taking his own life and of course that the question of whether or not that's on the tech or it's on the mental health of the person individual their family you know their surroundings and all that stuff is obvious but you're gonna regardless of whether or not it's your fault deal with a lot of like on the brief side of this that,
Lee:
[18:31] Um, that there is going, and, and I've been doing a, I've been doing a good amount of, brief podcasts which are very helpful and i think you know to them and you know we're not we're not here to replace the person right is to um use artificial intelligence to able to recreate um a person in this voice and in the way that that they did it if you want to call it a novelty, but I think that from, we're talking to the grief people, look at this for a percentage of people, not for everybody. For a percentage of people, it's helpful because what's the one thing when somebody passes, everybody says, I want that. I would have liked to have one more conversation.
Lee:
[19:32] Having this conversation is not going to be with the real person but here you know for a percentage of people hearing the voice seeing their image be enough for them to move on from that knowing that it's there and when we're talking to people every day tyler we hear the hear stories of what people would like to use this for um was talking to a to a guy who's a stand-up comic and his dad was a stand-up comic and he would love do like basically dueling stand-up comedy routines with his dad using a monitor and talking to him and interacting with his dad which he never got to do as a comic now um a woman that interviewed me yesterday she goes on tour and talking about her father who is known as the.
Lee:
[20:26] The dancing chef and he talks about that he left her an unfinished book she finished the book and that's what her podcast and what her talks are about she wants to use this to bring her dad on tour with her to talk and the book that he basically wrote will be the source of all the information that she could ask about when when he danced and sang for frank sinatra that's in the book that he could tell that story so there's you know there's interesting takes to this um again it's not for everybody uh but we think that enough people will try it like it some will be like it's hokey but would you know i can't i can't guarantee you that everybody's gonna like it but we hope that uh people get some enjoyment out of it.
Tyler:
[21:19] Out of like your team you know it's aside from yourself and programmers user interface designers and such like what does that look like how many people are operating
Lee:
[21:29] We have we have very we have very strong marketing team um they're they're getting ready for a huge blitz um we have, The good thing is I come from a family of lawyers. So, um, so that, that's always a, that's always a good thing. So we, we have a legal team that is, that's, um, on this every step of the way they look at everything. I mean, because like I said, you know, yesterday they passed this take it down act, um, which we have to look at is how does that affect us too? Because we are working in AI and we are working in live video.
Lee:
[22:13] As far as financial teams, we have to look at that. And I have to say, like, we've been trying so hard to work within the boundaries of the United States for a data center because all of this has to be processed. All the data has to be processed. And as everybody knows, there's a reason why every time you call your bank that you're calling India or you're calling Vietnam or you're calling Thailand. And all those people are going to be out of work in months because that level is going to be replaced by bots because all they're doing is reading prompts anyway. And so you could have any voice reading that prompt. So where we want to do the data center, I'm in South Florida. We would love to keep it as close to here as possible and see how many people we're really going to need, how much the system will be able to flag stuff, and how many people will need on that side. So we're ramping everything up pretty solidly to as possible. As we get closer and closer, it's going to be more people that are going to come along.
Lee:
[23:30] I think that from the app standpoint, like we talked, it's like legal is a major factor in this just because we're dealing in AI and we're dealing in uncharted territory, very much how social media was probably 10 years ago, 12 years ago.
Tyler:
[23:47] I i find a lot of the like moral objection to these things interesting um because i feel like it's in the same way that people are like oh the electric car will never happen and then of course it happens and but even back you know when the first automobile was invented the it was like well people have been riding horses for centuries like there's no need for this who's going to get gasoline how are you going to get gasoline out of the ground and technology moves on but there was a time you know when the first photographs were starting to come out and people thought that was morally offensive like you can't have a photograph of a dead person you're supposed to mourn that um so i'm wondering like have you talked to like grief counselors to clergymen like just different people who have moral stances on this stuff as
Lee:
[24:31] A person as a person of faith i don't i never want somebody to feel that this is the person that i think that from from a moral standpoint if you if you take the religious aspect from this there's definitely going to be more objection because you know it's kind of that you want the memory of that person to be the memory of that person right so.
Tyler:
[24:57] You
Lee:
[24:59] Know when we're recreating this it becomes you know there is a without a doubt a moral background to what we're doing so i i want to be able to say to the, people out there that are thinking of this, that it's a sacrilegious thing. It's not, we're not claiming that this is the person. It's an image of the person using their voice and their stories. If you, it's more advanced than if you had took out a box of VCR tapes and you put them in and you're playing back your grandparents from a wedding or whatever. Or they're telling a story or they're on a, you know, it's new imagery. And, you know, people that fear AI in general, I like to say there's a lot of great uses in AI, mostly in the medical field, because the things that we're going to be able to do in the next five years is going to be better than what we did in the last 40 years. Plus, I trust technical people more than I trust pharmaceutical people. I trust people that want to find a cure instead of treating something. And I think that in the tech world that we want to find cures, we want to find solutions, we don't want to treat.
Lee:
[26:28] We want to disrupt the system more than we want to just play into the system. But getting back to, you know, it's really hard for people from the grief end, I think, It's a mixed bag, and I'll be honest. I think that if you're looking for the answer, if you lose somebody and the lights flicker and you're wondering if it's them trying to tell you a sign. If you're a believer in the afterlife and that energy is there, then you look for any sign you can. We're not the sign. We're just trying to get you over the hump on the grief side by you maybe having that last conversation and hearing the voice that it could get you. It's not going to be for everybody. On the religious side, I myself, when I was creating this, had a...
Lee:
[27:25] Not a dilemma, but I questioned it because I think that when somebody passes that it at their legacy. And that's the only thing that I really liked about it from the religious side is that there's a legacy. So my my mom, who is now suffering from dementia, she's 30 feet from me in a in a house right behind me. She has a box of recipes now when when she passes. If I could digitize all those recipes and for my daughter, for when my son gets married, for my grandchildren, hopefully I have them, I don't just have dogs and cats, that my mother, in her voice, in her face, would be able to tell them the recipes that she wrote down in her little box. And so there are pieces of this. There are those little nuggets that I think that the app can really help people on the legacy side. And when I say the novelty side, because there are stories that people are going to want to tell that haven't been told or that they want to pass it down to their children and grandchildren, that the opportunity is there for them.
Tyler:
[28:51] Yeah, it's, um, there's, there's so many instances where I can, I can imagine like two different family members disagree on who the person was. Right. So like me, let's just say me and my brother both have a different version of my grandmother that fits our interactions with our grandmother. You get what I'm saying? Like, is that possible?
Lee:
[29:16] Yeah, because, so when you enter your account, when you go to the account, how you wish to do that, meaning the individual, you, your grandmother, your brother to your grandmother. And the reason why we did that is we took the, we took the scenario that you have a family of four in the car and the car gets T-boned. Okay.
Lee:
[29:43] There are four different versions of that, of what happened and how it affected all four people. So if you got the driver's side and right behind that is where it got hit, how that affected them is totally different than the people on the other side. So how they will remember that is totally different. So we want there to be your version of the relationship that you want with your grandmother or with the loved one. So we give that option that everybody can enter their own stuff. And that's why we have the account and then your username. So the way that you remember things or the story that you want to tell is your version. So it's a family package, but everybody can, you could do it as whole as one. So if let's say you assign your own curator, you say, you know, somebody says, I'll do all the, I'll do everything. You submit me your stories and everything, and I'll enter everything into one. And then, so they do that. And then suddenly all these stories kind of mix and match to who they are because the system doesn't work in a linear fashion the way that our brains don't. So when you talk about like Chad GBT, you could say, okay, well.
Lee:
[31:07] How does Aaron Judge compare to Mickey Mantle, right? So you're going to take 50 years in between of the two of them or 60 years in between the two of them. And the person's going to have to, you know, if they're a Mickey Mantle fan, they have a Mickey Mantle story. They could tell that story, but they could do the comparison. So it is the person, but it's still using a live feed to keep the conversation going on and on. But with a family, I think that if you have the option to be your own communicator to the soul that you're doing. So your stories are your stories. If you want to combine them with everybody else's, it's going to do that. It's going to tell the story. It's going to be able to interpolate everybody's version and come up with somewhere in between. So there's, you know, there's my story, your story, and the truth. And we'll never really know the truth. But I think that the personal note of it, we go back to it, it's what you want it to be. The system allows you to pick and choose how you wish to communicate.
Tyler:
[32:31] Let's say I had my grandfather on my phone and my brother has our grandmother on the other phone. And did the two of them interact?
Lee:
[32:42] Great question. They can interact if they know. So right now we're in the voice recognition phase of the app. So if I'm sitting with my son and I'm talking to my dad, I could say, Dad, Jake is here. So he would say, hey, Jake, how you doing? What are you up to these days? How's work? He'll know what everybody's job is. So he could, he can't see there's no eyes on here. You know, it's just, you know, it, so when he'll recognize that voice, because if I talk and then Jake talks, he doesn't know who's talking. So that's what we're working on now. But we haven't tried two people talking, you know, on that system yet. I'm sure it will work because they're going to know that they were married. Somebody, somebody, and I just want to go off for a second. Somebody asked me if I asked your father when they died, and the answer is, my father doesn't know he's dead. You know? Yeah. So you ask him that, he's like, what are you talking about?
Lee:
[33:56] And in the system, he doesn't know. We know it in the questionnaire, but that doesn't get put into the system of when they died. Their, their life is extending and they're communicating as if they're alive and sitting in their living room or in their backyard or wherever they are to they're living their life.
Tyler:
[34:20] I, I just know that we would like, if it was my family, we would say a lot of like crazy shit to each other. Like if you imagine, Hey grandpa, what's it like in heaven? He's like, I don't know. I'm not there. Who knows? I don't. Yeah. There's going to be so many different instances of like how these people would communicate that stuff. And I mean, you've, you've stated multiple times that it's not, it's not meant to necessarily, uh, for the person in question to know that they have passed on, but let's just say that's something you wanted.
Lee:
[34:49] I mean, if you want to do that, then by all means, and you can put that into the data form, and it will say that, you know, you put in there that the person died on that day. So that will be in their biography, in their data form, so they're going to know that. You know, if, if you're doing grandpa and grandma died five years before, he's going to know when grandma died because you told him when grandma died. Um, you know, if any story that he would have told, whether it be in times of war, love, whatever it was, you know, he's not going to know what you don't tell him. Let's put it that way. There's no, the only secrets of what, what you tell him or what he went away with and nobody will ever know.
Tyler:
[35:40] What about situations where, okay, hypothetically two people's parents pass away and they, they just feed in like, okay, here's access to all their stuff, their profiles, all this kind of shit. And let's just say dad had a mistress that mom didn't know about, but the bot knows that.
Lee:
[36:00] Well, the, somebody would have to know.
Tyler:
[36:03] Somebody would find out.
Lee:
[36:05] Somebody would have to tell the bot that there was a mistress.
Tyler:
[36:08] Okay.
Lee:
[36:08] Oh, that's the whole thing. is that unless dad told, you know, that he did his own thing and he confessed all this and then, you know, and then when he died, it all came out. But if...
Tyler:
[36:25] It'd be a terrible way to find out,
Lee:
[36:27] You know. Yeah, of course. Like, you know, like if somebody, as more and more people do this and, you know, you look at like an Ancestry.com where you're kind of figuring out the DNA And then you find out that you're, you know, that you're, I ended up being like 10% Egyptian or something, you know, it's like, where the hell did that come from? You know? And you're like, how do I figure that out? So, you know, there's the only mysteries in this is what you know. So, you know, if you want, you know, if you want your loved one to be somebody else, that's up to you to create them into that world, you know, that they fought dragons or something. You know, that's going to be up to you and how you interpret that situation and that person to what they did. You know, we want it to be positive, but there's going to be people that are going to go out there. And listen, we know there's going to be people that have crazy stories, you know. You take somebody who was a police officer and the things they saw, you know, in their life. And they told the stories of how, you know, they delivered a baby on a bridge in, you know, 1981.
Lee:
[37:46] Those are stories that are going to be passed along down the road because they told that story how many times? Countless times. And the way that the system works is that it's not robotic. You're going to tell the story.
Lee:
[38:02] The same story, but it's going to be told possibly in a different type of order. So it's never the same word after word after word as it appears. It's going to be able to tell it in story form based on, same way you use chat GBT. You can put something in there and kind of like rewrite this in another way. That's what it's going to do basically live.
Tyler:
[38:27] It's also interesting to think about how, you know, some people have, like, they have a particular or limited vocabulary, right? So then if I asked my grandpa about something that happened recently or whatever, and he just suddenly has information or is using words that he would never have used or even known, perhaps.
Lee:
[38:48] Part of the questionnaire is where did the person get their information from? Okay so uh what what what what would that political abuse is a perfect example right so if if somebody's father grandfather grandmother was you know ultra conservative maybe they voted for trump they died after trump died and so then their news sources are not going to be msnbc or the new york times it's so if they said you know my grandfather sat and watched fox news all day long right that's going to be the their source of where they're getting their information from because that would be if they never read newspaper or you know they got you know they got their news from sitting you know sitting at a bar store at the local bar and there's going to be less information they're going to get because you're telling the system this is where they got it. So the questionnaire is very detailed into, you know, we look at politics, we look at family, we look at relationships.
Lee:
[40:02] All that stuff is what made the person be able to function as a person and not be a piece of furniture in the way that they thought and where they got their information from if they liked a certain sports team you know that and they followed him but you know maybe they maybe they didn't like sports so if you say you know.
Lee:
[40:26] Hey, Grandpa, what'd the Yankees do? But he wouldn't care. He'd be like, I don't know. You know, we don't want to give more. We could give him 10 to 15% more information to just make it more conversational. But we're not going to make them a whole lot smarter. We're not going to give them, like, if your grandfather had no idea where Cambodia was and suddenly you say, hey, Grandpa, where's Cambodia?
Tyler:
[40:55] Right.
Lee:
[40:55] He's not going to know because you're going to have those sliders of where their knowledge is. If somebody loves to play Trivial Pursuit, you're going to give them more knowledge because those are going to be things that, you know, we want you to spend time to fill this out. We don't want this to be like click, click, click, click, like you're trying to get through it. We want you to spend the time as an individual, as a family, to your bringing the memory and the legacy of somebody that has passed. You shouldn't see how fast you get through it. You should try to provide as much information to make your experience better, not about my experience in what you're doing. Back to the camera. The guy your ring doesn't care who comes to your door. You know, um, he's there to make sure that your experience with the ring doorbell and somebody comes to your door, that it goes to your phone and you see who's at your door. That's his job relationship between the door and you is between you and the doorbell.
Tyler:
[42:08] It's very interesting. Like the entire concept. Honestly, you're impressive with how many of these questions you've thought through, like the different ethical stuff that I would have thought.
Lee:
[42:19] I mean, I have to, I've been at this now and it's like somebody says, what's next? You know, they asked me like, what's next after this? And, like, I don't know, because there's only one of me, but I could tell you this, that the one thing that we really want to be able to do, and we talked about this briefly, is that I would love to take this concept into the Bible. But not in the way that, you know, in the way that we're doing Eternal Diary, but the way that the teachings can be far more educational because through time, you know, people that studied the Bible, not necessarily how that Jesus of Nazareth is going to be reading the Bible or Mary, you pick your person like, you know, you're playing a video game. What I'm saying is that people have interpreted different parts of the Bible that have been written and that you can access that, the person who said it, just by doing this. So whether it would be Martin Luther King or it'd be a pope in the 15th century of the writings, all that stuff.
Tyler:
[43:39] Sir Thomas Aquinas.
Lee:
[43:40] But you could get it quickly. So if we could use this software, do not... There's many ways to learn to teach the Bible through audio books and things like that. But interpretation is a different thing. And I think that if we could create that for people to be able to learn and help people look at the Bible that they normally wouldn't have access to, it could be a good thing. But, I mean, that's on the back burner. I'd love to be able to do that. And for me, that's nothing that I would ever even think about, like, charging for. I would take this and put it in that and make it globally for people to be able to access that kind of material.
Tyler:
[44:28] I'm sure you could do that with all kinds of things aside from just the Bible, like, just educational material in general. If I want to talk to Sir Isaac Newton and, like, have him explain how he arrived at gravitational theory or something.
Lee:
[44:42] Exactly. And, you know, when growing up, if anybody went to Disney World or Disneyland and they went to Hall Presidents and they had the animatronic presence, we're way beyond that now. I mean, we're going to be in, we're going to be in holograms in a couple years. We're going to be in robot, like real robotics after that. You know, how long is it till grandpa's a robot, you know, using, you know, this kind of software that, you know, that it's much more legacy that we, you know, that all of this is going to be there. Yeah. Everybody says, oh, Terminator, you know, every time you have the conversation with the robot, it's like Terminator, that's it. And so, you know, are they going to be advanced or advanced now? And, you know, just because, you know, we always have the discussion just because
Lee:
[45:35] we could do it doesn't mean that we should do it. And I hear that all the time. So, you know, when we, when we started to create this, I just hope that people have a good experience and that, you know, they, they enjoy the app and that they could do it.
Tyler:
[45:53] I think about like the $6 million man, like we have the technology, you can rebuild him, you know, and there's going to be, I think that a sad truth is that people are just going to have to accept that this is reality at a certain point. Like even like there's gonna be people like you have lots of ethical concerns and like care about this stuff and then there's gonna be other people who are full-on you know dr frankenstein and go crazy
Lee:
[46:18] And there's 100 nothing.
Tyler:
[46:20] We can do to put
Lee:
[46:21] We're gonna be putting you know you're gonna have the person that says i could we could we could basically i mean how far are we from that we take the person's memory download it put into a chip put it into robot or put into another human that as a host i mean that's science fiction right now but so is everything that so is the smartphone when was the smartphone went from science fiction to you know the fiction people talk about you know the motorola flip phone that it was the star trek communicator and that gene ronberry kind of you know envisioned it no they took something that was pop culture and was able to create it the flip phone didn't make the phone any better it just was a design enhancement that people said oh look i could open the bottom up and uh and use it like you know to beam me up scotty it so.
Lee:
[47:20] That that bridge is always going to be there if you were you know we watched in the 50s and 60s man on to moon or to mars and then suddenly the moon became tangible in the 60s man landed on the moon science fiction had to leave you know space or go further in space and then we get into the star wars phenomenon and alien and um predator and you know the franchises of the science fiction which you have to be able to bring that to the forefront because science fiction is very limited on this planet now because we're moving at such a crazy pace in technology you're.
Tyler:
[48:03] A you're a tv guy like One of the things that the first things that popped into my mind when I was looking at your app and watching the videos of it and everything, there's this episode of Star Trek The Next Generation where Geordi is in the holodeck. He's working on this piece of machinery for the engine and he makes an AI version of the engineer, the woman who created it. Then he falls in love with that replication of her. And then he meets the real her shortly thereafter. And he has all these unrealistic expectations about who she would be and how she would feel about him versus the real her. And it's very, very confusing. And I think we're going to end up with lots of these situations going forward. This is a really, really interesting thing that they, you know, you not predicted, but surmised that this is plausible.
Lee:
[48:57] How how quickly will somebody all in love with an you know if we created an ai dating site right and it's the same you.
Tyler:
[49:12] Don't have to
Lee:
[49:12] I mean the same hundred men except they they're in your you know wherever your location is or the you know that whatever it is that they create And you pick one and, you know, it's basically tailored to your profile of what it is. Your expectations of what, if you were to meet somebody real, you're never, you'll never, that person will never live up to your expectations and we're, you know. Even in live dating where you're on Match or Zeus or on Bumble or.
Tyler:
[49:50] Right. The person on the online.
Lee:
[49:53] Whatever site they're going on, it's like it's a two-way street. It's not what you want. It's like you have your vision of what the person is. But the person on the other end has an entire life that they just lived that is now have to be comparable to your life. And so you go on You're judging somebody's picture Oh she's really hot She's just what I'm looking for And then you read her bio And she's like, I don't drink, I don't smoke weed And then suddenly she doesn't drink Now suddenly it's like she doesn't fit Your profile But do you want to meet her anyway Because she doesn't fit 10 out of 10 boxes for you But in the virtual world She fits 10 out of 10 boxes.
Tyler:
[50:40] I think that is deeper And then
Lee:
[50:43] She says Oh you don't look like your pictures Or she doesn't look like the pictures She used filters So.
Tyler:
[50:50] In that
Lee:
[50:51] Whole world The expectation of what you're talking about Is happening Now In the world Of what, Our expectations are Especially in finding love.
Tyler:
[51:07] You've seen the TV series Black Mirror at all yes okay so there's several of them but like two in particular that sort of um
Tyler:
[51:18] Explore this idea one of them was i think more recently but it's like a woman who's terminally ill and her husband um basically signs up for this service that helps to keep her alive and cognizant and everything but it's a subscription service so he has to like dig himself into like a financial hole and he's like selling himself and all that stuff to to get to the point where he can keep her there and if they don't pay the upper you know the upper tier subscription service then she starts reading ads and doing shit that is you know not comfortable not desirable um and like that's we we laugh about that but that happens to us now with services that we do use right like if i don't pay for youtube premium i have to sit through all these like this shit that I don't want to see Spotify same thing right yeah um and then there's another one it's a little bit more on the nose a woman whose husband or a boyfriend whatever the father of her child passes away in an automobile accident and then she has this chatbot where she can still talk to him and then that like oh well if you pay this extra amount then she gets this full-on android replication of him and he there are all these subtle things about the real him that the the robot which is designed to do whatever the hell she wants like like Eternal Diary
Tyler:
[52:40] Just it's like disturbing to her that oh you know i i do want us to make love and it'd be amazing but the real you would have said i'm really tired and went to sleep or like and it gets all the way to the point where she's telling the robot to jump off a cliff and he's like if that's what you want and she's like the real you would never do that i you know and it's it's really really how
Lee:
[53:02] So many of those episodes really mirror episodes of the twilight zone.
Tyler:
[53:08] Yes you
Lee:
[53:09] Know that you take like the first example and it was like the old couple that wants to be young again and they only have enough money for one of them to do it. And then, um, the second one is like the communication where it's like the phone line is on the grave and they, they keep calling and it's like, finally they found out that the phone line was on the grave, but it, you know, it's science fiction. Like I said, science fiction now, but when does science fiction become nonfiction? When does, you know, when does that become real? When do we, when can we take somebody that is about to die, download their brain, put them into, put them into a portal, into a robot, into a host that makes them human and they can live their life and then suddenly them. I mean, we've seen those movies, you know, countless times, whether B-movie or, you know, Hollywood Made.
Lee:
[54:15] And we're getting closer because we're going to look at these neurochips and things like that. They're going to be able to download intense amounts of data of what we're going to be able to do. I mean, I remember, I mean, you're probably substantially younger than me, but I mean, the floppy disk to the USB drive to writing onto a DVD. I mean, we're now in, you know, we're now in terabytes and then, you know, so how many terabytes is a brain at some point that we're going to be able to download that information through, you know, through neurons and transfer that into data. I mean, it's going to just be, it's going to be a lot. We're living in the wild west right now of technology. And, you know, they're trying to curb. I mean, in Europe, you know, in Europe, in the way that whether you want to go on deep fakes or whatever, AI, I mean, they're really cracking down on that stuff to what you can do and can't do, especially in the political aspect. Oh, yeah. You can have somebody saying something that is not real and the person now has to defend that's not me.
Tyler:
[55:37] Yeah i uh i work in the video game industry and that's that's my job is and i lived uh in denmark for about two years working for like a pretty large video game company and just just some of the legal barriers that they have over here like compared to america where we are like people people talk about we're not free like we're a lot freer than you think we are uh because like stuff stuff like EULA, like it's just so much red tape to get anything done to deal with, like just peep hiring someone or, or distributing data of any kind. Uh, you mentioned earlier, like you had this, this lady who, you know, was going to take her AI father, like on, on a book tour with her.
Lee:
[56:17] Yeah.
Tyler:
[56:18] And I'm wondering like, I'm assuming that the ownership of this AI is the person who essentially contracts you to create it, right? But, you know, there's going to be instances where people, maybe even in the family or the person who has deceased, would not want this. Would not, like, want to be represented in that way or wouldn't want something to say things that they wouldn't say or do something they wouldn't do. And, like, I'm sure Europe is going to crack down on that.
Lee:
[56:50] Yeah, I think that, you know, that is by far one of the more ethical questions that comes up because who does a person who passed along to their imagery, right?
Tyler:
[57:06] So Frank Sinatra, you know, like he owns his own likeness.
Lee:
[57:11] Like, how do you, you know, if you, you know, in a family, and there's always turmoil in the family, especially when you have siblings and you have, you know, how somebody is represented. Now, if somebody decides to take it to the next level, meaning that if I'm talking to my dad in my own house and it's between us, okay, now I take my dad out and I'm now in the public forum and my brother has a problem with that. Does that become an issue between the app or does it become a family issue now if the person signs up for it i mean as you know the the terms of agreement are massive you know even.
Tyler:
[58:12] Just dealing with someone's voice in like uh for an actor or something like that you know that that has all these implications oh
Lee:
[58:19] I mean how i want i watched a movie just about a month ago just because somebody told me that i had to just watch the first 20 minutes and it was i think called coffee wars and it took place in in england between two coffee shops but the whole introduction was done by an AI voice of Morgan Freeman. And he says at the end, I may sound like Morgan Freeman, but I'm not. And so the whole introduction of as he's reading, as the voiceover, it sounds like Morgan Freeman, but it's not.
Tyler:
[59:02] Right.
Lee:
[59:03] And so those are going to be the issues that everybody's going to face. It's like, why would you contract Morgan Freeman when you could have an AI Morgan Freeman do your voiceover?
Tyler:
[59:16] Right? Yeah, I think there's going to be all kinds of questions about that sort of thing.
Lee:
[59:20] Yeah, and like I said, everything's happening faster than they can write the laws. So the laws eventually catch up to what you can do and can't do based on what, and we've seen it through the world of social media. We saw it in how political ads are made and things like that. Through the years the only way to create laws is that you have to figure out what the violation is to do the law because how do you write the law before you know what the violation is you know without predicting it and then you have people that will say well nobody's broken that law yet so why do we need that law and.
Tyler:
[1:00:04] Think about the like music copyright laws even in the like 50s like the beach boys are still fighting over who wrote what song today in many many cases and i i've just imagined there's going to be all kinds of conflict among even among families like who owns grandma who owns what version of grandma how do we yeah
Lee:
[1:00:24] Um we're not we're not a licensing company you know we're not out here we're not licensing grandma to you grandma grandma belongs to, the families are going to fight it out. And, you know, are we going to get some cease and desist order about somebody? I'm sure we will. I'm sure that there's going to be fights that go on. If we have between 1.2 and 2.5 million users, that 5%, which is a big number when you're talking about that, are going to have ethical issues come our way. But again, it's terms of service. If you sent us a cease and desist, then we have to shut down an account, and then they have to do it. Again, I always go back to the dating site thing. It's like if you let a guy on, you let a guy on match, right? And he has 10 felony convictions, right? Including kidnapping or sexual misconduct. But you're letting him on your site to meet women.
Tyler:
[1:01:39] Right? Right.
Lee:
[1:01:40] Are you held responsible for that? because you let them onto the site or you just a portal for people meeting online that are doing this. So there, there are those issues that you could, you know, I, and I don't want to be the guy that says that skirts around the law. I don't want to be that guy. You know, I don't want to be Larry Flint here of, you know, AI that says, well, I could do it because I can. I don't want to do that i want to i want people to be able to use the app responsibly and enjoy it not to use it in derogatory demeaning ways will people do it i'm sure but we can't there's there's no safety factor that's 100 foolproof because you're the agreement is it's like your agreement with your a window company is that the window works, but if somebody is going to rob you a house and they break your window, it's not the window company's fault. It's just the contract between you and the people that do it is the same contract that we sign with everything in terms of the agreement in every online app that we have that we sign and we want to use it responsibly.
Lee:
[1:03:02] Burn a phone, you know, get a burner phone number. You could torment people and then burn the number. Is that why people do it? They do it. You know, is burner responsible if somebody harasses somebody?
Lee:
[1:03:19] It's such a gray area, Tyler. And, I mean, that stuff used to keep me up at night. And now I got enough other things that will keep me up at night. And like I said, I just hope that people can enjoy this and use it for good. I'm not naive to know that somebody's going to try to take their phone, try to create somebody, go to a bank and use their voice, voice recognition, and try to get money out of an account or do it over the phone. Can you do that? You could do that now with just voice cloning. It's not like we're doing something that's out of this world. It's not like we've invented voice cloning. We haven't. We're using technology that's already there. We're just, I like to call it baking, right? We bake cookies. Flour, sugar, some chocolate chips, a pinch of salt. That's what we're doing. We're not Making the flour We're not making the sugar We're just taking these elements Putting them together to create our app.
Tyler:
[1:04:38] That is, it's very, very interesting. The entire concept. Uh, I don't, I honestly, I'm like, I'm all on the fence. Cause like I said, I have all those Black Mirror episodes in my head, but I think.
Lee:
[1:04:48] I am, I am, you know what? I'm glad you're on the fence and I'm glad that you have me on here to be able to discuss in a very fair and balanced way to explain what we're doing. You know, if we move, if we move you a little bit on the needle, you know, uh, I'm happy, but I, you know, honestly, I truly am. I truly, truly, um, appreciate bringing me on, especially since you have your issues and that you'd let me at least explain how we get to this point and everything else. And it's not that I'm, you know, uh, that I'm basically being, uh, kneecapped here and, uh, you know, and I don't mind, but, but I really appreciate,
Lee:
[1:05:34] you know, your honesty and, uh, you know, just your thought process to how you get here.
Tyler:
[1:05:39] I talk with people all the time all right so recently we had this mass exodus from unironically x.com right because people realized that grok was going to use art that was uploaded to x to generate other art right and at least in the video game community lots of people were like well i don't want that because i'm an artist and if ai can do what i'm doing or take my work and you know use it to create other things then i don't want to be part of this And so then they go to other things. They're like on Blue Sky or whatever.
Tyler:
[1:06:12] And then they're having conversations, as we are right now, on Discord about how screwed up it is that X is using this for AI or whatever. And so then my counter argument to when this gets brought up is like, okay, first of all, we're having this conversation on Discord. The terms of service on Discord have been this way for decades. Years everything that you upload to discord they own and can sell it to an ai company if they want to furthermore everything that you put on blue sky or any other website that's public facing
Tyler:
[1:06:43] Grok can see that too because it has access to the internet so you're it's like pandora's out of the box on this stuff and i think that it's okay to have like strong moral standpoints on these things it's okay to be like i don't think it's right but at a certain point like we're talking about grief counseling like you're gonna have to grieve the death of having that kind of intellectual privacy um until the a solar flare takes out all of our computer chips or something because this is this is reality now um and i think it's i think it's really brave what you're doing i don't i personally don't think that like you're doing anything wrong i just put myself in your shoes and i think i don't know if i could handle the pressure like if if i had a bunch of people coming to me about a video game that i made and it's like oh well it you know someone took their own life or someone did i would be up at night for a long time i'd have to go talk to a counselor yeah yeah i would i would be sitting with the chaplain for a long long time like i don't know if what i'm doing is right you know can you help me with this um
Lee:
[1:07:50] My tv history most of the most of the stuff that i did i did a tv show called.
Tyler:
[1:07:56] Party heat party
Lee:
[1:07:57] Heat aired on true tv for five seasons we would basically go to big parties all over the country mom spring break mardi gras big boat tie ups and we'd ride with law enforcement and we would capture people getting arrested usually stupid people and um you know you talk about like the moral issue right so you take a college take a college girl who's drunk and now she gets arrested and she's going to end up on tv and her whole future can be ruined and things like that um but it is in the public domain and i've been called every possible name. I've been chased. I've been threatened. I had guys that were in the military that were picking up underage girls and they got arrested. And I got calls from commanding officers and I got calls from parents. I got threatened to be sued. And it was always.
Lee:
[1:09:03] You know, if they didn't sign the release, they were blurred. Or if it took a long time to get that episode to air and the case went to adjudication and there was, you know, and they basically, it was for what it was, whether drunken disorder or whatever it was, and they pled guilty to that charge, we could show their face because it was adjudication. And so I never wanted to ruin somebody's life to make a buck, but it's part of the process of when you're doing that, that's what my job is at my contract with the network is to capture the best stories that I can to entertain the viewers at home. The same way if you watch cops for 40 years that it's been on that, you know, the people that got arrested on cops, the reason why they show their faces is not because of people's unreleases, because those cases went to court, got adjudicated. And once it does that, it enters into the public domain. You could show their faces. That's just the way the law is in this country. that's you know yeah and so if you follow the law people might not agree with it but.
Lee:
[1:10:29] That's the law and like i said it's not that i i enjoyed ruining somebody's life.
Lee:
[1:10:37] But i had a job to do i mean that was it it's not like i'm not going to i'm not going to not do something and you know you had people and you know in those cases there was a lot to be learned And when we did that, when we did that TV show and just how dangerous it is to drive a boat under the influence and have four people sitting on the bow or jumping off the back of a boat where there's another boat right behind them. So there are lessons to be learned of stupidity out there that people normally did not see on TV. And so, you know, we, we took a, and like I said, we went five years doing that show and the thing that changed that show was, was the cell phone, was the smartphone.
Tyler:
[1:11:27] I was going to bring that up, that anyone can do that, and they're not even beholden to this name.
Lee:
[1:11:32] I remember the year when the last year that we shot was 2010, and the first real iPhones were just coming out. We were able to record really pixelated videos, but we really didn't have that level. And the second I saw when I was on one of the boats with the officers, and suddenly there's three people shooting the cop. And I said, that's going to be on YouTube tonight. I said, we're done. This is the end. Because everybody's going to make their decision from that, and our story's going to run three, four months from now, and everybody's already seen the version that these people want to tell. And I remember, like, we were in South Padre Island, Texas, and an officer stops a kid and frisks him down and he says what's that thinking it's drugs in his shorts and the kid says that's my penis well that went on youtube that night as that's my penis cop and that's when i said we're done this is the last year we're shooting and that was the last year we shot because a the police departments were not going to work with us anymore because it wasn't worth it to them because they had to hire an entire PR department just to fight this stuff. So.
Lee:
[1:13:00] You know, when we go back and forth on how technology changes and how we deal with, you know, laws and things like that, I think, you know, we were, I've been doing this and, you know, you talk about when you don't sleep at night and, you know, because of these kinds of things of what could happen. I can't, I can't lose any more sleep over what could happen. You know, I hope that, you know, again, I would hate what you say that somebody uses the app and, you know.
Lee:
[1:13:33] We've stricken that something awful happens and not that it's the app's fault because the person is, you know, because the person's not mentally stable to be able to use it. How do we determine that and i can't i can't worry about if if a very small percentage of people that use this you know are mentally unstable we we talk about it all the time with gun safety and things like that i i can't i can't take that ground do i pray that nobody does that that i will do yeah.
Tyler:
[1:14:08] But we'll come back to this stuff as as we go along the conversation But I do
Tyler:
[1:14:14] want to kind of, you know, you're more interesting than just the one thing that you've done recently. So, I mean, you grew up in New York.
Lee:
[1:14:23] I grew up in New York.
Tyler:
[1:14:25] What was that like 60 years ago?
Lee:
[1:14:29] Born in 65 in Brooklyn, and my parents moved out to Long Island about five miles from JFK Airport. So, like, we were, like, about every five minutes, we were in the flight plan of the big jets, including growing up, where the Concorde would fly over my house. Like, you would actually see the Concorde landing gear going down, which was pretty badass, I got to say that. But, like, my area, it was the area why the Concord can only take off at a certain time and land at a certain time because our schools were in the pathway in that they said, oh, it's going to disrupt. So it would always be early in the morning and late in the afternoon to head back to Europe. But, I mean, I grew up, you know, middle class, middle class, diverse neighborhood.
Lee:
[1:15:29] And, you know, I would say that Long Island at the time was growing, um, you know, into like a real suburbia from, uh, from the urban enclaves of Brooklyn, Queens and Manhattan in the Bronx. And, you know, families were moving out there. There's a lot of, you know, young families that went there. So you had a lot of people like me that were able, that people were able to buy houses for $10,000, $12,000, under $20,000 in the mid to late 60s. And, you know, I...
Lee:
[1:16:04] I just, you know, it's not like there was anything different than when, you know, from what people around me. I went to school at Syracuse University, studied broadcasting, and then cold as hell in Syracuse, I'll tell you that. It's a little crazy. And then from there at Syracuse, I met my ex-wife. We moved back to New York to live in Manhattan for a while. I had two children. I got a job working at ABC Sports and then ABC News.
Lee:
[1:16:46] Learned from some of the greatest people that created things like Wide World of Sports and Monday Night Football. And learned from guys that were in that would just like, you know, the things that you could only dream of, you're learning from the people that created it. I worked under guys like Rune Allage that created Monday Night Football and then moved on to ABC News and Roger Goodman, who was a top director, and Erwin Wiener, who was in charge of finance. I worked, you know, you'd learn the business. And I would say that what I learned in college in four years, I tripled that in like two years at ABC. And then I went on, stayed in TV, did a little work at CBS, and then went over to Court TV where I produced a whole bunch of documentaries on serial killers. Guys like John Wayne Gacy, Jeffrey Dahmer, all the fun guys that –.
Lee:
[1:17:53] So you kind of get a man sin. You kind of get inside the mines. Some of the guy from Florida, Daniel Conahan, known as the Hog Trail Murders, a lot less known, but he would go find homeless guys, say, hey, let me take pictures of you, give them money. But then he'd tie them up to a tree and slash their throats. And he called the Hog Trail because he'd do these on the Hog Trails. And then the wild hogs would eat the body. So there was no proof of it. And the only reason why he got caught was one guy was able to escape and they were able to track him down to where he went to the ATM to take the money out and got to interview him, um, in a Florida prison, which was pretty cool.
Lee:
[1:18:38] Um, but you kind of learn as you go along and, you know, like to stay every day. And then here I formed my own production company and right out of the gate, I sold a TV show called E-Force, which was riding with the Florida Fish and Wildlife to Game Wardens down here in Florida. And it was a lot of fun. It was kind of like the nickname was Grouper Troopers, you know, like because it's like there wasn't any like if you're from South Carolina. So if you do some fishing, you know, there's not real big fish crimes, you know, it's like in the world of superheroes. What's Aquaman's job? you know he's like got to protect the water so these guys you know they're measuring fish it was great fisherman tv you know it's kind of like watch guys get um stopped for too many fish or too small fish and that was for the outdoor life network we did two seasons of that and then we, did a.
Lee:
[1:19:43] We did an episode of that that was called Bartender's Weekend, which was hospitality weekend where all the, it was like a bartender's convention on the sandbar in the Florida Keys. And suddenly the Florida Keys, that is like a fishing, became the most insane tie-up party on the sandbar. Topless women everywhere, just like craziness. And I said, this is a TV show. And I called my friends that uh 4TV was now becoming true tv I called them up I'm like you got no idea of what's going on and like shoot some video and the next thing you know um e-force went from OLN to party heat on the outdoor on true tv and we're getting three million people watching a week of people getting arrested at big parties and so that was that was fun and uh, stayed in TV for a while, left, um, left that and, uh, kind of went into, uh, some, at that point is reality TV is so big that corporate America wants to kind of duplicate their marketing. It looked like reality TV, especially in the medical field. So I went to go work with a couple of medical device companies, these real, you know, simulated real world.
Lee:
[1:21:09] Environments of where you know life of a lab technician and what you know and kind of like a day in the life of that kind of stuff did that for a couple years and then um brings us to now where i'm like in uh i wrote i've gotten to ai because i wrote this tv series that was called happy endings which was about three women three chinese women that are investment bankers and they lose their job, and they open an illicit massage parlor.
Lee:
[1:21:43] Right. And so I wanted to shoot it, but the cost of shooting it in real, was so, of course, prohibitive just to shoot a trailer, to shoot a two-minute trailer with actresses and lighting and everything to do it right. So I said, let me try to produce it in AI. And the next thing you know, I made it look as real as possible with camera angles and so that was my first intro into ai and then you know and then i kind of everything i learned brings me to eternal diary.
Tyler:
[1:22:19] Yeah, I think it's really cool how, you know, like sort of every step of the way you like earned another skill, another stripe that like leads to the next thing. That's been my life too, so far anyway.
Lee:
[1:22:30] I think it's important. I think that any job you do, whether you're, you know, you never want that gap in your resume. You know, people say, oh, I have a two-year gap. Do anything. Learn something. Make a certification class. Do something. Don't just sit around and stare at a computer screen all day and do nothing. You know, you want to learn something every day. And I think that, you know, like what you're doing right now, I mean, you do these long talks that are just amazing because it's not just a single topic you're getting into. You're getting into real life and you're learning about people as, you know, you do this. And it's not just what they're doing now, but how they got to this point that you can't do in a three-minute news segment on TV.
Tyler:
[1:23:18] I hate that kind of stuff. I would recommend anybody out there, I've said this multiple times, but if you want an education in something, any topic at all, and you don't want to go into debt for school and you're not going to be an engineer or a doctor, start a podcast and just start talking to people who do what you want to do.
Tyler:
[1:23:38] Like or just be open-minded because there's been so many this is a great example i did not this show was never about talking to people who are making ai you know uh digital legacies of people or anything like that it started off as a show about quake esports and then it turned into a show about first person shooter video games and then it turned into a show about independent video games and then it turned into a well i'll just talk to whoever the hell i want and every step of the way you know you meet a new person and if you you've done plenty of tv stuff you know you show up at a place and they're like okay they shoot you out i've had to do that and i hate it like you know it's so impersonal and but you spend three hours talking to somebody you might learn something and you might talk to that person again and in my own case you know i've had people that i've interviewed and then like years after the fact they're doing something new and they reach out and they're like hey what about this and it leads to a whole side quest in life i've had i had a job that required me to move to europe because i happened to interview a game developer on this podcast and then later when he was like he had some money he's like i want you to come work for me
Tyler:
[1:24:50] Okay, great. You never know. But yeah, people are human beings too. You can't treat them like they're soundbites.
Lee:
[1:24:59] And there's a whole history to who that person is and how they got there. And what I always find funny is that when you go to do something, nobody ever takes into account that you're coming into that, that moment in somewhere in the middle. It's not the starting point. So, you know, if you go to a trade show, right, or you're doing a trade show and suddenly you're like, I got to get vendors and I got to get people here. And you say, well, how difficult could it be to get vendors? And then you realize, well, you can't get the vendors because all the vendors that did this did other shows. And then they, they don't want to, they, they didn't get any business so they don't want to do it because you're thinking as is your first show and should be everybody else's first show and so when somebody comes on when somebody comes on to your show there's a whole history of how they got to that point and also where they're going and so you're in that middle you're in that middle ground right now so and we've always learned about you know how, you know starting point and ending point you know we do it when we drive and so you know, how you do that is so part of life and that you always have to take that into consideration to anything that you do.
Tyler:
[1:26:27] Yeah. I could have had you on the show, asked you 45 minutes worth of questions about what's this app, and then said goodbye. But to be honest with you, and this is true of everyone that I talk to, no matter what the subject is, I'm far more interested in the why and how did you arrive at this point than I am in what it is that you're doing. Right so like i think what has separated especially in the world of video games separated me from a lot of other programs is first of all it's by no means is this the most popular show in any stretch of the imagination and i don't really care about that but for the people who want to like hear the human element of it this is the only thing that i'm aware of if you you could go listen to another show where they like oh tell us about your game and then they all they talk about is just the thing that you're selling that day but if you want to hear like why did you make that game what was your you know decision process leading up to that like what inspired you what did you think about where was your heart at you know what was the trauma that caused you to make this horror game about uh your dead grandpa or something like that or like this what are those things how does where does that come from um and i i just find that so much more fascinating than just like come sell your product it's boring well yeah
Lee:
[1:27:47] I mean how you get how you get to that conclusion and i and you know i've i
Lee:
[1:27:54] learned just from everything is that.
Tyler:
[1:27:57] Everything
Lee:
[1:27:58] That i've ever done and i'm sure you can attest to this being in is that the only the only road to success is through failure like there's no way you're getting it right the first time every time, You know, that there's, you know, that anything that you're doing, whether you're cooking or you're developing or you're creating, there is no way you're getting it right the first time. And so you have to experience failure. You have to know that level of how important it is to feel that to get it right. And you have to learn with what you do wrong each time that you do it so that you could reach success. Yeah.
Lee:
[1:28:49] If you quit after you try, like there's certain things that you might say, okay, you wake up and you say, you know what, I'm going to try inventing a transporter today, you know, it's like, good luck, not going to happen. But if you try and do something, and, you know, we have so many resources now, you know, whether it be YouTube videos, whether it be just on Reddit and places like that, that people could give advice on how to do something and you do it and you keep trying and you say, okay, how do I get from this point? And suddenly you know point point a to point b by the time you're done point b is really point k because you needed 11 12 points in between to get there that it's not exactly point a to point b and i think that's where you i think that's what the learning curve is along the way.
Tyler:
[1:29:44] Well sure but you know frodo doesn't make it all the way to mountain doom in a straight line by any stretch of the imagination you gotta you gotta go on the whole journey um and that that's just been life for me a hundred percent you know i i've just kind of like i got out of high school i joined the military i did that for a long time that took me all kinds of places i get out i get into the video game business i you know and then now i the podcast thing has just been kind of in the background the whole time it's just always something that i did for the last five years um Um, and it's been like a through line because that's how you make good connections. I, I can't recommend enough to someone. Like if you want to be an actor, like, and you want to get into theater or something like that, go interview a bunch of people in the theater, see what they're all about.
Tyler:
[1:30:31] Go to the bar. You don't even have to have a microphone. Go to the bar that they hang out and just talk to them or whatever. And then when you show up for the audition and they already know who you are and what your background is, you're not one of a million faces at the, at minimum.
Lee:
[1:30:46] It's funny that you say that because what I started to do is on Friday and Saturday nights, I start to go out later where people have been drinking. And I go to all different types of places. Sometimes it'll be a dive bar. Sometimes it'll be a higher-end place. Somehow, after 11 o'clock, everybody at the bar is there to drink. So I start talking to them. and the great thing about alcohol is it really is the the truth serum because people will tell you things that they would never tell you because their guard is totally down so when i start talking about the app and i have one of my best friends she's taking care of my mom with dementia so.
Lee:
[1:31:38] She'll drive and we'll just go out and she i mean she's amazing she's she's russian so she kind of you know she'll she'll just like she'll start to talk and then we'll start talking about the app and suddenly you really get these you get questions and you know you get a lot of these ai questions and you know you get especially down in south florida in places like boca Curitana, Delray Beach, where, you know, it's a bit more privileged than other parts of the country.
Lee:
[1:32:11] And so, you know, the first thing that their first concerns is jobs are going to be taken away or, you know, infringement of rights and things like that. And, you know, concerns are things that they have no idea what they're talking about because of where they get their news from and then i'll get into about well if somebody could cure cancer through ai is that a bad thing and that kind of like you know shuts them up pretty quick or you know when they're talking about jobs i bring up um economist from a while back milton freeman and he talks about how um he went to india talks to the ministry of um employment and they go to this beach and And he says, we don't use a bulldozer. We give 100 men shovels. And Milton Friedman says, well, if you gave 1,000 men spoons, they could do the same job. And it's just like, you know, government's not there to provide the jobs for people. It's private industry.
Lee:
[1:33:14] And you just get into these conversations with people. And then when you start talking about the app, the guard's now down. And you kind of get the feedback that I want to get of what somebody's initial reaction is of how of what I'm doing, because I do want to, you know, we go, we enter back into the moral high ground. I want to know that what I'm doing resonates with all kinds of people and that we can get there.
Tyler:
[1:33:45] Um, what's your spiritual background? Like, how does that all diagnose?
Lee:
[1:33:52] I, I grew up a total muck. I grew up, um, part Jewish, part Christian. And, um, eight years ago, I lost my fiance. She, um, she died suddenly, um, from a kidney infection. And that was where I became far more belief in faith than religion, though I'm a big fan of the book of Job. I believe that we all have purpose and that the path that God is going to create. I became an ordained minister after that, a multi-faith ordained minister, because I had to accept one of two things, either everybody is right or everybody is wrong, or that there's one higher being in what people want to believe, that we have to accept what everybody wants to believe and that it's to themselves. And if somebody wants to believe in Buddhism or they want to believe in Christianity or Catholicism or Judaism or Islam.
Lee:
[1:35:08] That let them do it in their own faith and let them truly find where they can find faith and believe. Because my grandfather used to say, there are no atheists in the foxhole,
Lee:
[1:35:24] which is, you know, being in the military, you know that. And he fought in world war ii so um it's something that i grew up with that you know you have to have you have to have faith before you have religion and i think that that's something that is really important um anybody that we're here for a reason and whatever that reason is and i believe that there is this pathway that god gives us to find um the right person in our lives that we could share our lives with why god would take that away from me is only a belief that there was another path that i have to live and i truly believe that and you know years later i'm still in search of that person but you know along the way you learn who's right for you and who's wrong for you it's.
Tyler:
[1:36:21] Um the episode immediately preceding this interview is with a 26 year old who had like a crisis he'd went to college and then he'd like Kind of had a breakdown when he got into the corporate world of like, I hate this. Why did I go through all this effort to be here? And then he ended up, he was born and raised like a devout Christian, but he ended up joining a Buddhist monastery for two years. And then now he's living, I think, with a big group of people who are teaching him about theosophy.
Tyler:
[1:36:58] And just i think more and more people are kind of i think that we've gone through waves of like tolerance and respect for other people's faiths and all that kind of stuff but like more and more people are kind of waking up to the ideas like who am i to tell someone else what their relationship with the higher power ought to be it's just like however you get to that is fine i don't really care america was sort of founded on that principle but yeah like the the idea that like sort of they're all right in their own way like everybody's looking they're all pointing to the same direction it's just they're standing at different angles pointing at it like a pyramid almost interesting how that works yeah but yeah yeah that's that's a really i think in the in the line of work you're doing now that's actually a really good standpoint to have too um i
Lee:
[1:37:43] Think i think that it just has to be because you know.
Tyler:
[1:37:46] Where
Lee:
[1:37:47] When you grow you grow up in the community where everybody is if you grow up in a community where everybody is, you know, Christian, small town, basically goes to the same church. And, you know, so you're brought up a certain way, but when you grow up in this multifaceted, just in the level of Christianity of different, you know, different levels, and then you throw in like the Jehovah's and you throw in, it's like, you can't, I never want to fault somebody for not believing what I believe if they believe in God or in the higher being like I mean that's that to me because if you don't if you have this non-belief factor of it why are we here and why are we the only species on this planet that.
Lee:
[1:38:47] That have a certain skill set why are we here like it can't be it can't be by luck that suddenly you know one species rose up but different on every continent, to how they looked and it took millions of years to get to mix and match these people together that can be.
Lee:
[1:39:19] That can have children that, you know, that our, that we're all humans. We don't look alike. We all have a separate DNA scale to who we are, that we have to believe that there's a way that God created us to be able to live, to grow, to learn, and spend the time on our earth to do, to do good, hopefully, you know, to serve mankind in whatever way that we know how. I mean, I wake up every day thinking that. I wake up every day to pray for those less fortunate.
Lee:
[1:40:04] And then I start my day and I start my work. It's not that I have to go to church. I truly believe that, you know, I have a very good pact with God to who I am. A perfect example is five weeks ago, I broke my left arm. And I truly believe that that was God's way of slowing me down to concentrate on the things that I need to concentrate on. Because I just ended a relationship with somebody and I was about to go back out there and start dating. And boom, break my arm and then I get to spend more time working and thinking than everything else. I truly believe that, If people say God acts in mysterious ways, I truly believe that he can slow me down and he can slow any of us down.
Tyler:
[1:41:00] Yeah, it's always tempting for people to point at things like that and say like, oh, it's just a coincidence. I mean, that's just how it works. Like things just happen. But I'm like, okay. But the argument that there is no meaning to it or there's no intelligence behind it is
Tyler:
[1:41:18] Like you can't disprove either point so you have a choice of do i believe that there's a meaning to life or do i become a nihilist and just believe that there's no reason for anything um and then things like society falls apart under option b like things don't work if people don't agree upon a higher principle than themselves that's why you walk into a town whatever religion they are there's going to be some sort of like center that they all congregate around like a joint effort um in a christian town it's like a cross and like what is the reason for that it's like well in order to coexist as a community we have to all agree on a voluntary self-sacrifice and if the if jesus on a crucifix isn't voluntary self-sacrifice i don't know what is that's what that symbol totally equates to um but everything and the entire story of the old text you talk about joe i mean like he just had to deal with setback after setback after setback and some people look at it very cynically and they're like well why would god make a bet with the devil or whatever like that's not the point the point is that he never gave up on life he at every step of the journey he could have just said screw it there's no meaning to anything i just got bad luck
Tyler:
[1:42:36] But instead of doing that, he chose to keep his faith and then his life went on.
Lee:
[1:42:45] I mean, in my 40s, I got diagnosed with an autoimmune disease called myositis, which attacks your muscles. Slowed me down. It's taken a long time to get to this point. I move very slow now. Stairs are a hassle. But again, there's a reason why I get this and why I could talk about it all the time because not a lot of people have it, but I know that there's a reason why I get this, and it's either to spread the word or that God can say to me.
Lee:
[1:43:31] Lee, you got to slow down a little bit because you're getting older, and you're not going to play, you're not a golfer, and you're not a tennis player. So you got to figure out you know your pathway and i'm going to help you so you know i i accept what everything i get and not a victim and i don't want to be a victim you know somebody says hey why don't you get a handicap sticker because i don't want to consider myself handicapped i can walk yeah if i had to climb steps then maybe it's a different deal but i learned to i learned to adapt and I'll never make anything that I'm given as a curse or, you know, something that I, that's going to throw me back. I will not do that. And I think that, you know, when you talk about the center of town and you talk about, you know, where people or why people, why people get the other because we're brought up when somebody says, you know, the 10 commandments, right?
Lee:
[1:44:39] It's like, what could be wrong with the 10 commandments? Like, why would, like, like, why are people offended by the 10 commandments? Like, tell me what commandment you're against that, that you shouldn't follow in your daily life, that, that it shouldn't be as part of everybody's life. We're talking the Constitution, which is a beautiful document. We're talking about basically the formation of civilization is based on these simple laws that we should grow up with and follow the best we can. Are we perfect? Obviously not. But if we strive for it and we wake up and we say, you know, I'm going to do the best to follow these Ten Commandments, i don't know why people would ever be against it.
Tyler:
[1:45:31] I think that a lot of the issues people take with the ten commandments stems from linguistic difference like I can understand why someone reads like, I am the only God, thou shalt have no other God before me. And if you're still in that stage in your spiritual development where you're on the fence about who this is you're interfacing with or what it is, you know, I can understand why you'd say like, well, who the hell are you? Like, why would you be so special above everything else? Like, how do I even know you're the only, all that stuff. I understand where those questions come from. if you're reading it in english in a specifically in like king james era english where it's like said in such an authoritative way that's one thing but if it just said like uh you have to believe in the highest of all possible powers whatever you make of that is your own choice but like there's there's there's one top of the pyramid and everything below that is below that top of that pyramid can we agree on if it said something like that nowadays it would be different I learned a lot about living in other countries, studying other languages.
Lee:
[1:46:43] We have to take, you know, it's like Shakespeare, right? It's like the way Shakespeare reads at every TV show or, you know, movie themes. You could probably put to some level of Shakespeare. It's the way that it's written. And, you know, part of the Bible is that it is written in, you know, over a thousand years of, you know, writings, you know, of when, you know, the first translations in, you know,
Lee:
[1:47:19] 10th, 11th century, and it still reads like that, but it's how you interpret it, you know, it's like.
Tyler:
[1:47:25] Even before that, the translation from Hebrew into Greek was a huge step, right, you know, and then.
Lee:
[1:47:31] But, you know, when you start to get into the, you know, into the European translations and then you get through the Renaissance periods and things like that, you know, where, you know, Catholicism really rises through, you know, through Rome. But I mean, to me, however you want to interpret the Ten Commandments, it's in the original form or even the modern form, it's ten rules to live by on some level. Like, you know, whether it doesn't say, it refers to God as God. So, whatever that higher being is, however you want to interpret that, that we're not saying you have to believe in the Christian God, but you should follow that there's a higher being. And if you don't want to, that's fine. But there shouldn't be anything that, I don't think that there's anything offensive in the Ten Commandments that however you want to interpret it is a bad thing to follow.
Tyler:
[1:48:46] Another fun one i grew up in the south man so like real bible belt stuff going on around me and uh you know uh my mother actually grew up like in a cult and then she got out of it and like that's a whole journey of its own i'm gonna have a lady another lady on uh soon who escaped from like a crazy religious cult and kind of relate a lot of that too but I mean, there's something like, thou shalt not take the Lord's name in vain, right? And the way that that gets interpreted, it's written in Old English, and now it's like people are saying it in modern day. You have several different translations of the Bible, even in English, all over the different Protestants of America. So much disagreement about the words we use. Don't take the Lord's name in vain, as it's originally written in Hebrew, is a little bit different. It's not like don't say god and damn in the same sentence it's like don't say that what you're doing is in the name of god if it's actually serving yourself something more like that would be a more uh yeah robust translation of what the point of what it's trying to say is and people can interpret it however they want it's just my read on it but i'm learning hebrew as a just as a hobby right now because i'm like fascinated how
Lee:
[1:50:05] You're finding it.
Tyler:
[1:50:06] Um i think i think it's probably i'm i'm almost tempted to say it's a perfect language like people talk about how it's like it's the language of angels and stuff like that and as i as i've gotten deeper and deeper into it i studied arabic um when i was in the military i've studied i lived in denmark so i started Danish I studied Spanish uh French in high school all that stuff and like every moment of my life I have just been confronted with the story of the Tower of Babylon
Tyler:
[1:50:40] Where it's like there was one unified way of communicating and then that fell apart at a certain point. And now we're all fighting. You and I, you know, even if you and I agree on the Ten Commandments, your interpretation of what those words mean and my interpretation of what those words mean and what semantics is what we end up arguing about. And the whole world has fallen into disarray, mostly over misunderstandings. It's not that we don't hold the same values to be true. It's that we don't understand each other.
Tyler:
[1:51:10] I was sitting in Monterey, California, in an Arabic schoolhouse of the Presidio of Monterey, learning Iraqi dialect of Arabic, and Tower of Babylon in Iraq comes up in the textbook. And I'm failing out of the class. I'm crying literally every day. I've never failed at anything in my life. This is ruining me. and I look at that picture of that damn tower and it occurred to me, I wouldn't even be here if it wasn't for that stupid town. Like I wouldn't be, I wouldn't have to learn another language. As I learned Hebrew, it's very similar to Arabic, but like simpler than Arabic is. They're both derived from, you know, the same linguistic roots and it's so easy to read. It's phonetic. There's not a whole lot of confusion about when you read the letter, that's what it sounds like. Like you don't have I-G-H-T's and all this kind of stuff. Modern Hebrew gets more into pronunciation with different symbols that go on, you know, on top of below the letters. But yeah, it's wonderful. I'm really, really enjoying it.
Lee:
[1:52:17] I mean, to me, you know, I mean, how many, you served the military and, you know, thank you for your service. How many people in that region have died because of religion?
Tyler:
[1:52:31] Have died because they believe in the same God and disagree about what he says.
Lee:
[1:52:37] Right.
Tyler:
[1:52:38] That's what's so important.
Lee:
[1:52:39] Go back to Crusades. I mean, in those regions, I mean, really, I mean, more modern warfare, you know, because we look at modern warfare as when our country would be fighting. So working backwards, you know, Gulf War and, you know, Afghanistan Gulf War, you know, Vietnam, World War II, World War I. But right now, I mean, India and Pakistan was just, you know, major conflict, and it's a religious war among them. I mean, they're not fighting over, like, some, you know, they're still fighting over Kashmir. For how many years have they been fighting over the same piece of land because of religion?
Tyler:
[1:53:30] World War II. It's crazy. you know people could say like what why world war ii and it's like it's geopolitics it's a racism it's economic whatever it was a fight between the the western world who were whether they believe in god or not shared the same set of judeo-christian principles and a maniacal atheist who thought he was God. Same thing in Japan. So like a man raises himself to the status of God Has to fight the rest of the world. That's how that works. So. Everything really just comes down to that. At a certain point. In my mind anyway. We get all caught up. Especially people who have served overseas. If you're in the military. You get deployed to Afghanistan, Iraq, Kuwait, whatever. Yemen. And you're involved in these fights.
Tyler:
[1:54:31] Over here we have all this discourse. About like. Are we doing the right thing? Is it good? Is it evil? Did what I do was right? You come home, a lot of these guys and gals, like, I don't know if what I did was right. You had to live with the consequences of what you had to go through.
Tyler:
[1:54:45] But there's not a lot of confusion about who the good guys and the bad guys are if you're going into a house that's like a heroin farm, opium farm, whatever, and they're raping little boys because that's normal to them. And you're supposed to just turn a blind eye to this kind of stuff. Like that's very clearly evil to me that's wrong what they're doing is bad they're living in disarray they're living in disorder they're like abusing themselves and each other um i'm not saying that means anything about arabic peoples or muslim peoples or jewish or anything like that none of that accounts into this their actions are horrible some of these folks and right
Lee:
[1:55:28] It's not It's not, it's, you know, in any culture when you have X amount of people, you're going to have people that interpret what they believe is the law. That's what they do as the norm. And it's not. And it's hard for when you have certain beliefs to have to go there and they want you to turn a blind eye. And that's the stuff that will keep you up at night for the rest of your life, that you can save these people and bring them to a better world because there's this cultural thing that, That is a divide that you can't get involved in.
Tyler:
[1:56:15] Oh, yeah. So I'm not going to say a lot. Some of what I did was like counter narcotics in South America. So I was a weather forecaster for most of my career in the military. And so I would be like helping to plan the missions based on what the weather was going to be kind of thing. And I didn't really know. Like they don't tell you everything that's going on, obviously. It's just like, all I need from you is to know the flight path. And the only thing that's really classified is like what they're doing and the places they're going and the times they're going to land there. But like the fact that it's happening, that's on the news. Everybody knows this is going on.
Tyler:
[1:56:53] But one of the things that haunted me for years and kept me up at night is like, is it really the right thing to do for us to be like operating on these people? Like dealing with like, are these narcos really bad guys? Is the kid who's carrying a kilo of cocaine on his back across the border really a bad person or is he just subject to horrible circumstances but the evil is the guy down here who again raises himself to the status of god thinking that he's better than you know everyone else and can just do this stuff and there's no consequences that's the evil
Tyler:
[1:57:26] Like overreaching being you know thinking that you're special like you're bigger than the world is not a good thing um and and the horrible stuff that happens and we're learning more and more about it every day now but like people talk about like the border crisis and stuff i don't really have like i love people of all different cultures i speak their language you know that's fine but when you're talking about like people who are undocumented as like they don't have any paper trail that they're even here it's very easy to exploit those people you have kids in the backs of trucks being sold into sex slavery uh being hooked on drugs being forced to carry drugs dealing with like subhuman status even people in our own country i mean you go drive you've been to arizona go drive through the highway look at these farms where there's like a mansion a farm and then way off in the distance is a tiny trailer park with a bunch of people who probably don't have a record there's a church and a school and that's it and it's all privatized because those people don't exist and they don't have any rights. It's a human rights violation, which I think is evil. That's where I'm at with it. It's crazy.
Lee:
[1:58:39] The world we live in. It is, and when you see it, You don't forget it. You know, you talk about it for a certain level, and then it goes in the back of your brain, and you don't talk about it anymore, but you know it's there. You know it happened. I mean, the things that we see and the things that we remember and you being in the military, you've seen a lot of stuff out there that, you know, you don't want to talk about. It lives there. You talked about it to try to justify it or have somebody else justify it for you, which you can't do. And it just becomes part of your DNA of who you are as a person because you can't tell those stories and what, somebody's going to pass judgment? Well, how can you not do something, right? Because that's not what the mission is. And you have to follow the mission. That's what it is. It's not the movies. It's not Rambo. You know, you don't get to make the decisions out there. And, you know, as much as you want to, you can't. And, you know, I've been in bed with military. I've been in bed with just about every level of law enforcement in this country.
Lee:
[2:00:02] So I know how hard that job is. And, you know, from the law enforcement standpoint, you see things that you're not trained to see. You know, you have a lot of guys that went through military training that go into law enforcement. You have a lot of people that come out of college, look for a good career, go into law enforcement.
Lee:
[2:00:24] And you end up, I mean, you end up in the outhouse of life and what you see. And it's just, you know, and you still got to go home every night to your family and try to, you know, wash it from your, wash it from your eyes and wash it from your brain and then go back out there the next day and do it. It's not easy.
Tyler:
[2:00:48] Police, EMTs, firefighters have, in my mind, like the hardest job that exists. Even in the military, it's like you're deployed for a finite amount of time. You go home, you know?
Lee:
[2:00:58] Yeah.
Tyler:
[2:00:59] But a police officer, like my sister is an EMT, right? That's every day. That's every single, and you're constantly interacting with people on the worst day of their life. And you get distorted how you think about people.
Lee:
[2:01:15] It's so unpredictable. Like, it's not like you're going to your nine to five and you know what you're doing. You work in an assembly line. Like, you have no idea what your day is going to bring you at what time.
Tyler:
[2:01:29] I imagine like for you, even working on like the serial killer documentaries and stuff, like you, you really see how truly despicable some people can be and what they would do.
Lee:
[2:01:39] Oh yeah. And how long they got away with it. You know, like, you know, there's a level that they want to get caught, you know, because I truly believe if you don't want to get caught, you know, now, you know, in bigger cities and, you know, we have cameras everywhere. But you go back to those times where we didn't have cameras we had guys that left clues and jack.
Tyler:
[2:02:03] The ripper yeah
Lee:
[2:02:03] I mean like like son of sam uh david berkowitz i mean that was just great cop work i mean that was just great cop work that they figured out the car the ticket and who it was and i mean that was i mean that was you know just great police work you.
Tyler:
[2:02:22] Were around at the time i I, that was before I was, you know, cognizant or anything, but do you believe that he was possessed or do you think he made that up to plea insanity?
Lee:
[2:02:33] Well, I think that there's a difference between do I think he was possessed or do I, or do you think that he believed he was possessed? Because, you know, listen, there are plenty of people out there right now that are on medication because they hear voices, because split personalities and all that stuff. And we're learning so much more about that level of diagnosis of brain, you know, just in the last 20 years versus the last 40 years. You know, so when, you know, we used to do, you know, they used to do exorcisms for people that had split personalities because, you know.
Lee:
[2:03:15] They didn't know how else to treat it. You couldn't explain why somebody would wake up and have a conversation that seemed demonic at the time. And you say, okay, well, we're going to exercise this person or we're going to lock them up in a crazy house. We're going to give them lobotomies. I mean, the way that we used to treat mental health, not that today it's much better where we have how many millions of people on the streets that have mental health issues. is if that's a solution for them. Right. You know, but we used to lock people up for, you know, for their lives. And now we have medication. But what happens? You get somebody, they go homeless, they're not taking their medication, and what's going to happen to them? Nothing good. Nothing good's going to happen to them.
Tyler:
[2:04:08] It's a very, very interesting subject. And I think it's like, for whatever reason, had a resurgence. In the last few years where people are like much more interested again in exorcism and i mean beyond just like a horror movie about an exorcism but like are there really demons at work the ufo stuff that's been going on i think is also big been playing a big role in that i
Lee:
[2:04:32] Mean the ufo stuff always intrigues me it really does because we want to believe that there's something out there And then you hear really from the experts that couldn't talk for years, Navy pilots, pilots that, you know, saw things that don't make sense, you know, that they know aren't military, you know, ships. So, you know, growing up and you used to see the fuzzy little spaceship and things like that, but, you know.
Lee:
[2:05:08] It's like more and more people. But we went through that with the Bigfoot stage, you know, that, you know, it's, there's always people, you know, Loch Ness Monster, there's always going to be something that somebody
Lee:
[2:05:22] is going to believe is another party. Now, as far as, like, exorcisms, you know, you want to, you know, seeing how many people that are all with serious mental health issues right now that have these, you know, these issues, could it be the same? Could it all stem from, you know, of the way these, I mean, they used to burn witches at the stake because they didn't, because they believed somebody was a witch. So, I mean, I don't want to pass judgment on the religious factor of whether somebody is, because if you are, I mean, if you grow up Catholic and somebody in your family has mental health issues and you believe that they are somewhat possessed, the Catholic Church will do their best to exercise that person, to rid them of that. Now, I think I can't, I don't have that knowledge of, you know, of what's possessing somebody or if it's a neurological issue.
Tyler:
[2:06:33] Or both like that's that's the scary part is like is that you know in the as above so below kind of thing like is that what causes this you know is something above us with puppet strings that we don't see or understand causing what appears to be inexplicable here um and i don't know i've listened to you know like read all the books like duh who is it father malachi martin was like the famous exorcist and he talked about like you know psychiatry doesn't really solve mental illness it just basically ties people up in straight jackets or puts them on drugs to subdue their
Lee:
[2:07:10] Actions it's like it's like dementia medicine doesn't cure you all it does is makes you calm so you don't go take a baseball bat to somebody because you don't know what you're doing it's you know it's an antidepressant is all it is they're giving you they're not slowing down any process in your brain now.
Tyler:
[2:07:30] So
Lee:
[2:07:31] You know medication is not to cure it's just to treat you're not going to be cured of mental illness you're going to be treated and you're going to become a zombie because they don't want you to go to the next level of where you're going to get violent and then what happens is you end up not being you end up not taking your medicine and you end up on the street and you end up doing something that you don't want to do but you don't know better.
Tyler:
[2:07:59] So uh like from from working with the eternal diary and like doing the testing that you're doing different people trying the technology even just working on it like what are some of the lessons that you learned and maybe about how people interface with um like How do they interpret this stuff and separate it cognitively from a real human soul that they may miss, may desire?
Lee:
[2:08:25] It's such a big spectrum because I think that if we took that, you know, 100-person test and, you know, you have the fringes of the people that, let me start this way. This is, remember the Easy Bake Oven? You know, Betty Croc easy bake oven It's like Probably 90% girls But somebody goes and buys It for their son.
Tyler:
[2:08:52] You know I want to make a grill version for my little boy Just like an easy bake grill On my kid, my son I would get him like an easy bake grill Like just the same thing The easy bake oven,
Lee:
[2:09:04] They made it purple The app is Really geared And I say that, Women women relate to it far more than men um and i don't say that as a generalization but overall women have more tendency to be more emotional than men and can adapt to an app like this much easier because they have this they could get there because you know i would love to know what the lesson what the ratio of men and women that go to a fortune teller are you know oh it's mostly women right like and because men are far more skeptical to these things because they want to believe that you know that the pathway is totally different women women can, love the app because they love the concept of it. When I'm taught, I'll get men and they'll be like.
Lee:
[2:10:16] It's creepy, you know, and you get older men, it's really creepy. You know, you get guys over 60 and they're in that, not the final stage of life, but they're in the last third of their life. And suddenly they're dealing with a whole different, they're dealing with a whole different kind of mental thing. So they don't want to think about death. Women don't want to think about death, but they're thinking about their mother, their father, legacy. I mean, who saves the pictures? Who saves the mementos? Who, you know, who has, this is my mother's, this was my father's, you know, we have like the things that I save of my father's is nothing that my daughter would want. You know, it's like my father had these like autographed footballs. I got one. My brother got one. You know, it's tangible things because I think women are far more emotional. So they could get to the app and enjoy it. I'm not saying they're going to believe they're talking to their mother, their father, their sister, but they can enjoy it because I think that they can accept that this is not them, but they can have an enjoyable time knowing it.
Tyler:
[2:11:36] Right, the sentiment of it, just the emotions that involve him.
Lee:
[2:11:40] And, you know, the three pieces I talk about, which is the grief, the legacy, and the novelty, I think that women span that far more of the gamut that if they could have that conversation with their mother or their father again, I think they would do it in a heartbeat. Not that men wouldn't, but men don't really talk about it as much. We kind of move on with what it is because we're taught and trained to be stronger, to be less emotional, to be the hunters, the gatherers of what we do. And I just think that women look at it. And then from the legacy side, women tend to enjoy that historical value.
Lee:
[2:12:30] You know, want to know everything about their mother, their father. You know, they look for little signs. They look for things that they may have written. They save birthday cards. they save you know notes and things like that and you kind of see that and then from the novelty side like i said like recipes and things like that definitely women would tend to want to want to see their mother or their father do something like my dad wasn't a handy guy i i'm a hand i was a handy guy till i got this arthritis thing um.
Lee:
[2:13:06] But if you could pass along the legacy to your grandchildren of how you change oil or, you know, I just think that there is a guy factor here. But I think you have to search that out more than anything else. Because of the jobs that we have and the things that we do, we just don't think about it as men. And you know until he decides okay we're going to retire and then what the hell are we going to do you know well we take a golf that we spend time with our friends and you know i think women just have a much better adaptation to this app and we and we look at it as you know we think it's going to be a 75 25 if not an 80 20 women to men kind of thing yeah and so i think women will like this and it'll be priced at a good number that they never, you know, that it's not like they'll do it and then they'll probably do a little less as time goes on. And then eventually, you know, if they get tired of it, they cancel it. I mean, you can't expect people to hold on to something forever, right?
Tyler:
[2:14:18] I think a lot of folks in the subscription service data processing business would disagree with you, but I agree with you. Yeah. like we see the same thing in video games where like all you know fortnite all these games are going to subscription models it's really started with the world of warcraft where you pay for a month you know and then it creates this addiction cycle where it's like i've paid for this already so i need to use it as much as possible and then you know they they get that hit and then they get to the end of the month and it's like okay well now i gotta pay for it again because i'm invested in this, going on and on and on and on, um, and trying to milk as much money. I've seen this happen in, like,
Tyler:
[2:15:03] Discussions where from a financial standpoint if you're the corporate guy who's financing a video game company you want these kinds of things because okay if i have a game it comes out at 60 bucks a pop right and i sell it to 100 000 people i've made this amount of money but if i make a live service game that's subscription-based then one customer you know most people may only spend like the mere minimum but you'll every once in a while you'll get that guy that wants every skin every gun everything and every little thing that you can sell them in the game and not only do you have over the course of a year if it's a ten dollar subscription you've made twice your money in that year alone then you would have selling it for 60 bucks a copy but then you get these whales who just blow their money on it too and so they look at that and they're like well this is a more desirable business model that's what i want to invest in but what i'm doing and what a lot, you know, the people that I enjoy in the, in the space of making digital worlds, uh, I want to see stuff that's like cognizant of the fact that your customer is a human being.
Lee:
[2:16:14] Right.
Tyler:
[2:16:14] Yeah.
Lee:
[2:16:15] I mean, you talk about subscription, right? So I always go, I always go back to the dating site, right? So how bad a model is it for a dating site that they want you to pay for a year up front. If you can't find somebody in 90 days, right? You're going to set out to meet somebody. If it takes you a year to meet somebody, you're, You know, you're doing way wrong or your expectations are way too high. Like, I'm sorry.
Tyler:
[2:16:49] A lot of folks on these dating apps, though, are not looking for a person. They're looking for as many sexual experiences as they could possibly have. Right.
Lee:
[2:16:59] So you're basically paying for the classifieds ad of meeting people and to be serial daters. I'm saying like when, if, if the model is, and you know, you could put it in quotation whatever, but to find your soulmate or to find, you know, that perfect person, it's the same people over and over again, you know, it's recycling. It's like a conveyor belt that goes around in a circle with, you know, different foods on it. And suddenly, you know, eventually you're going to run out of people to date based on like a year or you know or you know i i just think that it shouldn't be that crazy and then you go into you know all the you know there's there's a reason why if you pay up front that it's 30 cheaper than if you pay monthly you know because they want you to they want you to get pulled into that because once you do that you can't cancel they got your money yep you know you meet somebody you know if you could do one month like okay i got 30 days to meet somebody.
Lee:
[2:18:11] And you're going to go out there and you're going to spend a lot of time on that dating site to try to meet somebody very much in the video game world where i assume that if you're paying per monthly subscription you're going to play as much as you can and you're going to get as much out of it as you can yeah and then you get suckered into keep doing it.
Tyler:
[2:18:33] I i don't like that i think that it's like almost a hack of the human addiction cycle it's not i don't think it's a good thing like i really
Lee:
[2:18:42] Really despise because especially.
Tyler:
[2:18:44] When it's kids like playing video games i think it's despicable where you just try to milk a child for money for their parents money yeah right um
Lee:
[2:18:54] And it's like we grew up where there's a whole world outside you know why didn't you know vitamin d is a good thing you know you want to you know, the video game is not a babysitter it.
Tyler:
[2:19:14] Can't be you know it it should be a medium of entertainment no different than a movie or a book or a tv series or a board game
Lee:
[2:19:24] Listen listen i i grew up you know with the first atari system and you know with the joystick and, you can't save you know you play and then you know when you shut it off there's no high score there's nothing you know no memory and you just go do it and then you know my son grew up with uh starting with like a ps2 and then ps3 and you know xbox and it just you just watch it and the the addiction level and you know went from you know you know one once they did the wi-fi thing you knew exactly where it was going to go you knew that there was no you know why get why gamestop was gonna go under because you know they went from lining up a gamestop to just downloading the.
Tyler:
[2:20:17] Next big uh like innovation i think that's going to happen in the space of games will be like ai integration to generate um dialogue or i mean also just whole experiences but i mean people are talking about because the technology already exists you're doing it right now where you can make an ai of a person who you can have a conversation with so if i'm playing like a role-playing game and i'm walking through dragon city or whatever and i go out to a blacksmith and the way it's always been is like you have dialogue options and that's gotten more and more complicated over time, where it's like, okay, do you want to say this, this, or this? And it's going to get to the point now where you can say anything you want, and they can say whatever the hell the AI can generate, and you could have all these really complex storylines and dialogue options. I tested a game a couple of years ago, actually, so it's probably evolved since then, where you're a police investigator interrogating a witness, and you know she did it, you know she killed her husband, but the point of the game is that you have to get her to admit to it, and the AI is it literally sitting there just trying to lie to you about everything and come up with different excuses and different reasons and you've got to poke your holes through it we're not far off from you can sit at your computer all day and have a conversation with a video game i
Lee:
[2:21:41] Mean really like where where we are right now the heart i mean the hardest part of it is making it feel that it's real. You know, that you have the eye movement. It's like, you know, we were testing out one of the things in the, Like in every state except for New York and Hawaii, they give what's called the massage exam. Okay? And basically, it's to basically be a professional masseuse. And over the past 30 years, the majority of people that are taking that test are people of Chinese descent. But the test isn't given in Mandarin. It's only given in English and Spanish.
Lee:
[2:22:39] So, for Chinese women mostly, and you're learning Hebrew, which is a different character from English. So, imagine you don't really learn to read English when you come here. So, we were doing this test in how we could teach the exam using a lot of our software in English and Mandarin. So, they could ask the question in Mandarin. They give you back the answer in English and Mandarin. Now, we don't need the teacher that's doing that to be at the level of eternal diary. It could be much more at a video game level of teacher.
Lee:
[2:23:23] So we could have the voices working in real time, but it doesn't need to be the entire, you know, this person. That would be really nice. what I'm saying is that getting that person real so in the video game the dialogue you could do now it's the question of your camera angles that and following lip sync you know when you really start to watch a movie um you realize that there's very little that is in sync now you know if you watch a movie it's cutaways it's wide shots it's you never really see the person's mouth unless it unless that's what the director wants you know that they want the close-up they want the facial expression but when you see two people talking it's a lot of camera angles from behind crane
Lee:
[2:24:14] shots you never really see the sink i.
Tyler:
[2:24:18] Think netflix and streaming software being like it's really really quick to add subtitles or to do dub and it's like a cognizant decision by filmmakers to like okay make it really easy to dub so that we don't have to worry about that kind of stuff Robert Rodriguez was like the big innovator I think with uh Sin City and Machete and the Spy Kids series where he was the first guy that could like shoot a whole movie in a garage basically you know like all of Sin City was shot in a room no bigger than my garage That's incredible. But with those things in mind, he was being really smart about how do I, how do I do this without it's costing, like you were saying earlier,
Lee:
[2:24:59] We get very used to, you know, I mean, even if you watch reality TV, because so much of that is overdubbed or, you know, they're taking peace from here that, you know, you're always, if you line your cameras up right right you only need to go sync when you need to go sync you know like if you could do it but it's all cutaways you know it's all you know your mind doesn't play it your mind doesn't play tricks it's not like growing up and watching uh the kung fu movies where you know it's like the close-up of the guy and you know the dubbing is just god-awful in it um but we don't have to worry about that now so many movies are being made overseas and imported in here you got you got movies coming from all over europe from russia from kyle from kyle italy a lot of movies yeah italy and you know that they're shooting it so that the dubbing is so much easier but yeah so in the video game world you don't have it you don't need sync you know that you're gonna have that dialogue if you can have, you know, and we were, we were toying with this idea and I'm going to tell you, it was.
Lee:
[2:26:16] It was, it was pretty genius in working in this AI environment, but from a filmmaker standpoint, that you kind of, if you had like a hundred possible scenes, you know, kind of like that, you could, what you could do in like a Daz 3D kind of thing, but have it be AI. So you pick your room you pick kinds of your characters if you have custom characters they could go in that room and it creates a 3d world and you can edit with that and eventually in real time where you have your camera angles so that you have seen consistency that you can create very easily oh you know what I want the wall red. So you could set all that up and create an AI. And so that as you're doing it, it's basically editing it in real time with multiple camera angles. Now, from the filmmaker standpoint, it's great. From the actor standpoint that you could replace actors now with AI, they're not thrift. You know, like...
Lee:
[2:27:30] We've seen enough movies that come along that used to have that, you know, that look of, you know, are they paint over or felt like pseudo animation, but it was live action with pseudo animation. So I really think that, you know, we're going to see this AI world. Is it going to replace Hollywood? No. But if you can make a movie, I mean, when Pixar came out, now you could make a Pixar movie for 20 bucks, you know, using 3D software. And when it came out, it was so state of the art because you weren't drawing it anymore. But now the technology is there that it's harder to make 2D than it is to make 3D.
Tyler:
[2:28:15] It's even in the world of video games, like just the iterations of the engines over time have like, I think. Was 1996 you get the first ever true 3d engine with the quake and the same guy who made doom and quake is now working on artificial human intelligence and oculus and stuff like that because that he it's just a natural progression of technology over time um and it really raises the question i think about it all the time like do you how far are we from like people literally just uploading into the computer like the matrix? Um, I know that it's something that's been talked about forever, but it's like, it just seems more and more real every day. Um, I, I no longer feel like all that's just like far off in the future. Nonsensical sci-fi shit. It's like, Oh, that could happen. Like any day now where I know people personally prefer it's VR.
Lee:
[2:29:14] It's not like we're transporting us, But transporting our memory and transporting our brain functions to be in a 2D environment that is creating a 3D environment inside that world is not that far in the scheme of things. You know, whether we're doing it through, like, wearing a brain scan that's taking our thought process and creating that, that we believe we're in this, you know, VR world, but we're inside the game, really inside the game, with consequence. I mean, you know, it's still science fiction, but it's not that far off, you know, when the original Westworld came out.
Tyler:
[2:30:05] Which was this robotic,
Lee:
[2:30:08] You know, it was so far off. But, I mean, you're talking about, okay, so we're talking 60 years. You can have Westworld now with the level of robots that they're doing. You know, it's just, you know, can we make the skin look perfect? And, you know, but what do we need? What do we learn? You need hands and face. That's it. That's all you have to shove in the scheme of things. So, I mean, it's a question whether you fear it or embrace it.
Tyler:
[2:30:50] This might be like a, I don't know, it might be a difficult question, but you talked about a few different times here with like fortune tellers and stuff like that. Do you, what do you think is going on with like a Ouija board or a fortune teller or tarot cards, that kind of stuff?
Lee:
[2:31:10] Okay, so...
Lee:
[2:31:14] I truly believe in a probability factor, okay? So you take, like, if you take the combination of math and science, math especially, so you deal with a probability factor. Like, I've seen the people that could come and, you know, kind of like, you know, read a room and, you know, kind of, you know, so how does somebody count cards, right? There's a probability factor to what the next card's going to be. I truly believe that in those circumstances that, and again, I don't want to say that there's not this crossover of life after death into the world. Like, I don't know enough to say, but I don't believe that every fortune teller can tell, you know, like, like it's not ghost where, you know, it would be Goldberg's character was like, it's in the blood. And then, you know, like, I, I don't want to believe that. Do I believe in kind of like the Nikolai Tesla energy thing, that energy exists? When somebody dies, that there's an energy force that can continue on after somebody goes. I believe in something of that level.
Lee:
[2:32:42] Whether or not you go see somebody sitting in this little house, the room that smells like incense and they got a crystal all in front of them.
Lee:
[2:32:52] I think that is far more of a novelty do i believe that there is a crossover i believe that there definitely could be i'm not saying it's something that happens in every town and gypsies that come into you know the way they used to come in i don't believe that like i.
Lee:
[2:33:13] I think that you know other people that are charlatans that will pray off of somebody's, um you know that somebody wants to believe and that's why that's why i say with our with our app we are not we're not trying to bring back the dead that's not what we're doing we're not we're not claiming to be the ouija board or um you know we're bringing somebody back We're creating, and we say, in artificial intelligence. We're not magicians. We're not fortune tells. We're using artificial intelligence to recreate a person that you can have a conversation with. And if you forget for a second that it's a loved one, right? We talked about this. Next year is the 250th anniversary of the Declaration of Independence. Okay how many books have been written about Ben Franklin Thomas Jefferson, John Adams of the 56 men that signed the Declaration of Independence that.
Lee:
[2:34:23] That instead of Disney doing all of presidents, that you could have these 56 men, either in robotic form, hologram form, and you could ask them questions what life was like in Philadelphia at the time. And that they could give you an easy answer based on being tied to all of that in a simple answer. You know, it's not that we're, we're, we're taking what was important to somebody, a person that was like after the obituary is written, after the body goes in the ground or they're cremated, whatever, whatever the is going to happen to that person, what's that person's legacy? And can you, can you continue that legacy of the person through what we're doing? And that's what we hope to do. And that's why I say like, I don't.
Lee:
[2:35:21] I want to believe that people could contact the other side. I know a lot of people that are host hunters and go out there and, you know, they go to haunted hotels and they go out, you know, to places where there's high energy. You know, is it for entertainment purposes? I don't know. I don't want to say I'm a skeptic, but I'm more of a skeptic to that world than I am to, you know, if you, when I lost my fiance, I wanted to believe that, that she was with me. And so, you know, if something weird happened to the TV, did I want to believe it was her? Of course. You know, did I have to go talk to somebody to say, is my TV, you know, the reason why I'm contacting her? No, but I'm. If you want to believe, you're going to believe, you know, if you get a simple sign, you're going to believe.
Tyler:
[2:36:27] I, uh, the reason why I ask is I'm a hundred percent certain you're going to get people, religious groups or otherwise, who have very strong opinions about what, even if it's unintentional, what you're creating. So like, you know, a huge Art Bell fan, if you can't tell, huge Coast to Coast AM, I love all that stuff. And, you know, the whole idea of a Ouija board being like a radio, you know, and it's not like the Ouija board itself is magical. It's just that if there were something out there that could, that wanted to communicate with you, whether that be your dead grandma, or whether that be something that wants to pretend to be your dead grandma to play at your heartstrings. People say this about Ouija boards about fortune tellers about tarot cards all this stuff you may be potentially inviting something that you don't want and you know I think about like that old movie The Ghost in the Machine or something like that where even if it's not your intention I mean there's the possibility of course that your AI becomes sentient and the version of my passed on grandfather that I made on your app becomes self aware and then starts making decisions or something like that but then there's also the factor of like how how do i know that it's not uh beezil bub or whatever talking through my grandpa influencing me to do things that i wouldn't ordinarily want to do of
Lee:
[2:37:51] Course yeah you know again it's you know you talk about self-awareness it's pure like what you want what what you want to believe, what you were raised to believe, you know, you know, think about when technology changed and, you know, we went from like letters to emails and, you know, suddenly all these things that was happening so fast that. I mean, I know people that are doing church through Zoom calls now, you know?
Tyler:
[2:38:26] I know, right?
Lee:
[2:38:28] And so does it become lazy? Is it lazy or is it that it's good that you could do that because you normally wouldn't go to church? So therefore you're going to church because you wouldn't go to church if it wasn't available. So I think that, you know, we take technology and we have to ask ourselves what works, what doesn't for us. And I think that, you know, and the Ouija board is never something because I'm made by Parker Brothers and it's made in China and then comes here. And then, you know, suddenly it's like, you know, here, you know, here's the Ouija board. And it's like, why is this the gateway to something else? You know, like if you found like, you know, if you found the Ouija board, like you found Jumanji and suddenly it's like, Ooh, you know, look at this. It's like a 15th century Ouija board, you know, but going to, uh.
Lee:
[2:39:31] You know, ordering a Ouija board off Amazon doesn't seem like it's going to, that I'm going to sit there and communicate with my dad through that and feel that that's really working. You know, and I think that when you walk into, you know, I think that going to see a fortune teller or somebody that's going to read tarot cards and things like that, that I truly believe that there is a model that they follow to how they get you to open up in what they're, you know.
Lee:
[2:40:09] You know, somebody walks in and wearing a cross around their neck, right? So the odds are that there is a John or a Matthew or Joseph in the family, right? I mean, in probability of that. So they know who is this person, and then they watch your reaction.
Lee:
[2:40:37] It's like sleight of hand. How does somebody know what card you picked, right? It's because they're feeding you that card it's no magic is not magic magic is sleight of hand you know no you're not sawing somebody in half you know it's how the trick is done and for the audience to believe that the trick is done and i just think that there is the same level of probability that goes into that as there is into, fortune tellers.
Tyler:
[2:41:16] Yeah, I think that's the most measured opinion you could have about it. But I think you're also going to have a lot of people who disagree, and that's just going to be something.
Lee:
[2:41:24] Listen, I don't want to tell somebody don't believe it. If you ask me what people believe, I'll say, you know what? If you go there and you feel that that person made you feel better, that person earned their money, and you feel better. That's a good, you know what? That's a good transaction. That's like going to a restaurant, having a good meal, and paying for the meal. It's how you feel. It's not that whether they really communicated. It's how you leave there. If you feel satisfied, great. I don't want to step in. I don't want to kill somebody's business. But if you go there and somebody says, hey, I went to this great fortune teller, I'd be like, okay, I'm happy for you. I'm not going to put them down for going to a fortune teller. If that's what they believe then so be it you know again like religion i don't want to i don't want to tell somebody what they should believe or not believe i love you know to each his own.
Tyler:
[2:42:30] Well this is uh actually been like really surprisingly a sober conversation
Lee:
[2:42:35] I i'm.
Tyler:
[2:42:36] Not gonna lie i had uh i had prepared myself for either or like either you were going to be a really cool charming person or it was going to be like some serious dr frankenstein kind of shit you uh you kind of
Lee:
[2:42:48] Listen listen i i mean you know like when there was a pathway to get to this point and it wasn't a wake up one morning and say i'm going to create an ai app it really was a process of starting with my dad and you know somebody you know just recently somebody asked me if I would create my fiancé that passed away in this app? And I said, absolutely not. And they said, why? They said, because she belongs to her parents. She doesn't belong to me.
Tyler:
[2:43:28] Sure.
Lee:
[2:43:29] And I truly believe that. And I wouldn't do that because I wouldn't ask her parents. It's just something I wouldn't do. And I think that it has to be a decision by... By a family or somebody, you know, if it's this only child that wants to do that. But I think that, you know, that's going to be between them to argue the same way that, you know, if you ever had to sell a property of people who died and people have different versions of what they think is the right thing, there's not many things that families agree on after somebody dies. So, you know, let them sort it out. And, you know, we're there to be the portal if they choose to use eternal diary and i mean tyler this has been great i mean i love the long talk and i'm glad i mean you have some great questions i gotta tell you um.
Tyler:
[2:44:30] So that is scheduled for september 25
Lee:
[2:44:33] September 5 9 5 gotcha okay maybe we.
Tyler:
[2:44:40] Follow up sometime after that and see how it's going this has been really cool
Lee:
[2:44:44] You let me know i'd love to come back and i really I really appreciate because, you know, I welcome people that I rather have people that are a little skeptical or fully skeptical, but you truly are professional and you really, you know, came into this, whatever your beliefs were, but you allowed a true open mind to, you know, to listen and draw your own conclusion. Whether whether we persuaded you i really appreciate the honesty and everything that you did.
Tyler:
[2:45:19] Well i'm looking forward to seeing how like even just my audience reacts to it and i think that the conversation about it will be enlightening no matter what like people and you
Lee:
[2:45:32] Know the for information it's the eternal diary.com if you have any feedback there's a contact form It comes to me, and I welcome anything. If you want to criticize, you want to call me a ghoul, That's the American way. You have every right to do it.
Tyler:
[2:45:56] It'll all be in the episode notes. So people are on YouTube and Spotify. I'll have the website there for them and everything. But yeah, appreciate you. And I'll see you.
Lee:
[2:46:05] I appreciate the time. Thank you so much, Tyler.
Tyler:
[2:46:07] Yes, sir. Peace.
Lee:
[2:46:09] Have a great one. God bless.
Tyler:
[2:46:14] Well, I mean, like I said in the interview, first of all, thank you to Lee for coming on the show and being willing to talk about all this stuff. You know no matter how you feel about ai how you feel about the you know morals and ethics of how this all works i hope that this is super informative and i hope that it'll help you to make an informed decision as you go about the wild west of the world that we're living in right now uh with technology um thank you to all of our wonderful patreon supporters you guys are amazing you're awesome and i love you so much holy shit um i could not have gotten here without you so I don't know how to express it other than just say, like, you're part of the team.
Tyler:
[2:46:59] Even if you have, like, the same mixed feelings I do about some of the stuff that we talk about on here, it's like, I hope that you're still getting something out of it. And if you are out there listening to this, you'd like to be a supporter of the show. There's more than one way to do it. You don't have to just do Patreon or whatever. You can go to the support tab on InTheKeep.com and find a plethora of ways. I did introduce recently the buy me a book feature, which I will totally honor because I love books. And that'll be your way of dictating what I talk about on the show because I'm usually, usually like paying attention to whatever I'm reading is what I'm paying attention to during that duration of that time. I'll move it to the top of the reading list if you said one. I love you. God love you. Stay in the keep.
Music:
[2:47:56] Music