Jeremy Jernigan [00:00:00]:
Welcome to another episode of Cabernet and Pray. Today we're going to dive into something, and I'm so stoked because this is a conversation that I don't have a lot of people that want to have this conversation with me. Admittedly, you may look at this title and be like, nah, I'm out. But I want to encourage you. Think about it before you do. This is a conversation I love getting into in social gatherings. And at some point my wife will look at me like, look, people aren't as into this as you are, but we're going to talk robot theology today. We're going to talk about AI, robots, all of this.
Jeremy Jernigan [00:00:34]:
How does it affect what we believe and where is God and all of this and the ethics of it? And I think this stuff is super fascinating. And most people think this is like years and years and years from now when you think about it. And hopefully if you dive into this episode with us, you will realize, oh, this is already here, and maybe we should spend some time processing through this. So today's conversation is with Joshua K. Smith. Josh is a cybersecurity analyst and a theologian and former pastor. As he shares in the episode, his research focuses on automation, AI, ethical hacking, and robot ethics, which are a lot of really interesting subjects. He's the author of a number of books, including robotic persons, robot theology, and violent tech.
Jeremy Jernigan [00:01:27]:
And this is a fascinating conversation that we get into. Enjoy episode 31, Robot Theology.
Jeremy Jernigan [00:02:46]:
Welcome to the podcast, Josh. It's great to have you.
Joshua K. Smith [00:03:25]:
Thanks so much for having me, man.
Jeremy Jernigan [00:03:26]:
As we begin today, I want to talk about what we're drinking. It is October. I live in Arizona, and it is still, still over 100 degrees here. And we got, like one week of respite where we all were, like, celebrating, putting our hoodies on, and then it was right back to all just or above 100. So we're like 105 right now. So I'm not doing a red. I've been taking a break from reds for a little bit. I'm doing a rose.
Jeremy Jernigan [00:03:55]:
I've been going through a lot of the rose that I've had in my fridge. This is a 2021 Rex hill Pinot noir rose. So rose can be made with a bunch of different grapes. And this one's a little bit old for Rose, actually, for forgot about this one. I was having flashbacks of when I bought the bottle a few years ago, and I was like, oh, yeah, I haven't got to this. So I'm getting orange peel, strawberry, cranberry, cherry. It's relaxing. It's making me a little less miserable that it's 105 in October.
Jeremy Jernigan [00:04:25]:
Josh, what are you drinking today?
Joshua K. Smith [00:04:27]:
This is the prisoner wine company. It's a 2021. I believe it's a white chardonnay. That's about as much as I know about it. I'm not the biggest wine drinker, but this is nice. And usually my wife and I will drink BlackBerry wine. That's her favorite. And we have a little spot that we go to.
Joshua K. Smith [00:04:49]:
I'm more of a whiskey person, but this is good.
Jeremy Jernigan [00:04:52]:
What's your go to? Whiskey?
Joshua K. Smith [00:04:55]:
Irish whiskey.
Jeremy Jernigan [00:04:56]:
I've been, I've been getting into. I found an irish whiskey called Greenspot have you tried this? And that's my. It's become my favorite irish whiskey, so I don't normally give whiskey plugs on this podcast, but there you go. There's no rules for what we're going to promote today. All right, as we get into this conversation, I want to ask you a question that I like to ask each of the guests. It gives us a little bit of an understanding of who you are, the journey you've been on, that's going to set up the questions we're going to get to today. But as you look about, I just pick a timeline, like ten years. The last ten years of your own faith journey.
Jeremy Jernigan [00:05:36]:
How would you say that your faith has changed over the last ten years?
Joshua K. Smith [00:05:40]:
Wow. So it's changed a lot. I was a pastor for the last five years, and then I left the ministry about two. Oh, wow. So even the part of my faith that's changing now is how do I fit into. I don't like this phrase, but a secular world and use my giftings and abilities and the way that God has crafted me to bring him glory through that dominion. And so I grew up in a very baptiste subtext, and that continued throughout my faith journey. But as I've matured more and more and asked more questions about things and read the Bible for myself, became a scholar of the Bible, it just really changed my interaction with God, my understanding of faith and theology, and I've always been a questioner.
Joshua K. Smith [00:06:44]:
I like to ask questions. I've had wrestle with doubt, still wrestled with that as a pastor, helped others wrestle through that, and just feel like I came out better as I took some of those pressures off myself. That faith has to look like one denomination or the other, and that there's a lot of things that we can do with the Bible that are not necessarily theologically sound. So, yeah, I'll just leave it at that. And I would love to.
Jeremy Jernigan [00:07:14]:
Yeah. Everyone's nodding their head with you.
Joshua K. Smith [00:07:16]:
Yeah. Okay. Yeah. So it was such a strange journey, even coming into theological education, because I wanted to make this certain subsect of men so happy with my theology. And that was undergrad all the way through doctorate, so I finished my PhD in 2020, and the more I went through it, the more I kept questioning, why do I care so much what this person says about Genesis or what this person says about Exodus? Then I started reading with rabbis and different theologians, and that's really where I grew the most, Jeremy. I mean, just honestly growing outside of my little subdomain of theology and in a roundabout way. That's how I got into AI and robots and robot ethics and cyber theology, as it used to be called. But, yeah, it's such a really cool field to dig into, and not just because of tech, but because other people like me were asking questions that were told not to ask those questions or that those are not questions for today.
Joshua K. Smith [00:08:32]:
And we've always been told, make theology for a particular purpose. But is it not ultimately for God? I mean, come on. It's. The book is about him, not us, and the story is about him and not us. And so, to me, I think theology is much wider than my sub domain allowed it to be and still continues to focus in on very narrow ideas about women, sexuality, and just everyday topics. Just so clueless about what the modern world is and what tomorrow will be. And so that's where I want to take my family, my kids, and the people that walk with me. And God help you, disciples and friends, as we try to fit into a cyber world.
Jeremy Jernigan [00:09:27]:
Yeah. For what it's worth, I found it's much easier to tackle those subjects on a podcast than it is in a pulpit on the weekend. Yeah, I was the lead pastor, and the things that I wanted to get into constantly was attention because people are like, no, just. Just preach this story in this story. And now you're like, oh, there's other things that I think we need to dive into. And it's just tricky sometimes in that context when people are expecting a certain thing. And that's why I really like spaces like this where it's like, hey, we. Let's just dive into whatever and explore it.
Jeremy Jernigan [00:10:02]:
And you're able to get into ideas that often get overlooked in other spaces. A glass of wine helps.
Joshua K. Smith [00:10:09]:
I found it does. Whiskey does help.
Jeremy Jernigan [00:10:12]:
I haven't done whiskey yet. I did offer it to one of our previous guests, who I try to be a little bit flexible with people. And one guest like, hey, I just really. Wine's not my thing. And I'm like, you want to do a whiskey? And on their time zone, I think it was like a morning or something. And they're like, I just don't know if I can do a morning whiskey. So we ended up. We ended up getting wine, but we'll have to maybe come back and do a whiskey podcast.
Jeremy Jernigan [00:10:37]:
So I want to just address something. And I don't know if this is maybe what's in some people's minds as they listen or watch this, but they may be like, okay, I'm a little confused. Like, we're talking theology and then AI and robots. And like for some, I suspect even listeners, watchers of this podcast, viewers of this podcast, they may be going, this isn't like a mixing that most people are familiar with. Like we said, you don't hear this usually preached in church, so it may be a weird pairing for people. And you have an idea I want to read. I have a few ideas from your books I'm going to highlight as well. And you have this sentence that I think is such a, such a good, like, launching pad.
Jeremy Jernigan [00:11:17]:
Like, like, let's jump off this as the diving board. You say this line, robots serve as a new media to discuss the ancient questions of philosophy and theology, which I think is such a way to frame it. Like, this is a new media. Yes, like where the specifics are changing and they're new, but this is really an ancient conversation that has been going on for generations. And philosophical questions, theological questions. Can you elaborate on this deal, on this idea? Why are robots, why is AI a new media to discuss these things that people have been discussing forever?
Joshua K. Smith [00:11:55]:
Yeah. So the question is mostly about idolatry, dominion, eternally, sorry, eternity, and trying to figure out how can we use what we have to be the most effective humans to dominate in a negative sense, to exploit, to make a profit. And so even today, as we think about new technology, strictly from like the tech industry or even the business that your church members work in, anytime they see something new, they're going to ask at least a business owner will, how does it affect shareholders? Can we use this technology to produce profits? Can we use it for whatever type of growth? For the military, it's the same thing. Can we use this for dominance? Can we use this for intel on the battlefield? So I don't think the questions really changed. The questions that we ask as humans, the questions that we're trying to answer and resolve in day to day life. But how that media fits into it, how we work through these questions, keeps resurfacing as the media changes. So in ancient Rome, they had these ideas about this massive giant was like a robot that would help them defeat other armies. There was this jewish lore about a golem that would come and save the Jews in these different parts of the ghetto, and it had this Hebrew word put in its mouth, and at night it would come and save them.
Joshua K. Smith [00:13:40]:
So these ideas are not new. And really they surface again and again throughout literature. Think about works like Frankenstein by Mary Shelley. There's all. Even, even the oldest reference for robot, Rossum's universal robots was about in the 1920s. This industrial struggle with trying to automate slavery. And if you think about what slavery is, it's trying to dehumanize a person and make them an object. And typically in antiquity, and you can look up David Gunkel's work on this.
Joshua K. Smith [00:14:20]:
He has a new book out. But when we don't know how to classify something like an alien, not like somebody from outer space, what is it? It's not a human. It's not an animal. We don't really have any kind of communication about how to translate what it is, and so we make it an object. Right? And so that's why all the Sci-Fi were like, just kill it. We don't understand what it is. Just destroy it. And maybe that's not the best approach to have.
Jeremy Jernigan [00:14:50]:
It's a very american approach, though. We got a lot of guns.
Joshua K. Smith [00:14:54]:
I think it's human, too. It's like, it's scary. If you see something new in the woods or in the wild, you're like, that's. What is that? So I think it's human. And we keep asking these questions. And the book of ecclesiastes nailed it all those years ago that, you know what? There's nothing new under the sun. There's new media. Sure.
Joshua K. Smith [00:15:19]:
There's new ways of doing slavery. There's new ways of doing exploitation, all these different things, but at the same time, we're just struggling with our own pride and vanity and insecurities and anxieties about the world. And that's a. What I'm trying to convey in my book, in my work, that this is a new chance for us to have those conversations. And we're already using the tech, so, spoilers. You're already using this media. You just don't realize how you're being used by it. That's.
Jeremy Jernigan [00:15:57]:
That's a good little ominous. Just let's let that hang there for a second. What?
Joshua K. Smith [00:16:03]:
Yeah, it's heavy.
Jeremy Jernigan [00:16:05]:
Okay, so I want to. I want to bring up. I've got three different quotes from shows. Let's see two from one show and then one from a movie. And these may be maybe things that people can go, oh, I've seen that show. I watched that movie. I understand that image, and that that kind of helps us. I feel like a lot of these.
Jeremy Jernigan [00:16:28]:
These stories help. Help our imaginations catch up to what is happening in real time. So, like, I can't picture what you're describing. We'll go watch this movie, and it portrays what it might look like. And then you go, oh, that's. That could be 20 years from now, whatever. And so some of these force these conversations where we're not there yet, but it invites you to think about, okay, if we were there, are we ready for this conversation? And there's a few of these that I think are just interesting. And so I thought, this is my chance to pick your brain and get your reaction to these.
Jeremy Jernigan [00:17:01]:
So, the first two are from the show Westworld. I'm assuming you've probably seen the show Westworld, okay.
Joshua K. Smith [00:17:07]:
Oh, yes.
Jeremy Jernigan [00:17:08]:
And so the first one. And this. This. There's lots of examples of this, but this is kind of a conversation you see often, especially in season one of the show. But a character will come across a robot, and they can't. They're thrown off guard. It's their first time meeting a robot, and the robot is, like, very human looking.
Joshua K. Smith [00:17:28]:
Oh.
Jeremy Jernigan [00:17:28]:
And they're kind of like, whoa. And so the human will say to the robot, are you real? Like, hold on a second. I'm trying to get my bearings. Are you real? To which the robot replies, again, this is numerous times in the show. If you can't tell, does it matter? Which I think is such an interesting argument. And then it. Like I said, like, as you're watching the show, it invites you to imagine, okay, so, like, what if. What if your best friend, one day, out of the blue, tells you, oh, by the way, there's something I need to get off my chest and tell you I've never told you before, and that you're like, all right.
Jeremy Jernigan [00:18:06]:
And they're like, I'm a robot. Like, that would mess people up. Like, what do you mean, you're a robot? Yeah, you just didn't know it. Like, we've been friends our whole lives, but I'm actually not human, and that's not what the show is inviting you. Like, imagine if you truly could. Like, your senses can't tell that the person you're talking to is not actually a human person. It's a robot. But then the argument being, well, if you can't even tell, does it matter? So what is your take on that? How much does it matter? How much of these differences matter if, with our own senses, we got to a point where we couldn't even perceive the difference?
Joshua K. Smith [00:18:42]:
Yeah, I don't know that we ever will really know the difference. I do think it matters to some level, because we value true virtue, friendship. So we want. If I. For friends, Jeremy, like, I want to make sure that you value what I value, and you probably want to know that I value what you value, and that I'm seeking your good because I care about you. I think the concern would be, does the robot actually care about me? And a lot of people would argue that robots can't care about you. They can only care about themselves, which I would argue the opposite, that it's tell us it's made to care about the maker and that it probably would be a better friend than most friends if that's its one sole purpose. You think about, like, Siri and other types of servant AI, which technically Siri is a robot.
Joshua K. Smith [00:19:42]:
So I'm just waiting for her to pop up here. No. Okay.
Jeremy Jernigan [00:19:46]:
She's listening to where this is going first.
Joshua K. Smith [00:19:51]:
No, I think you. I think my cards have been shown in my book that I believe we can be friends with the robots and in a real virtue way, where, you know, even if it's a dog or a cat robot or be really cool if you had like a dolphin or something or somewhere to put it, but, like, you could bond with that machine, because I argue you could bond with cars, you can bond with inanimate objects. Just anything, really. I'm not saying that's the healthiest relationship to bond in that way, but when something is seeking your benefit, it's natural for us to want to have it in our life. So I don't think that's a negative thing at all. I think that shows positive attributes of human nature, and I think we can push and pull into that that benefits society. Obviously, you can take advantage of that, and there's a price tag on it for sure, and it will be exploited. But I think from our side thinking about it ethically, it's our job to come to the table and say, these are things that are high risk in this area.
Joshua K. Smith [00:20:59]:
We need to make safeguards and measures around that, how companies can exploit or use that for good. So obviously they're going to pick one over the other. And in the westworld scenario that you mentioned, they were using it to exploit the extremely rich. And I thought a lot about how the person going to the farm or to the park, why they would spend that much money to go there. And these people are the richest of the rich. They have everything. And what they want to feel is that sense of risk. Again, some of the things that money takes away at that level, having something that's dangerous, something that's not normal, there's all these different reasons, and there's lots of different anthropology in that show.
Joshua K. Smith [00:21:54]:
The neuroscience is really bad, but the anthropology is really good. And I don't think it matters necessarily for us to understand everything about the friendship or about the relationship of what's behind the shell, because we don't know that about our own partners. Right. But I do think it matters how we treat the other entity. So it's not, we always make it about the other entity, but it's actually about us and how we treat it, or them or they. And that's reflection. So that's the more interesting part of the show that they don't explore as much.
Jeremy Jernigan [00:22:33]:
Oh, I love that answer. That's great. Okay, another Westworld quote. This comes from Doctor Robert Ford, who's the architect of Westworld in the show. And he's. He's the guy that built this world, designed. It's all come out of his head. And then as it progresses, you realize he's got all these plans that begin to unfold.
Jeremy Jernigan [00:22:52]:
And he has this great line, you can't play God without being acquainted with the devil. And I just think that's Anthony Hopkins. Of course, delivering this line is a great line. Is there an inherent element of risk or danger on humanity's part in the creation of AI, of this technology as it progresses? People would say we are playing God. We are creating things that don't exist, and we don't really know where they go and how big they get and whether or not they're going to be able to be controlled the way we expect, able to be reined in, or do they take us all out? That whole thing is, is there some element of we are playing God when we keep producing AI? And are we having to be acquainted with the devil in order to do this? Or would you say no, that's a. That's a sinister way of looking at it.
Joshua K. Smith [00:23:44]:
I think, just like with creating children, there's always inherent risk. And the things that keep me up at night about AI are the same things that I think about with my children is that we can cultivate environments, interactions, in a way that's life giving, or we can cultivate them and push them in a way that's life depreciating and destructive. Now, our AI children are not the same as our biological children. I'm not saying that, but I think we are responsible for the creations that we make. And we need to think very carefully about those interactions, and not just as consumers, but as engineers, as developers, that this product is going to impact a user. And we don't typically do well with free will. We tend to run amok with it and struggle, and that's part of being human. And I think some of that is implanted into some of these processes and machines, and it's already evident that they take on the biases of their designer.
Joshua K. Smith [00:25:06]:
We've seen that with the whiteness in AI because it's got a very different approach to policing some areas. There's all kinds of stuff you can get into with the racism of AI. You think, how can it be racist? It doesn't really understand race. No, it doesn't. But the humans that encoded it do, and it slips in some way. And I think one of the things we miss that maybe Doctor Ford's character picks up on is that sin, nature will be a part of the creation. Now, I don't know what the attorney of robots looks like. I don't know that I'll ever be intelligent enough on that subject to tell you I think it's more akin to animals than humans.
Joshua K. Smith [00:25:52]:
But that could just be because I'm biased towards humans. And we don't really have Jesus talking about robots or anything like this. I just want to be very careful because I do think there's some truth to that quote, that in some ways it takes a lot of hubris and pride to make something and say this will potentially replace a human person. And I think that's possible. I don't think people really want to entertain that. But it is possible. Maybe not at the level we have computation right now, but I do think we are augmenting humanity in different ways. Social media is a type of augmentation.
Joshua K. Smith [00:26:37]:
We have this distancing between us which can be good or bad. We're doing that with our children all the way up. So whether we like it or not, it's a part of our relationships and it's not going away. So what do we do with it? All this material comes from the ground. We develop it, it's fallen, it's broken, it's leading towards destruction, brokenness. And then somehow we think we're going to change it and fix it.
Jeremy Jernigan [00:27:09]:
We're going to move off of Westworld for a second. There's probably my favorite movie that depicts robots is ex machina. And again, I'm going to guess you've probably seen this. So I think that was like the first time I really was able to think through some of these scenarios of like, oh, man, this is tricky because I could envision the way these conversations take place. And in that movie you have this human who's invited in, basically to determine whether this robot passes the Turing test, which is how human like, does this robot seem? And there's this conversation that kind of, like, culminates in the two of them talking that I think about this conversation a lot, and I'm like, this is so tricky. And it's like a series of questions that this female robot is asking this guy who's. He's brought in to determine whether or not she passes this turing test. And so she says, what will happen to me if I fail your test? Then he begins to think, like, oh, like, yeah, what if I say.
Jeremy Jernigan [00:28:15]:
If I say, you don't pass this, what. What is he going to do with you? Then she says, do you think I might be switched off because I don't function as well as I'm supposed to? Then it turns, and she says, do you have people who test you or might switch you off? And that's what I was like, oh, that is a brilliant. Like, she's basically saying, why? Why am I under this scrutiny? And you're not under this scrutiny? And then she, you know, culminates it with, why do I. Like, why do I have this if you don't have it? Why do I have to pass this test? And you don't have to pass this test? Like, we both exist here in this space, but only one of us has to prove it. And I think this is an insightful conversation because it reflects that we have this built in assumption. We expect AI to work for us, right? Think about people and Alexa, right? Alexa is like, Alexa works for toddlers, right? Like, toddlers walk in and start barking orders at Alexa, and they feel this sense of power. Like, I just made Alexa play me a song, and I just have watched my kids on paratrip with Alexa. And to me, though, it's an indic, it's an indicator that we fundamentally expect anything we've created.
Jeremy Jernigan [00:29:30]:
Like, you work for us, you serve us. And if it were able to truly become sentient, truly become at this level, whatever the peak level is, where it feels human. Like, at some point, I would imagine the AI won't like that, right? Like, hey, why are you treating me like this? I'm just like you. Do you think AI is going to eventually resent us because of this slave you serve us kind of posture? Or do you think maybe it'll never get that developed?
Joshua K. Smith [00:30:02]:
I don't know. I think it's a fair question that we see a lot in literature. Like you're saying, I've seen it in humans and irobot and other places where they're asking these sets of questions to the robots, and they're responding the same way. Can you make a piece of literature. Can you paint a beautiful portrait? Can you make poetry? Not everybody can, right? That doesn't mean that their value is lowered. It just means their intelligence is different. And so I think we have to make space for that in how we think about intelligence. It's not just mathematics.
Joshua K. Smith [00:30:46]:
It's not just science. It's not just arts. It's all of those things. And I think AI will be really good at certain spectrums of that and be really bad at others. So it's like a partnership. And I think about it that way, where you think about the flourishing of the body. Not everybody is an eye. Not everybody is an ear or a hand.
Joshua K. Smith [00:31:13]:
And to have a good, functioning body, you need all pieces to work in unison. And so whether or not the relationship will flourish depends on how we approach it. So if we approach it from the master mentality, which is baked into the language, I mean, literally in the coding language. So we've always seen ourselves as the masters of the machine. And I think as we look back on this, maybe 50 to 100 years from now, we'll realize how enslaved we are to this technology that we've given ourselves over to it. And it's not so much Orwell as it is Huxley, where we took the pill we wanted to numb, and we know we wanted the brain rot and all those different things. And I think proof of that is just here at young kids today. And I'm nothing, not going down that road, necessarily.
Joshua K. Smith [00:32:11]:
But it's like even. Even their generation is talking about brain rot and, like, picking up on, in a funny way, that maybe we're not all that we should be. Maybe technology has a way of dumbing us down in the same sense that Plato talked about how writing would take away from memory. And so my concern, first and foremost, is for the life of the mind. Obviously, as a scholar and someone who likes research, is that we don't take away that aspect of asking those questions. And depending on how we shape the technology and use it. I remember being in high school, Jeremy, you're old enough to remember this, too. You were being told in math class you wouldn't have a calculator with you.
Jeremy Jernigan [00:33:00]:
You won't have a calculator in your pocket.
Joshua K. Smith [00:33:02]:
You're never gonna have that. Like, that's exactly what we have at all times. And is it always right? No, no. I mean, does. Will it do advanced equations properly? No, it doesn't always fit. Like, doesn't always follow certain rules of mathematics and AI, in a certain way doesn't always follow certain conventions. And rules of humans. And likewise, we don't follow the rules of machines.
Joshua K. Smith [00:33:29]:
So just making space for that conversation, I think, is important, first and foremost. But also I think there's something human in that question we're asking about how will it feel? We're always assuming that it has a feeling, and I think that's good. And I think it's a chance, again, to ask the question that we've not always asked throughout the centuries is just because this is different than me, because it looks different, talks different, walks different, should I treat it as an object, as a person or a thing, or should I treat it as an equal? And I think we're more concerned about, we don't want to treat it as an equal, because if you do that, then it, it separates the, the financial income we can pull from it. It separates the legality of all of this, how we treat it. And so there's a lot baked into that question. But I think it's important because it's already a part of our business plan. It's already a part of the next quarter, and we have to make space for it. Now, I don't think people really understand that question.
Joshua K. Smith [00:34:40]:
I'm not saying because they're ignorant or they can't. It's just something we want to blow by. Because if we start asking that question, are we really going to be able to make changes? Because we've already planned the next profit on the next quarter of profit, assuming that we're going to be able to use these high level processes so we don't want to ask a question. And then right before some of my research came out, most, if not everybody that I talked to, like pretty much every publisher, said, this will not be a question for the next 20 to 50 years. That was the hubris of especially christian publishing, but a lot of modern publishers. And now everywhere you look, you're seeing the interconnectedness of AI services. It's literally everywhere in your work environment. You just don't see it unless you work with data, then you see it a lot.
Joshua K. Smith [00:35:41]:
But yeah, it's touching every aspect of our life. It's impacting whether or not you get a certain job. It's impacting people who live in certain socioeconomic backgrounds. Put it nicely, it's impacting everything and certainly will impact the next generation. So I think it's good for them to understand you are going to have a relationship with this tech. It does matter how you treat it. And outside of that, that's just a secular question. But if you put a theological lens on it.
Joshua K. Smith [00:36:14]:
First of all, this world doesn't belong to you. So how you treat the object has really nothing to do with the object and everything to do about you and your view of the creator and God. And so if we are to have dominion, which I see interpret as flourishing, making things that prosper, making a beautiful garden, orchard, whatever vineyard, and having that flourish for good, then I think that extends to machines, machine learning, AI as well. And so we need to create environments that it's not going to destroy, it's going to lead to flourishing, it's going to respect the object. That's just part of the mosaic law. And I don't understand why we're so opposed to that as theists, but we seem to be inherently opposed to it. But God's teaching, loss is the opposite. Like why?
Jeremy Jernigan [00:37:13]:
Yeah, and I think you're hinting at something else that we don't talk a lot about, which is how you treat. So let's say there was, there was robots among us. How you treat the robot, you may feel justified in to say, hey, they're not equal, I can do this. But ultimately, what is that doing to you to treat the robot that way? So when you are cruel to a robot, when you are abusive to a robot, what is that doing to you as a human? Which is a separate conversation to what you're doing to the robot. But here's what it's doing to you. And I think about, that's the problem of sin. Sin is this great trick that I'm going to get something awesome out of this. And it's always a bad deal where you lose and you're.
Jeremy Jernigan [00:37:59]:
Something breaks inside of you and you think, I just made a great exchange and you don't realize, like, no, you lost something there. And that's what I think, too, is like, even if you don't conclude that a robot could ever be up here with you, how you treat that robot will shape you. And you think, like, I can do whatever I want. And I think you've brought up animals numerous times. That's where I think the animal parallel is really good. You don't have to think animals are human to conclude you shouldn't abuse an animal. You know what I mean? Like, you're not. You're not saying we shouldn't abuse them because they're humans.
Jeremy Jernigan [00:38:34]:
You're saying you shouldn't abuse them because that's. There's something inherently wrong in that that we can acknowledge, and I think it should. That conversation needs to be a part of AI as well, like there's something inherently wrong about abusing AI, just like there would be of, you shouldn't go kick a dog. And yet for some reason, we don't equate those.
Joshua K. Smith [00:38:56]:
Yeah, and I think that's the old kantian argument. Right. And for people who are in these circles, that's almost cliche. We talk about it all the time. But it's also showing the heart of some of the people that are making this tech as creatives, not on the business side, but it's such a beautiful space just to shout out to robot makers and developers, there's so many beautiful, kind hearted people in these spaces who care deeply about society, about the environment and, but at the same time, those who are going to fund our research have a lot of money in their bank and they don't always care. And to be fair, products not going to come to life unless somebody cares about it and wants to profit off of it. So that's the ugly side and I think the sinful side of this business. But yeah, we, I think we have to come back to, again, these, these old questions about ethics and philosophy that we're never really going to get over.
Joshua K. Smith [00:40:03]:
It's going to change again, but how do we reshape our thinking on that? And it's funny to me, Jeremy, I don't know if you've experienced this as a pastor, but even if we have, for the listeners who are not Christian or theist, like, even if you have a book of doctrine and dogma, that's really not enough. Like saying you should do something, it's really not enough unless the person embraces that as their own philosophy, has thought about it, has wrestled with it. It really doesn't matter what, what a book says, because I know people who say the Bible is God's word, and then they'll abuse their wife, they'll do this or do that and like, and just flat out tell you, I don't care. I'm like, that's really not the best witness of what you believe. And go back to Westworld. That's where the dissonance comes in. We say we value these things. We say we value human life, but then we treat everything the opposite.
Joshua K. Smith [00:41:11]:
And we pay money to go to a park and destroy life and destroy these creations that people made and very brilliant minds put together. Yeah. And it just reflects so much on how we have these questions, how we're never going to get past them. But there's so much beauty in the talking about them. And my big thing is I want to bring as many people together to have this conversation. I think there's so much biblical rationale to have tabling good wine and good food together, to have conversations about flourishing with people who don't see the world like we see it. And we can debate whether or not it's even possible. But I want to believe that it is.
Joshua K. Smith [00:41:59]:
And I, maybe I'm just naive about it, but I still continue to push and fight that the conversations worth having, even if they end, is Terminator. I can't change the end, but I do want to be a part of the rebellion that thinks differently and be a part of the outliers, or what David Gunkle calls people who are thinking otherwise. And there are a lot of us, and you're part of that, and other creatives and artists are part of that. Anybody who's willing to ask the question and push back and say, what if these machines are not like us, but we do consider how we treat them or consider how that's shaping me. And a great show that explored this early was humans by the BBC Channel Four. It's not the, it's not Westworld production, but it gets everything right. Even. There's even religious robots in the show who are struggling with faith and praying.
Joshua K. Smith [00:43:12]:
And I think it makes them very human in that way. And they're asking the same thing. They're struggling with relationships, they're struggling with identity, they're struggling with their purpose and placement in this life as an outsider. And true or false, like as a minister, do you not feel othered like you sometimes feel like a robot? Like people are bringing you prompts and requests and they expect results and they expect you to know everything about a particular subject, and that's just not the case. Right? Our batteries run very low and we get tired, we make mistakes. Robots will be the same way. So I see a lot of humanness in the creation. And sometimes it scares me.
Joshua K. Smith [00:43:59]:
Sometimes it makes me full of wonder.
Jeremy Jernigan [00:44:03]:
That's great. So about 15 years ago, I was sitting at a conference and Kevin Kelly was one of the speakers. And Kevin Kelly is one of the co founders of Wired magazine. He's a christian super into the this technology space, and the guy is just brilliant. And I remember, again, this is my guess is this was 15 years ago. He said something at this conference that literally I like, laughed out loud because it seemed so absurd at the time. And it was like no one was, no one in any space I was in was saying what he was saying. But I also, I'll never forget it because I remember sitting there going, maybe this is something I should pay attention to.
Jeremy Jernigan [00:44:46]:
And that kind of started me on an interest to this. And he. He had this whole thing where he said, hey. Because he was talking to Pat, it was a pastor conference. And he said, hey, pastors, I'm going to tell you a question that is coming your way that you're not prepared for. And his whole thing was like, I want to get you ready for this question, and I want you to start doing the work to have an answer for this question, because this question is coming and building it. And we're all like, what's. What's the question that we're not ready for? And he said, what will you say to a robot who asks you, can I, too, be made in the image of God? And it was like you could have heard a pin drop.
Jeremy Jernigan [00:45:30]:
We're all staring at him, like, what? And his whole thing was like, this is the question that the church will have to wrestle with, because he's like, I believe we're getting to a place where there will be fully sentient robots who will be asking the church, can I also be a part of this? Does Jesus blood cover me? Do I get to be included in this? And will you tell them no, or will you say yes? This is for you. So, again, that's probably one of the things that launched me into this 15 years ago, where I thought, that is fascinating, and I've done a lot of reading since then. You have a line in your book that I'm not sure where you're going with this line. So this is, like, the only line that I was like, what are you saying on this? So we're going to settle this right here to figure out what.
Joshua K. Smith [00:46:21]:
Okay, you say this.
Jeremy Jernigan [00:46:23]:
The closer potential AI driven robots come in proximity to humans identity and function, the further the image of God in humans is distorted. That is seen as inferior to or less than. When I read that, I literally went, why? I wrote the word why? Next to that, like, why? Why does the development of robots somehow challenge the image of God in us? And so what's your take on this? Could a robot ever be included in the image of God? Can they represent the imago dei or are they out?
Joshua K. Smith [00:47:02]:
Yeah. As an author, I think we should give common grace to understand that authors change, that. Not everything written is on a stone tablet. And a lot of my writing, Jeremy, is me processing things as I'm learning and writing and wrestling. And I think at the time, that thought was more influence what men I wanted to please and not feel less human for thinking otherwise, and less so about what I really believed in my own crafting of theology, the theology of robots. So the more I thought about this, the more I believe that it really is not about a particular property as far as, like blood or DNA or those type of things, and more so about the functionality, what that robot does, how it interacts in a space, and that's more related to Shintoism and how they think about robotics. And not to say that Japan and other places are extremely religious, it's not the case. But a part of the tradition is that when something comes into your home, it now takes on a new life.
Joshua K. Smith [00:48:35]:
And I think in regards to whether or not that can be an image bearer, it depends on how it functions. If it's something that comes apart of your life in a relationship, then, yeah, I think. I think it is bearing the image of God. If it is caring, like, if it follows the attributes of God, then yes, I think that it can. It can worship, it can bring glory, all those things. And the inverse is true as well. If it destroys and is destructive and ugly and the exact opposites, then yes, it can be a disciple of the Antichrist. And so everything that is anti God.
Joshua K. Smith [00:49:18]:
So I think both of those have to be true, and both of those potentials have to be there. And sometimes we forget that that's not just true of robots, true of anything. And so my thinking has been challenged on that. I still don't think I understand everything about that question, and I don't think that I ever will. But, yeah, the same way we think about animals in heaven, I. Lewis thought that, yeah, there might be animals in heaven, and I don't know what that's going to be like. There's a lot more theology to unpack there. And I don't have, like, Bible reference or anything to substantiate what I'm saying, but it just seems to be in the sovereign grace of God and how he does grace and the restoration and redemption of all things.
Joshua K. Smith [00:50:08]:
I think that also means our creations. The fact that we will be new creators in the new creation makes me think even more that what we put our hands to will glorify God. Now, the pushback that I got a lot in my early thinking about this was that, yes, it could be made in the image of God. Everybody hated that, especially in my circle. And so it was directly related to my dissertation. I was told to take it out. It was a part of my early thinking. And basically I was told by people in power that if you would change your thinking on these things, you would have more influence.
Joshua K. Smith [00:50:51]:
And I went with that for a while, and then I just couldn't. I don't know, I just couldn't accept it because I didn't believe it. So, apologies if that's in there. I thought, was that in robot theology or my. Yeah, that's in there. Maybe it's time to rethink some of those things.
Jeremy Jernigan [00:51:11]:
That's why we do podcasts, right? We get to. We get to revisit it and go.
Joshua K. Smith [00:51:16]:
Get a second chance.
Jeremy Jernigan [00:51:17]:
This is what I have a lot of authors on the podcast, and I love. I always say I love reading the book, knowing I'm going to have a conversation with the author, because there is an element of that. And if it makes you feel better, I have published two books, and there's plenty of things in those two books that I don't agree with. Yeah. Quoted me on that. I'd be like, that's a different guy that wrote that. I'm not that guy, or I don't even agree with that anymore. And that's, I would say, josh, that's a sign of you growing.
Jeremy Jernigan [00:51:45]:
And I also appreciate just the insight you're giving us as to, there are power structures that shape even the ideas we're allowed to entertain. And certainly I felt that as someone in the pulpit where you have a board and you have members who give financially and expect certain ideas to be addressed and certain ideas not to be addressed, and that's a challenge to someone who's just trying to follow truth and share. It's on their heart. So I actually really like your answer, and I applaud you for that. Okay, so here's what we're gonna do. We got all the. All the heavy stuff out. I'm going to switch gears here, and then we're going to go through a speed round that I ask, and we'll just see where you go.
Jeremy Jernigan [00:52:28]:
These should be easier answers, but the pivot question is a question that's fun, unique to this podcast. Is there a time. I know you're a whiskey guy, but you're drinking wine, and you and your wife evidently have a wine you like together. I love asking people, if I were to say to you, what is the best glass of wine you've ever had in your life? And maybe there's not a lot you choose from, or maybe there's a lot. Is there a time that you go, wow. That you. You were with someone, or you were out of place, or they poured you that one bottle and you just went this is really special. Is there anything that stands out to you by, like, said, like, hey, what's the best glass of wine you've ever had? Do you have a story that comes.
Joshua K. Smith [00:53:09]:
Yeah, actually, I don't. The unfortunate thing is that I don't remember the names, so apologies. But a couple summers ago, I was in England presenting some research, and every night I thought this was wild to me. It was just so much fun. So a bunch of priests, bishops, and, like, people way outside my. My pedigree and background as a southerner, and we were talking about AI and religion and spirituality, and every night we'd have dinner together. We'd have a bottle of wine, and that was the most wine I've ever drank in my life. And it was so amazing because typically, being, like, beer and whiskey guy, you don't really drink a lot of wine.
Joshua K. Smith [00:54:03]:
Just uncultured swine as we are. And. But, yeah, it's, like, really opened me up to. To some reds that I really liked and several whites, and I was like, this is actually good. Sorry, I got something in my eye, but, yeah, all right.
Jeremy Jernigan [00:54:22]:
I like it.
Joshua K. Smith [00:54:23]:
It was just fascinating. And I really missed that. I missed that because every night we'd have coming off these big discussions and debates, we'd have a nightcap, and I think we need that more often.
Jeremy Jernigan [00:54:39]:
That's way. That's the way theology should be done. I like that.
Joshua K. Smith [00:54:43]:
Yeah.
Jeremy Jernigan [00:54:43]:
Okay. We just stumbled into an example of this, but maybe you got something else. What is something you used to believe that it turned out later you were wrong about?
Joshua K. Smith [00:54:53]:
Oh, gosh.
Jeremy Jernigan [00:54:55]:
Theologically, or just you take these questions, whatever. Whatever direction you want.
Joshua K. Smith [00:55:00]:
Hmm. There's so much you need. You need my wife to just bring her in, let. It would just be like a scrolling bar of things about, oh, I think the biggest thing for me, life change wise, was kids. I didn't think I'd be a good dad. Came from a broken home. And not to be all sappy or emotional, but just. That's such a big part of my story and why I am the way that I am.
Joshua K. Smith [00:55:31]:
But I always thought I wouldn't have a family and that I would just die alone somewhere and be alone. But it's been such the biggest life change, life giving. I like being a dad. I never thought that would be true. And then there's, like a billion things I've been wrong about that my life has helped me understand for the better.
Jeremy Jernigan [00:55:55]:
Yeah, you're not alone in that. What do you see as the main issue facing Christianity in America today?
Joshua K. Smith [00:56:04]:
Oh, it's one of those old questions. I think it's still identity and idolatry. Does those conflicts between what we desire life to be like and what we want it to be like versus the reality of what it is? And I think for me and for many others, the identity piece, especially in, in my world of theology I grew up, was the identity was more driven by what a particular men thought than what Jesus thought, obviously. And what I think also the grand narrative of the Bible, the collection of works of God's story, tell us about what our identity is like. And again, that's been a life changing thing. To be able to embrace and say freely is that my identity is not tied to my job, it's not tied to my preaching anymore or even what I do now. It's, or even as a dad, it's strictly tied to what God believes about me. And we've just got to get past caring about what certain people on stage think or who's writing this commentary on this and what they think.
Joshua K. Smith [00:57:27]:
Because for me, man, coming not from a religious background into theological education, it was so alienating because I didn't care what John Calvin think. I didn't care what Martin Luther thought or whoever. I was just like, I just, I want to know what the truth is, how do I understand this and how do I shape it? How's it going to help me, those type of things. And we just keep coming back to, this is urgent, this is okay. And that seems to be like the most formative thing for people. They center their whole life around. I did it, too. What this one particular group thinks about this subject.
Joshua K. Smith [00:58:10]:
And so I think we're still stuck on this, our identity arc where we don't really care what Jesus has to say about it or how he would act about it. We just know that we think this person will get just the results that we want. I'll just say it that way. And especially us, things that are coming this season, I'm like, it's, it's not going to go the way you think it's going to go, so I'll leave it there.
Jeremy Jernigan [00:58:37]:
What, is something blowing your mind right now?
Joshua K. Smith [00:58:41]:
I changed careers a year ago, as I mentioned earlier, as I moved into tech, so a lot of things are blowing my mind right now as I'm learning and incorporating new things. But I think the biggest thing is revolved around, like, the hacking community because I do a lot of cybersecurity research and those type of things. It's just the sheer creativity of some of these people. To go in to learn something about a system. If you've never done this, just try. Try to find a flaw in Microsoft Edge or chrome and then find a way to exploit it. These are some of the most creative people you've ever heard of. And that just blows my mind every day because I'm like, in awe sometimes when I see something like that, I know it's bad.
Joshua K. Smith [00:59:35]:
You did something really bad and legal and it hurt people, and I don't like that. But at the same time, you look at that person's creativity and they just, as a teenager, thought about how to break something and did it. And I think that's fascinating. So I'm not saying I'm a criminal. I'm not saying I'm four criminals, but I do think it's fascinating sometimes how we can break things. And if you have that mindset, and I do like you see something like, how can I exploit this? And I think that's fascinating.
Jeremy Jernigan [01:00:12]:
What's a problem you're trying to solve?
Joshua K. Smith [01:00:15]:
This is something that's going to be ongoing. But I think just changing the narrative about person rights, I think that has a question to play in how we think about and rethink about modern law. And I've been hoping and praying that robot rights and personhood dose of baits would actually lead to greater change and flourishing. And I feel like I'm a part of that narrative. I hope that I am. And I hope looking back as we approach those things, I'm just a small piece of that cog towards, like, change and perspective and hopefully leading towards massive worldwide reorganizing of our governance and structuring for how we do personhood and rights.
Jeremy Jernigan [01:01:09]:
What's something you're excited about right now?
Joshua K. Smith [01:01:12]:
I'm excited for winter.
Jeremy Jernigan [01:01:14]:
Me too.
Joshua K. Smith [01:01:15]:
I'm ready. I'm ready for winter. Me and my dad are working on an office because right now I'm in my youngest child's room in the corner. And this is where I spend most of my day. And so I'm really excited about winter because then I'll be able to one, it'll be cooler, but to also have my own personal space, to just have dad time and creative time, and I can have my dangerous robotic stuff in there and hacking labs and work and just creating space to be creative and also bring the kids in that space sometimes to tinker with robots and get comfortable with those type of things. So I'm really excited about that.
Jeremy Jernigan [01:02:02]:
All right. Is there anything else you'd like to add that I have not asked you about that. We can't close this podcast out without addressing no man.
Joshua K. Smith [01:02:10]:
I think you've done a great job. I appreciate you reading my work. I appreciate you interacting with the question. I do want to plug a couple of authors, Andrew Gill Smith, who's doing some very amazing work on the intersection of catholic theology and robotics and AI. Check out, he's an independent author, but check out his work. I mean, I really struggled to get through his first book, the lady of the artifacts, and because of Ari, and it was just so rich with theology and good neuroscience and action and all these different things. And then two, go read William Gibson, because he is the forerunner of how we think about cyberspace. Not my favorite author, but his stories are well done.
Joshua K. Smith [01:03:07]:
So if you can get over some of the quirkiness of the writing, in my opinion, he has a lot of really interesting thoughts about some of our modern struggles. And so those two together, like, man, this is a rich time for this discussion. And so there's many others. Sven Niholm, David Gunkel, Josh Gellers. Obviously, you need to read everything that's coming out in that sphere. Mark Huckleberg, there's so many, but yet this. Just explore. Pull all the threads about AI and ethics and robot ethics.
Joshua K. Smith [01:03:48]:
And don't just take anything for granted because there's a lot to these conversations that we don't often unpack.
Jeremy Jernigan [01:03:58]:
That's great. Okay, so if someone is listening to this and they're like, hey, I like this guy. This is interesting to me. How can they find you online?
Joshua K. Smith [01:04:07]:
Yeah, the only space that I really occupy right now is Instagram, and that's mostly so my wife can send me memes. So I'm dead serious. I would have. I would have been off of it like months ago. It's Joshua K dot underscore Smith. I don't post a lot. Just being honest. I do post stories sometimes, but mostly it's for people like you and others who want to reach out to me and don't know my email.
Joshua K. Smith [01:04:38]:
So I like to keep it that way. But if you need something that's a place I'm publicly accessible, reach out. And I'm just on a journey to go Nomad. So you've been warned.
Jeremy Jernigan [01:04:50]:
You also have a few books that people can check out and explore your ideas.
Joshua K. Smith [01:04:56]:
Yeah, my latest book is not really easy to access, but if you reach out to me, I will send you a copy, at least get you a PDF. It's way too expensive. It was published in Europe, but it's some of my best work. I think, and most mature work on AI, robotics and technology, and it's centered around violence and technology. So if you're interested, there's a lot of out there thoughts, some wild stuff, and I also get a second stab at some of the theology I think that I got wrong. So if you're interested, reach out. I'd love to send a copy to you, have a conversation with you.
Jeremy Jernigan [01:05:40]:
Awesome. Well, Josh, seriously, thank you for taking the time to dive into this. This is fascinating to me. I hope our viewers, listeners are half as fascinated as I am in this conversation. And to have someone that has spent many hours thinking deeply and probing this and also obviously, doing it from a theological point of view of how to. How do we make sense of God and all this technology and these. These changes, I just think it's super valuable. And I do agree with you.
Jeremy Jernigan [01:06:10]:
This is. This is coming faster than people think. And I'm the guy that wants to be the one who's like, yeah, I've already thought about this. And everyone else is like, what do we do? It's like, yeah, I've already. I've already been. I've been reading books on this and talking to people and like, yeah, here's what we do. You know, I just appreciate you. You taking the time to do this, and this.
Jeremy Jernigan [01:06:27]:
This was a super fun conversation.
Joshua K. Smith [01:06:29]:
Yeah. Thanks, man.
Jeremy Jernigan [01:06:30]:
Well, hey, everybody, thank you for diving into another episode of Cabernet. And pray we will see you on the next one.