Participants:
Series Code: AU
Program Code: AU000022S
00:00 - What if the world as you experience it
00:01 is not exactly the way you think it is? 00:05 What if the evidence of your senses has been lying to you? 00:09 And what if the things you hear and see 00:11 as you make your way through life 00:13 aren't actually real? 00:15 Don't think that could happen? 00:16 Stick around because you're about to have 00:19 an authentic experience. 00:21 [upbeat music] 00:42 I don't know if you're getting the same online ads 00:44 that I am, but I keep getting these commercials 00:46 for computer generated voiceover services 00:49 that allow you to narrate a video 00:51 without actually having to use a voice actor. 00:54 And I guess computerized voices 00:56 are nothing particularly new 00:58 because, well, maybe you remember this. 01:01 The Chrysler Corporation actually introduced talking cars 01:04 back in the 1980s. 01:06 Right about the time that "Knight Rider" 01:08 was a really popular TV show. 01:10 And these Chryslers had all these prerecorded notifications 01:15 like please fasten your seatbelt 01:17 or a door is a jar and a lot of other well naggy messages. 01:23 And of course, as a kid, 01:24 I thought that talking cars were pretty cool. 01:27 But now maybe you've noticed you can't find them anywhere 01:30 because talking cars have disappeared. 01:32 It's been 40 years since they first hit the market. 01:36 And my guess is that manufacturers 01:39 found that people don't want their cars talking to them. 01:42 There's just something about artificial voices 01:45 and artificial people that doesn't sit well with most of us. 01:50 [upbeat music] 01:51 You see, no matter how good you are 01:52 at creating the illusion of human intelligence 01:55 coming out of a machine, 01:57 most of our efforts don't pass the consumers sniff test. 02:01 Even though a lot of people can't really put their finger 02:03 on why they don't like it. 02:06 For example, sometimes you get a computer generated voice 02:09 answering your phone calls. 02:11 And most of us don't like that. 02:14 Back in 2018, 02:15 Google unveiled a new service that actually calls businesses 02:18 for you to make appointments. 02:20 So if you need a haircut, 02:22 let's say you just push a button 02:23 and Google calls the hairdresser, 02:25 and lines up your appointment. 02:27 And from what I gather, 02:28 a lot of the people who took those calls, 02:31 couldn't tell it was a computer talking at them. 02:34 Then on the heels of that, 02:36 Google introduced a virtual call center 02:39 where they answer the phone for you 02:41 and talk to your customers. 02:42 And again, I've got to admit these computer generated voices 02:46 were really pretty convincing, 02:48 at least for a couple of minutes. 02:51 For a few minutes, 02:52 you might think you're talking to a real human being. 02:56 But if you keep listening to those artificial voices, 02:58 there's always a tell. 03:00 There's always some little nuance that gives it away 03:03 because human communication is very complex 03:07 and our brains have gotten good 03:09 at picking up little details 03:10 to help us interpret what people are saying. 03:13 You see, we don't just communicate with words. 03:16 We also communicate by using little inflections 03:19 or silencers, or facial expressions, or hand gestures. 03:26 And for computers to mimic all of that. 03:28 Well, they'd have to be able to read 03:30 and interpret the emotional state of the listener, 03:33 and that's not an easy task. 03:37 So at some point, 03:39 when you're listening to computer generated voices, 03:41 you are going to hear the difference. 03:44 The computer's going to get something wrong 03:47 ever so slightly. 03:49 And that's gonna make you feel 03:50 just a little bit uncomfortable. 03:53 It's a phenomenon we call the uncanny valley. 03:57 There's just something about a phony person, 03:59 a phony voice that triggers a very uncomfortable feeling 04:03 in most of us. 04:05 Now on the phone, 04:06 it's pretty easy to fool people 04:08 because you can't actually see someone's facial expressions. 04:12 But when you turn on video, 04:14 your face in essence becomes your second voice. 04:17 In fact, something like 80% of communication, 04:20 isn't done with words. 04:22 When you and I talked through FaceTime or on Zoom, 04:25 I can see the sparkle in your eye 04:27 or the wrinkle forming on your forehead 04:29 or the look of fear on your face 04:31 when you tell me what just happened to you at the hospital. 04:34 What I see adds a lot of meaning to what you say. 04:39 And so far, 04:40 we haven't been able to build a computer 04:42 that can actually master the complete art 04:44 of human communication, 04:46 because communication is very intuitive. 04:49 It's a two-way street. 04:52 Real human beings will change what they're saying 04:55 or how they're saying it 04:56 based on the visual feedback they get 04:58 from somebody's face and so forth, 05:01 a Silicon chip doesn't know how to do that. 05:05 At least not very well. 05:08 Some visual artists and movie makers have caught on to this. 05:11 And they're actually abandoning the idea 05:13 of trying to use computer generated people in their movies. 05:17 They'd rather make cartoons, 05:19 something that doesn't try to look like a real person, 05:22 because well, audiences often reject 05:25 the more realistic looking characters. 05:28 They don't like them. 05:29 What we're discovering is that if you try to get too close, 05:33 too much like a real person, it turns people off. 05:37 They find it creepy. 05:39 It's kind of like those Japanese robots. 05:41 They get really close to looking like real people. 05:45 But when they start moving and talking, 05:47 it gives you the willies, 05:48 because you can tell something's wrong. 05:51 You might not know what it is, 05:53 but you know something's wrong. 05:55 That is the uncanny valley. 05:58 Of course, some people are still working on this 06:00 and they're getting closer and closer 06:02 to creating a believable fake. 06:05 When Google rolled out 06:06 their artificial operators for example, 06:09 a lot of people were fooled, 06:11 but still there's a significant barrier 06:14 to making fakes look really, really good. 06:17 And that's the fact that the sheer volume of data 06:20 that passes back and forth when you and I are talking 06:23 is so massive that programmers 06:26 have to anticipate millions of possibilities, 06:29 and that's a pretty daunting job. 06:31 So of course the next step is to teach computers 06:34 to start learning so they can learn all the possibilities 06:38 found in human exchanges. 06:39 And then they could anticipate absolutely anything. 06:44 So who knows, maybe we'll get there someday. 06:47 Maybe we'll build the perfect illusion, 06:50 an artificial human that can actually fool all of us. 06:53 Now, personally, I kind of doubt it. 06:55 I could be wrong, but I doubt it. 06:58 So, as I was saying, 07:00 I've been getting these ads 07:01 for a computer generated voiceover service, 07:04 and they tell me I can create instructional videos 07:07 or sales presentations by using an artificial voice. 07:11 And they insist that nobody will be able to tell. 07:15 So who knows, 07:16 maybe someday I can do this show 07:17 by having a computer generated Sean sit in this chair. 07:21 Now, again, I'm not very hopeful 07:23 because when I listened to the samples from that service, 07:26 I can tell the difference. 07:28 So you can just call me skeptical. 07:30 I'm not sure it's going to happen. 07:32 Now just a few weeks ago, 07:34 I had this article pop up on my newsfeed 07:36 and it's talking about the dangers of deep, fake videos. 07:41 These are videos that put words into real people's mouths, 07:44 words they never actually said. 07:47 And some of them look pretty good. 07:49 Maybe you saw that State Farm commercial, 07:51 where they made it look like a sportscaster 07:53 back in 1998 was predicting stuff in the year 2020. 07:57 And he was getting absolutely everything right. 08:01 Of course, this never actually happened. 08:02 It was a computer generated fake, 08:05 but I've got to admit 08:06 if you weren't really watching carefully, 08:07 this one was pretty good. 08:09 Now I don't know if I would have fallen for it, honestly, 08:11 because unfortunately I already knew it was a fake 08:14 before I watched it. 08:16 So I was on high alert and I was being careful. 08:18 I was watching for inconsistencies. 08:21 And of course, because I was watching, I found them. 08:25 But what if I didn't know in advance? 08:27 What if I wasn't paying attention? 08:29 What if I was watching this out of the corner of my eye 08:32 while I was busy doing something else? 08:34 Because, well, that's usually the way I watch TV, 08:37 who knows I might have fallen for it. 08:40 But I'll tell you what really bothers me, 08:42 the possibility that people might use 08:44 this kind of technology 08:46 to persuade the public of things that never really happened. 08:51 And sadly, I think there are people who would do this. 08:54 There are people who will use this technology 08:56 to put words in people's mouths 08:58 in order to destroy those people and to sell us a lie. 09:02 After all, even though we didn't have deep fake technology 09:05 in the past, 09:06 we still used print media and other modes of communication 09:10 to run nationwide propaganda campaigns. 09:13 The clearest example we have of that 09:15 in recent history of course is the way that the Nazis 09:18 or the Stalinists managed to distort reality 09:22 through a deliberate campaign of misinformation. 09:26 So we know that people will do this. 09:29 We know that deep fakes can be very dangerous. 09:32 In recent years, 09:34 we've already seen deep fake artists 09:35 put words into famous politicians mouths, 09:38 creating videos where they appear 09:40 to be saying something they never did. 09:42 And now we find ourselves standing on the front edge 09:45 of a brand new problem. 09:47 If somebody doesn't like who I am 09:49 or what I say and trust me, 09:52 there's always somebody who doesn't like what I say. 09:55 They could potentially ruin my life 09:57 by manipulating a video and putting words in my mouth. 10:01 Let's say someone wants a promotion at work 10:03 and they're afraid you might get it. 10:05 So they sit at their computer and make a video 10:07 where it sounds like you're bad mouthing the boss. 10:10 How do you defend yourself against something like that? 10:15 You know, this is something that's already happening. 10:17 Somebody made a video making it look like President Obama 10:20 was cussing about President Trump, 10:22 calling him rude names that I could never repeat on the air. 10:26 It was a fake. 10:28 Somebody did it to Mark Zuckerberg too, 10:30 they made a video where it looked like he was saying 10:32 he planned to manipulate the whole planet using Facebook. 10:35 And if I'm perfectly straight with you, 10:37 I kind of want it to believe that one, 10:39 because Facebook's been rubbing my fur the wrong way 10:42 for a long time. 10:44 That's what makes these things so dangerous. 10:47 If you come across a deep fake video 10:49 of someone you don't like, you're gonna want to believe it. 10:53 So of course, what in the world can you do about this? 10:57 Well, I guess the first thing 10:58 is to take everything you see with a grain of salt. 11:00 I don't want to live like a cynic 11:02 and it would be nice to trust everybody, 11:04 but that's getting harder to do. 11:06 A world with deep fake technology is a world 11:08 where conspiracy theorists 11:10 are probably gonna have a heyday. 11:12 I mean, some of these guys 11:13 are already trying to tell us 11:14 that nobody ever landed on the moon 11:16 and the world is flat. 11:18 And now they might just have the technology to show you. 11:22 So you've got to wonder what it's gonna do to us, 11:24 if we all become distrustful and apprehensive all the time. 11:29 And unfortunately this new problem 11:31 runs a whole lot deeper than some people think. 11:34 I'll be right back to tell you why. 11:37 [upbeat music] 11:38 - [Narrator] Life can throw a lot at us. 11:40 Sometimes we don't have all the answers, 11:43 but that's where the Bible comes in. 11:46 It's our guide to a more fulfilling life. 11:49 Here at the Voice of Prophecy, 11:50 we've created the Discover Bible guides 11:53 to be your guide to the Bible. 11:54 They're designed to be simple, easy to use 11:57 and provide answers to many of life's toughest questions. 12:00 And they're absolutely free. 12:02 So jump online now, 12:03 or give us a call and start your journey of discovery. 12:08 - Unfortunately, we now find ourselves 12:10 living in a world where it's getting harder and harder 12:12 to figure out if some things are actually real. 12:16 Well, we have a lot of manufactured reality 12:18 in the 21st century 12:20 and the sheer volume of information 12:21 pushed in your direction every single day, 12:24 and the speed at which you are asked 12:26 to absorb that information makes it impossible 12:29 to carefully evaluate everything you hear and see, 12:33 I don't know if you've ever noticed this, 12:34 but when you watch too much TV, 12:35 binge on it. 12:37 It kind of leaves your brain in a bit of a fog. 12:39 You plop yourself down on the couch 12:41 and you watch a TV show over and over for a whole afternoon. 12:44 And then when you're finished 12:46 and you try to do something intelligent, like read a book, 12:49 you find it really hard to concentrate. 12:52 Now, this isn't exactly hard science, 12:54 but some people think that the reason this happens, 12:57 at least in part is that all that information from the TV 13:00 was being fed into your mind so quickly 13:03 that you actually turned off your capacity 13:05 for careful judgment. 13:06 You just shut down your mental filters 13:08 and let all the information come in. 13:11 Now I'm not a neuroscientist. 13:14 And that's just my way of describing the process. 13:17 But it's an undeniable fact that when you're presented 13:20 with way too much information far too quickly, 13:23 you do have a tendency to shut down. 13:25 You just absorb stuff without thinking about it. 13:28 So here we are living in a time 13:31 when we have more information 13:32 than any other generation that's ever lived on this planet. 13:35 Over the last couple of years, 13:37 human beings have added 2.5 quintillion bytes of data 13:41 to the internet every 24 hours. 13:44 Just on Instagram, 13:45 there are 95 million new photos every single day. 13:49 It's a lot of information. 13:52 And while you certainly don't even see a fraction 13:54 of that personally, 13:55 the amount of data that you are exposed to is unbelievable. 13:59 So of course, 14:01 there's no way to absorb all of that critically. 14:03 And what you do is filter the information. 14:06 You pick and choose, 14:07 what you're going to expose your mind to. 14:09 But unfortunately your mind has a habit 14:11 of preferring the information 14:13 that already agrees with what you believe. 14:15 That's called confirmation bias. 14:18 And that's not a bad thing 14:19 because it actually allows you to build a working model 14:22 for how you're gonna navigate the world. 14:25 But sometimes what happens 14:26 is that you're adding bad information to bad information, 14:30 and you're actually confirming ideas 14:31 that were never valid in the first place. 14:35 Let me see if I can illustrate. 14:37 Once upon a time most of us believed 14:39 that the earth was the center of the solar system. 14:41 And we had really bright people telling us this was true. 14:44 People like Ptolemy, the great North African astronomer. 14:48 And once we accepted the basic premise, 14:50 it suddenly looked like all the data supported this idea. 14:54 The Ptolemaic model of the solar system 14:56 was actually pretty useful for making predictions. 14:58 It worked. 15:00 So we kept on adding to a body of knowledge 15:02 that at its core was wrong. 15:05 It's kind of the same with the flat earth crowd. 15:08 Talking to a flat earther 15:09 can be a really frustrating experience 15:12 because once they've come to the conclusion 15:14 that the earth is flat, 15:16 they start to see evidence absolutely everywhere. 15:18 They have confirmation bias. 15:20 And over time they collect piles and piles of evidence 15:23 to support an idea that was never true. 15:26 You see the human brain simply can't handle everything. 15:29 So it accepts the information that appears useful 15:32 and the information that confirms what it already believes, 15:35 and it pretty much ignores everything else. 15:40 So once you absorb one lie, one convincing falsehood, 15:44 it becomes very easy 15:45 to make a mental fortress out of that idea. 15:48 You keep collecting evidence 15:49 that what you believe is true 15:51 and you use that evidence to put up walls 15:54 to protect your idea. 15:56 Now, again, that's not all bad 15:57 because you have to pick and choose the information 16:00 that you're going to keep. 16:01 And you really don't wanna start from scratch 16:03 with every new experience. 16:05 If you had to analyze absolutely everything all the time, 16:09 it would paralyze you. 16:10 You'd never get anything done. 16:12 So you want your brain to do this. 16:14 You want it to build shortcuts, 16:17 but at the same time in a world 16:18 where people are deliberately fabricating stuff, 16:21 it becomes more and more important 16:23 to know what you're going to believe 16:26 and why you wanna believe it. 16:28 We have no choice but to examine our assumptions 16:31 and we have to do it more often 16:33 because we're living in a sea of very bad information. 16:39 So let's say you watch a video on YouTube 16:42 and it's telling you that the government 16:43 has ordered a hundred million caskets 16:46 because it's gonna kill off a hundred million people. 16:49 And then somebody actually shows you these caskets. 16:52 Now you've gotta ask yourself a very important question. 16:55 How likely is this to be true? 16:58 I mean, would anybody have a reason 16:59 to make a story like that up? 17:01 Is it possible they're doing this 17:03 to build a YouTube following. 17:06 And then after that, 17:07 you've got to ask yourself 17:08 another really important question. 17:10 Even if this is true, so what? 17:12 I mean, let's suppose for a moment, 17:14 the earth really is flat. 17:16 Now, I don't believe that. 17:17 So don't run around quoting me, 17:18 but just for the sake of argument, let's pretend. 17:21 So what? 17:22 What difference does it make? 17:24 Are you planning to go on a space voyage? 17:26 Why does this information matter? 17:29 It doesn't. 17:31 Now please don't write me letters. 17:32 I'm not interested in the flat earth theory. 17:34 I reject it. 17:36 And I digress. 17:38 Human beings tragically have a long record of being liars 17:42 and now we have the digital tools 17:44 to make our lives seem well, very compelling. 17:48 So how in the world are you supposed to navigate this? 17:51 I'll be right back. 17:52 Take a stab at answering that. 17:55 [upbeat music] 17:56 - [Narrator] Here at the Voice of Prophecy, 17:58 we're committed to creating top quality programming 17:59 for the whole family. 18:01 Like our audio adventure series, "Discovery Mountain." 18:04 "Discovery Mountain" is a Bible based program 18:07 for kids of all ages and backgrounds. 18:09 Your family will enjoy the faith building stories 18:12 from this small mountain summer camp and town. 18:15 With 24 seasonal episodes every year 18:17 and fresh content every week, 18:19 there's always a new adventure just on the horizon. 18:26 - There is no way for a human being 18:29 to process absolutely every bit of information 18:31 that pours into your brain. 18:35 So we have to build models for how we think the world works, 18:39 because you can't know everything. 18:41 And the models we build require 18:44 that we deliberately ignore a lot of information. 18:47 You've got no choice. 18:48 You have to pick and choose. 18:50 So the question is, how do you build the model? 18:53 What information are you going to choose 18:55 to help you understand the world? 18:59 You know, there's an old story 19:00 about the way they train bank tellers 19:01 to spot a counterfeit bill. 19:04 There are so many different kinds 19:05 of counterfeits in circulation 19:07 that you can't study them all. 19:09 So what they do is give bank tellers a real bank note, 19:13 and they tell them to study it, turn it over in your hands. 19:16 Remember how it looks, how it feels, 19:18 think about how it smells. 19:20 And one day when you get handed a fake, 19:22 you're just going to know it. 19:25 It's kind of like the uncanny valley. 19:28 You might not be able to define exactly what's wrong, 19:31 but you'll know something is. 19:35 So how do you tell if something is true? 19:37 You spend a lot of time studying the truth. 19:41 Now, unfortunately we are part of a generation 19:43 that says there is no truth. 19:45 There is no objective reality, 19:48 but historically we need to understand 19:49 that's a brand new concept 19:51 and it flies in the face of thousands of years 19:54 of diligent human observation. 19:57 There is a real world out there, 19:59 and there are things you can count on. 20:01 You can tell yourself, for example, 20:03 all you want that gravity is an illusion, 20:06 but the next time you fall off a ladder, 20:08 I promise you will know that gravity is real. 20:11 There is an objective world, 20:14 and you can use your senses to discover it and study it. 20:18 But even then, the sharpest senses do have some limits 20:22 and the way your brain works, unfortunately, 20:24 your senses will sometimes deceive you, 20:26 eyes and ears do play tricks. 20:30 So how do you evaluate the stuff you perceive 20:33 with your senses? 20:35 What you need is an outside source of authority. 20:39 You need a way to step outside your own perspective 20:41 and see the situation from somewhere else. 20:44 To accurately measure the universe, 20:46 you have to compare it to something, 20:47 something you know for sure. 20:50 And here's where an ancient book like the Bible 20:54 suddenly has some really useful things to say, 20:57 because it's terribly honest 20:58 about the limits of human perception. 21:01 Over the last 100 years or so, 21:03 we've come to realize that we have some real limitations 21:05 on our ability to assess the universe. 21:08 But the people who wrote this book 21:10 have been saying that for thousands of years. 21:13 Just listen to this from the book of Proverbs. 21:17 "There is a way that seems right to a man, 21:19 but its end is the way of death." 21:23 So let me ask you this. 21:25 How many times have you been absolutely convinced 21:28 of something only to find out later 21:30 that you were wrong because you didn't have all the data? 21:34 I mean, I can be honest enough to admit 21:35 that I've done it many times. 21:37 I have forged ahead with arrogant confidence 21:40 knowing I was absolutely right, 21:44 only to be humiliated 21:45 when the gaps in my personal understanding 21:47 suddenly showed up. 21:49 I mean, how many times haven't I said, I don't need a map. 21:53 I can find this place without one. 21:55 Only to have to get a map 21:56 when it turns out I was completely wrong. 22:00 So here's what you have in this book. 22:03 You have a man who claims to be the son of God. 22:06 "Have a look at me," he said. 22:07 "If you have seen me, 22:09 you have seen the father." 22:11 And when Jesus finally stood trial 22:13 in front of the religious authorities of his day, 22:15 because they thought of him as a political threat, 22:18 they asked him to explain what he believed. 22:22 And here's what he said, 22:23 "Jesus answered him, 22:25 'I spoke openly to the world. 22:27 I always taught in synagogues and in the temple 22:29 where the Jews always meet. 22:30 And in secret I have said nothing. 22:33 Why do you ask me? 22:34 Ask those who have heard me what I said to them? 22:36 Indeed, they know what I said.' 22:38 And when he had said these things, 22:39 one of the officers who stood by 22:42 struck Jesus with the palm of his hand saying, 22:44 'Do you answer the high priest like that?' 22:47 Jesus answered him, 22:49 'If I have spoken evil, bear witness of evil, 22:51 but if well, why do you strike me." 22:55 On another occasion the same man 22:57 stood in front of the Roman governor and said, 22:59 "Everyone who is of the truth 23:01 hears my voice." 23:03 And Pontius Pilate responded 23:04 with maybe the most important question ever asked, 23:08 "What is truth?" 23:11 Now I understand a lot of people have trouble believing 23:13 that real truth actually exists. 23:16 You and I grew up in a generation that was taught to say, 23:19 we should talk about truthiness instead of truth. 23:21 Because well, we all have our own idea of what truth is. 23:24 And we believe now 23:26 that we get to make up our own sets of facts. 23:29 So when Jesus says, 23:31 "I am the way, the truth and the life." 23:33 It leaves us just a little bit skeptical, 23:36 but at the very least you do owe it to yourself 23:38 to take an honest look. 23:40 Go get a copy of this ancient book and read it for yourself. 23:45 Forget what all the religious people say about this, 23:46 because, well, I've gone around the block enough times 23:49 to know that some of those religious folks you see on TV, 23:52 they're also in the business of creating deep fakes. 23:55 They really are. 23:57 So go to the source, 23:58 read what the people 24:00 who were actually there 2000 years ago said, 24:03 and see what you find. 24:05 Because I'm telling you there is truth 24:07 and there is something out there you can count on for sure. 24:10 Something you can believe. 24:12 And when you find that you will suddenly have the ability 24:15 to detect a fake 24:17 because the lies will leave you in the uncanny valley. 24:21 I mean, what if Jesus really did live the perfect life? 24:25 If that's true, 24:27 it seems to me that this would be a really good place 24:30 to find a little bit of objectivity. 24:32 And honestly, you've got nothing to lose just by looking. 24:37 I'll be right back after this. 24:40 - [Narrator] Dragons, beasts, cryptic statues, 24:45 Bible prophecy can be incredibly vivid and confusing. 24:49 If you've ever read Daniel or Revelation 24:52 and come away scratching your head, you're not alone. 24:55 Our free focus on prophecy guides 24:57 are designed to help you unlock the mysteries 24:59 of the Bible and deepen your understanding 25:01 of God's plan for you and our world. 25:04 Study online or request them by mail 25:06 and start bringing prophecy into focus today. 25:12 [upbeat music] 25:13 - [Narrator] Are you searching for answers 25:14 to life's toughest questions? 25:16 Like where is God when we suffer? 25:18 Can I find a real happiness? 25:20 Or is there any hope for our chaotic world? 25:23 The Discover Bible guides 25:25 will help you find the answers you're looking for. 25:27 Visit us at biblestudies.com 25:30 or give us a call at 888-456-7933 25:35 for your free Discover Bible guides. 25:37 Study online on our secure website 25:40 or have the free guides mailed right to your home. 25:43 There is never a cost or obligation. 25:46 The Discover Bible guides are our free gift to you. 25:49 Find answers and guides like; 25:50 Does My Life Really Matter to God? 25:52 And A Second Chance at Life. 25:55 You'll find answers to the things that matter most to you 25:57 in each of the 26 discover Bible Guides. 26:00 Visit biblestudies.com and begin your journey today 26:04 to discover answers to life's deepest questions. 26:13 - Look, for thousands of years, 26:15 the human race had no problem 26:17 believing in objective reality 26:18 that there was something noble and real out there. 26:23 And for the most part, 26:25 most of us knew what that reality was. 26:28 But now we live in a world full of many strange ideas, 26:31 and it's hard to tell what's real, what's authentic. 26:35 Our natural reaction is to throw up our hands 26:36 and just give up. 26:38 But at some point it's going to be important 26:41 to know the real from the fake. 26:43 One of the authors of the Bible wrote 26:45 that Satan transforms himself into an angel of light. 26:49 So before this is all over, 26:51 the lies are gonna get very convincing. 26:54 But if you marinate yourself in the truth, 26:58 there's always going to be something, 26:59 some little tell from the uncanny valley 27:02 that reveals the lie for what it is. 27:05 The time has come to get yourself a real bank note. 27:08 Study this day and night 27:10 so that the deep fakes will be really obvious. 27:14 I'm Shawn Boonstra thanks for joining me. 27:17 This has been "Authentic." 27:18 [upbeat music] |
Revised 2021-09-08